February 16
• Create and maintain ELT/ETL processes for existing and new systems. • Collaborate with development and business teams to understand requirements and define source system data flows • Develop and maintain ETL/ETL specifications for data integration development • Define and deliver consistent data modeling and data architecture standards, methodologies, guidelines and techniques • Document, implement and maintain the data pipeline architecture and related business processes • Serve as a source of knowledge of industry practices and processes. • Participate in the development of enterprise standards and guidelines for data model quality and accuracy • Audit project level data model quality deliverables to ensure that practices and standards are met • Analyze information and data requirements and understand effects of data inconsistencies • Identify inefficiencies in current architecture and processes and communicate solutions in a manner that gets support from the teams involved • Perform cost and sizing estimates for projects • Collaborate with the project coordinator and the rest of the agile team to identify epics, stories and estimate effort • Create and maintain data dictionary documents, table and data lineage models and produce artifacts to support project development and communicate project information to customers
• Bachelors in Computer Science or other engineering degree equivalent • 2+ years of hands-on experience in building Data pipeline (ETL/ELT) in a cloud platform • Master of SQL and Python Languages • Experience with using Airflow
• Equal Employment Opportunity employer • High-growth, high-autonomy culture • Competitive compensation philosophy • Commitment to diversity
Apply NowJanuary 3, 2023
11 - 50