Lead Data Engineer

October 17

Apply Now
Logo of The Baldwin Group

The Baldwin Group

Commercial Risk Management • Private Risk Management • Personal Insurance • Employee Benefits • Asset and Income Protection

1001 - 5000

Description

• Oversee the design, building and optimization of data orchestration and pipelines. • Optimize data collection and flow for cross functional teams. • Mentor junior engineers and provide technical guidance. • Create and maintain optimal data orchestration architecture. • Identify, design, and implement internal process improvements. • Build the infrastructure for optimal extraction, transformation, and ingestion of data. • Support analytics tools that utilize the data pipeline. • Collaborate with stakeholders to assist with data-related technical issues. • Create data tools for analytics and data science teams. • Communicate internally and externally to collect and validate data.

Requirements

• Bachelor’s degree in related field preferred, equivalent years’ experience considered. • At least seven to ten years of data related or analytical work experience in a Data Engineer role, preferably three of those within the Azure ecosystem. • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of data platforms. • Advanced understanding of and experience implementing data lakes, lake houses. • Advanced understanding and experience with file storage layer management in data lake environment, including parquet and delta file formats. • Solid experience with SPARK (PySpark) language, and data processing techniques. • Solid Understanding of and experience with AZURE SYNAPSE tools and services. • Some knowledge of Python preferred. • Strong analytic skills related to working with structured, semi-structured, and unstructured datasets and blob storage. • Build processes supporting data transformation, data structures, metadata, dependency and workload management. • A successful history of manipulating, processing and extracting value from large disconnected datasets. • Strong project management and organizational skills. • Experience supporting and working with cross-functional teams in a dynamic environment. • Insurance industry experience preferred.

Apply Now

Similar Jobs

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com