IT Tech Lead

January 9

Apply Now
Logo of OhioHealth

OhioHealth

Healthcare • Home Care • Rehabilitation • Sports Medicine • Outpatient

Description

• Looking for an experienced Lead Data Engineer with background in Informatica ETL and Cloud technologies. • In this senior role, tech lead will be responsible for overseeing the architecture, development, and maintenance of our data platform, ensuring data quality and efficiency. • Serve as the primary SME, within their respective functions, with deep knowledge of applications and platforms the team is responsible for. • Lead the design and implementation of ETL processes using various ETL tools and diverse data sources. Both batch and real-time data. • Develop and manage data solutions on cloud platforms (Azure) • Provide technical leadership and mentorship to a team of data engineers, ensuring best practices and high-quality deliverables. • Optimize data pipelines and systems for performance, scalability, and reliability. • Work closely with data analysts, data scientists, and other business stakeholders to understand data requirements and deliver effective solutions. • Maintain comprehensive documentation of data architecture/design, pipelines, and processes • Identify and implement improvements to data engineering practices and technologies

Requirements

• Minimum of 12 – 15 years of experience in data engineering, with a focus on ETL and Cloud technologies (can be less than 10 years) • Proficiency in Informatica ETL tools, SQL, and Azure Cloud platforms (preferably Azure but others acceptable) • Experience in Informatica IDMC - Architect, Design and develop ETL processes on IDMC and Informatica suite of tools. • Familiarity with MDM Architecture and Data Flow • Experience in potential migration from existing data platforms to Databricks, Microsoft Fabric • Must have working experience with Azure based Data Pipelining, Scheduling and Monitoring and pyspark with ability to debug troublesome pipelines. • Must have hands on expertise dealing with data pipelines • Strong working experience with Big Data technologies (Spark, Data Bricks) for Data integration, & processing (ingestion, transformation, curation, etc), preferably on Azure cloud, and a clear understanding of how the resources work and integrate with cloud and on-prem. • High level of proficiency with database and data warehouse development, including replication, staging, ETL, stored procedures, partitioning, change data capture, triggers, scheduling tools, cubes, and datamarts. • Experience working with backend languages such as Python. • Strong computer literacy and proficiency in data manipulation using Analytics tools/platform like Databricks, Azure Fabric using Spark engine • Expertise in at least one technology stack designing, developing, testing, and/or delivering complex software (i.e., Java, Python, Pyspark) • Excellent debugging, troubleshooting, and analytical skills • Strong analytical and problem-solving skills with the ability to own, troubleshoot and resolve complex data issues • Collaborate with Architects and Managers to develop Metrics & KPI’s • Identify any technical risks and forming contingency plans as soon as possible • Good communication and collaboration skills to work effectively with cross-functional teams. • Experience working in an Agile development environment preferred • Experience building data pipelines to support ML workflows a plus • Experience working with geographically distributed teams (different time zones)

Apply Now

January 9

Allstate

10,000+ employees

As a Lead Consultant, you will architect digital products and manage applications at Allstate, ensuring success through innovative technology.

January 9

As an Infrastructure Engineer, you'll develop systems for Dropbox’s search platform, impacting millions of users.

Discover 70,000+ Remote Jobs!

Join now to unlock all job opportunities.

Find your dream remote job

Discover hidden jobs

We scan the internet everyday and find jobs not posted on LinkedIn or other job boards.

Head start against the competition

We find jobs within 24 hours of being posted, so you can apply before everyone else.

Be the first to know

Daily emails with new job openings straight to your inbox.

Choose your membership

Cancel anytime

Loved by 10,000+ remote workers

Wall of Love

Frequently asked questions

We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.

Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.

Other job boards only have jobs from companies pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internets for jobs and doesn't accept payments from companies. This means we have thousands of more jobs!

New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.

Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.

Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.

Why I created Remote Rocketship

Choose your membership

Cancel anytime

Loved by 10,000+ remote workers
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com