September 20
• Experience implementing AWS HealthLake for a healthcare provider organization is required. • Lead and provide advanced data engineering expertise for projects that enable analytics to drive optimization of decisions for client(s), within a team of engineers. • Design new methods and processes to ensure maximum effectiveness of client data. • Partner with data analysts/scientists to provide solutions enabling statistical analysis tools and data visualization applications. • Identify processes and tools that can be shifted towards automation to enable seamless development and self-service analytics workloads. • Partner with various business units and data stewards to understand the business needs. • Obtain and/or maintain technical expertise of available data manipulation and preparation tools (ADF, Talend, Informatica, Matillion, etc.) as well as programming languages (Python, Spark, etc.) • Ensure data is secure, relevant, and maintains high quality standards. • Identify and implement industry best practices. • Evaluate new data sets to determine appropriate ingestion techniques. • Build, manage and optimize data pipelines through a variety of ETL tools, including custom infrastructure and 3rd-party tooling (Azure/AWS, Databricks, Snowflake). • Work with internal engineering teams and vendors to understand business logic to ensure veracity in datasets. • Generate documentation on existing production data logic and its influencing business processes in order to reconcile knowledge gaps between the business, engineering, and data collection.
• Bachelor or Master Degree in Computer Science or relevant field is required. • 10+ years of experience in delivering data engineering solutions that include batch and streaming capabilities. • 5+ years of strategic/management consulting experience with experience in pre-sales activities (RFP response, proposal development, SOW development, etc.) is required. • Experience implementing AWS HealthLake for a healthcare provider organization is required. • Expert knowledge of cloud services (AWS preferred). • In-depth expertise of design, implementation, engineering, automation and DevOps implementation, service operation and service improvement initiatives. • On-prem / other cloud providers to cloud migration using cloud migration tools. • Experience creating and delivering end-to-end roadmaps to address a business problem. • Strong understanding of Unix operating system and experience in any scripting language like Python, Shell, Awk, etc. • Experience building, testing, automating and optimizing data pipelines is required. • Experience using emerging technologies (Snowflake, Databricks, Synapse, etc.) is required. • Strong understanding and prior use of SQL and be highly proficient in the workings of data technologies (Hadoop, Hive, Spark, Kafka, low latency data stores, Airflow, etc.). • Passion for software development and data and be highly skilled in performing data extraction, transformation and processing to optimize quantitative analyses on various business functions. • Familiarity with Scrum, DevOps, and DataOps methodologies, and supporting tools such as JIRA. • Excellent oral and written communication skills. • Strong presentation skills and the ability to communicate analytical and technical concepts with confidence and in an easy-to-understand fashion to technical and non-technical audiences. • Candidates can reside anywhere in the U.S. and must be flexible to travel to the client location 1 month out of the month.
• healthcare • retirement • life insurance • short/long-term disability • unlimited paid time off • short-term incentive plans (annual bonus) • long-term incentive plans
Apply NowSeptember 17
51 - 200
Architect system monitoring solutions and optimize containerized deployments for cloud and SAP.
September 3
11 - 50
Senior Consultant at a digital service provider, focusing on IT/OT solutions.
August 22
51 - 200
Connect product capabilities with client needs to create value in negotiations.