HealthEdge is a company that specializes in providing advanced solutions for healthcare payers through its HealthRules Solutions Suite. This suite includes a comprehensive digital claims administration processing system, care management workflow solutions, and payment integrity solutions, which aim to enhance operational efficiency and improve quality of care for health plans. By leveraging integrated technology and automation, HealthEdge helps health plans eliminate data silos, increase payment accuracy, and elevate member experience, thereby transforming the healthcare landscape for better collaboration and accessibility.
Healthcare Technology
4 days ago
Airflow
Amazon Redshift
AWS
Azure
BigQuery
Cloud
ETL
Google Cloud Platform
Kafka
MySQL
Oracle
Postgres
Python
RDBMS
Spark
SQL
HealthEdge is a company that specializes in providing advanced solutions for healthcare payers through its HealthRules Solutions Suite. This suite includes a comprehensive digital claims administration processing system, care management workflow solutions, and payment integrity solutions, which aim to enhance operational efficiency and improve quality of care for health plans. By leveraging integrated technology and automation, HealthEdge helps health plans eliminate data silos, increase payment accuracy, and elevate member experience, thereby transforming the healthcare landscape for better collaboration and accessibility.
Healthcare Technology
• In this role you will be an integral part of the Data Engineering team responsible for building high quality, robust and scalable data platform environments and solutions to support our data lake and data warehouse environments • Our team collaborates closely with other engineering, analytics and data science teams to identify and implement optimal data solutions for the Wellframe platform • Build data pipelines to assemble large, complex sets of data that meet non-functional and functional business requirements • Work closely with data architect, SMEs and other technology partners to develop & execute data architecture and product roadmap • Build analytical tools to utilize the data pipeline, providing actionable insight into key business performance including operational efficiency and business metrics • Work with stakeholders including the leadership, product, customer teams to support their data infrastructure needs while assisting with data-related technical issues • Act as a subject matter expert to other team members for technical guidance, solution design and best practices within the customer organization • Keep current on big data and data visualization technology trends, evaluate, work on proof-of-concept and make recommendations on cloud technologies
• 3 + years of data engineering experience working in partnership with large data sets (preferably terabyte scale) • Experience in building data pipelines using any of the ETL tools such as Glue, ADF, Notebooks, Stored Procedures, SQL/Python constructs or similar • Deep experience working with industry standard RDBMS such Postgres, SQL Server, Oracle , MySQL etc. and any of the analytical cloud databases such as Big Query, Redshift, Snowflake or similar • Advanced SQL expertise and solid programming experience with Python and/or Spark • Experience working with orchestration tools such as Airflow and building complex dependency workflows • Experience developing and implementing Data Warehouse or Data Lake Architectures, OLAP technologies, data modeling with star/snowflake-schemas to enable analytics & reporting • Great problem-solving capabilities, troubleshooting data issues and experience in stabilizing big data systems • Excellent communication and presentation skills as you’ll be regularly interacting with stakeholders and engineering leadership • Bachelor’s or master's in quantitative disciplines such as Computer Science, Computer Engineering, Analytics, Mathematics, Statistics, Information Systems, or other scientific fields • Bonus Points: Hands-on deep experience with cloud data migration, and experience working with analytic platforms like Fabric, Databricks on the cloud • Certification in one of the cloud platforms (AWS/GCP/Azure) • Experience or demonstrated understanding with real-time data streaming tools like Kafka, Kinesis or any similar tools
Apply Now4 days ago
Meetsta seeks a Data Engineer to design and develop mobile applications for iOS.
4 days ago
As a Data Architect at ITC Federal, you'll maintain databases for federal projects, ensuring architecture and data integrity.
4 days ago
51 - 200
As a Data Engineer at Elder Research, design and build data pipelines for analytics challenges.
4 days ago
51 - 200
Serve the Department of Defense as a Data Engineer, enhancing search capabilities with AI. Engage in data analytics to drive rapid delivery of solutions.
4 days ago
51 - 200
As a Data Engineer, design and build data pipelines at a data science consulting firm.
Discover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.