6 days ago
Airflow
Apache
AWS
D3.js
Distributed Systems
ETL
JavaScript
Kafka
Kubernetes
MongoDB
Node.js
Postgres
RabbitMQ
TypeScript
• OneImaging is revolutionizing the radiology landscape by providing high-quality imaging services at significantly more affordable rates. • We assist patients throughout their care journey, eliminating common obstacles such as high out-of-pocket costs, limited availability, poor scheduling, and tedious insurance follow-ups. • Join us in making a tangible impact on healthcare. • We are seeking a skilled and motivated Data Engineer to join our team. • You will play a key role in developing and maintaining the core infrastructure that powers our platform. • The ideal candidate has a strong foundation in building scalable ETL services interfacing with multiple data sources, experience with modern web application frameworks and backend development, and a passion for improving healthcare through technology.
• Implement and manage ETL scripts & processes for data ingestion, validation, transformation, DB updates and reporting results using Apache Airflow. • Develop, and maintain scalable API adjacent services using Node.js (Express), MongoDB, Mongoose, and Postgres within an AWS ecosystem. • Have an understanding of the data sources and flow within the platform, make recommendations to optimize data models & schemas in MongoDB and Postgres. • Manage database migrations and helper scripts where needed to ensure smooth and efficient updates between versions. • Optimize query and controller performance to ensure responsive and efficient operations. • Collaborate with others in the engineering team to develop and scale event-driven communication between backend services using technologies like Kafka, SQS / RabbitMQ. • Support integration and maintenance of databases with BI tools such as Looker and data visualization tools like d3.js. • Maintain and optimize API services for improved interaction with frontend applications. • Create documentation for key parts of the platform that fall under your domain. • B.S. / M.S in Computer Science, Engineering, Applied Math & Statistics, or a related computational field, with 3+ years of experience in backend-focused software engineering/data engineering OR 6+ years of experience in backend-focused software engineering/data engineering. • Proven experience in backend development, with strong proficiency in Node.js (Express), MongoDB, Mongoose, and Postgres. • Familiarity with modern scaling approaches and techniques for FTP and delivering statically stored assets to the API server. • Hands-on experience with AWS services including S3, Route 53, Apprunner, Fargate, Bedrock, and Transfer Family. • Experience with ETL processes using Apache Airflow or similar DAG schedulers • Acquaintance with integrating databases with BI tools and data visualization frameworks. • Knowledge of event-driven architecture and communication between services using Kafka, SQS, RabbitMQ. • Data extraction from scanned documents using OCR or Anthropic Claude and Amazon Bedrock. • Strong understanding of data security, encryption, and compliance best practices. • Ability to work independently and as part of a collaborative team. • Excellent problem-solving skills and attention to detail. • English proficiency, strong communication skills and the ability to work effectively with cross-functional teams. • Exceptional problem-solving skills and innovative thinking. • Ability to work effectively in a team-oriented environment. • Experience working with healthcare data standards such as FHIR, HL7, DICOM. • Document OCR experience. • Knowledge of container orchestration with Kubernetes for managing scalable, distributed systems. • Interest and experience in backend development in Node.js/Typescript based projects.
• Competitive salary and benefits package. • Making a tangible impact on patients, and the healthcare system in an undersaturated field full of opportunity. • Work with a team of passionate and innovative individuals experienced in the health tech field • Opportunity for leadership within teams, career growth and acquiring cross team skills. • Flexible work environment with remote opportunities.
Apply Now6 days ago
51 - 200
Data Engineer at Rezilient Health to enhance healthcare data impact.
6 days ago
501 - 1000
Manage AI engineers at Blend360 to build real-time data pipelines.
🇺🇸 United States – Remote
💰 $100M Private Equity Round on 2022-08
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
🗽 H1B Visa Sponsor
6 days ago
51 - 200
Data Engineer at Tekmetric, managing data infrastructure and pipelines for a repair shop system.
6 days ago
11 - 50
Manage data warehouse and pipelines for Brushfire’s event management platform.
November 6
2 - 10
Data Architect designing data infrastructure for insights and decision-making.