October 22
Airflow
AWS
Cloud
Docker
Google Cloud Platform
JavaScript
Jenkins
Kafka
Kubernetes
Postgres
PySpark
Python
SQL
Tableau
Terraform
Go
• We're looking for a savvy and experienced Senior Data Engineer to join the Data Platform Engineering team at Hims. • As a Senior Data Engineer, you will work with the analytics engineers, product managers, engineers, security, DevOps, analytics, and machine learning teams to build a data platform that backs the self-service analytics, machine learning models, and data products serving Million+ Hims & Hers subscribers. • Architect and develop data pipelines to optimize performance, quality, and scalability • Build, maintain & operate scalable, performant, and containerized infrastructure required for optimal extraction, transformation, and loading of data from various data sources • Design, develop, and own robust, scalable data processing and data integration pipelines using Python, dbt, Kafka, Airflow, PySpark, SparkSQL, and REST API endpoints to ingest data from various external data sources to Data Lake • Develop testing frameworks and monitoring to improve data quality, observability, pipeline reliability, and performance • Orchestrate sophisticated data flow patterns across a variety of disparate tooling • Support analytics engineers, data analysts, and business partners in building tools and data marts that enable self-service analytics • Partner with the rest of the Data Platform team to set best practices and ensure the execution of them • Partner with the analytics engineers to ensure the performance and reliability of our data sources. • Partner with machine learning engineers to deploy predictive models • Partner with the legal and security teams to build frameworks and implement data compliance and security policies • Partner with DevOps to build IaC and CI/CD pipelines • Support code versioning and code deployments for data Pipeline
• 8+ years of professional experience designing, creating and maintaining scalable data pipelines using Python, API calls, SQL, and scripting languages • Demonstrated experience writing clean, efficient & well-documented Python code and are willing to become effective in other languages as needed • Demonstrated experience writing complex, highly optimized SQL queries across large data sets • Experience with cloud technologies such as AWS and/or Google Cloud Platform • Experience with Databricks platform • Experience with IaC technologies like Terraform • Experience with data warehouses like BigQuery, Databricks, Snowflake, and Postgres • Experience building event streaming pipelines using Kafka/Confluent Kafka • Experience with modern data stack like Airflow/Astronomer, Databricks, dbt, Fivetran, Confluent, Tableau/Looker • Experience with containers and container orchestration tools such as Docker or Kubernetes. • Experience with Machine Learning & MLOps • Experience with CI/CD (Jenkins, GitHub Actions, Circle CI
• Competitive salary & equity compensation for full-time roles • Unlimited PTO, company holidays, and quarterly mental health days • Comprehensive health benefits including medical, dental & vision, and parental leave • Employee Stock Purchase Program (ESPP) • Employee discounts on hims & hers & Apostrophe online products • 401k benefits with employer matching contribution • Offsite team retreats
Apply NowOctober 21
10,000+
Integrates healthcare data and maintains systems for user consumption at university.
October 21
1001 - 5000
Develop scalable data platforms for ServiceTitan's data-driven products.
October 21
201 - 500
Develop systems to detect intent signals for B2B sales and marketing teams.
🇺🇸 United States – Remote
💵 $150k - $170k / year
💰 Debt Financing on 2021-04
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🗽 H1B Visa Sponsor
October 20
11 - 50
Data Engineer builds scalable data pipelines for large language models at CDS.