5 days ago
Airflow
Amazon Redshift
Apache
AWS
Azure
Backbone
Cloud
Distributed Systems
ETL
Google Cloud Platform
Hadoop
Kafka
Python
Scala
Spark
SQL
Go
• Are you passionate about building data infrastructure that powers advanced analytics and machine learning? • Do you thrive on transforming raw data into well-organized, accessible, and reliable datasets? • As a Data Engineer, you’ll be responsible for constructing efficient, scalable data pipelines. • You’ll work with large datasets, implement ETL processes, and build the infrastructure that powers analytics and AI-driven insights. • Build and maintain robust, scalable, and efficient data pipelines to ingest, process, and store data. • Architect and maintain data warehouses or data lakes using cloud platforms. • Work closely with data scientists, analysts, and other stakeholders to understand data requirements. • Implement data quality checks and monitoring systems to ensure accuracy, completeness, and consistency of data. • Optimize the performance of data systems, ensuring fast and reliable data access. • Automate data workflows, pipeline deployments, and data quality checks. • Implement security protocols to protect sensitive data, ensuring compliance with relevant regulations.
• Data Engineering Expertise: Strong experience building and maintaining data pipelines, ETL processes, and data warehouses using cloud platforms (AWS, GCP, Azure). You’re skilled at handling large, complex datasets efficiently. • Programming and Scripting: Proficiency in languages such as Python, SQL, or Scala, and experience with data engineering tools like Apache Spark, Airflow, or Kafka. You can write efficient code to process and transform large datasets. • Data Warehousing and Storage: Expertise in managing and optimizing data warehouses or data lakes (e.g., Redshift, BigQuery, Snowflake). You understand partitioning, indexing, and storage optimization techniques. • Database and Query Optimization: Strong knowledge of database design and query optimization for performance. You can fine-tune SQL queries and structure databases for fast, reliable access to large volumes of data. • Data Governance and Security: Solid understanding of data governance practices, security protocols, and compliance regulations. You can enforce data privacy and implement measures to safeguard sensitive information.
• Health and Wellness: Comprehensive medical, dental, and vision insurance plans with low co-pays and premiums. • Paid Time Off: Competitive vacation, sick leave, and 20 paid holidays per year. • Work-Life Balance: Flexible work schedules and telecommuting options. • Professional Development: Opportunities for training, certification reimbursement, and career advancement programs. • Wellness Programs: Access to wellness programs, including gym memberships, health screenings, and mental health resources. • Life and Disability Insurance: Life insurance and short-term/long-term disability coverage. • Employee Assistance Program (EAP): Confidential counseling and support services for personal and professional challenges. • Tuition Reimbursement: Financial assistance for continuing education and professional development. • Community Engagement: Opportunities to participate in community service and volunteer activities. • Recognition Programs: Employee recognition programs to celebrate achievements and milestones.
Apply Now5 days ago
11 - 50
Design and maintain scalable data platform for social change with ForceMetrics.
5 days ago
51 - 200
Data Engineer for CDC Foundation's public health data infrastructure development.
🇺🇸 United States – Remote
💵 $103.5k - $143.5k / year
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
🗽 H1B Visa Sponsor
5 days ago
501 - 1000
DataOps - Data Engineer to build and maintain scalable data pipelines for Fetch’s business.
🇺🇸 United States – Remote
💰 Debt Financing on 2022-04
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
🗽 H1B Visa Sponsor