Join our Facebook group
👉 Remote Jobs NetworkWe Turn Challengers Into Champions
Paid Search & Media • Creative Services • SEO • Earned Social • Content Marketing
201 - 500
August 2
🇺🇸 United States – Remote
đź’µ $130k - $165k / year
⏰ Full Time
đźź Senior
đźš° Data Engineer
Airflow
Benchmarks
Cloud
ETL
Google Cloud Platform
Grafana
Kubernetes
Postgres
Prometheus
Python
SQL
Terraform
Go
We Turn Challengers Into Champions
Paid Search & Media • Creative Services • SEO • Earned Social • Content Marketing
201 - 500
• Act as a technical leader and Subject Matter Expert (SME) in data pipeline automation and workflow orchestration. • Design, implement, and maintain complex, reliable, data solutions with a focus on automation using Airflow, dbt, and Google Cloud data products. • Manage a large portfolio of dbt models, leveraging macros and DRY patterns. • Advocate for test-driven development and assist QA in developing a robust and reliable process for continuous integration and delivery. • Monitor, troubleshoot, and optimize the performance of data pipelines and workflows. • Architect solutions and reusable patterns that scale with business needs. • Provide implementation, configuration, and deployment documentation. • Proactively address issues and problems, generating and implementing innovative solutions. • Participate in all agile ceremonies, including daily standups and regular sprint planning. • Mentor other engineers and foster a culture of technical excellence. • Stay up-to-date with the latest industry trends and technologies to drive continuous improvement and innovation in data engineering practices. • Ensure data security, governance, and compliance with relevant standards and privacy restrictions.
• Minimum 7+ years of experience in software development, with extensive experience in Python, SQL, and data pipelines. • Expertise in building and automating ETL/ELT pipelines using Airflow or other DAG-based workflow management software. • Deep experience managing dbt models using macros and dbt tests. • Mastery of Python or comparable scripting language, API integrations, and software architecture. • Deep experience with databases like PostgreSQL and BigQuery, including query optimization for performance and cost. • Good understanding of business intelligence tools like Looker or comparable alternatives. • Experience with Google Cloud Platform, Kubernetes, and managing infrastructure as code using Terraform. • Proficiency with advanced data formats (Parquet, Avro, Delta Lake, Hive, JSONL) and data integration techniques. • Experience with monitoring and logging tools (e.g., Prometheus, Grafana) is a plus. • Familiarity with version control systems and CI/CD tools like GitHub Actions. • Strong command of agile methodologies, continuous integration, and test-driven development. • Exceptional problem-solving skills and technical leadership. • Ability to influence and guide cross-functional teams and projects.
• Half-day Fridays year round • Unlimited PTO • Extended Holiday break (Winter) • Flexible schedules • Work from anywhere options* • 100% paid parental leave • 401(k) matching • Medical, Dental, Vision, Life, Pet Insurance • Sponsored life insurance • Short Term Disability insurance and additional voluntary insurance • Annual Class Pass Credits and more!
Apply NowAugust 1
51 - 200
Improve and transform VRChat's data capabilities while collaborating across teams.
🇺🇸 United States – Remote
đź’° $80M Series D on 2021-06
⏰ Full Time
🟡 Mid-level
đźź Senior
đźš° Data Engineer
July 31
2 - 10
Design and scale data infrastructure to support data-driven decisions in the crypto sector.
July 30
51 - 200
Build scalable data pipelines and implement test automation for Device42's software.
🇺🇸 United States – Remote
đź’° $34M Series A on 2019-03
⏰ Full Time
🟡 Mid-level
đźź Senior
đźš° Data Engineer
July 29
11 - 50
Enhance data analytics pipeline using SQL, Scala, and ETL technologies.
🇺🇸 United States – Remote
đź’° Seed Round on 2020-02
⏰ Full Time
đźź Senior
đźš° Data Engineer
July 29
51 - 200
Balance BI architecture, analysis, and operations to enhance data privacy solutions.