Data Engineer

September 3

Apply Now

Description

• Design, Develop & Deploy data pipelines and data models on various Data Lake / DWH layers • Ingest data from and export data to multiple third-party systems and platforms (e.g., Salesforce, Braze, SurveyMonkey). • Architect and implement data-related microservices and products • Ensure the implementation of best practices in data management, including data lineage, observability, and data contracts. • Maintain, support, and refactor legacy models and layers within the DWH

Requirements

• Minimum of 3 years of experience in software development, data engineering, or business intelligence • Proficiency in Python - A must. • Advanced SQL skills - A must • Strong background in data modeling, ETL development, and data warehousing - A must. • Experience with big data technologies, particularly Airflow - A must • General understanding of cloud environments like AWS, GCP, or Azure - A must • Familiarity with tools such as Spark, Hive, Airbyte, Kafka, Clickhouse, Postgres, Great Expectations, Data Hub, or Iceberg is advantageous. • Experience with Terraform, Kubernetes (K8S), or ArgoCD - is advantageous.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com