September 3
Airflow
AWS
Azure
Cloud
ETL
Google Cloud Platform
Kafka
Kubernetes
Microservices
Postgres
Python
Spark
SQL
Terraform
• Design, Develop & Deploy data pipelines and data models on various Data Lake / DWH layers • Ingest data from and export data to multiple third-party systems and platforms (e.g., Salesforce, Braze, SurveyMonkey). • Architect and implement data-related microservices and products • Ensure the implementation of best practices in data management, including data lineage, observability, and data contracts. • Maintain, support, and refactor legacy models and layers within the DWH
• Minimum of 3 years of experience in software development, data engineering, or business intelligence • Proficiency in Python - A must. • Advanced SQL skills - A must • Strong background in data modeling, ETL development, and data warehousing - A must. • Experience with big data technologies, particularly Airflow - A must • General understanding of cloud environments like AWS, GCP, or Azure - A must • Familiarity with tools such as Spark, Hive, Airbyte, Kafka, Clickhouse, Postgres, Great Expectations, Data Hub, or Iceberg is advantageous. • Experience with Terraform, Kubernetes (K8S), or ArgoCD - is advantageous.
Apply NowAugust 15
51 - 200
Build and maintain efficient data pipelines for Captiv8's influencer marketing platform.
June 14
51 - 200
🇵🇱 Poland – Remote
💵 $90k - $110k / year
💰 $55M Series A on 2021-09
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
March 19
11 - 50
🇵🇱 Poland – Remote
💵 €60k - €100k / year
💰 $4.5M Pre Seed Round on 2022-12
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer