August 25
Airflow
AWS
Azure
Cloud
ETL
Google Cloud Platform
Kafka
Kubernetes
Microservices
Postgres
Python
Spark
SQL
Terraform
β’ Design, Develop & Deploy data pipelines and data models on various Data Lake / DWH layers β’ Ingest data from and export data to multiple third-party systems and platforms (e.g., Salesforce, Braze, SurveyMonkey). β’ Architect and implement data-related microservices and products β’ Ensure the implementation of best practices in data management, including data lineage, observability, and data contracts. β’ Maintain, support, and refactor legacy models and layers within the DWH
β’ Minimum of 3 years of experience in software development, data engineering, or business intelligence β’ Proficiency in Python - A must. β’ Advanced SQL skills - A must β’ Strong background in data modeling, ETL development, and data warehousing - A must. β’ Experience with big data technologies, particularly Airflow - A must β’ General understanding of cloud environments like AWS, GCP, or Azure - A must β’ Familiarity with tools such as Spark, Hive, Airbyte, Kafka, Clickhouse, Postgres, Great Expectations, Data Hub, or Iceberg is advantageous. β’ Experience with Terraform, Kubernetes (K8S), or ArgoCD - is advantageous. β’ A bachelorβs degree in Computer Science, Engineering, or a related field is advantageous but not mandatory
Apply NowAugust 15
51 - 200
Build and maintain efficient data pipelines for Captiv8's influencer marketing platform.
June 14
51 - 200
π΅π± Poland β Remote
π΅ $90k - $110k / year
π° $55M Series A on 2021-09
β° Full Time
π‘ Mid-level
π Senior
π° Data Engineer
March 19
11 - 50
π΅π± Poland β Remote
π΅ β¬60k - β¬100k / year
π° $4.5M Pre Seed Round on 2022-12
β° Full Time
π‘ Mid-level
π Senior
π° Data Engineer