December 6
Airflow
Apache
Cloud
Docker
Google Cloud Platform
Grafana
Kafka
Kubernetes
Prometheus
Python
RabbitMQ
SQL
Terraform
Go
• Ensure that data is available and exploitable by Data Scientists and Analysts. • Contribute to the construction and maintenance of Shippeo’s modern data stack. • Work with various technology blocks: Data Acquisition, Batch data transformation, Cloud Data Warehousing, Stream/event data processing, and underlying infrastructure.
• Degree (MSc or equivalent) in Computer Science. • 3+ years of experience as a Data Engineer. • Experience building, maintaining, testing and optimizing data pipelines and architectures. • Programming skills in Python. • Advanced working knowledge of SQL. • Familiarity with a variety of databases. • Working knowledge of message queuing and stream processing. • Advanced knowledge of Docker and Kubernetes. • Advanced knowledge of cloud platform (preferably GCP). • Advanced knowledge of cloud based data warehouse solution (preferably Snowflake). • Experience with Infrastructure as code (Terraform/Terragrunt). • Experience building and evolving CI/CD pipelines (Github Actions).
Apply NowDecember 1
As Senior Data Engineer, develop scalable data systems for a leading Fintech in Europe. Join us in shaping the future of collaboration and decision-making through data.
November 20
Lead the Data Platform team at Homa, driving data innovation for mobile game development.
October 5
Kpler simplifies global trade data for clients in commodities and energy sectors.