Senior GCP Data Engineer

16 hours ago

Apply Now
Logo of Xebia

Xebia

IT Consultancy • Continuous Delivery • Offshore Services • Deployment Automation • Digital Transformation

Description

• Work closely with engineering, product, and data teams to deliver scalable data solutions. • Design, build, and maintain data platforms and pipelines. • Mentor new engineers. • Deliver software systems and best practices for robust solutions. • Engineer data platforms for scale, performance, reliability, and security. • Integrate data sources and optimize data processing. • Proactively address challenges and drive effective communication. • Continuously enhance data systems and ensure alignment with business needs.

Requirements

• available to start immediately • 5+ years in a senior developer role • hands-on experience in building data processing pipelines • proficiency with GCP services, especially BigQuery and BigQuery SQL, for large-scale data processing and optimization • extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization. • knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing. • strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL). • experience with unit testing, pre-commit checks, and strict type enforcement for data pipelines. • deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts. • excellent command of oral and written English. • expertise in optimizing BigQuery performance using tools like Query Profiler (nice to have) • prior experience developing or testing custom operators in Apache Airflow (nice to have) • familiarity with Docker, Kubernetes, Helm, Terraform, Kafka, and CI/CD pipelines for data environments (nice to have)

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com