7 hours ago
Airflow
AWS
Azure
Cloud
Docker
Flask
Google Cloud Platform
Java
Jenkins
Kafka
Kubernetes
PySpark
Python
RabbitMQ
Shell Scripting
Spring
Spring Boot
SpringBoot
SQL
Terraform
• We are seeking a highly skilled and motivated Cloud Data Engineer with a strong backend development focus and experience in Cloud platforms (AWS/GCP). • You will be responsible for the design, implementation, and maintenance of our data-centric, Cloud-hosted backend infrastructure and services. • You will play a critical role in building and scaling our data processing pipelines and APIs, ensuring high availability, security, and performance. • Build and optimize data pipelines using ELT frameworks such as Fivetran, dbt, Airflow, and PySpark, ensuring efficient data movement and transformation. • Experience with schema design and data modeling is essential. • Design, develop, and maintain high-throughput, low-latency backend services for data ingestion, processing, and transformation using languages such as Python and Java. • Develop and maintain robust, scalable, and secure RESTful APIs using frameworks such as Spring Boot (Java) or Flask/FastAPI (Python). • Experience with API gateway services (e.g., Apigee, Kong) a plus. • Advise/design OData APIs. • Implement and manage security protocols such as OAuth, SSO, and SSL. • Design and implement event-driven architectures leveraging technologies like Kafka or Pub/Sub. • Develop and implement CI/CD pipelines using tools like Jenkins, GitLab CI, or similar, to automate the deployment and scaling of backend services. • Containerize applications using Docker and orchestrate them using Kubernetes. • Experience with managing Kubernetes clusters (e.g., GKE, EKS) is preferred. • Write and maintain infrastructure-as-code (IAC) using tools such as Pulumi, Terraform or CloudFormation. • Collaborate with Architects and Security teams to ensure that the Cloud infrastructure and services are secure and compliant with industry standards (e.g., SOC 2, HIPAA, GDPR). • Troubleshoot and resolve service-related issues, providing timely and efficient solutions. • Stay up to date with the latest developments in Cloud technology, backend development best practices, and security vulnerabilities. • Mentor and provide guidance to other team members as needed.
• Bachelor’s degree in Computer Science, Computer Engineering, or a related field. • At least 5 years of experience in backend development in a Cloud environment (AWS or GCP preferred). • At least 3 years of experience in Data Engineering in a Cloud environment. • Strong proficiency in Python and Java; SQL proficiency is required. • Extensive experience with RESTful API design, implementation, and testing using relevant frameworks. • Experience with OpenAPI/Swagger specifications a plus. • Experience with event-driven architectures and message queues (e.g., Kafka, RabbitMQ, Azure Event Hub, Amazon MSK). • Experience with ELT data processing using frameworks such as Fivetran, dbt, Airflow, and/or PySpark. • Proven experience with containerization (Docker) and orchestration (Kubernetes). • Experience with CI/CD pipelines and infrastructure-as-code (Pulumi/Terraform). • Working knowledge of Linux, shell scripting, and Git. • Strong understanding of cloud security best practices, including IAM and data encryption. • Excellent problem-solving and analytical skills. • Excellent written and verbal communication skills. • Ability to collaborate effectively in a team environment.
Apply NowDecember 6
Seeking a Data Engineer at Aquaticode to support ML research with data pipelines and processing.
December 6
Become a Data Engineer at XE, optimizing cloud data architecture for currency services.
November 10
Data Engineer for cloud platforms empowering businesses through data-driven solutions.
November 5
Data Engineer to design robust data architectures using GCP and AWS technologies.
October 30
Build and optimize Trustly's big data environment for payment solutions.