Data Engineer - Spark, Airflow

November 21

Apply Now
Logo of accesa.eu

accesa.eu

Cloud Solutions • Custom Development in Microsoft Azure​ • Custom Development in AWS • SAP Commerce Cloud (Hybris) Platform Development • Performance Testing & Capacity Planning Services

Description

• Drive Data Efficiency: Create and maintain optimal data transformation pipelines. • Master Complex Data Handling: Work with large, complex financial data sets to generate outputs that meet functional and non-functional business requirements. • Lead Innovation and Process Optimization: Identify, design, and implement process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for higher scalability. • Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using open-source technologies. • Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. • Collaborate with Cross-Functional Teams: Work clients and internal stakeholders, including Senior Management, Departments Heads, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs.

Requirements

• Must have 3+ years of experience in a similar role, preferably within Agile teams. • Strong analytical skills in working with both structured and unstructured data. • Skilled in SQL and relational databases for data manipulation. • Experience in building and optimizing Big Data pipelines and architectures. • Knowledge of Apache Spark framework and object-oriented programming in Java; experience with Python is a plus. • Experience with ETL processes, including scheduling and orchestration using tools like Apache Airflow (or similar). • Proven experience in performing data analysis and root cause analysis on diverse datasets to identify opportunities for improvement. • Nice to have: Expertise in manipulating and processing large, disconnected datasets to extract actionable insights. • Automate CI/CD pipelines using ArgoCD, Tekton, and Helm to streamline deployment and improve efficiency across the SDLC. • Manage Kubernetes deployments on OpenShift, focusing on scalability, security, and optimized container orchestration. • Technical skills in the following areas are a plus: relational databases (e.g. Postgresql), Big Data Tools: (e.g. Databricks), and workflow management (e.g. Airflow), and backend development using Spring Boot.

Benefits

• premium medical package for both our colleagues and their children • dental coverage up to a yearly amount • eyeglasses reimbursement every two years • voucher for sport equipment expenses • in-house personal trainer • individual therapy sessions with a certified psychotherapist • webinars on self-development topics • virtual activities • sports challenges • special occasions get-togethers • yearly increase in days off • flexible working schedule • birthday, holiday and loyalty gifts for major milestones

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com