Cloud Solutions • Custom Development in Microsoft Azure • Custom Development in AWS • SAP Commerce Cloud (Hybris) Platform Development • Performance Testing & Capacity Planning Services
1001 - 5000
3 hours ago
Cloud Solutions • Custom Development in Microsoft Azure • Custom Development in AWS • SAP Commerce Cloud (Hybris) Platform Development • Performance Testing & Capacity Planning Services
1001 - 5000
• Drive Data Efficiency: Create and maintain optimal data transformation pipelines. • Master Complex Data Handling: Work with large, complex financial data sets to generate outputs that meet functional and non-functional business requirements. • Lead Innovation and Process Optimization: Identify, design, and implement process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for higher scalability. • Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using open-source technologies. • Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. • Collaborate with Cross-Functional Teams: Work clients and internal stakeholders, including Senior Management, Departments Heads, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs.
• Must have 3+ years of experience in a similar role, preferably within Agile teams. • Strong analytical skills in working with both structured and unstructured data. • Skilled in SQL and relational databases for data manipulation. • Experience in building and optimizing Big Data pipelines and architectures. • Knowledge of Apache Spark framework and object-oriented programming in Java; experience with Python is a plus. • Experience with ETL processes, including scheduling and orchestration using tools like Apache Airflow (or similar). • Proven experience in performing data analysis and root cause analysis on diverse datasets to identify opportunities for improvement. • Nice to have: Expertise in manipulating and processing large, disconnected datasets to extract actionable insights. • Automate CI/CD pipelines using ArgoCD, Tekton, and Helm to streamline deployment and improve efficiency across the SDLC. • Manage Kubernetes deployments on OpenShift, focusing on scalability, security, and optimized container orchestration. • Technical skills in the following areas are a plus: relational databases (e.g. Postgresql), Big Data Tools: (e.g. Databricks), and workflow management (e.g. Airflow), and backend development using Spring Boot.
• premium medical package for both our colleagues and their children • dental coverage up to a yearly amount • eyeglasses reimbursement every two years • voucher for sport equipment expenses • in-house personal trainer • individual therapy sessions with a certified psychotherapist • webinars on self-development topics • virtual activities • sports challenges • special occasions get-togethers • yearly increase in days off • flexible working schedule • birthday, holiday and loyalty gifts for major milestones
Apply Now2 days ago
5001 - 10000
As a Data Engineer at Xebia, develop data solutions for a recognized American brand. Join us in building scalable data infrastructure and workflows using Cloud technologies.
November 8
51 - 200
Designing data architecture for a global tech consulting firm, enhancing productivity.
October 24
51 - 200
Data Engineer at Tecknoworks, optimizing data pipelines for client solutions.
October 24
51 - 200
Data Engineer at TECKNOWORKS optimizing client productivity through data management.