Data Engineer

August 7

🇵🇪 Peru – Remote

💵 PLN12.5k - PLN24.9k / year

⏳ Contract/Temporary

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

Apply Now
Logo of Xebia Poland

Xebia Poland

A place where experts grow.

Software Development • Agile Development • DevOps • Scrum • Mobile Applications

1001 - 5000

Description

• Working with teams of a globally recognized American apparel brand • Responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems • Building and maintaining architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies • Evaluating and translating technical design to workable technical solutions/code and technical specifications at par with industry standards • Driving creation of re-usable artifacts • Establishing scalable, efficient, automated processes for data analysis, data model development, validation, and implementation • Working closely with analysts/data scientists to understand impact to the downstream data models • Writing efficient and well-organized software to ship products in an iterative, continual release environment • Contributing and promoting good software engineering practices across the team • Communicating clearly and effectively to technical and non-technical audiences • Defining data retention policies • Monitoring performance and advising any necessary infrastructure changes • Responsible for dashboard development (Tableau, PowerBi, Qlik, etc) • Responsible for data analytics model development (R, Python, Spark)

Requirements

• 5+ years’ experience as a software developer/data engineer • Big Data technologies and AI/ML Life cycle • University or advanced degree in engineering, computer science, mathematics, or a related field • Strong hands-on experience in Databricks using PySpark and Spark SQL (Unity Catalog, workflows, Optimization techniques) • Experience with at least one cloud provider solution - Azure, AWS, GCP (preferred) • Strong experience working with relational SQL databases • Strong experience with object-oriented/object function scripting language: Python • Working knowledge in any transformation tools (DBT preferred) • Ability to work with Linux platform • Strong knowledge of data pipeline and workflow management tools (Airflow preferred) • Working knowledge of Git hub /Git Toolkit • Expertise in standard software engineering methodology, e.g. unit testing, code reviews, design documentation • Experience creating Data pipelines that prepare data for ingestion & consumption appropriately • Experience in maintaining and optimizing databases/filesystems for production usage in reporting, analytics • Good verbal and written communication skills (English) • Experience with ecommerce, retail or supply chains is welcome • Cooperation with US west-coast based teams is part of the game – up to 2x in a week overlap with 9am PDT (18:00 CET)

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com