Software Development • Agile Development • DevOps • Scrum • Mobile Applications
1001 - 5000
September 10
Software Development • Agile Development • DevOps • Scrum • Mobile Applications
1001 - 5000
• Working with teams of a globally recognized American apparel brand • Responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems • Building and maintaining architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies • Evaluating and translating technical design to workable technical solutions/code and technical specifications at par with industry standards • Driving creation of re-usable artifacts • Establishing scalable, efficient, automated processes for data analysis, data model development, validation, and implementation • Working closely with analysts/data scientists to understand impact to the downstream data models • Writing efficient and well-organized software to ship products in an iterative, continual release environment • Contributing and promoting good software engineering practices across the team • Communicating clearly and effectively to technical and non-technical audiences • Defining data retention policies • Monitoring performance and advising any necessary infrastructure changes • Responsible for dashboard development (Tableau, PowerBi, Qlik, etc) • Responsible for data analytics model development (R, Python, Spark)
• 5+ years’ experience as a software developer/data engineer • Big Data technologies and AI/ML Life cycle • University or advanced degree in engineering, computer science, mathematics, or a related field • Strong hands-on experience in Databricks using PySpark and Spark SQL (Unity Catalog, workflows, Optimization techniques) • Experience with at least one cloud provider solution - Azure, AWS, GCP (preferred) • Strong experience working with relational SQL databases • Strong experience with object-oriented/object function scripting language: Python • Working knowledge in any transformation tools (DBT preferred) • Ability to work with Linux platform • Strong knowledge of data pipeline and workflow management tools (Airflow preferred) • Working knowledge of Git hub /Git Toolkit • Expertise in standard software engineering methodology, e.g. unit testing, code reviews, design documentation • Experience creating Data pipelines that prepare data for ingestion & consumption appropriately • Experience in maintaining and optimizing databases/filesystems for production usage in reporting, analytics • Working in a collaborative environment and interacting effectively with technical and non-technical team members equally well • Good verbal and written communication skills (English) • Experience with ecommerce, retail or supply chains is welcome
Apply Now