Artificial Intelligence β’ Robotic Process Automation β’ Distributed Teams β’ Engineering β’ Tech
51 - 200
September 23
Artificial Intelligence β’ Robotic Process Automation β’ Distributed Teams β’ Engineering β’ Tech
51 - 200
β’ Design and implement data pipelines for extracting and loading data into a central data store β’ Collaborate with stakeholders to understand business requirements and design data models β’ Optimise and enhance existing data pipelines for high performance and reliability β’ Develop and maintain data quality checks to ensure accuracy and completeness β’ Monitor and maintain data pipelines, identifying areas for improvement β’ Participate in code reviews and collaborate with other engineers to maintain high-quality code β’ Create visualisations using tools like Power BI β’ Stay updated with industry trends, recommending new tools and technologies for data infrastructure improvement β’ Participate in discovery workshops and solutions design activities with customers
β’ You have at least 5 years of related experience β’ You have strong experience with Databricks, Snowflake, Redshift, or Synapse β’ You demonstrate proficiency in AWS and/or Azure β’ You possess extensive knowledge of ETL processes and data integration β’ You showcase expertise in data warehousing, modelling, and lake house architecture β’ You possess strong programming skills in Python, SQL, and Spark β’ You excel in communication and collaboration β’ You have hands-on experience with Python (Pyspark, Lambda, Batch, Glue, DataFlow, etc) β’ You work with customers to elicit requirements and scope projects β’ You demonstrate proficiency in data transformation and migration β’ You understand Data Governance concepts (MDM, Metadata management, Data Security, Data Quality, etc) β’ You continuously monitor and maintain data pipelines, ensuring they run smoothly and identifying areas for improvement β’ You possess visualization skills with an eye for aesthetics β’ You demonstrate descriptive Data Analytics skills with PowerBI β’ You have experience in Data Lake establishment and/or data pipeline development β’ Your dashboard proficiency includes Tableau, Kibana, Quicksight, PowerBI β’ You are familiar with commercial ETL tools like DBT, Talend, Informatica, or Attunity β’ You have exposure to Infrastructure as code and cloud deployment β’ You are familiar with Bitbucket, GitHub, Azure DevOps
β’ Competitive salary and professional development β’ Collaborative client partnerships and a fun, inclusive work environment β’ Opportunities for leadership, expertise, and global client exposure β’ Monthly coaching, training, and career development β’ Remote work flexibility and a flexible hybrid model β’ Engage in social impact activities and industry impact β’ Achieve work-life balance and flexibility to support personal commitments β’ Comprehensive HMO coverage with one free dependent
Apply NowJoin our Facebook group
π Remote Jobs Network