Data Engineer - Databricks

September 23

Apply Now
Logo of Sharesource

Sharesource

Artificial Intelligence β€’ Robotic Process Automation β€’ Distributed Teams β€’ Engineering β€’ Tech

51 - 200

Description

β€’ Design and implement data pipelines for extracting and loading data into a central data store β€’ Collaborate with stakeholders to understand business requirements and design data models β€’ Optimise and enhance existing data pipelines for high performance and reliability β€’ Develop and maintain data quality checks to ensure accuracy and completeness β€’ Monitor and maintain data pipelines, identifying areas for improvement β€’ Participate in code reviews and collaborate with other engineers to maintain high-quality code β€’ Create visualisations using tools like Power BI β€’ Stay updated with industry trends, recommending new tools and technologies for data infrastructure improvement β€’ Participate in discovery workshops and solutions design activities with customers

Requirements

β€’ You have at least 5 years of related experience β€’ You have strong experience with Databricks, Snowflake, Redshift, or Synapse β€’ You demonstrate proficiency in AWS and/or Azure β€’ You possess extensive knowledge of ETL processes and data integration β€’ You showcase expertise in data warehousing, modelling, and lake house architecture β€’ You possess strong programming skills in Python, SQL, and Spark β€’ You excel in communication and collaboration β€’ You have hands-on experience with Python (Pyspark, Lambda, Batch, Glue, DataFlow, etc) β€’ You work with customers to elicit requirements and scope projects β€’ You demonstrate proficiency in data transformation and migration β€’ You understand Data Governance concepts (MDM, Metadata management, Data Security, Data Quality, etc) β€’ You continuously monitor and maintain data pipelines, ensuring they run smoothly and identifying areas for improvement β€’ You possess visualization skills with an eye for aesthetics β€’ You demonstrate descriptive Data Analytics skills with PowerBI β€’ You have experience in Data Lake establishment and/or data pipeline development β€’ Your dashboard proficiency includes Tableau, Kibana, Quicksight, PowerBI β€’ You are familiar with commercial ETL tools like DBT, Talend, Informatica, or Attunity β€’ You have exposure to Infrastructure as code and cloud deployment β€’ You are familiar with Bitbucket, GitHub, Azure DevOps

Benefits

β€’ Competitive salary and professional development β€’ Collaborative client partnerships and a fun, inclusive work environment β€’ Opportunities for leadership, expertise, and global client exposure β€’ Monthly coaching, training, and career development β€’ Remote work flexibility and a flexible hybrid model β€’ Engage in social impact activities and industry impact β€’ Achieve work-life balance and flexibility to support personal commitments β€’ Comprehensive HMO coverage with one free dependent

Apply Now
Built byΒ Lior Neu-ner. I'd love to hear your feedback β€” Get in touch via DM or lior@remoterocketship.com

Join our Facebook group

πŸ‘‰ Remote Jobs Network