Data Engineer - Databricks

September 23

Apply Now
Logo of Sharesource

Sharesource

Artificial Intelligence • Robotic Process Automation • Distributed Teams • Engineering • Tech

51 - 200

Description

• Design and implement data pipelines for extracting and loading data into a central data store • Collaborate with stakeholders to understand business requirements and design data models • Optimise and enhance existing data pipelines for high performance and reliability • Develop and maintain data quality checks to ensure accuracy and completeness • Monitor and maintain data pipelines, identifying areas for improvement • Participate in code reviews and collaborate with other engineers to maintain high-quality code • Create visualisations using tools like Power BI • Stay updated with industry trends, recommending new tools and technologies for data infrastructure improvement • Participate in discovery workshops and solutions design activities with customers

Requirements

• You have at least 5 years of related experience • You have strong experience with Databricks, Snowflake, Redshift, or Synapse • You demonstrate proficiency in AWS and/or Azure • You possess extensive knowledge of ETL processes and data integration • You showcase expertise in data warehousing, modelling, and lake house architecture • You possess strong programming skills in Python, SQL, and Spark • You excel in communication and collaboration • You have hands-on experience with Python (Pyspark, Lambda, Batch, Glue, DataFlow, etc) • You work with customers to elicit requirements and scope projects • You demonstrate proficiency in data transformation and migration • You understand Data Governance concepts (MDM, Metadata management, Data Security, Data Quality, etc) • You continuously monitor and maintain data pipelines, ensuring they run smoothly and identifying areas for improvement • You possess visualization skills with an eye for aesthetics • You demonstrate descriptive Data Analytics skills with PowerBI • You have experience in Data Lake establishment and/or data pipeline development • Your dashboard proficiency includes Tableau, Kibana, Quicksight, PowerBI • You are familiar with commercial ETL tools like DBT, Talend, Informatica, or Attunity • You have exposure to Infrastructure as code and cloud deployment • You are familiar with Bitbucket, GitHub, Azure DevOps

Benefits

• Competitive salary and professional development • Collaborative client partnerships and a fun, inclusive work environment • Opportunities for leadership, expertise, and global client exposure • Monthly coaching, training, and career development • Remote work flexibility and a flexible hybrid model • Engage in social impact activities and industry impact • Achieve work-life balance and flexibility to support personal commitments • Comprehensive HMO coverage with one free dependent

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com