November 26
• Develop and optimise ETL/ELT pipelines to support data warehouse and analytics needs • Build and improve dashboards and reports for internal stakeholders • Work closely with stakeholders to understand data requirements and drive insights • Manage and optimise Postgres DWH (with DBT on AWS, orchestrated by Airflow) • Ensure data availability and quality • Lead initiatives for improving data structure, documentation, and analytical processes
• 2+ years in data/analytics engineering or business intelligence roles • Experience with PostgreSQL, Python, DBT, AWS, and Airflow • Strong skills in dashboarding tools like Metabase or similar solutions • Proven ability to balance technical data engineering tasks with a hands-on, analyst-oriented approach • Comfortable working in a remote-first, English-speaking environment • Skilled in Git & CI/CD, Google Suite • Advanced knowledge of data warehousing concepts and schema optimisation based on usage patterns • Experience in developing cross-platform ETL/ELT processes, maintaining systems for tracking data quality and consistency • Experience using databases in a business environment with large-scale, complex data sets • High standard of quality and personal responsibility as well as attention to detail • Proactive, creative, and self-starter attitude • Very good written and spoken English skills
Apply Now