Analytics Engineer

August 30

Apply Now
Logo of Superside

Superside

Superside is the leading Creative-as-a-Service (CaaS) company that helps over 450 ambitious brands get great design and

design • brand • brand identity • visual identity • graphic design

501 - 1000

Description

• Develop and maintain well-documented, scalable, and performant data models using dbt, ensuring business data is well structured, logically modeled, rich in analytical value, and ready for analysis. • Design and optimize SQL-based transformation pipelines that deliver high-quality, analytics-ready datasets to business intelligence and analytics teams. • Collaborate with business analysts, product managers, and stakeholders to gather requirements and ensure data models support accurate and meaningful business reporting. • Implement and embed business logic within data models to ensure KPIs and other business metrics accurately reflect the datasets. • Establish and maintain data quality checks, automated testing, and documentation to ensure business-critical data accuracy, consistency, and reliability. • Continuously monitor and improve the performance of data transformations and queries, optimizing them for speed, efficiency, and scalability. • Maintain clear and comprehensive documentation for data models, business logic, and data flows, ensuring that analysts and stakeholders can easily use the data. • Assist in mentoring junior team members on best practices in analytics engineering and actively contribute to developing the team’s skills and capabilities.

Requirements

• Fluency in SQL, with experience writing efficient and scalable queries to transform, model, and analyze large datasets. • Experience with data modeling tools like dbt to build and manage structured, reusable data models. • Ability to translate business requirements into well-defined and scalable data models, ensuring alignment with key business metrics. • Familiarity with BI tools such as Looker, Tableau, or Power BI and a strong understanding of structuring data for optimal use in these platforms. • Experience implementing automated data quality checks and tests within data pipelines to maintain data integrity; some python experience is advantageous. • Strong communication skills and the ability to collaborate with technical and non-technical stakeholders to gather requirements and deliver data solutions that meet business needs. • A basic understanding of cloud-based data platforms like Snowflake, BigQuery, or Redshift is beneficial. • Experience working in a remote-first, collaborative environment, demonstrating a proactive and self-directed work approach.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com