Astrafy is a data consultancy that transforms businesses into data-driven organizations by offering modern, scalable, and customized data strategies. With a focus on aligning data initiatives with business goals, Astrafy helps organizations implement comprehensive data solutions, including data engineering, analytics, machine learning, and data management. Their expertise in managing the full data life cycle enables clients to harness the power of their data effectively, driving insights and competitive advantage.
google cloud • data analytics • analytics engineering • data engineering • business intelligence
February 20
Astrafy is a data consultancy that transforms businesses into data-driven organizations by offering modern, scalable, and customized data strategies. With a focus on aligning data initiatives with business goals, Astrafy helps organizations implement comprehensive data solutions, including data engineering, analytics, machine learning, and data management. Their expertise in managing the full data life cycle enables clients to harness the power of their data effectively, driving insights and competitive advantage.
google cloud • data analytics • analytics engineering • data engineering • business intelligence
• Work on various projects to help Astrafy customers get the most out of their data. • Design and maintain scalable data pipelines leveraging technologies such as Airflow, dbt, BigQuery, and Snowflake. • Develop and optimize data infrastructure in the Google Cloud environment. • Utilize Terraform and Kubernetes to automate infrastructure provisioning and manage containerized workloads. • Implement robust data governance and quality measures throughout the data lifecycle. • Collaborate with cross-functional teams to design and deploy Looker dashboards and other analytics solutions. • Champion a culture of innovation by researching, evaluating, and recommending emerging data technologies and industry best practices.
• Educational Background: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field (or equivalent practical experience). • Professional Experience: 0+ years of hands-on experience in data engineering or software engineering roles, building and optimizing data pipelines. • Technical Expertise: Proficiency in SQL and at least one programming language (e.g., Python), with a proven track record of working with modern data stack components like Airflow, dbt, and BigQuery or Snowflake. • Cloud Knowledge: Familiarity with Google Cloud Platform (GCP) services for data storage, orchestration, and analytics; experience with other cloud providers is a plus. • Infrastructure as Code (IaC): Understanding of Terraform for automating infrastructure setup and management, plus exposure to container orchestration with Kubernetes. • Analytics & Visualization: Experience with BI tools such as Looker, including designing and developing dashboards for data-driven insights. • Data Governance & Security: Knowledge of best practices for data quality, lineage, and security, ensuring compliance with relevant regulations and standards. • Team Player: Strong communication and collaboration skills, with the ability to work effectively within cross-functional teams to deliver impactful data solutions. • Continuous Learning: Curiosity to stay updated on emerging data technologies, practices, and frameworks, and to share knowledge across the organization. • Strong data visualization skills to convey information and results clearly. • You speak English fluently, and a word of French and/or Spanish is a plus.
• Attractive Salary Package: No blurry or hidden clauses. • Genuine Innovation: An exciting role where technology innovation is more than a buzzword—it's how we operate daily. • Strong Values & Culture: Become part of a dynamic team that lives by solid values. • Continuous Learning: We offer ongoing training and development for both soft and hard skills. • Flexible Work Environment: Enjoy flexible hours and remote work options. • Team-Building & Retreats: Thrive in a supportive environment, supported by regular team activities. • And Much More: Our handbook covers all you need to know about who we are and how we work.
Apply NowDiscover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.