DataOps Engineer

December 13

Apply Now
Logo of Alter Solutions Portugal

Alter Solutions Portugal

IT Outsourcing • Nearshore • Software • Turn-Key Projects • Tailored Solutions

Description

• Understand problems from a user perspective and communicate to clearly understand the issue. • Reproduce bugs or issues that users are facing. • Apply root cause analysis to quickly and efficiently. • Find the root cause of the problem, patch it, test it, and communicate with the end user. • Write postmortems summarizing every step of resolution and helping the team to track all issues. • Monitor existing flows and infrastructure and perform the same tasks when discovering bugs/issues through monitoring and alerting. • Monitor flows and infrastructure to identify potential issues. • Adapt configurations to keep flows and infrastructures working as expected, keeping the operations without incident. • Track costs and time of processing through dedicated dashboards. • Alert people who query tables the wrong way, involving high costs. • Track down jobs, views, and tables that are running inefficiently and occur either high costs or low speed of execution. • Optimize jobs, queries, and tables to optimize both costs and speed of execution. • Manage infrastructure through Terraform. • Share and propose good practices. • Decommission useless infrastructures such as services, tables, or virtual machines. • Track future deployments with a Data Architect and participate in Deployment Reviews. • Share and propose good practices of deployment. • Accompany Data Engineers during the entire process of deployments. • Accompany Data Engineers in the following period of active monitoring. • Ensure diligent application of deployment process, logging, and monitoring strategy. • Take over newly deployed flows in the run process.

Requirements

• Google Cloud Platform: General knowledge of the platform and various services, and at least one year of experience with GCP. • Apache Airflow: At least two years of experience with the Airflow orchestrator, experience with Google Composer is a plus. • Google BigQuery: Extensive experience (at least 4 years) with GBQ, know how to optimize tables and queries, and able to design database architecture. • Terraform: At least two years of experience with Terraform, and know good practices of GitOps. • Apache Spark: this is an optional expertise we would value. Some of our pipelines use pySpark. • Additional Knowledge and Experience that are a Plus: o Pub/Sub o Kafka o Azure Analysis Services o Google Cloud Storage optimization

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com