Cloud Consultant

October 19

Apply Now
Logo of Pythian

Pythian

Oracle β€’ Microsoft SQL Server β€’ MySQL β€’ NoSQL β€’ Big Data

201 - 500 employees

Founded 1997

πŸ€– Artificial Intelligence

πŸ’° $15M Venture Round on 2017-05

Description

β€’ As a Cloud Consultant you work within a team of globally dispersed cloud specialists, to design and create impactful software powering enterprise data platform solutions mainly focused on Cloud Platforms. β€’ You will produce outcomes for the real customer projects and help create software artifacts enabling automation of data platform implementations and data migrations.

Requirements

β€’ Proficiency in a programming language such as Python, Java, Go or Scala β€’ Experience with big data cloud technologies like EMR, Athena, Glue, Big Query, Dataproc, Dataflow. β€’ Ideally you will have specific strong hands on experience working with Google Cloud Platform data technologies - Google BigQuery, Google DataFlow, and Executing PySpark and SparkSQL code at Dataproc β€’ Understand the fundamentals of Spark (PySpark or SparkSQL) including using the Dataframe Application Programming Interface as well as analyzing and performance tuning Spark queries β€’ Have experience developing and supporting robust, automated and reliable data pipelines β€’ Develop frameworks and solutions that enable us to acquire, process, monitor and extract value from large dataset β€’ Have strong SQL skills β€’ Bring a good knowledge of popular database and data warehouse technologies & concepts from Google, Amazon or Microsoft (Cloud & Conventional RDBMS), such as BigQuery, Redshift, Microsoft Azure SQL Data Warehouse, Snowflake etc. β€’ Have strong knowledge of a Data Orchestration solutions like Airflow, Oozie, Luigi or Talend β€’ Have strong knowledge of DBT (Data Build Tool) or DataForm. β€’ Experience with Apache Iceberg, Hudi and Query engines like Presto (Trino) is a plus. β€’ Knowledge of Data Catalogs (AWS Glue, Google DataPlex etc.), Data Governance and Data Quality Solutions (for eg. Great Expectations) is an added advantage. β€’ Have knowledge of how to design distributed systems and the trade-offs involved β€’ Experience with working with software engineering best practices for development, including source control systems, automated deployment pipelines like Jenkins and devops tools like Terraform β€’ Experience in data modeling, data design and persistence (e.g. warehousing, data marts, data lakes). β€’ Experience in performing DevOps activities such as IaC using Terraform, provisioning infrastructure in GCP/aws/Azure, defining Data Security layers etc. β€’ Good to have knowledge of GenAI tools and frameworks such as Vertex AI, Langchain. Proficiency in prompt engineering.

Benefits

β€’ Competitive total rewards package with excellent take home salaries, shifted work time bonus (if applicable) and an annual bonus plan! β€’ Hone your skills or learn new ones with an annual training allowance; 2 paid professional development days, attend conferences, become certified, whatever you like! β€’ 3 weeks of paid time off and flexible working hours. All you need is a stable internet connection! β€’ We give you all the equipment you need to work from home including a laptop with your choice of OS, and budget to personalize your work environment! β€’ Blog during work hours; take a day off and volunteer for your favorite charity.

Apply Now
Built byΒ Lior Neu-ner. I'd love to hear your feedback β€” Get in touch via DM or lior@remoterocketship.com