Cloud Consultant

3 days ago

Apply Now
Logo of Pythian

Pythian

Oracle • Microsoft SQL Server • MySQL • NoSQL • Big Data

201 - 500

💰 $15M Venture Round on 2017-05

Description

• As a Cloud Consultant you work within a team of globally dispersed cloud specialists, to design and create impactful software powering enterprise data platform solutions mainly focused on Cloud Platforms. • You will produce outcomes for the real customer projects and help create software artifacts enabling automation of data platform implementations and data migrations.

Requirements

• Proficiency in a programming language such as Python, Java, Go or Scala • Experience with big data cloud technologies like EMR, Athena, Glue, Big Query, Dataproc, Dataflow. • Ideally you will have specific strong hands on experience working with Google Cloud Platform data technologies - Google BigQuery, Google DataFlow, and Executing PySpark and SparkSQL code at Dataproc • Understand the fundamentals of Spark (PySpark or SparkSQL) including using the Dataframe Application Programming Interface as well as analyzing and performance tuning Spark queries • Have experience developing and supporting robust, automated and reliable data pipelines • Develop frameworks and solutions that enable us to acquire, process, monitor and extract value from large dataset • Have strong SQL skills • Bring a good knowledge of popular database and data warehouse technologies & concepts from Google, Amazon or Microsoft (Cloud & Conventional RDBMS), such as BigQuery, Redshift, Microsoft Azure SQL Data Warehouse, Snowflake etc. • Have strong knowledge of a Data Orchestration solutions like Airflow, Oozie, Luigi or Talend • Have strong knowledge of DBT (Data Build Tool) or DataForm. • Experience with Apache Iceberg, Hudi and Query engines like Presto (Trino) is a plus. • Knowledge of Data Catalogs (AWS Glue, Google DataPlex etc.), Data Governance and Data Quality Solutions (for eg. Great Expectations) is an added advantage. • Have knowledge of how to design distributed systems and the trade-offs involved • Experience with working with software engineering best practices for development, including source control systems, automated deployment pipelines like Jenkins and devops tools like Terraform • Experience in data modeling, data design and persistence (e.g. warehousing, data marts, data lakes). • Experience in performing DevOps activities such as IaC using Terraform, provisioning infrastructure in GCP/aws/Azure, defining Data Security layers etc. • Good to have knowledge of GenAI tools and frameworks such as Vertex AI, Langchain. Proficiency in prompt engineering.

Benefits

• Competitive total rewards package with excellent take home salaries, shifted work time bonus (if applicable) and an annual bonus plan! • Hone your skills or learn new ones with an annual training allowance; 2 paid professional development days, attend conferences, become certified, whatever you like! • 3 weeks of paid time off and flexible working hours. All you need is a stable internet connection! • We give you all the equipment you need to work from home including a laptop with your choice of OS, and budget to personalize your work environment! • Blog during work hours; take a day off and volunteer for your favorite charity.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com