Data Architect - GCP

6 days ago

Apply Now
Logo of 66degrees

66degrees

Google Cloud • Cloud Adoption • Cloud Security • Google Cloud Cost Optimization • Cloud-Native Application Development

501 - 1000

Description

• Overview of 66degrees: 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. • Overview of Role: We are seeking an experienced Data Architect for a 3 month contract-to-hire opportunity. The Data Architect will design, develop, and maintain our Google Cloud data architecture. • Responsibilities: 1. GCP Cloud Architecture: Design, implement, and manage robust, scalable, and cost-effective cloud-based data architectures on Google Cloud Platform (GCP), leveraging services like BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. Experience designing cloud architectures on Oracle Cloud is a plus. 2. Data Modeling: Develop and maintain conceptual, logical, and physical data models to support various business needs. 3. Big Data Processing: Design and implement solutions for processing large datasets using technologies such as Spark and Hadoop. 4. Data Governance: Establish and enforce data governance policies, including data quality, security, compliance, and metadata management. 5. Data Pipelines: Build and optimize data pipelines for efficient data ingestion, transformation, and loading. 6. Performance Optimization: Monitor and tune data systems to ensure high performance and availability. 7. Collaboration: Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and provide architectural guidance. 8. Innovation: Stay current with the latest technologies and trends in data architecture and cloud computing.

Requirements

• GCP Core Services: In-depth knowledge of GCP data services, including BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. • Data Modeling: Expertise in data modeling techniques and best practices. • Big Data Technologies: Hands-on experience with Spark and Hadoop. • Cloud Architecture: Proven ability to design scalable, reliable, and cost-effective cloud architectures. • Data Governance: Understanding of data quality, security, compliance, and metadata management. • Programming: Proficiency in SQL, Python, and DBT (Data Build Tool). • Problem-Solving: Strong analytical and problem-solving skills. • Communication: Excellent written and verbal communication skills. • A Bachelor’s degree in Computer Science, Computer Engineering, Data or related or equivalent work experience required. • GCP Professional Data Engineer or Cloud Architect certification is a plus.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com