Solutions Architect

November 7

Apply Now
Logo of Cloudera

Cloudera

Big Data • Cloud Computing • machine learning • cloud • Analytics

1001 - 5000

💰 $4.1M Venture Round on 2013-01

Description

•At Cloudera, we empower people to transform complex data into clear and actionable insights. •Cloudera is seeking a Solutions Architect to join its APAC Professional Services team in Singapore. •Develop massively scalable solutions to solve complex data problems using Hadoop, NiFi, Spark, and related Big Data technology. •Client facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. •Work directly with customers to implement Big Data solutions at scale using the Cloudera Data Platform and Cloudera Dataflow. •Design and implement Hadoop and NiFi platform architectures and configurations for customers. •Perform platform installation and upgrades for advanced secured cluster configurations. •Analyze complex distributed production deployments and make recommendations to optimize performance. •Document and present complex architectures for the customers' technical teams. •Drive projects with customers to successful completion.

Requirements

•15+ years in Information Technology and System Architecture experience •8+ years of Professional Services (customer facing) experience architecting large scale storage, data centre and /or globally distributed solutions •10+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions •Expert on big data use-cases and provided recommend standard design patterns commonly used in Hadoop-based and streaming data deployments. •In-depth knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc. •Expert at understanding and translating customer requirements into technical requirements •Several experiences implementing data transformation and processing solutions •Several experiences designing data queries against data in the HDFS environment using tools such as Apache Hive •Several experiences setting up multi-node Hadoop clusters •Several experiences in configuring security configurations (LDAP/AD, Kerberos/SPNEGO) •Several experiences in Cloudera Software and/or HDP Certification (HDPCA / HDPCD) is a plus •Strong experience implementing software and/or solutions in the enterprise Linux environment •Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos •Several experiences with migration of data platforms to Datalakehouse. •Strong understanding of network configuration, devices, protocols, speeds and optimisations •Strong understanding of the Java ecosystem including debugging, logging, monitoring and profiling tools •Experience with scripting tools such as bash shell scripts, Python and/or Perl, Ansible, Chef, Puppet •Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements

Benefits

•Generous PTO Policy •Support work life balance with Unplugged Days •Flexible WFH Policy •Mental & Physical Wellness programs •Phone and Internet Reimbursement program •Access to Continued Career Development •Comprehensive Benefits and Competitive Packages •Paid Volunteer Time •Employee Resource Groups

Apply Now

Similar Jobs

October 20

Enabling providers to leverage Polygon’s blockchain for decentralized applications.

Web3

October 17

Wiz

201 - 500

Solutions Engineer supporting enterprise customers in cloud security across AWS, Azure, GCP.

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com