June 14
Airflow
Amazon Redshift
AWS
Azure
Cassandra
Cloud
ElasticSearch
Google Cloud Platform
Hadoop
HDFS
Java
Kafka
Matillion
NoSQL
Python
Scala
Spark
SQL
Tableau
•We are seeking qualified Solution Architects proficient in Software/Data engineering and DevOps to help deliver our Elastic Operations service. •This position will report to our Managed Services team in Bangalore, India. •This is a hands-on technical Developer/Architect position. •Only experienced candidates with a deep passion for understanding and designing complex data solutions must apply. •You will be responsible for designing, validating, optimizing, and maintaining small/large-scale complex data integration and data pipeline workloads. •You will be working on large-scale, complex data platform projects running on Snowflake and other native cloud platform services in AWS and Azure. •You will also participate in data integration, data modeling, data governance, and data security tasks. •You will need the ability to learn and quickly upskill on data ecosystem technologies related to data ingestion, data transformation, data modeling, data migration, platform design, and architecture, with some exposure to data visualization tools like PowerBI.
•9-12 years of hands-on experience as a software engineer, DevOps, or Data engineer in Data modeling, designing, implementing, and supporting modern data solutions. •Experience in Core cloud data platforms like Snowflake, AWS, Azure, or Databricks. •Deep working knowledge of end-to-end pipelines for small and large-scale data sets from various sources based on applications. Ability to diagnose and fix broken pipelines. •Understanding of common data integration and data transformation patterns for small and large-scale data sets. •Deep understanding of data validation processes using utilities or manual processes. •Hands-on experience troubleshooting, optimizing, and enhancing data pipelines and bringing improvements in the production environment. •Extensive experience in providing operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift). •Programming expertise in Java, Python and/or Scala. •SQL and the ability to write, debug, and optimize SQL queries. •Unmatched troubleshooting and performance tuning skills (data warehouse) •Willing to work in a developer and support role across customers. Proficient in Incident Management and Troubleshooting. •Excellent client-facing written and verbal communication skills and experience. •Create and deliver detailed technical presentations for an executive audience. •Excellent client-facing written and verbal communication skills and experience. •4-year Bachelor's degree in Computer Science or a related field.
•Medical Insurance for Self & Family •Medical Insurance for Parents •Term Life & Personal Accident •Wellness Allowance •Broadband Reimbursement •Professional Development Allowance •Reimbursement of Skill Upgrade Certifications •Certification Reimbursement
Apply Now