IT Consultancy • Continuous Delivery • Offshore Services • Deployment Automation • Digital Transformation
5001 - 10000
2 days ago
Apache
AWS
Azure
Cloud
Distributed Systems
Docker
ETL
Google Cloud Platform
Hadoop
Informatica
Kubernetes
NoSQL
Numpy
Pandas
PySpark
Python
Spark
SQL
Unity
Go
IT Consultancy • Continuous Delivery • Offshore Services • Deployment Automation • Digital Transformation
5001 - 10000
• responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems • building and maintaining architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies • evaluating and translating technical design to workable technical solutions/code and technical specifications at par with industry standards • driving creation of re-usable artifacts • establishing scalable, efficient, automated processes for data analysis, data model development, validation, and implementation • working closely with analysts/data scientists to understand impact to the downstream data models • writing efficient and well-organized software to ship products in an iterative, continual release environment • contributing and promoting good software engineering practices across the team • communicating clearly and effectively to technical and non-technical audiences • defining data retention policies • monitoring performance and advising any necessary infrastructure changes.
• 3+ years’ experience with GCP (BigQuery, Dataflow, Pub/Sub, Bigtable or other NoSQL database, Dataproc, Storage, Kubernetes Engine) • 5+ years’ experience with data engineering or backend/fullstack software development • strong SQL skills • Python scripting proficiency • experience with data transformation tools - Databricks and Spark • Data manipulation libraries (such as Pandas, NumPy, PySpark) • experience in structuring and modelling data in both relational and non-relational forms • ability to elaborate and propose relational/non-relational approach, normalization / denormalization and data warehousing concepts (star, snowflake schemas) • designing for transactional and analytical operations • experience with CI/CD tooling (GitHub, Azure DevOps, Harness etc.) • good verbal and written communication skills in English.
Apply Now2 days ago
10,000+
As a Senior Data Engineer, design and maintain Oracle PBCS database infrastructure and ensure data quality.
November 10
1001 - 5000
Data Engineer optimizing financial data management for a major client.
November 8
51 - 200
Designing data architecture for a global tech consulting firm, enhancing productivity.
November 7
1001 - 5000
Enhance financial operations through data management and integration strategies.
October 31
11 - 50
Deliver data solutions and design scalable data pipelines for Business Analytics & Insights.