Analytics • Credit ratings • Finance • Energy & commodities information • Intelligence
10,000+
November 5
Angular
AWS
Azure
Benchmarks
Cloud
ETL
Informatica
J2EE
Java
Jenkins
Microservices
Python
RDBMS
React
Scala
SDLC
Shell Scripting
SOAP
Tensorflow
Unix
Go
Analytics • Credit ratings • Finance • Energy & commodities information • Intelligence
10,000+
• You will be an expert contributor and part of the Rating Organization’s Ingestion Pipelines Engineering Team. • Design & Build ingestion pipelines using ELT and schema on read in Databricks delta lake. • Provide technical expertise in the areas of design and implementation of Ratings Data Ingestion pipelines with modern AWS cloud and other technologies such as S3, Hive, Databricks, Scala, Python and Large-scale data analytics tools. • Work closely with other data teams & Data Science team and participate in development of Ingestion pipelines. • Ensure data governance principles adopted, data quality checks and data lineage implemented in each hop of the data. • Be in tune with emerging trends Big data and cloud technologies and participate in evaluation of new technologies.
• BE, MCA or MS degree in Computer Science or Information Technology. • 10+ years of experience as Data Engineer at an innovative organization. • 5+ years of hands-on experience in implementing data lake systems using AWS/Azure cloud technologies such as S3, Databricks, Hive, Large-scale data analytics tools & Scala, Python etc. • 3+ years of Expertise in building application using Data stream processing tools, Golden Gate etc. for building ingestion pipeline for Bulk and incremental data loads. • Strong expertise (preferable 5 yrs) in SDLC cycle including stakeholder management, people management, risk management, strategic management, communication management. • Experience with development frameworks as well as data and integration technologies such as Python, Scala. • Experience in microservices and API design and implementation, with service-oriented architectures, SOAP and RESTful APIs. • Hands-on experience in developing scalable data pipeline using technologies like Data stream processing tools, Databricks, Large-scale data analytics tools and Scala applying ETL and ELT concepts. • Deep Experience with three or more technologies of Java/J2EE, C#, AWS, Large-scale data analytics tools, Python, Scala, any RDBMS, Data stream processing tools, Informatica, Angular/ReactJS, Databricks, Cloud-native orchestration tools. • Experience with Continuous integration and deployment tools like Jenkins and Azure DevOps. • Experience working in UNIX/Linux environment including shell scripting. • Strong understanding of cloud native architectures, design patterns and best practices. • Knowledgeable in technology and industry trends with ability to develop and present substantive technical solutions. • Knowledge of Agile approaches to software development and able to put key Agile principles into practice to deliver solutions incrementally. • Quality first mindset with a strong background and experience developing products for a global audience at scale. • Excellent analytical thinking, interpersonal, oral, and written communication skills with strong ability to influence both IT and business partners.
• Health & Wellness: Health care coverage designed for the mind and body. • Flexible Downtime: Generous time off helps keep you energized for your time on. • Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. • Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. • Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. • Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference.
Apply NowNovember 5
10,000+
Lead Data Engineer for S&P Global focusing on ingestion pipelines and data architecture.
November 5
10,000+
Designs and builds data applications leveraging cloud computing for healthcare