Quantrics Enterprises Inc. is a leading global provider of customer solutions, specializing in world-class customer care, IT, and software solutions. Committed to a people-first approach, Quantrics focuses on creating career growth opportunities for its employees while delivering customer satisfaction. The company's culture emphasizes diversity, community involvement, and excellence, making it a recognized entity in the digital services industry.
Information Technology • Customer Experience
January 16
Apache
BigQuery
Cloud
Google Cloud Platform
Hadoop
Java
Kafka
Kubernetes
NoSQL
Python
Scala
SDLC
Shell Scripting
Spark
SQL
Go
Quantrics Enterprises Inc. is a leading global provider of customer solutions, specializing in world-class customer care, IT, and software solutions. Committed to a people-first approach, Quantrics focuses on creating career growth opportunities for its employees while delivering customer satisfaction. The company's culture emphasizes diversity, community involvement, and excellence, making it a recognized entity in the digital services industry.
Information Technology • Customer Experience
• Join our award-winning IT team as we lead the way in digital, cloud, and security technology services. • Deliver innovative solutions for our biggest client, Canada’s leading telecommunications, tech, and media corporation. • Play a critical role in designing, developing, and operating scalable and efficient data pipelines. • Collaborate with cross-functional teams translating business requirements into performant data pipelines with service level objectives (SLOs). • Drive best practices in data engineering, ensuring high code quality, maintainability, and user documentation. • Leverage cloud technologies with a focus on Google cloud services such as BigQuery, Dataflow, and Pub/Sub. • Optimize data pipelines for end-to-end performance, reliability, security, and resource-efficiency. • Develop and implement monitoring, alerting, and incident response processes for data pipeline systems and infrastructure. • Assure the operations of data pipelines through 24-7 monitoring of data service level indicators (SLIs). • Assure the performance of data infrastructure, ensuring 99.99% high availability. • Provide DevOps mentorship and guidance to team members. • Contribute to the development of data systems disaster recovery plans and conduct regular compliance validation drills. • Mentor and coach junior data engineers, fostering a culture of continuous improvement and learning.
• Minimum Required Skills & Experience: • Bachelor’s degree in computer science, Software Engineering, or related field; advanced degree preferred. • 4+ years of experience in engineering and operating end-to-end data systems across multi-cloud environments, with expertise in GCP. • 4+ years of proven experience in software architecture and design, driving the technical direction of projects. • 4+ years of expertise in on-premises data engineering utilizing technologies such as Hadoop, Kubernetes, and Kafka. • 4+ years of experience using Scala, Apache Spark, Apache Kafka, SQL/NoSQL, Shell scripting, Python, and Hadoop frameworks. • At least 2 years of hands-on experience in streaming batch processing technology using Apache Kafka and cloud platform using Google Cloud Platform. • At least 2 years of hands-on experience building data pipeline in Java and Python, and applying CI/CD automation within data pipeline SDLC. • Nice to have: Knowledge of Database design, optimization, and performance tuning Site Reliability Engineering or Software System performance optimization.
Apply NowDecember 4, 2024
Lead Consultant position at 3Cloud focusing on Data Architecture & Engineering, emphasizing Azure skills.
November 11, 2024
11 - 50
Data Engineer at Umpisa Inc. focusing on data architecture and pipelines.
November 11, 2024
11 - 50
Design and manage data architecture for Umpisa Inc.'s technology services.
September 18, 2024
Work remotely in a role focused on Azure data solutions.
Discover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.