RxCloud is a full-scale quality assurance and professional services organization that focuses exclusively on the pharmaceutical, biotechnology, and medical devices industries. With over 1000 years of combined IT expertise and knowledge of quality and compliance, RxCloud delivers tailored solutions for regulatory compliance and quality engineering. The company specializes in providing diverse audit services, quality management system consulting, and IT system implementations to ensure adherence to stringent industry standards, enhancing operational efficiency and product safety across the life sciences sector.
Validation • Testing • Automation • GxP • GMP
January 3
RxCloud is a full-scale quality assurance and professional services organization that focuses exclusively on the pharmaceutical, biotechnology, and medical devices industries. With over 1000 years of combined IT expertise and knowledge of quality and compliance, RxCloud delivers tailored solutions for regulatory compliance and quality engineering. The company specializes in providing diverse audit services, quality management system consulting, and IT system implementations to ensure adherence to stringent industry standards, enhancing operational efficiency and product safety across the life sciences sector.
Validation • Testing • Automation • GxP • GMP
• This is a remote position. • Job Title: Lead Snowflake Data Engineer • Location: Boca Raton, FL (Initial remote ok) • Duration: Long Term • Total of 10+ years in data engineering role with 4+ years of recent experience with Snowflake, Tera data, Cloud data Migration. • Extensive experience in design, development and support of complex ETL solutions • Ability to design and implement highly performant data ingestion pipelines from multiple sources using DataStage, SnowPipe, SNOWSQL. • In-depth knowledge of SnowPipe, SNOWSQL, stored procedures • Good knowledge of Agile processes and able to work with Scrum teams. • Experience in DataStage and Snowflake performance optimization • Hands-on development experience with Snowflake data platform features including Snowpipes, SnowSQL,tasks, stored procedures, streams, resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing and respective use cases. • Advanced proficiency in writing complex SQL statements and manipulating large structured and semi-structured datasets. • Build processes supporting data transformation, data structures, metadata, dependency and workload management. • Demonstrable experience designing and implementing modern data warehouse/data lake solutions with an understanding of best practices. • Ready to cover on-call support on rotation basis • Proficiency in Snowflake Cloud Data Platform and familiarity with AWS /Azure cloud platform • Strong leadership quality and able to coordinate with team at offshore • Ability to provide technical guidance to data engineering team for data pipeline design and enhancements • Strong experience in ETL with Data migration, data consolidation • Hands-on experience in ETL data loading around event and messaging patterns, streaming data, Kafka , API • Understanding of fundamentals of DevOps CI/CD, Git and Git workflows and SAAS-based Git tools like GitHub, GitLab, Bitbucket • Experience of working in agile application development environment • Ability to proactively prioritize tasks in consultation with business stakeholders, Product Owners, Product Managers • Design, Build, Deploy and Support DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools. • Ensure data quality, efficient processing, and timely delivery of accurate and trusted data. • The ability to design, implement and optimize large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is essential. • Establish ongoing end-to-end monitoring for the data pipelines. • Strong understanding of full CI/CD lifecycle. • Convert business requirements to technical solution • Ensure adherence to architectural guidelines, strategic business needs • Technical feasibility analysis, recommendations and effort estimation • Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes. • Performance optimization • QA support • Automation
• Total of 10+ years in data engineering role with 4+ years of recent experience with Snowflake, Tera data, Cloud data Migration. • Extensive experience in design, development and support of complex ETL solutions • Ability to design and implement highly performant data ingestion pipelines from multiple sources using DataStage, SnowPipe, SNOWSQL. • In-depth knowledge of SnowPipe, SNOWSQL, stored procedures • Good knowledge of Agile processes and able to work with Scrum teams. • Experience in DataStage and Snowflake performance optimization • Hands-on development experience with Snowflake data platform features including Snowpipes, SnowSQL,tasks, stored procedures, streams, resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing and respective use cases. • Advanced proficiency in writing complex SQL statements and manipulating large structured and semi-structured datasets. • Build processes supporting data transformation, data structures, metadata, dependency and workload management. • Demonstrable experience designing and implementing modern data warehouse/data lake solutions with an understanding of best practices. • Ready to cover on-call support on rotation basis • Proficiency in Snowflake Cloud Data Platform and familiarity with AWS /Azure cloud platform • Strong leadership quality and able to coordinate with team at offshore • Ability to provide technical guidance to data engineering team for data pipeline design and enhancements • Strong experience in ETL with Data migration, data consolidation • Hands-on experience in ETL data loading around event and messaging patterns, streaming data, Kafka , API • Understanding of fundamentals of DevOps CI/CD, Git and Git workflows and SAAS-based Git tools like GitHub, GitLab, Bitbucket • Experience of working in agile application development environment • Ability to proactively prioritize tasks in consultation with business stakeholders, Product Owners, Product Managers • Design, Build, Deploy and Support DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools. • Ensure data quality, efficient processing, and timely delivery of accurate and trusted data. • The ability to design, implement and optimize large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is essential. • Establish ongoing end-to-end monitoring for the data pipelines. • Strong understanding of full CI/CD lifecycle. • Convert business requirements to technical solution • Ensure adherence to architectural guidelines, strategic business needs • Technical feasibility analysis, recommendations and effort estimation • Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes. • Performance optimization • QA support • Automation • Valid professional certification • Experience in Python and BigData Cloud platform • Expertise in Unix & Shell Scripting
Apply NowDecember 9, 2024
Help Vozy transform customer interactions as a Data Engineer, enhancing database performance and reliability.
🇺🇸 United States – Remote
💰 $1M Debt Financing on 2022-10
⏳ Contract/Temporary
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
December 9, 2024
Remote Data Architect designing data architecture for SYMVOS's business transformation solutions.
December 9, 2024
As a Data Engineer, you'll design and maintain data pipelines for SYMVOS's innovative solutions.
November 22, 2024
11 - 50
Work as a Postgres Database Engineer focused on conservation analytics in AWS environments. Join a remote team driving innovative solutions.
November 10, 2024
Design and enhance AWS data environment for a software consulting firm.
Discover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.