Ingenius Technologies and Consulting is a company that specializes in Hybrid Intelligence, merging artificial intelligence analytics with human insight to provide dynamic and efficient business solutions. They offer a variety of products and services including AI-driven forecasting tools, custom AI solutions, data analytics and insights, as well as AI strategy and advisory services. Ingenius is dedicated to helping clients succeed in their AI initiatives by leveraging advanced technologies and human creativity.
consulting • Platform for Physical Commodity Companies • Risk Management • Analytics • Risk Management
March 19
Ingenius Technologies and Consulting is a company that specializes in Hybrid Intelligence, merging artificial intelligence analytics with human insight to provide dynamic and efficient business solutions. They offer a variety of products and services including AI-driven forecasting tools, custom AI solutions, data analytics and insights, as well as AI strategy and advisory services. Ingenius is dedicated to helping clients succeed in their AI initiatives by leveraging advanced technologies and human creativity.
consulting • Platform for Physical Commodity Companies • Risk Management • Analytics • Risk Management
• We are an innovative AI SaaS startup that develops cutting-edge AI solutions and provides expert consulting services. • We're seeking a talented Data Engineer to join our team and help drive our product development and consulting initiatives. • A Mid-Senior Azure Data Engineer will play a key role in designing, building, and optimizing data pipelines and architectures within the Azure ecosystem. • This role involves working with Azure Data Factory (ADF), Databricks, PySpark, Scala, and Python to support data ingestion, transformation, and orchestration workflows. • Collaborating with senior team members, they will enhance data solutions, contribute to AI-driven projects, and ensure scalability and performance. • Additionally, they will provide technical guidance to junior engineers and help implement best practices for data security and compliance.
• Design and deploy scalable data pipelines using Azure Synapse, Azure Databricks, and Azure Stream Analytics, Azure ML, Azure Data Factory. • Optimize data storage (e.g., partitioning in ADLS, Cosmos DB for real-time data). • Implement DevOps practices (CI/CD, IaC) via Azure DevOps or GitHub Actions. • Integrate data solutions with AI/ML services (e.g., Azure Machine Learning, OpenAI models). • Develop data governance frameworks (security, RBAC, encryption). • Troubleshoot performance bottlenecks and automate monitoring/alerting. • Collaborate with AI engineers to productionize GenAI models. • 3–5 years of Azure data engineering experience. • Proficiency in advanced SQL, PySpark, and data orchestration tools (e.g., Airflow). • Expertise in performance tuning (partitioning, indexing, caching). • Strong understanding of DevOps and IaC (Terraform, ARM templates). • Experience with AI/ML workflows and real-time data streaming. • Azure Data Engineer Associate (DP-203) certification (preferred). • Familiarity with GenAI tools (Azure OpenAI, LangChain) (preferred). • Bachelor in any stream (Engineering, Science & Commerce) (preferred).
• Cutting-edge Technology: Opportunity to work on cutting-edge AI projects and shape the future of data visualization • Rapid Growth: Be part of a high-growth startup with endless opportunities for career advancement. • Impactful Work: See your contributions make a real difference in how businesses operate. • Collaborative Culture: Join a diverse team of brilliant minds from around the world. • Flexible Work Environment: Enjoy remote work options and a healthy work-life balance. • Competitive Compensation as per market.
Apply NowMarch 14
Join Oportun as a Senior Data Engineer responsible for enhancing data platforms for their fintech solutions.
March 14
Senior Manager at Wex, overseeing Data Lake technologies and engineering team in Bangalore.
March 14
Join Wex as Senior Staff Engineer for Data Lake House platform development and innovation.
March 11
The role focuses on data modeling and management for a global fintech company. It requires collaboration with architects and analysts to meet business data needs.
March 10
Join a leading media client to build and maintain scalable data pipelines using Snowflake and Python.
Discover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.