Nimble Gravity is a company specializing in AI acceleration and automation services, leveraging cutting-edge data science, generative AI, and digital technologies to transform business challenges into growth opportunities. They provide a comprehensive suite of services, including predictive and prescriptive analytics, digital transformation strategies, software engineering, e-commerce solutions, and CRM optimization, particularly using Salesforce. Nimble Gravity is known for creating AI-powered solutions such as automated generative AI agents and data-driven e-commerce strategies, helping businesses accelerate their digital transformation and improve decision making.
Artificial Intelligence β’ Data Science β’ Machine Learning β’ AI/ML β’ ecommerce
3 days ago
Nimble Gravity is a company specializing in AI acceleration and automation services, leveraging cutting-edge data science, generative AI, and digital technologies to transform business challenges into growth opportunities. They provide a comprehensive suite of services, including predictive and prescriptive analytics, digital transformation strategies, software engineering, e-commerce solutions, and CRM optimization, particularly using Salesforce. Nimble Gravity is known for creating AI-powered solutions such as automated generative AI agents and data-driven e-commerce strategies, helping businesses accelerate their digital transformation and improve decision making.
Artificial Intelligence β’ Data Science β’ Machine Learning β’ AI/ML β’ ecommerce
β’ Location: LATAM - Remote β’ Are you passionate about turning messy, disconnected data into clean, actionable insights? Do you thrive on building custom solutions that bridge legacy systems with modern platforms? Are you fluent in Python, Databricks-certified, and excited by the challenge of automating complex data pipelines? If you are nodding along, then you belong with us! β’ We are looking for a skilled and driven Data Engineer & Integration Developer - Databricks to join our team. This role focuses on developing custom connectors for legacy systems, automating data extraction processes, and integrating data from diverse sources. The successful candidate will possess a Databricks certification, strong Python development expertise, a proven track record of working with complex data extraction scenarios, and excellent communication skills to collaborate effectively with Data Stewards and stakeholders. β’ Essential Duties and Responsibilities: β’ Custom Connector Development: Design and develop Python-based custom connectors for legacy systems to facilitate seamless data integration. β’ Automation and Data Extraction: Create automation scripts for data extraction from a variety of sources, including legacy systems, APIs, Excel files, SharePoint, and PDFs. β’ Web Scraping: Utilize web scraping techniques to gather data from online sources, ensuring accuracy and reliability. β’ Collaboration and Stakeholder Engagement: Work closely with Data Stewards and stakeholders to understand requirements, provide updates, and ensure that solutions align with business objectives. β’ Integration and Deployment: Employ tools such as Airflow and DBT to ensure efficient integration and deployment of data workflows. β’ Process Documentation: Create detailed technical documentation to support process replication, maintenance, and knowledge sharing. β’ Continuous Improvement: Stay updated with emerging tools and techniques to enhance data integration and automation practices. β’ Databricks Integration and Development: Design, develop, and optimize data workflows and pipelines within the Databricks platform, ensuring high performance, scalability, and alignment with best practices.
β’ Databricks certification is required. β’ 5+ years in a Data Scientist or related role. β’ Proven expertise in data integration and automation. β’ Familiarity with handling diverse and complex datasets. β’ Proficiency in Python development. β’ Experience with Airflow, dbt, and similar tools and frameworks. β’ Experience with cloud providers. β’ Expertise in web scraping and automation. β’ Strong understanding of data extraction methods from various sources: legacy systems, APIs, Excel files, SharePoint, and PDFs. β’ Demonstrated experience in designing and implementing data integration solutions. β’ Fluent in English (spoken and written). β’ Excellent communication and problem-solving skills. β’ Ability to work effectively in a collaborative, remote environment. β’ US Visa (preferred)
Apply NowMarch 12
Join DevSavant as a Data Engineer, supporting data metrics and delivery processes.
January 19
Dynamic Recruiter needed at Tiger Analytics to attract talent in data analytics roles. Join a fast-paced team fostering business value through advanced analytics consulting.
Discover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, youβll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! Weβre always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.