Particle41 is a company that specializes in providing expertise and solutions in technology development, data science, and DevOps. They offer CTO advisory services, acting as a partner to strengthen tech strategies and deliver robust software solutions tailored to their clients' needs. Particle41 emphasizes modernizing operations using cloud architecture, integrating software systems, and leveraging artificial intelligence to provide innovative digital products. They work closely with businesses to ensure on-time project delivery, scalability, and maintaining data security. Particle41 is particularly focused on helping businesses improve their competitive edge through strategic tech solutions and ongoing support.
Application Development • DevOps • Data Science
March 8
AWS
Azure
Cloud
ElasticSearch
ETL
Flask
Google Cloud Platform
Java
Linux
Microservices
MongoDB
MySQL
NoSQL
Pandas
Postgres
PySpark
Python
Redis
Scikit-Learn
Spark
SQL
Particle41 is a company that specializes in providing expertise and solutions in technology development, data science, and DevOps. They offer CTO advisory services, acting as a partner to strengthen tech strategies and deliver robust software solutions tailored to their clients' needs. Particle41 emphasizes modernizing operations using cloud architecture, integrating software systems, and leveraging artificial intelligence to provide innovative digital products. They work closely with businesses to ensure on-time project delivery, scalability, and maintaining data security. Particle41 is particularly focused on helping businesses improve their competitive edge through strategic tech solutions and ongoing support.
Application Development • DevOps • Data Science
•Data Engineer (Elasticsearch + Data warehousing) •Particle41 is seeking a talented and versatile Data Engineer to join our innovative team. •As a Data Engineer, you will play a key role in designing, building, and maintaining robust data pipelines and infrastructure to support our clients' data needs. •You will work on end-to-end data solutions, collaborating with cross-functional teams to ensure high-quality, scalable, and efficient data delivery. •This is an exciting opportunity to contribute to impactful projects, solve complex data challenges, and grow your skills in a supportive and dynamic environment. •In This Role, You Will: •Software Development •Design, implement, and optimize Elasticsearch clusters for high-performance querying and data retrieval. •Build and manage Elasticsearch indexes, ensuring data is stored, indexed, and queried efficiently. •Build and optimize data storage solutions like data lakes and warehouses. •Integrate structured and unstructured data from various internal and external systems to create a unified view for analysis. •Ensure data accuracy, consistency, and completeness through rigorous validation, cleansing, and transformation processes. •Maintain comprehensive documentation for data processes, tools, and systems while promoting best practices for efficient workflows. •Requirements Gathering and Analysis •Collaborate with product managers, and other stakeholders to gather requirements and translate them into technical solutions. •Participate in requirement analysis sessions to understand business needs and user requirements. •Provide technical insights and recommendations during the requirements-gathering process. •Agile Development •Participate in Agile development processes, including sprint planning, daily stand-ups, and sprint reviews. •Work closely with Agile teams to deliver software solutions on time and within scope. •Adapt to changing priorities and requirements in a fast-paced Agile environment. •Testing and Debugging •Conduct thorough testing and debugging to ensure the reliability, security, and performance of applications. •Write unit tests and validate the functionality of developed features and individual elements. •Writing integration tests to ensure different elements within a given application function as intended and meet desired requirements. •Identify and resolve software defects, code smells, and performance bottlenecks. •Continuous Learning and Innovation •Stay updated with the latest technologies and trends in full-stack development. •Propose innovative solutions to improve the performance, security, scalability, and maintainability of applications. •Continuously seek opportunities to optimize and refactor existing codebase for better efficiency. •Stay up-to-date with cloud platforms such as AWS, Azure, or Google Cloud Platform. •Collaboration •Collaborate effectively with cross-functional teams, including testers, and product managers. •Foster a collaborative and inclusive work environment where ideas are shared and valued.
•Bachelor's degree in Computer Science, Engineering, or related field. •Proven experience as a Data Engineer, with a minimum of 3 years of experience. •Proficiency in Elasticsearch and Python programming language is a must. •Experience with database technologies such as SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases. •Strong understanding of Programming Libraries/Frameworks and technologies such as Flask, API frameworks, Data warehousing/lakehouse, Principles, Database, ORM, Data analysis, Databricks, Pandas, Spark, PySpark, Machine learning, OpenCV, Scikit-learn. •Utilize Java to build and enhance backend systems, particularly for integration with Elasticsearch and databases. •Develop APIs, microservices, and automation scripts as needed. •Utilities & Tools: logging, requests, subprocess, regex, pytest •ELK stack, Redis, distributed task queues •Strong understanding of data warehousing/lakehousing principles and concurrent/parallel processing concepts. •Familiarity with at least one cloud data engineering stack (Azure, AWS, or GCP) and the ability to quickly learn and adapt to new ETL/ELT tools across various cloud providers. •Familiarity with version control systems like Git and collaborative development workflows. •Competence in working on Linux OS and creating shell scripts. •Solid understanding of software engineering principles, design patterns, and best practices. •Excellent problem-solving and analytical skills, with a keen attention to detail. •Effective communication skills, both written and verbal, and the ability to collaborate in a team environment. •Adaptability and willingness to learn new technologies and tools as needed.
Apply NowMarch 6
Join Syniti as a Data Migration Consultant to deliver projects for global clients using proprietary tools.
March 4
201 - 500
Join GroundTruth to build data engineering solutions enhancing advertising platform capabilities.
February 16
Join Codvo as a Data Integration Specialist responsible for API solutions and data processes.
February 12
Join Bungee Engineering as a Data Engineer, leading efforts in analytical systems and developing automation tools.
February 8
Role focuses on data engineering, security solutions, and collaboration with IT teams.
Discover 100,000+ Remote Jobs!
We use powerful scraping tech to scan the internet for thousands of remote jobs daily. It operates 24/7 and costs us to operate, so we charge for access to keep the site running.
Of course! You can cancel your subscription at any time with no hidden fees or penalties. Once canceled, you’ll still have access until the end of your current billing period.
Other job boards only have jobs from companies that pay to post. This means that you miss out on jobs from companies that don't want to pay. On the other hand, Remote Rocketship scrapes the internet for jobs and doesn't accept payments from companies. This means we have thousands more jobs!
New jobs are constantly being posted. We check each company website every day to ensure we have the most up-to-date job listings.
Yes! We’re always looking to expand our listings and appreciate any suggestions from our community. Just send an email to Lior@remoterocketship.com. I read every request.
Remote Rocketship is a solo project by me, Lior Neu-ner. I built this website for my wife when she was looking for a job! She was having a hard time finding remote jobs, so I decided to build her a tool that would search the internet for her.