5 days ago
Airflow
Amazon Redshift
Apache
Cassandra
Cloud
Distributed Systems
ElasticSearch
ETL
Google Cloud Platform
Grafana
HBase
JavaScript
MariaDB
MongoDB
MySQL
NoSQL
Postgres
Prometheus
Python
RDBMS
Tableau
Go
• Join a US-based outsourcing company focusing on providing exceptional software experiences. • Engage in a blockchain project transforming the art market while collaborating with world-class brands. • Contribute to building processes, tools, and products from scratch in a startup environment.
• 4+ years of experience in data engineering, encompassing data extraction, transformation, and migration. • Advanced experience with data extraction from unstructured files and legacy systems. • Proven expertise in migrating data from file-based storage systems to cloud storage solutions, ideally on Google Cloud Platform. • Proficiency with relational databases, specifically MariaDB or MySQL, as well as cloud-native solutions such as Google Cloud Storage, Google BigQuery, and optionally Snowflake or Amazon Redshift. • Strong programming skills in Python, with a focus on data manipulation, automation, and re-implementing custom tools. • Extensive experience with ETL/ELT pipeline development and workflow orchestration tools (e.g., Apache Airflow, Luigi, Google Dataflow, Prefect). • Hands-on experience with batch processing frameworks and real-time data processing frameworks. • Experience with data pipeline development using programming languages, including batch processing implementation. • In-depth understanding of data modeling, data warehousing, and best practices for designing scalable data architectures. • Practical experience in developing or re-engineering data mastering tools for the purpose of data cleaning, standardization, and preparation. • Expertise in RDBMS functionalities, such as stored procedures, triggers, partitioning, indexes, and structural changes. • Ability to handle Personally Identifiable Information (PII) data within pipelines and data storage systems. • Experience with NoSQL databases, such as MongoDB, Cassandra, or HBase. • Experience with monitoring tools such as Prometheus, Grafana, and CloudWatch to oversee data pipelines and systems. • Knowledge of best practices in database management, performance optimization, data security, and ensuring consistency across distributed systems. • Ability to critically evaluate data architecture and provide strategic recommendations for infrastructure improvements. • Upper-Intermediate+ English level • Familiarity with JavaScript for maintaining or enhancing legacy systems and cross-functional integration. • Experience with ElasticSearch for indexing and querying large datasets. • Proficiency with analytical tools such as Tableau, Power BI, Looker, or similar platforms for data visualization and insights generation. • Interest or background in the art industry, particularly related to digital asset management and tokenization. • Demonstrated ability to collaborate in cross-functional teams and contribute to multidisciplinary projects. • Experience with PostgreSQL and understanding its application in data engineering environments. • Knowledge of specific services related to data engineering, including key metrics and business processes relevant to the industry domain. • Experience with MLOps tools and practices to streamline machine learning deployment and operations. • Basic understanding of existing machine learning models and algorithms.
• Get 30 paid rest days per year to use as holidays/vacation/other on the desired and requested dates • 5 sick leave days, up to 60 days of medical leave, and up to 6 days of leave per year due to family reasons (i.e., wedding/funeral/baby birth) • Get a health insurance package fully compensated by Dev.Pro • Join fun online activities and team-building events • Get continuous remote HR, payroll support, and overtime coverage • Join English/Polish lessons • Grow your expertise with mentorship support and DP University
Apply NowNovember 29
Drive data architecture advancements for EcoVadis while collaborating with analytics and product teams.
November 21
Join Sunscrapers as a Data Engineer to develop data-driven solutions for healthcare clients remotely.
November 21
Join an AI startup as a Data Scientist / Data Engineer, developing pipelines for innovative solutions.
October 24
Data Engineer designing and maintaining scalable data pipelines at Infosys Consulting.