Video Production • Interactive Content • User Acquisition Assets • Mobile Specialized • App Store Videos
October 31
Airflow
AWS
Docker
DynamoDB
ElasticSearch
ETL
GraphQL
Kafka
Microservices
NoSQL
Pandas
Postgres
PySpark
Python
SDLC
Spark
SQL
Terraform
Go
Video Production • Interactive Content • User Acquisition Assets • Mobile Specialized • App Store Videos
•About Craft: Craft is the leader in supplier intelligence, enabling enterprises to discover, evaluate, and continuously monitor their suppliers at scale. •Craft’s open supplier profiles appear in over 100 million organic search results each month, driving over 2 million monthly active users on our website. •We're looking for innovative and driven people passionate about building the future of Enterprise Intelligence to join our growing team! •About the Role: Craft is looking for an experienced and motivated Data Engineer to join a team responsible for a key product within the organization. •We’re growing quickly and looking to hire data engineers for several of our teams. Each team is looking for someone with strong data engineering experience, Python coding experience, and solid software engineering practices. •What You'll Do: Building and optimizing data pipelines (batch and streaming), Extracting, analyzing and modeling of rich and diverse datasets, Designing software that is easily testable and maintainable. •What We're Looking For (General Qualifications): Strong knowledge of SDLC and solid software engineering practices, Knowledge and experience of Amazon Web Services (AWS) and Databricks (nice to have).
•4+ years of experience in Data Engineering. •4+ years of experience with Python. •Experience in developing, maintaining, and ensuring the reliability, scalability, fault tolerance and observability of data pipelines in a production environment. •Strong knowledge of SDLC and solid software engineering practices. •Knowledge and experience of Amazon Web Services (AWS) and Databricks (nice to have). •Demonstrated curiosity through asking questions, digging into new technologies, and always trying to grow. •Strong problem solving and the ability to communicate ideas effectively. •Familiar with infrastructure-as-code approach. •Self-starter, independent, likes to take initiative. •Have fundamental knowledge of data engineering techniques: ETL/ELT, batch and streaming, DWH, Data Lakes, distributed processing. •Familiarity with at least some technologies in our current tech stack: Python, PySpark, Pandas, SQL (PostgreSQL), Airflow, Docker, Databricks & AWS (S3, Batch, Athena, RDS, DynamoDB, Glue, ECS), CircleCI, GitHub, Terraform
•Option to work as a B2B contractor or full-time employee •Competitive salary at a well-funded, fast-growing startup •PTO days so you can take the time you need to refresh! •Full-time employees: 28 PTO days allotted + 13 public holidays (41 total paid days off) •B2B contractors: 15 PTO days allotted + 13 public holidays (28 total paid days off) •Uncapped sick leave so you can focus on your health when you need it •100% remote work (or hybrid if you prefer! We have coworking place in center of Warsaw.) •zł 400 monthly wellness/learning stipend (Gym memberships, meals, snacks, books, classes, conferences, etc.)
Apply NowOctober 31
Senior Data Architect designing scalable data architectures for insurance SaaS.
October 19
Data Engineer role at Provectus focusing on data solutions and collaboration.
August 24
Provide custom software solutions and digital services as a technology partner.
June 27
Senior Data Engineer for InsurTech company revolutionizing home insurance.