June 5
• Design and implementation of ETL batch and Streaming processes under best practices. • Design and build data structures and/or Apis that allow to explore the information required by all areas of the company and our customers. • Translate business requirements into data models for implementation (dimensional models and datamarts). • Development of SQL Queries for information analysis and exploration. • Implement and drive continuous improvement and Data Governance strategies
• At least 4 years of experience working with Database Managers such as: SQL Server, PostgreSQL, Redshift, Oracle or Snowflake. • At least 4 years of experience in AWS. • At least 3 years in Python, PySpark development. • At least 3 years in ETL tools (Glue, dbt, dms, Pentaho, or IPC). • At least 4 years of DWH and/or Data Lake experience. • At least 2 years experience in streaming tools such as Kafka or AWS Kinesis. • At least 2 years of experience in Lambdas development. • At least 3 years of experience is DB NoSQL (MongoDB, DynamoDB, DocumentDB or Cassandra)
• 11 paid holidays • Generous Accrued Time Off increasing with years of service • Generous paid sick time • Annual day of service
Apply Now