Azure Data Engineer

February 14

Apply Now
Logo of LAAgencia

LAAgencia

Digital Technology • Human Resources • Information Technology and Services

Description

Responsible for implementing data ingestion pipelines from diverse data sources employing Azure Data Factory, Azure Databricks, and additional ETL tools. Tasked with developing scalable and reusable self-service frameworks for data ingestion and processing. Will be involved in the design, construction, and administration of SQL Server databases within the Azure cloud environment. Engaged in data modeling and the integration of data from various systems. Will assess and implement best practices for data manipulation. Responsible for creating and maintaining Azure Data Factory pipelines. Involved in integrating end-to-end data pipelines to facilitate the seamless transfer of data from source to target data repositories, ensuring data quality and consistency.

Requirements

• Over 3 years of Azure development experience • Proficiency in cloud-based solutions • Comprehensive understanding and practical experience with GIT • Sound familiarity with Microsoft ETL tools including Azure Databricks, Azure Data Factory, Data Lake, and SSIS • Hands-on experience with both structured and unstructured data • Proficiency in utilizing ARM templates • Proficient in working with JSON • Good grasp of Azure DevOps or Jira • Knowledge of SQL • Effective communication skills, capable of providing customers with technical insights and interpreting data for them • Ability to work independently with a strong sense of ownership for assigned tasks • Capability to work effectively in both independent and collaborative team environments, spanning cross-functional and cross-cultural settings • Nice to have: Proficiency in data analysis programming, particularly in PySpark and SparkSQL, or a willingness to acquire this skill • Comprehension of AAS (Azure Analysis Services) • Familiarity with Power BI

Benefits

• End-to-end data services partner to global brands and enterprises • Opportunity to work with diverse data sources • Implementation of data ingestion pipelines • Development of scalable and reusable self-service frameworks • Design, construction, and administration of SQL Server databases • Integration of data from various systems • Assessing and implementing best practices for data manipulation • Creating and maintaining Azure Data Factory pipelines • Integration of end-to-end data pipelines • Ensuring data quality and consistency • Opportunity to work with cloud-based solutions • Ability to provide technical insights and interpret data • Ownership of assigned tasks • Collaboration in cross-functional and cross-cultural environments

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com