D365 Data Engineer

Yesterday

Apply Now
Logo of ITtude

ITtude

Staffing • Consultancy • IT Services • Hiring • IT Recruitment

2 - 10

Description

Requirements

• Proficiency in building data pipelines using Azure Data Factory, particularly for ETL/ELT processes from D365 F&O and other systems to the data lake. • Connecting ADF with D365 F&O, other databases, and external sources. • Implementing complex data transformations using ADF’s data flow capabilities. • Experience with real-time data ingestion and processing using tools like Azure Stream Analytics, especially for pushing data into the data lake in near-real-time. • Knowledge of using Azure Databricks or Synapse for data processing, transformation, and analytics within the data lake environment. This includes working with large-scale distributed data processing. • Skills in optimizing data storage within the data lake, including choosing the right storage tiers (e.g., hot, cool, archive) and compressing large datasets. • Familiarity with working with optimized file formats such as Parquet, Avro, or Delta Lake for efficient querying and data storage. • Experience with monitoring data pipelines, troubleshooting performance bottlenecks, and optimizing for cost-effective usage of Azure resources.

Apply Now

Similar Jobs

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com