Data Architect

September 12

Apply Now

Description

• Lead the project team, driving project success through effective leadership, communication, and collaboration. • As a member of the project governance team, collaborate to ensure alignment with project objectives, budget, timeline and quality standards. • Lead requirements gathering, solutioning, and estimation efforts to meet client needs. • Ensure data integrity, security, and compliance with industry standards. • Design and lead implementation of data ingestion, storage, transformation, modeling, and data visualization solutions, ensuring efficiency, performance, scalability, security and reliability within data and AI solutions. • Provide technical leadership and mentorship to project team members. • Drive client success by delivering high-quality data solutions aligned with best practices and industry standards that meet business needs.

Requirements

• Bachelor’s or master’s degree in computer science, information technology, or related field expected. • Master’s degree or PhD in data science or business administration preferred. • 5-10 years of experience as a Data Architect or similar role. • 10-15 years of experience as a Data professional • Strong understanding of data management systems including data warehouses and data lakes • Experience with data modeling to support the following industries common data stories • Distribution, Supply Chain, Warehousing, Manufacturing • Retail, Consumer Goods • Financial Services, Professional Services • Knowledge of data governance, security and privacy regulations. • Excellent problem-solving and communication skills. • DP-600 Fabric Analytics Engineer Associate • PL-300 Power BI Analytics Associate or DP-203 Data Engineer Associate • AZ-305 Azure Solutions Architect • Proficiency in creating and maintaining conceptual, logical, and physical data models. • Expertise in working with unified data platforms such as Azure Synapse Analytics, Microsoft Fabric and Databricks • Experience in designing and implementing data integration solutions. Knowledge of ETL (Extract, Transform, Load) processes and tools. • Ability to ensure data quality and implement data quality frameworks. • Knowledge of data security best practices and experience implementing data security measures. • Proficiency in programming languages such as SQL, Python, or others used for data manipulation and analysis. • Strong communication skills to collaborate with cross-functional teams and explain complex technical concepts to non-technical stakeholders. • Advanced problem-solving and analytical skills. • Willingness to learn and adapt to new technologies and tools.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com