July 16
• Design, implement, test, and maintain data models within our data lake architecture to support advanced analytics • Implement processes that ensure models are scalable, maintainable, and optimized for high performance • Design and manage workflows using Apache Airflow • Collaborate with data architects, analytic engineers, software engineers, and IT team members to achieve project goals
• Bachelor’s degree in Computer Science, Engineering, or a related field • 3+ years of experience in a data engineering role, preferably with exposure to data lake environments • Well-skilled in SQL and Python for complex queries and analytics • Extensive experience with ETL processes and tools • Solid understanding of Hadoop ecosystem, including Hive and Apache Spark for processing large datasets • Experience with Apache Airflow for workflow management • Familiarity with major cloud platforms (AWS, Azure) • Ability to work in a team environment and collaborate on projects • Excellent problem-solving and troubleshooting skills • Strong communication and organizational skills • Certifications in Big Data technologies and cloud platforms are strongly preferred
• Competitive salary • Friendly, pleasant, and creative working environment • Remote Working • Development Opportunities • Private Health Insurance
Apply NowJoin our Facebook group
👉 Remote Jobs Network