Education • Special Education • Data an AI • Modern Data Estate for Education • Exceptional Learning
11 - 50
October 20
Education • Special Education • Data an AI • Modern Data Estate for Education • Exceptional Learning
11 - 50
• Be an integral part of our engineering team, responsible for designing, developing, and maintaining the Education Intelligence Platform. • Ensure the scalability, reliability, and performance of the platform. • Construct, manage, and enhance fault-tolerant data infrastructure while upholding data quality and integrity. • Collaborate with the engineering and product teams to execute on product goals. • Develop and manage fault-tolerant, scalable data pipelines capable of handling terabytes of data using distributed cloud technologies. • Develop data ingestion, processing, and transformation techniques for data integrity and quality. • Assist in the construction of control plane infrastructure using event-driven services. • Conduct POCs to validate new tools and services that enhance our data engineering solutions and products. • Troubleshoot production data quality issues and ensure data integrity. • Stay abreast of industry standards and technological advancements.
• Minimum of 2 years hands-on experience with Python and related data libraries (e.g. Pandas, Data Frames) • Practical expertise in ETL/ELT technologies and methodologies. • Prior experience using Databricks. • Proven experience in data wrangling and cleaning across structured, semi-structured, and unstructured data formats. • Solid design and development background in modern technologies such as API management, REST/API integration, Containers, and Micro services. • Experience in designing or working with data warehouses, including an understanding of associated data flows. • Exceptional communication skills, both written and verbal. • English fluency is required to effectively communicate with our clients and other key stakeholders both internal and external. • Nice to Have: A background in Education and/or Ed Tech. • Hands-on experience with data streaming technology like Databricks DLT. • Certifications from Databricks. • Experience using Azure, AWS, or GCP, especially from a DevOps or Data engineering perspective. • Experience building multi-tenant systems. • Experience with OAuth. • Knowledge of privacy concerns related to user data.
Apply NowJune 21, 2023
11 - 50
Hiring freelance trainers for AWS Data Engineering at Btree Systems.