April 5
• Design and implement scalable and reliable data pipeline, suitable for ingestion, processing, and storing large datasets from different sources. • Improve existing and create new data infrastructure components to better automate extraction, transformation, loading, and other data management processes. • Develop tools to proactively measure, monitor, and improve data quality and consistency during loading and analysis processes. • Define and promote best practices for data management and analysis, and to build and improve systems to implement and support these. • Collaborate with cross-functional Moneytree teams from multidisciplinary science, engineering and business backgrounds to enhance current business processes. • Learn and understand a broad range of Moneytree data sources and know when, how, and which to use and which not to use • Maintain technical document and communicate results to diverse audiences with effective writing, visualisations, and presentations • Own meaningful parts of the service, have an impact and grow with the company In the first 30/60/90 days, your expected deliverables include • Complete security and privacy training. • Complete onboarding training with a mentor. • Understand technical solutions from other teams. • Become a strong team player and raise our overall quality. • Identify and suggest several improvements in the current data pipeline to the data platform. • Contribute in the vision of the LINK Data team services, such as code improvement, documentation, and production deployment.
• 2+ years of working experience in software development in at least Python (Alternatively, Scala or Java). • 2+ years of working experience in software development using JavaScript or TypeScript (Alternatively, Ruby). • 4+ years of relevant work experience in big data infrastructure systems in the cloud (AWS is preferred), such as, but not limited to, Distributed Computing, Hadoop, Spark, Kafka, Presto, or other big data systems. • Comfortable with Spark, Python, and cloud service deployment. • A proven record of designing scalable data pipelines or ETL workflows. • Expertise with some AWS services, especially AWS CloudFormation. • Experience with Docker containers or similar technologies. • Experience with CI/CD pipelines. • Solid understanding of backend architectures/design patterns. • Expertise with and attention to security best practices. • Interest in building efficient pipelines for machine learning applications. • Discipline to work independently and drive results. • Business level of written and spoken English.
• Working remotely from anywhere in Japan • Flexible working hours • Employee stock option program participation • Referral bonus 250,000 JPY per successful hire • Communication allowance (10,000 JPY/month) • Remote work allowance (3000 JPY/month) • 20 annual paid leave • 10 annual sick and carer leave • Health and social insurance support • Ability to work from overseas for short periods • Learning support (7500 JPY/quarter)
Apply Now