Fleet payments • Heathcare payments • Travel payments • Virtual payments • Corporate payment solutions
5001 - 10000
💰 $310M Post-IPO Debt on 2020-06
October 31
Fleet payments • Heathcare payments • Travel payments • Virtual payments • Corporate payment solutions
5001 - 10000
💰 $310M Post-IPO Debt on 2020-06
• Work with Kafka as a Service (KaaS) Engineering team to configure and run Kafka at the Enterprise level. • Create canonical Kafka topics for the entire enterprise. • Define standards for message content, serialization schemas, and canonical topic naming conventions. • Collaborate with divisions to understand their data structure. • Ensure Eventing Platform meets customer data needs. • Estimate data storage and platform growth, assist with monitoring metrics, and ensure performance. • Develop tools and pipelines to increase deployment speed using GitOps principles. • Learn, research, and prototype new tools for teams. • Create Proof of Concepts (PoCs) and share findings. • Enrich events with meaningful information from data sources, using Apache Flink or similar solutions. • Provide architectural blueprints and roll out solutions. • Maintain exemplary standards for modern end-to-end software development.
• Solid development experience with at least one major programming language (Java, C#, Python, or Golang). • Proven success in delivering software features through all phases of the SDLC and building, testing, and deploying using DevOps principles. • Good experience of designing and implementing data solutions. • Hands-on data warehousing, data lake, and/or data pipeline experience required. • Hands-on experience with at least one major RDBMS and NoSQL data store. • Knowledge of optimizing and troubleshooting large data stores. • Working knowledge in designing and building data pipelines that meet business SLAs. • Experience in delivering solutions in the cloud, preferably AWS. • Familiarity with major AWS components such as EC2, S3, SQS, etc. • Equivalent experience with GCP or Azure is also acceptable. • Academic degree in Computer Science or equivalent field is nice to have. • Ability to code and deliver containerized solutions with Docker. • Experience with AI/ML. • Experience with Apache Flink or similar tools. • Familiarity with GitHub and GitHub Actions or equivalent. • Familiarity with Terraform or equivalent tool. • Experience delivering projects with strong failover capabilities, with multi-region or multi-cloud support. • Good understanding of security-related concepts and best practices, such as OWASP, SSO, ACLs, TLS, tokenization, etc. • Experience delivering solutions with PCI-DSS and/or HIPAA requirements. • Participation in data and process audits.
• Comprehensive and market competitive benefits offered designed to support personal and professional well-being.
Apply NowOctober 31
51 - 200
Senior automation engineer for Consensus software quality through testing and automation.
October 31
51 - 200
Lead engineering for a platform transforming Instagram into a sales channel.
October 30
51 - 200
Develop tools for analysts at dbt Labs with an emphasis on user experience.
October 29
501 - 1000
Software Engineer for EDB’s database infrastructure team maintaining Postgres variants.