We’re the Enterprise Architecture team at WEX, focusing on Enablement and Eventing. We are in the process of improving existing solutions and also building new ones to continuously modernize our products and technologies. We focus on the latest and most efficient tools to deliver the best products to our customers. We have more work than we can handle and we’re looking for great people to come along for the ride.
Our team works hard, we cover for one another, and we maintain a healthy work-life balance. We own our results and take pride in the ownership of everything we do (check your ego at the door and take pride in owning results!). We all are comfortable balancing the need to move fast with the realities of working in a highly regulated space like payments.
You will help us create canonical Kafka topics for the entire enterprise. Our eventing platform is growing and we need a single source of truth from multiple data sources within our company.
You will help us define standards for message content, serialization schemas, canonical topic naming conventions, and anything else that helps our eventing platform reach new levels of excellence. You will work closely with one or more of our divisions to understand how their data is structured.
You will work with our Kafka Engineering team to ensure our Eventing Platform meets the data needs of our customers. This involves estimating data storage and cluster size, estimating platform data growth, helping with new monitoring and observability metrics, and helping ensure optimal performance and high availability.
You will learn, research, and prototype new modern tools for our teams. You will create Proof of Concepts (PoCs) and share your findings with a broader audience. As a result, you will constantly learn new eventing technologies, processes, and tools.
You will help create and design permanent data storage structures for our events in Snowflake or other data warehousing solutions. You will work with Snowflake and WEX’s Snowflake team.
You will help us enrich our events with meaningful information from various topics and data sources working with Apache Flink or similar solutions.
You will help provide architectural blueprints, prototype solutions, analyze data flows, and create the necessary specs to roll out various solutions.
You’re open-minded and have very strong soft skills to relate, collaborate, and communicate well with a diverse audience.
You love to learn and code! This role changes rapidly, technology evolves quickly and we want to provide the best options for our customers continuously.
You’re passionate about data and technology and love to learn and try new things.
You’re creative and feel comfortable with constant changes in the industry.
5+ years of development experience with at least one major programming language.
3+ years of designing and architecting data solutions. Data warehousing experience is required.
Hands-on experience with at least one major RDBMS and NoSQL data store.
Working knowledge of enterprise data warehousing solutions. You know how to optimize and troubleshoot large data stores.
Experience in delivering solutions in the cloud, preferably AWS. You’ve worked with at least major AWS components such as EC2, S3, SQS, etc. Equivalent experience with GCP or Azure is also acceptable.
Academic degree in Computer Science or equivalent field.
You know how to code and deliver containerized solutions with Docker.
Experience with AI/ML.
Experience with Apache Flink or similar tools.
Familiarity with GitHub and GitHub Actions or equivalent.
Familiarity with Terraform or equivalent tool.
Delivered projects with strong failover capabilities, with multi-region or even multi-cloud support.
Good understanding of security-related concepts and best practices, such as OWASP, SSO, ACLs, TLS, tokenization, etc.
You delivered solutions with PCI-DSS and/or HIPAA requirements. You participated in data and process audits.
AWS, Azure
Kafka
Python, Java, C#, others
Docker
Kubernetes
Argo CD
OpsLevel
GitHub, GitHub Actions
Terraform
Anything that moves us forward