We’re the Enterprise Architecture team at WEX, focusing on Enablement and Eventing. We are in the process of improving existing solutions and also building new ones to continuously modernize our products and technologies. We focus on the latest and most efficient tools to deliver the best products to our customers. We have more work than we can handle and we’re looking for great people to come along for the ride.
Our team works hard, we cover for one another, and we maintain a healthy work-life balance. We own our results and take pride in the ownership of everything we do (check your ego at the door and take pride in owning results!). We all are comfortable balancing the need to move fast with the realities of working in a highly regulated space like payments.
About the role
You will help us create canonical Kafka topics for the entire enterprise. Our eventing platform is growing and we need a single source of truth from multiple data sources within our company.
You will help us define standards for message content, serialization schemas, canonical topic naming conventions, and anything else that helps our eventing platform reach new levels of excellence. You will work closely with one or more of our divisions to understand how their data is structured.
You will work with our Kafka Engineering team to ensure our Eventing Platform meets the data needs of our customers. This involves estimating data storage and cluster size, estimating platform data growth, helping with new monitoring and observability metrics, and helping ensure optimal performance and high availability.
You will learn, research, and prototype new modern tools for our teams. You will create Proof of Concepts (PoCs) and share your findings with a broader audience. As a result, you will constantly learn new eventing technologies, processes, and tools.
You will help create and design permanent data storage structures for our events in Snowflake or other data warehousing solutions. You will work with Snowflake and WEX’s Snowflake team.
You will help us enrich our events with meaningful information from various topics and data sources working with Apache Flink or similar solutions.
You will help provide architectural blueprints, prototype solutions, analyze data flows, and create the necessary specs to roll out various solutions.
You’re open-minded and have very strong soft skills to relate, collaborate, and communicate well with a diverse audience.
You love to learn and code! This role changes rapidly, technology evolves quickly and we want to provide the best options for our customers continuously.
Qualifications
Must have
You’re passionate about data and technology and love to learn and try new things.
You’re creative and feel comfortable with constant changes in the industry.
2+ years of development experience with at least one major programming language (Java, C#, Python, or Golang).
2+ years of designing and architecting data solutions. Hand-on data warehousing, data lake, and/or data pipeline experience is required.
Hands-on experience with at least one major RDBMS and NoSQL data store. You know how to optimize and troubleshoot large data stores.
Experience in delivering solutions in the cloud, preferably AWS. You’ve worked with major AWS components such as EC2, S3, SQS, etc. Equivalent experience with GCP or Azure is also acceptable.
It would be nice if you have
Academic degree in Computer Science or equivalent field.
You know how to code and deliver containerized solutions with Docker.
Experience with AI/ML.
Experience with Apache Flink or similar tools.
Familiarity with GitHub and GitHub Actions or equivalent.
Familiarity with Terraform or equivalent tool.
Delivered projects with strong failover capabilities, with multi-region or even multi-cloud support.
Good understanding of security-related concepts and best practices, such as OWASP, SSO, ACLs, TLS, tokenization, etc.
You delivered solutions with PCI-DSS and/or HIPAA requirements. You participated in data and process audits.
Some Technologies we use and teach
AWS, Azure
Apache Kafka
Apache Flink
Python, Java, C#, others
Docker
Kubernetes
Argo CD
OpsLevel
GitHub, GitHub Actions
Terraform
Anything that moves us forward
The base pay range represents the anticipated low and high end of the pay range for this position. Actual pay rates will vary and will be based on various factors, such as your qualifications, skills, competencies, and proficiency for the role. Base pay is one component of WEX's total compensation package. Most sales positions are eligible for commission under the terms of an applicable plan. Non-sales roles are typically eligible for a quarterly or annual bonus based on their role and applicable plan. WEX's comprehensive and market competitive benefits are designed to support your personal and professional well-being. Benefits include health, dental and vision insurances, retirement savings plan, paid time off, health savings account, flexible spending accounts, life insurance, disability insurance, tuition reimbursement, and more. For more information, check out the "About Us" section.Salary Pay Range: $86.000,00 - $114.000,00