Senior Data Engineer, Tech lead

Wex

Senior Data Engineer, Tech lead

This job is no longer open

We are seeking an experienced Data Engineer to play a critical role in the development of WEX's data & analytics capabilities. You will be part of an organization focusing on the development and delivery of data solutions and capabilities for WEX’s data platform. The successful candidate is motivated by thinking big data, technically proficient, and enjoys working in a fast paced environment.

Must-Haves:

Exceptional critical thinking, analytical reasoning, and problem-solving skills. Strong people skills with a gift for relationship building and mentorship.

 In addition, you are:

  • Are an exceptional critical thinker with strong analytical and problem-solving abilities

  • Are self motivated, work independently with little or no supervision, and can lead others

  • Bring thought leadership to your area of responsibility and enjoy staying ahead in your field

General Responsibilities

 You’ll be part of a team that is responsible for:

  • Solutions architecting new data platform features to ensure future success and sustainability

  • Creating optimized data pipelines using Qlik Replicate, Fivetran, custom code using Python, and Snowpipes

  • Working with stakeholders to understand business requirements and then implement SQL-first transformation workflows to deploy analytics code using dbt (data build tool)

  • Designing efficient data marts that are catered towards the needs of very specific business units, functions, or departments

You’ll be part of a collaborative scrum team that consists of an Agility Engineer, Technical Program Manager, Data Engineers, QA, and DevOps. You’ll also be supported by a manager that is here to listen and help you grow.

Technical Skills

Our core stack for this position consists of dbt, Snowflake, Python, Airflow, Docker, Qlik Replicate, Fivetran, and several AWS services. You possess the following skills and experiences:

 

  • Solid understanding of dbt to perform data transformation tasks (3+ years)

  • Strong understanding of data design principles and dimensional data modeling (2+ years)

  • Advanced SQL skills (3+ years) and understanding of query optimization strategies in Snowflake

  • Fundamental understanding of DAGs and operators in Airflow, with 1+ years of hands-on experience with action and sensor operators

  • Fundamental understanding of Docker and deploying code as a container

  • Solid understanding of basic programming concepts in Python or similar modern languages (2+ years)

  • Solid understanding of Qlik Replicate and Enterprise Manager with 1+ years of experience managing endpoints and streaming CDC (change data capture) tasks

  • Solid understanding of Fivetran with 1+ years of experience ingesting files and SaaS (e.g. Salesforce, Workday) data sources

  • Fundamental understanding of the following AWS services:

    • Elastic Compute Cloud (EC2) - performance monitoring & optimization and knowledge of instance types

    • Virtual Private Cloud (VPC) - provisioning of virtual networks and all its basic components (security group, network access control list, route tables, subnet, etc.)

    • Elastic Map Reduce (EMR) - managing and running PySpark code in a distributed processing framework; including troubleshooting and cluster optimization

    • S3, IAM, Secrets Manager, AWS CLI 

Note: Qlik Replicate and Fivetran experiences can be substituted with ETL/ELT data pipeline development experience using Python (or similar tools).

This position requires solutions architecting new data platform features. It would be nice if you have one or more of the following:

  • Implemented CI/CD using Jenkins or Github Actions in a production environment

  • Are familiar with DevOps practices and procedures

  • Are certified as an AWS Solutions Architect and/or Snowflake SnowPro

  • Implemented disaster recovery solutions for system platforms

Minimum Qualifications

8+ years of experience as a Data Engineer creating data pipelines using tools and custom code

3+ years of experience with dbt (or similar tool) developing complex data models using macros and Jinja

2+ years of experience with Snowflake and its advanced features

2+ years of experience with cloud service providers such as AWS (preferred), GCP, or Azure with solutions architecting background.

BS in a technical or quantitative field; or you can make us feel intensely confident that you don’t need one.

The base pay range represents the anticipated low and high end of the pay range for this position. Actual pay rates will vary and will be based on various factors, such as your qualifications, skills, competencies, and proficiency for the role. Base pay is one component of WEX's total compensation package. Most sales positions are eligible for commission under the terms of an applicable plan. Non-sales roles are typically eligible for a quarterly or annual bonus based on their role and applicable plan. WEX's comprehensive and market competitive benefits are designed to support your personal and professional well-being. Benefits include health, dental and vision insurances, retirement savings plan, paid time off, health savings account, flexible spending accounts, life insurance, disability insurance, tuition reimbursement, and more. For more information, check out the "About Us" section.

Salary Pay Range: $113,000.00 - $150,000.00
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.