Data Architect

Data Architect

This job is no longer open

We are building the next generation of Data/Analytics and Machine Learning driven products and features for our B2B portfolio.  We are looking for a Data Architect with deep experience across Data/Analytics platforms, infrastructure, pipelines and modeling. You have a strong handle on SQL and can  traverse many domain specific data models.  You have a working understanding of integrated data solutions and products requiring a low latency, performant, and resilient back end data platform(s).  

As an Architect, you will define and lead implementation of our Analytics data platforms and pipelines.  This role will require a system design thinker who can traverse inter-connected tiers (databases, data engineering/pipelines, multi-tenant low-latency distributed systems, event streaming, etc) and define an optimal strategy for implementation.  

You may also lead, shape and mentor a small team to help support our growing Analytics platform needs.  You are pragmatic, creative and can serve as a liaison for both technical and product/functional teams to understand and solicit high value requirements.  Above all, you are a constant and resourceful learner who is quick to evaluate and apply technologies that complement the overall vision.  

What You'll Do:

  • Define the end-to-end architecture and standards for our core data platforms, data engineering pipelines.  
  • Drive the implementation of a Modern Data Stack to ensure proper code driven version control, idempotent data pipelines (Airflow, etc), Transform layers (DBT) as well optimization of Extract/Load logic.   
  • Work with Application Dev teams to design a central Metadata/Middleware layer which reflects the complex needs of our business units and Order2Cash product families
  • Directly assist with data analysis, design approach, functional mapping of business needs to data model and pipeline
  • Define standards for low latency data pipelines and ETL to ensure they are resilient, fault tolerant and within SLA
  • Define conceptual, logical and physical data/query models and abstractions between various OLTP/OLAP databases and front end consumers.  
  • Ensure solutions are designed for performance, scalability, and production stability.  
  • Automate processes to optimize, alert, and ensure quality and consistency.  
  • Define flexible standards and framework for cross-tier components (front end UIUX, middleware/API, etc.) while being keenly aware of enterprise security, and compliance 
  • Assist with definition and implementation of the next generation of our software in AWS Cloud including technology assessment, migration strategy, and overall cost/benefit analysis.  
  • Continuously research evolving (proprietary and open source) technologies and determine viability to further improve our products. 
  • Familiarity with Docker/Kubernetes, Linux, Git, Jenkins, and modern CI/CD pipelines 

What You'll Bring to the Team:

  • Bachelor’s Degree preferably in Computer Science, Engineering, or similar discipline.  
  • Hands on experience both as an IC and leading distributed and cross functional teams to achieve successful business outcomes. 
  • 10+ years of foundational data engineering experience supporting production grade enterprise systems. 
  • 5+ years’ Experience with performance tuning of data pipelines (ETL/ELT)
  • 3+ years’ experience with modern Data Engineering pipelines for large scale DBs, Python/Airflow as well as event streaming (Kafka, Boomi, etc).  Experience or ability to integrate with DBT, Airbyte and modern Reverse ETL solutions is a plus.  
  • 5+ years’ experience managing and mentoring cross functional teams, working with business stakeholders, roadmap/project management. 
  • 4+ years’ experience building highly scaled distributed solutions for high concurrency as well as performance.  
  • 3+ years’ experience with SAAS product development and design in line with market adoption and impact.  
  • 3+ years’ experience with design of containerized and distributed solutions using Kubernetes/Docker and implementation of DevOPS
  • Experience across SQL and NoSQL databases and ability to adapt via fundamental understanding of SQL and data/application concepts
  • Familiarity with Docker/Kubernetes, Linux, Git, Jenkins, and modern CI/CD pipelines
  • Experience with Finance/Fintech a plus

SALARY RANGE BELOW FOR COLORADO BASED APPLICANTS ONLY

Target Base Compensation: $140,000.00 - $160,000.00  (bonus and equity offered in addition to base compensation)

Please note that the compensation information that follows is a good faith estimate for Colorado-based hires only and is provided pursuant to the Colorado Equal Pay for Equal Work Act and Equal Pay Transparency Rules. Billtrust intends to offer the selected candidate base pay within this range, dependent on job-related, non-discriminatory factors such as experience and location. We encourage you to apply and speak with our Talent Acquisition team to learn more about the total compensation package.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.