Senior Data Engineer

Senior Data Engineer

This job is no longer open

Why this role

As a tech-enabled disruptor in the vacation rental industry, data is the backbone of how Evolve is fundamentally changing how guests interact with vacation rentals and how homeowners maximize rental income generation.  Evolve’s Business Intelligence team is focused on enabling the company to innovate and make data informed decisions at scale by ensuring data is a trusted and valued asset available to all levels of the organization for innovation, visualization, analytics, reporting, data hygiene, and data science.  

The Senior Data Engineer is a SQL, Python, AWS, data warehouse, and data lake expert who is passionate about building, supporting, and optimizing data pipelines, structures, and transformations that enable mission critical workflows for our reporting, analytics, business operations, and data science teams. The Senior Data Engineer is a technical leadership role on the team and will play a key part in establishing best practices, defining standards, and mentoring teammates.  This team and role is a critical element to Evolve’s success and helps position us as an innovator and thought-leader in the vacation rental space. 

What you’ll do

  • Own the availability and performance of our Snowflake data lakehouse environment.
  • Monitor Snowflake compute and storage use, optimizing costs and ensuring we don’t exceed budgetary or contractual limits
  • Mentor BI Developers, Data Engineers, and Analytics Engineers to aid in growth and ensure best practices are followed
  • Own production data ingestion and transformation processes, ensuring the successful, on-time execution of all pipelines. 
  • Develop, own, and ensure adherence to quality standards for ETL/ELT, Python, Prefect, Snowflake, Fivetran, Matillion, and AWS as well as documenting and training other teammates on these standards
  • Research, recommend, and implement new and enhanced tools and methods that support Evolve’s data pipelines and processes for data ingestion, storage, transformation, automation, and hygiene
  • Data engineering subject matter expert on cross-functional teams developing scalable solutions for new and modified data sources
  • Lead the design, development, and optimization of data pipelines using tools like Fivetran, Matillion, Prefect, and Python to move data to/from Snowflake, SaaS APIs, and other data stores.
  • Design, modify, and implement data structures in Snowflake to support data ingestion, integration, and analytics
  • Curate and transform data into appropriate structures for analytics and data science purposes using SQL, Python, Snowflake scripting, and data transformation tools like Matillion and dbt.
  • Design and implement processes to automate monitoring and alerting on source data quality, data ingestion and transformation processes, and the overall health of our data infrastructure 

What makes you a great fit

  • 5+ years of relevant data warehouse and data lake engineering experience
  • Expertise in data pipeline design and automation
  • Advanced SQL and AWS expertise
  • Advanced expertise in one or more cloud data platforms like Snowflake , Red Shift, Azure SQL, or similar databases is required with Snowflake being highly preferred 
  • Experience with advanced ETL/ELT tool administration (code deployment , security , setup and configuration)
  • Experience with enterprise ETL/ELT tools like Fivetran, dbt, Matillion or other similar ETL/ELT tools 
  • Advanced expertise coding in Python including advanced data structures and libraries
  • Drive to mentor and guide data engineers, BI developers, and data analysts
  • Ambition to design and implement code from scratch and build new data infrastructure
  • Enjoy supporting team members and helping solve problems that arise
  • Enjoy a connected, collegial environment even though we are remote, hybrid, and on-site
  • Expertise with documenting data definitions and code
  • Experience deploying applications in Docker
  • Experience with a cloud-based BI platform like Sisense, Tableau, Qlik, or Power BI
  • Experience with Airflow or Prefect
  • Experience with AWS Fargate and Lambda

Location 

Evolve has a flexible working environment so teammates can work remotely anywhere in the state of Colorado, in our beautiful downtown Denver office, remotely or a hybrid of both! As we grow, we are working towards opening remote opportunities across the entire U.S. We currently are able to hire across the U.S except in the following locations: California, District of Columbia, Hawaii, New Jersey, New Mexico, and Pennsylvania.

Compensation

For this role our salary range is $136,000 - $156,000, depending on relevant experience. This role will also be eligible to receive a variable annual bonus based on both company and individual performance.

In addition to the base salary you will also receive:

Equity: 50% of salary equivalent RSUs
Variable comp: 10% of salary target

#LI-AZ1

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.