Why this role
As a tech-enabled disruptor in the vacation rental industry, data is the backbone of how Evolve is fundamentally changing guest interaction with vacation rentals and homeowners ability to maximize rental income. Evolve’s Business Intelligence team is focused on enabling the company to innovate and make data informed decisions at scale by ensuring data is a trusted and valued asset available to all levels of the organization for innovation, visualization, analytics, reporting, data hygiene, and data science.
The Data Engineer is a SQL, data warehouse, and data lake expert who is passionate about building, supporting, and optimizing data pipelines that enable mission critical workflows for our reporting, analytics, business operations, and data science teams. This team and role is a critical element to Evolve’s success and helps position us as an innovator and thought-leader in the vacation rental space.
What you’ll do
- Build, support, and optimize data pipelines using tools like Fivetran, Matillion, Prefect, and Python to move data to/from Snowflake, SaaS APIs, and other data stores.
- Design, modify, and implement data structures in Snowflake to support data ingestion, integration, and analytics
- Curate and transform data into appropriate structures for analytics and data science purposes using SQL, Python, Snowflake scripting, and data transformation tools like Matillion and dbt.
- Design and implement processes to automate monitoring and alerting on source data quality, data ingestion and transformation processes, and the overall health of our data infrastructure
- Research, recommend, and implement new and enhanced tools and methods that support Evolve’s data pipelines and processes for data ingestion, storage, transformation, automation,ed and hygiene
- Manage the deployment and monitoring of scheduled data ingestion and transformation processes
- Assist in defining quality standards for ETL/ELT, Python, Prefect, Snowflake, Fivetran, Matillion, and AWS as well as documenting and training other teammates on these standards
- Partnering with Business Intelligence, Data Architecture, Data Science, Product teams, and others to develop scalable solutions for new and modified data sources
What makes you a great fit
- 5+ years of relevant data warehouse and data lake engineering experience
- Experience with data pipeline design and automation
- Advanced SQL knowledge
- Experience with databases like Oracle , Snowflake , Red Shift, Azure SQL, Mongo or similar databases is required with Snowflake being highly preferred
- Experience with advanced ETL/ELT tools administration ( code deployment , security , setup and configuration )
- Experience with enterprise ETL/ELT tools like Fivetran, dbt, Matillion or other similar ETL/ELT tools
- Experience coding in Python
- Ambition to design and implement code from scratch and build new infrastructure
- Enjoy supporting team members and helping solve problems that arise
- Enjoy a connected, collegial environment even though we are remote, hybrid, and on-site
- Familiarity with documenting data definitions and code
- Driven by a fast-paced, energetic, results-oriented environment
- Experience deploying applications in Docker a plus
- Experience with a cloud-based BI platform and data warehouse a plus
- Experience with Airflow or Prefect a plus
- Experience with AWS Fargate and Lambda a plus
Evolve has a flexible working environment so teammates can work remotely anywhere in the state of Colorado, in our beautiful downtown Denver office, remotely or a hybrid of both!
For this role our salary range is $109,000 - $125,000, depending on relevant experience
Additional compensation details:
Equity: $25,000 of equivalent RSUs