Own the design and optimization of the data infrastructure.
Collaborate with product and engineering to advocate best practices and build supporting systems and infrastructure for the various data needs.
Create reliable data pipelines and associated documentation for source of truth tables.
Implement data governance and error monitoring, ideally using an orchestration framework like Airflow.
Unlock self-serve analytics by building out comprehensive data models for different teams.
You’ll Be a Good Fit If:
You have at least 5+ years of experience in a data engineering role building products, ideally in a high-growth, fast-paced environment
Experience with Snowflake, Fivetran, DBT, SQL, Airflow and LookML.
Strong foundations in Python, Spark (nice to have), data modeling and system design
You have excellent communication and collaboration skills, and can work effectively with cross-functional teams to achieve common goals.
You are proactive and self-motivated, and can work independently to identify and prioritize opportunities for optimizing and improving our data pipelines and infrastructure.
Please upload your Resume in a PDF format.
The salary range for this role is 175-210k.
This job is no longer open
Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.