What we’re building and why we’re building it.
Fetch is a build-first technology company creating a rewards program to power the world. Over the last 5 years we’ve grown from 0 to 7M active users and taken over the rewards game in the US with our free app. The foundation has been laid. In the next 5 years we will become a global platform that completely transforms how people connect with brands.
It all comes down to two core beliefs. First, that people deserve to be rewarded when they create value. If a third party directly benefits from an action you take or data you provide, you should be rewarded for it. And not just the “you get to use our product!” cop-out. We’re talkin’ real, explicit value. Fetch points, perhaps.
Second, we also believe brands need a better and more direct connection with what matters most to them: their customers. -- Brands need to understand what people are doing, and have a direct line to be able to do something about it. Not just advertise, but ACT. Sounds nice right?
That’s why we’re building the world’s rewards platform. A closed-loop, standardized rewards layer across all consumer behavior that will lead to happier shoppers and stronger brands.
Fetch Rewards is an equal employment opportunity employer.
In this role, you can expect to:
- Manage large datasets utilizing SQL best practices for OLAP / OLTP query and database performance.
- Architect data models consumed by dashboards, data analysts, and other downstream clients.
- Leverage Data Build Tool (DBT), Snowflake, CI/CD, testing, git, and other engineering best practices to create data assets for end users.
- Create and maintain alerting and testing around data availability and quality.
- Generate innovative approaches for performance tuning on datasets with millions of daily active users and terabytes of data.
- Convert business requirements into clean, reliable data assets.
- Communicate findings both verbally and in writing to a broad range of stakeholders.
- Perform administrative duties for Snowflake, Tableau, and DBT infrastructure.
- Lead the charge on data documentation and data discovery initiatives.
You are a good fit if you:
- Are exceptional with SQL; data and business analysts come to you for help with query writing and performance tuning.
- Excel at, and enjoy, clearly communicating about data with internal and external customers.
- Have hands-on experience building ETL or ELT processes that power data warehouses and business intelligence tools.
- Have worked with an orchestration framework to coordinate complex data pipelines and workflows.
- Have experience with relational (SQL), non-relational (NoSQL), and/or object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB).
- Are highly motivated to work autonomously, with the ability to manage personal work streams.
- Interest in building and experimenting with different tools and tech, and sharing your learnings with the broader organization.
- Love Dogs! . . . Or at least tolerate them. We're a very canine-friendly workplace!
You have an edge if you:
- Have developed or worked with a real-time OLAP database (i.e. Pinot, Druid, Clickhouse).
- Have experience programmatically deploying cloud resources with Terraform, Ansible, or AWS CDK.
- Have successfully implemented data quality, data governance, or disaster recovery initiatives.
- Are proficient in at least one scripting language (i.e., Python, Bash).
#BI-remote
#LI-remote