Sardine helps neobanks and crypto wallets with fraud indemnification and advanced risk scoring. We're rapidly growing but still a small team. You'll be an essential part of building the next phase of the company.
You will:
Build scalable and reliable data infra combining data from multiple sources
Build serving layer storage that can support online data processing in collaboration with other engineers
Implement automation for machine learning model training in collaboration with data scientist
Formulate business problems as technical data problems while ensuring key business drivers are captured in collaboration with product management
Develop ETL/ELT data pipelines which are easy to maintain and monitor
Build data warehouse solution for BI reporting
We’re looking for someone who has:
3+ years of experience with large scale data projects
Expertise with designing complex Data Models and Data Engineering solutions
Write complex SQL for processing raw data, data transformations as well as data validations.
Experience with GCP technologies stack including dataflow, bigquery, bigtable and pubsub
Nice to haves:
BS/MS in Computer Science or equivalent
Experience with data integrations (informatica, python, spark, etc.), analytics (Metabase, Tableau, Mode, Looker, etc.) and scheduling tools (Airflow).
GCP Data Engineer certification
Technologies we use includes:
bigtable
postgres
GCP dataflow (apache beam)
bigquery
ElasticSearch
Metabase
TypeScript (React + node)
Golang
Python
Ruby
Benefits
Remote-first
Flexible PTO
Health care and 401K
Work from home stipend
7-year for post termination option exercise (vs standard 90 days)