- 5-7 years of experience in Data Warehousing andData Modeling, ETL/ELT,SQL,Python,Kafka,Spark,Airflow
- Bachelor's Degree in Library Science, Information Systems, Finance, Information Technology or equivalent experience
- Any Public Cloud certification focused on data warehousing
- Any certification on specific database technology
- Strong understanding of Normalized/Dimensional model disciplines and similar data warehousing techniques.
- Strong experience working with ETL/ELT concepts of data integration, consolidation, enrichment, and aggregation in petabyte scale data sets.
- Expert in SQL and/or SQL based languages and performance tuning of SQL queries.
- Collaborate with Product and analytics teams on normalizing and aggregating large data sets based on business needs and requirements.
- Data auditing skills to verify data integrity, understand discrepancies and resolve them with the highest sense of urgency.
- Experience with cloud-based data warehouses – e.g. Snowflake, BigQuery, Synapse, RedShift, etc
- Experience with message queuing, stream processing, and highly scalable ‘big data’ data stores (Kafka, Pub/Sub)
- Experience building data pipelines on modern public cloud services like Snowflake, AWS, GCP, or Azure
- Create supporting documentation, such as metadata and diagrams of entity relationships, business processes, and process flow.
- Familiarity with Analytical/Reporting Solutions like Qlik Sense, PowerBI is a plus
- Proficient in Linux/Unix environments