About The Position
We are hiring ETL Data Engineers with strong experience across the entire Cloud Data stack. The ideal candidate will have extensive experience in data pipelines (ELT/ETL), data warehousing and dimensional modeling, and curation of data sets for Data Scientists and Business Intelligence users. This candidate will also have excellent problem-solving ability dealing with large volumes of data.
Responsibilities
- Building scalable Cloud data solutions using MPP Data Warehouses (Snowflake, Redshift, or Azure Data Warehouse/Synapse), data storage (S3 or Azure Blob Storage) and analytics platforms (i.e. Spark, Databricks, etc.)
- Creation of data pipelines and transformations (ELT – Informatica, Matillion, FiveTran, DBT, Talend, etc.)
- Creating data integrations with scripting languages such as Lambda, Python, etc.
- Writing complex SQL queries, stored procedures, etc.
Requirements
Qualifications
Must haves:
- Bachelor’s degree, or equivalent experience, in Computer Science, Engineering, Mathematics or a related field. Commensurate work experience will be considered in lieu of degree
- Experience building scalable Cloud data solutions using MPP Data Warehouses (Snowflake, Redshift, or Azure Data Warehouse/Synapse), data storage (S3 or Azure Blob Storage) and analytics platforms (i.e. Spark, Databricks, etc.)
- 5+ years with complex SQL queries and scripting
- 3+ years’ experience with Azure and/or AWS Cloud
- 3+ years developing, and deploying scalable enterprise data solutions (Enterprise Data Warehouses, Data Marts, ETL/ELT workloads, etc.)
- 3+ years of supporting business intelligence and analytic projects
- Good understanding of code repositories such as GIT
- Excellent written and oral communication skills
Pluses:
- Experience with PowerBI or Tableau
- Experience with Snowflake