Analytics Engineer - Data

Analytics Engineer - Data

This job is no longer open

About the role:

Our Data & Analytics group is growing and now includes close to 30 multidisciplinary members! As we continue to expand and innovate, we are looking for an Analytics Engineer to join the Data Engineering team and work alongside Data Analysts and Data Scientists.  As an Analytics Engineer - an integral part of the Data team - you will play a pivotal role in storing, modelling, and processing our data. You will also design ETL pipelines, ingest data from multiple sources, derive business insight and allow dashlaners to receive feedback on their work while improving the product and growing the business. The team has full ownership of the data platform, and we promote multidisciplinarity, giving team members the opportunity to bring their expertise and the chance to expand their knowledge on all things data related.

The role in numbers : 

  • 1 Team: Data Engineering 
  • 2 Roles: 6 Data Engineers, 1 Analytics Engineer 
  • >10TB of raw data 

1 day :

  • 100M client events
  • ~50 data pipelines 
  • ~250 SQL transactions 

About our Stack:

  • Data acquisition:                   ECS, Kinesis, Firehose, Lambda, DMS, Airbyte 
  • Scheduling/Orchestration:    Airflow, dbt
  • Transform:                             SQL, Glue Jobs (python and Spark), 
  • Datalake:                               S3, Glue Crawlers, Athena
  • Datawarehouse:                    Redshift
  • Dashboard:                           Tableau
  • Code versioning:                   Gitlab
  • Preferred languages:             SQL and Python
  • Infrastructure :                       Terraform, ansible, packer
  • Monitoring:                            Cloudwatch

At Dashlane you will:

  • Develop data models and schemas that enable performant and intuitive analysis in a Data Warehouse setting
  • Use data pipelines on Airflow (python) and dbt (sql templating) to acquire, process and deliver insights
  • Communicate and collaborate with different stakeholders: Analysts, Product Managers and Data Scientists, to gather model requirements and detailed pipeline designs
  • Facilitate and guide the definition and documentation of analytics application logs
  • Work in a fast-paced, distributed environment, collaborating with our New York, Paris, and Lisbon teams

We're looking for:

  • 3+ years of  SQL experience with the ability to write complex queries
  • 3+ years of experience in translating business requirements to data pipelines
  • 2+ years of experience using ETL and Orchestration tools
  • 2+ years of experience working with large data sets, optimising storage and querying
  • 1+ year(s) of experience with a scripting language (preferably Python) 

The ideal candidate will also have:

  • A passion for sharing the value of data and communicating insights to a broad audience with varying levels of technical expertise
  • Experience with dashboarding tools, like Tableau, directly or indirectly 
  • Experience with streaming AWS: Kinesis or Kafka
  • Experience with a serverless on AWS or outside 
  • Excellent English verbal/written communication and presentation skills

If you are passionate about data, security, technology, looking to grow, and have what we are looking for - please reach out, we'd love to meet you.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.