Senior Data Engineer

Senior Data Engineer

This job is no longer open

About the team

At Typeform, the Data Engineering team is making data our #1 asset, and we know how important that is, as our product helps people collect information in the best way possible. We do that by making the user experience as human and conversational as possible. Typeform’s data needs are growing, and we need to find new technical solutions to respond to these needs.

About the role

We’re looking for a Sr. Data Engineer embedded in the broader Engineering organization to support all departments at Typeform (Product, Marketing, Customer Success, and Software engineering) and provide them with the data they need to drive the business forward and push our product to the next level. 

 

Things you will do

  • Be part of the team responsible for the near-real-time transfer of data from source systems to the Data Lake, between systems within Typeform, and to/from our external tools.
  • Collaborate in designing, engineering, developing, and delivering an information technology infrastructure supporting the entire Data Team.
  • Help develop a scalable data architecture with a team of data engineers who populate and maintain Typeform’s Data Lake.
  • Lead and support Typeform’s engineering efforts in designing, developing, and rolling out Data systems and services.
  • Partner and learn with system architects, functional managers, and program managers to deliver mission-critical data to wherever it is needed
  • Help in building infrastructure and services that have an immediate impact on our business and customers.
  • Mentor and upskill junior team members.

 

What you already bring to the table:

  • 4+ years of experience in the Data Engineering field, with a proven track record of technical ability
  • Previous experience in using an event-driven architecture with Kafka
  • Excellent stakeholder management and communication skills 
  • Experience with Scala, Python, and SQL for data pipelines
  • Experience with modern cloud data warehouses (like AWS Redshift, GCP BigQuery, Azure Synapse or Snowflake)
  • Strong problem-solving skills 
  • Strong mentorship skills

 

Extra awesome:

  • Experience with Apache Spark (in both batch and streaming)
  • Experience with a job orchestrator (Airflow, Google Cloud Composer, Flyte, Prefect, Dagster)
  • Experience with dbt
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.