Senior Data Engineer

Senior Data Engineer

This job is no longer open

What is Calendly?

Calendly takes the work out of scheduling so our customers have more time to work on what’s really important. Our software is used by millions of people worldwide—with hundreds more signing up each day. To maintain this exciting growth, we’re looking for top talent to join our team and help shape the future of our product and data usage at Calendly. 

Why join Calendly’s Data Engineering team? 

Calendly is looking for a Senior Data Engineer to join our fast-growing team. This role will reside on the Data Engineering (DE) team with a focus on building our Data Platform. This platform, built on the Google Cloud Platform (GCP), ensures our company has access to timely, accurate, relevant, and proactively persuasive insights to efficiently grow the business.

Our ideal candidate will develop and maintain data architectures that align with business requirements. They will work closely with our product, sales, and marketing teams to support and implement high-quality, data-driven decisions while ensuring data accuracy and consistency. The right person for the job will have experience developing data pipelines built for the now and the future, have a keen eye for reliability and performance, and be a team player. This individual should have a hunger and passion for helping the business, asking the right questions, and delivering pipelines that push the business forward.

What are some of the high-impact opportunities you’ll tackle? 

  • Creation of a high-volume data platform linking all aspects of Calendly’s business
  • Ensure proper, secure, and scalable instrumentation of data models in partnership with Product Engineering
  • Manage performance, integrity, and security of data transfer and storage
  • Identify process improvements for data intake, validation, mining, and engineering as well as modeling, visualization, and communication deliverables
  • Work with stakeholders to design and deliver assets necessary for their objectives and key results. These deliverables will span both Business Intelligence and customer facing reporting
  • Ensure timeliness, accuracy, and relevance of data and deliverables 

This opportunity is for you if you have/are:

  • 5-7 years of experience in a position designing and maintaining data pipelines, data models and reporting systems
  • Strong programming skills in imperative and declarative languages -  especially Python and SQL
  • Strong data modeling background and cloud data warehouse experience
  • Experience with big data tools such as Google BigQuery
  • Experience with batch and streaming pipelines, i.e. Apache Spark, Beam
  • Experience with orchestration techniques and frameworks like Airflow, PubSub
  • Authorized to work lawfully in the United States of America as Calendly does not engage in immigration sponsorship at this time

If you are an individual with a disability and would like to request a reasonable accommodation as part of the application or recruiting process, please contact us at recruiting@calendly.com

Calendly is registered as an employer in many, but not all, states. If you are not located in or able to work from a state where Calendly is registered, you will not be eligible for employment.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.