Data Engineer II

Data Engineer II

This job is no longer open

PagerDuty is growing and we are looking for an experienced Data Engineer for our Technical Operations team to manage and contribute to the software and services that we provide to our users. As a Data Engineer at PagerDuty, you will help lead the team responsible for designing, building, deploying, and supporting solutions for teams across PagerDuty's growing global user base. You are scrappy, independent, and excited about having a big impact on a small but growing team.

Together with the other members of the Data Platform team, you will have the opportunity to re-define how PagerDuty, designs, builds, integrates, and maintains a growing set of software and SaaS solutions. In this role, you will be working cross-functionally with business domain experts, analytics, and engineering teams to re-design and re-implement our Data Warehouse model(s).

How You Contribute to Our Vision: Key Responsibilities 

  • Translate business requirements into data models that are easy to understand and used by different disciplines across the company. Design, implement and build pipelines that deliver data with measurable quality under the SLA 
  • Partner with business domain experts, data analysts and engineering teams to build foundational data sets that are trusted, well understood, aligned with business strategy and enable self-service 
  • Be a champion of the overall strategy for data governance, security, privacy, quality and retention that will satisfy business policies and requirements 
  • Own and document foundational company metrics with a clear definition and data lineage 
  • Identify, document and promote best practices

About You: Skills and Attributes 

  • You will design, implement and scale data pipelines that transform billions of records into actionable data models that enable data insights.
  • You will help lead initiatives to formalize data governance and management practices, rationalize our information lifecycle and key company metrics. 
  • You will provide mentorship and hands-on technical support to build trusted and reliable domain-specific datasets and metrics. 
  • You will have deep technical skills, be comfortable contributing to a nascent data ecosystem, and building a strong data foundation for the company. 
  • You will be a self-starter, detail and quality oriented, and passionate about having a huge impact at PagerDuty.

Minimum Requirements 

  • Bachelor's degree in Computer Science, Engineering or related field, or equivalent training, fellowship, or work experience 
  • 5+ years of experience working in data architecture, data pipelines, data modeling, master data management, metadata management
  • Experience designing, deploying, and leading end to end Data platforms in a cloud-based and Agile environment
  • Very strong experience in scaling and optimizing schemas, writing complex and performance tuning SQL and Data pipelines in the OLTP, OLAP and Data Warehouse environments.
  • Advanced knowledge of relational databases and being capable of writing complex SQL.
  • Experiences with Cloud-based Data Warehousing Platform such as SnowFlake or AWS Redshift.
  • Experience with Data Pipeline and workflow tools such as Airflow including upgrades and administration tasks. Experience with AWS managed Airflow is a plus.
  • Proficiency with one or more object-oriented and/or functional programming languages - Python/Scala/PySpark.
  • Hands-on experience with Big Data technologies like Spark - Databricks Preferred.
  • Experience with AWS  - S3, SQS, Lambda, Athena and more.
  • Experience with any of different ETL Tools like - Fivetran, Segment, Mulesoft or others.
  • Knowledge of at-least one Data Visualization tool like Tableau, Periscope or Looker
  • A consistent track record of close collaboration with business partners and crafting data solutions to meet their needs 
  • Excellent written and verbal communication and interpersonal skills, able to effectively collaborate with technical and business partners
  • Smart Individual to pick up and jump into any new technology as and when needed and a good team player.

Preferred Qualifications 

  • Experience with Real time and other machine learning frameworks
  • Familiarity with Infrastructure as a code - Terraform is a plus
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.