Data Engineer (L3)

Data Engineer (L3)

This job is no longer open

See yourself at Twilio

Join the team as our next Data Engineer (L3)

Who we are & why we’re hiring

Twilio powers real-time business communications and data solutions that help companies and developers worldwide build better applications and customer experiences.

Although we're headquartered in San Francisco, we have a presence throughout South America, Europe, Asia, and Australia. We're on a journey to becoming a globally anti-racist, anti-oppressive, anti-bias company that actively opposes racism and all forms of oppression and bias. At Twilio, we support diversity, equity & inclusion wherever we do business. We employ thousands of Twilions worldwide, and we're looking for more builders, creators, and visionaries to help fuel our growth momentum.x

About the job

This position is needed to design and build data pipelines and services in cloud environments that fuels strategic business decisions across Twilio. You will partner with other engineers and product managers to translate data needs into critical information that can be used to implement scalable data platforms and self-service tools. We are looking for someone who is passionate about solving problems using engineering and data, thrives in an evolving environment, brings an enthusiastic and collaborative attitude, and delights in making a difference. As a successful candidate, you must have a deep background in data engineering and a proven track record of solving data problems at scale leveraging distributed data systems. You are a self-starter, embody a growth attitude, and can collaborate optimally across the entire Twilio organization.

Responsibilities

In this role, you’ll:

  • Design, Develop and maintain scalable data pipelines and Data warehouse to support continuing increases in data volume and complexity.
  • Develop set processes for analytics, data modeling, and data production.
  • Collaborate with Data Architects and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering educated decision-making across the organization.
  • Implement processes and systems to manage data quality, ensuring production data is always accurate and available for key partners and business processes that depend on it.
  • Writes unit/integration tests, contributes to engineering wiki, and documents work.
  • Perform data analysis required to solve data-related issues and assist in the resolution of data issues.
  • Work closely with a team of frontend and backend engineers, product managers, and analysts.
  • Design data integrations and data quality framework.
  • Prepare accurate database design and architecture reports for management and Business teams.
  • Execute the migration of data from legacy systems to new solutions.
  • Manage the system performance by performing regular tests, solving problems and integrating new features.
  • Recommend solutions to improve new and existing database systems.
  • Offer support by responding to system problems in a timely manner.

 

Qualifications 

Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!

Required:

  • 5+ years of diverse experience in building data pipelines and in data lake /warehousing
  • Experience working with distributed analytics platforms - Hive, Presto etc.
  • Experience working with data processing technologies like Kafka, Spark 
  • Expertise in SQL
  • Expert coding skills across a number of languages including Scala, Java, Python etc.
  • Experience in processing various file formats such as Parquet, Avro, JSON etc
  • Experience in AWS EC2, S3, Redshift, Snowflake, etc

Desired:

  •  Experience with DAG-based open-source workflow management solutions airflow etc.


Location 

This role will be remote in the US but is not eligible to be hired in San Francisco CA, Oakland CA, San Jose CA or the surrounding areas

What We Offer

There are many benefits to working at Twilio, including, in addition to competitive pay, things like generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.

Twilio thinks big. Do you?

We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.

So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!

If this role isn't what you're looking for, please consider other open positions.

*Please note this role is open to candidates outside of Colorado as well. The information below is provided for those hired in Colorado only.

*If you are a Colorado applicant:

  • The estimated pay range for this role, based in Colorado, is $132,000.00-$165,000.00

The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location within the state. This role is also eligible to participate in Twilio’s equity plan and for the following benefits: health care insurance, 401(k) retirement account, paid sick time, paid personal time off, paid parental leave.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.