Software Engineer - Data Platform

Software Engineer - Data Platform

This job is no longer open

See yourself at Twilio

Join the team as our next  Software Engineer(Data Platform)

Who we are & why we’re hiring

Twilio powers real-time business communications and data solutions that help companies and developers worldwide build better applications and customer experiences.

Although we're headquartered in San Francisco, we have a presence throughout South America, Europe, Asia, and Australia. We're on a journey to becoming a globally anti-racist, anti-oppressive, anti-bias company that actively opposes racism and all forms of oppression and bias. At Twilio, we support diversity, equity & inclusion wherever we do business. We employ thousands of Twilions worldwide, and we're looking for more builders, creators, and visionaries to help fuel our growth momentum.

About the job

This position is needed to design and build data ingestion frameworks and services in cloud environments that fuels strategic business decisions across Twilio. You will partner with other engineers and product managers to translate data needs into critical information that can be used to implement scalable data platforms and self-service tools. We are looking for someone who is passionate about solving problems using engineering and data, thrives in an evolving environment, brings an enthusiastic and collaborative attitude, and delights in making a difference. As a successful candidate, you must have a deep background in data engineering and a proven track record of solving data problems at scale leveraging distributed data systems. You are a self-starter, embody a growth attitude, and can collaborate optimally across the entire Twilio organization.

Responsibilities

In this role, you’ll:

    • Design, and implement a scalable data lake, including data integration and curation
    • Build modular set of data services using Python/Scala,Presto SQL, API Gateway, Kafka, Apache Spark on EMR among others
    • Research, design, and experiment to execute fast proof of concepts to evaluate similar products.
    • Participate in the strategic development of methods, techniques, and evaluation criteria for projects and programs. This will include assessment of build vs buy decisions at every stage, backed by proof of concepts, benchmarking, etc.
    • Experience working autonomously and taking ownership of projects.
    • Create data applications with ability to do searches, real time data alerts, APIs to pull the data on a large volume of data.
    • Design and implement innovative data services solutions using Microservices and other UI and API related technologies
  • Implement processes and systems to manage data quality, ensuring production data is always accurate and available for key partners and business processes that depend on it.
  • Writes unit/integration tests, contributes to engineering wiki, and documents work.
  • Work closely with a team of frontend and backend engineers, product managers, and analysts.
    • Coaching other engineers on best practices for designing and operating reliable systems at scale
  • Design data integrations and data quality framework.
  • Execute the migration of data and processes from legacy systems to new solutions.
    • Perform production support and deployment activities
  • Manage the system performance by performing regular tests, solving problems and integrating new features.
  • Offer support by responding to system problems in a timely manner.

Qualifications 

Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!

 

Required:

  • 5+ years of diverse experience in building scalable Realtime and Batch Data Ingestion Framework to populate data into DataLake.
  • Experience working with data technologies that power analytics (e.g. EMR Apache Sqoop, Airflow, Hadoop, Hive, Spark, Presto,Cassandra, Kafka, Pinot, Flink, etc. or similar technologies).
  • In-depth knowledge on a few of the production technologies we use including AWS (CloudFormation, EC2), Docker, Terraform, Kubernetes, Chef
  • Lead technical architecture discussions and help drive technical decisions within your team.
    • Has expert understanding on Design Patterns and OOPS concepts.
    • Experience with performance management, logging, and monitoring tools
    • Familiarity with SpringBoot and related technologies
  • Expert coding skills across a number of languages including Scala, Java, Python etc.
  • A systematic problem-solving approach, coupled with strong communication skills and a sense of ownership and drive
  • Experience in processing various file formats such as Parquet, Avro, JSON etc
  • Experience in AWS EC2, S3 

 

Desired:

  •  Experience with DAG-based open-source workflow management solutions airflow etc.
  • Familiarity with low latency Database solutions such as Dremio,Pinot,Clickhouse. 
  • Good  in SQL.


Location 

This role will be based in the United States (Remote).

What We Offer

There are many benefits to working at Twilio, including, in addition to competitive pay, things like generous time-off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location.

Twilio thinks big. Do you?

We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts.

 

So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now!

If this role isn't what you're looking for, please consider other open positions.

*Please note this role is open to candidates outside of Colorado as well. The information below is provided for those hired in Colorado only.

*If you are a Colorado applicant:

  • The estimated pay range for this role, based in Colorado, is $132,000-165,000

Additionally, this role is eligible to participate in Twilio's equity plan.

The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location within the state. This role is also eligible to participate in Twilio’s equity plan and for the following benefits: health care insurance, 401(k) retirement account, paid sick time, paid personal time off, paid parental leave.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.