Senior Data Engineer

Senior Data Engineer

This job is no longer open

Kettle's mission is to balance risk in a changing climate. Kettle uses deep learning and proprietary algorithms to reshape the reinsurance industry and better protect people from the growing risks associated with climate change. Kettle’s first product protects Californians’ businesses, homes, and livelihoods with wildfire reinsurance. To learn more, visit ourkettle.com

You probably have never heard of the $300bn reinsurance industry, but it is the single most important industry in the world to protect ourselves and help society recover from the effects of climate change disasters. This industry is failing due to a 3x increase in $1B+ crises caused by climate change. Kettle’s aim is to use deep learning to ensure that people’s lives are not destroyed when these events happen. By building the most sophisticated machine learning models to predict when and where wildfires happen, Kettle accurately prices the cost of covering wildfire prone areas in California. We are a machine-learning-powered reinsurer, in that we sell reinsurance to insurers, we don't sell software to reinsurers.

Who we are/what we value:

  • Obsessive Fanatics - We obsess over big problems. We are fanatical about our business/mission. We drive the company towards the areas people describe as ‘impossible to understand or stop’
  • Initiative - Obsessive people don’t need micromanagement.
  • Data Focused - With the world burning worse every year, we can’t afford opinions and bias to derail data driven decisions.
  • Questioning Renegades - We work in a 600 year old industry. If everyone is running one direction, we generally sprint in the other.
  • Impact Driven - We are a mission driven business


The Role

We are looking for a passionate Senior Data Engineer to join our team and work with a growing team of engineers, deep learning experts, and other data scientists to model risk using the most advanced tools available,

We strongly encourage people from traditionally underrepresented populations in tech - such as women, People of Color, People with Disabilities, and LGBTQ+ people, etc - to apply!

As a member of Kettle, you:

  • Design and develop a cloud-based data architecture that is is optimal for data science
  • Design and develop a research pipeline and support data scientists
  • Work with diverse data formats with varying data quality
  • Create data science pipeline, which includes ETL, model run, and metrics tracking, from start to finish
  • Explore the data space to constantly improve the quality of data sources
  • Advise data scientists on the optimal data engineering practices

We are looking for a highly motivated, successful data engineer to lead a growing data engineering team. They should be a demonstrated self-starter with at least five years experience in some sort of data analysis or engineering profession. This person needs to have the drive to make sure our technology and models are efficient, well-run systems to make the most accurate predictors of risk in the world. Below we have listed out some things we are looking for; you do not necessarily need to check all these boxes to be eligible for this position.

Essential Experience of a Successful Candidate

  • Bachelor’s, Master’s, or PhD in computer science or related field
  • In-depth experience with designing and developing data system
  • In-depth experience with satellite image data
  • In-depth knowledge of Amazon Web Services
  • Experience with building data science research pipeline
  • Some knowledge of Google Earth Engine data
  • Familiarity with machine learning
  • Excellent code proficiency in Python and Javascript
  • Self-starter with ability to work within a fast-paced and rapid-evolving startup
  • Eagerness to learn new skills and help with the task at hand

Useful Experience

  • Coding proficiency in building serverless APIs
  • Familiarity with database design and architecture
  • Experience using SageMaker, TensorFlow, and other machine learning tools
  • Developing simulation pipeline
  • Code proficiency in golang and C++

We offer a competitive package that is based on location and experience. We also offer the following benefits:

  • Stock: Ownership in a fast-growing venture-backed company.
  • 401k matching: We care about your ability to save for your future.
  • Family Focus: Parental leave and flexibility for families.
  • Time Off: Flexible vacation policy to encourage people to get out and see the world.
  • Healthcare: Platinum level Medical, dental, and vision policies.
  • Goodies: Whatever hardware and software you need to get the job done.
  • Team Fun: Regularly scheduled events, annual retreat, and celebrations.
  • Learning: Learning & Development Opportunities to grow your skills and career.
  • Great team: Working with fun, hard-working, kind people committed to making a difference!
  • Flexible culture: We are results-focused. We don’t work at the office every day.
  • ...And much more! Lots of other perks make this company an incredible place to work.
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.