Firebolt

Tel Aviv
51-200 employees
Firebolt is a complete redesign of the cloud data warehouse for the era of cloud and data lakes. Data warehousing with extreme speed & elasticity at scale.

Python Data Engineer

Python Data Engineer

This job is no longer open

Description

Who we are

Firebolt is a major shake-up to the cloud data warehouse industry. We have built the fastest, most scalable, and most hardware efficient cloud data warehouse in the market. By far. We’ve proven incredible market fit, turned the most data-forward companies from amazing design partners into paying customers, raised $37M in our recent round A, and are poised to grow quickly to help companies do with big data what they can’t with Snowflake, Redshift, and the rest.


About the tech stack

Firebolt is composed of several open-source projects and relies on a unique IP that boosts data analytics and enables full scalability and decoupling compute from storage.

We have Terraform as part of the application, dynamically provision AWS accounts, network infrastructure and workloads in reaction to clients activity. The infrastructure is managed as code with Terraform/Kops/Helm and services are monitored using Prometheus Thanos Grafana and Loki. CI/CD is handled by a combination of CircleCI and ArgoCD in GitOps fashion to test and deploy code to production.

We’re pushing to run everything on Kubernetes, stateless and stateful workload. We love CRDs and operators, and we also develop our own.

Our SQL core teams work with C++. Our backend teams work with Go, Python, Rust in order to create gRPC microservices exposing REST APIs and GraphQL interface. We are using CockroachDB, FoundationDB, Temporal, and Kafka as application infrastructure. Our frontend teams work with TypeScript, React, Redux + Apollo.


About the job to be done:

  • Take a key part in our echo-system team
  • Create and maintain high-scale data pipelines 
  • Create connectors and integrations with ELT tools
  • Build public SDKs
  • Integrations with 3rd parties products

Requirements

  • BS/BA in Technical Field, Computer Science or equivalent 
  • At least 3 years of experience developing data pipelines
  • Experience in custom ETL design, implementation, and maintenance
  • Experience with ETL/ELT tools: Airflow/dbt/Fivetran, etc
  • Python knowledge
  • Familiarity with at least one of the following: Java/Scala/C#
  • 3+ years experience in the data warehouse space
  • 3+ years of hands-on experience in using sophisticated SQL queries and writing/optimizing highly efficient SQL queries
  • Experience integrating with 3rd party APIs
  • Communication skills including the ability to identify and communicate data-driven insights
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.