Sr. Data Platform Engineer

Sr. Data Platform Engineer

This job is no longer open

YOUR MISSION

We are looking for a Senior Software Engineer to enable our data platform to meet the challenges presented by phenomenal growth of our business.  You will be working in the Data Platform team and also collaborating closely with Product, Analytics and Data Science teams.  You will understand their use cases and establish SLAs to give confidence that those use cases can be supported.

The ideal candidate will have at least 5 years experience on Big Data projects.  You have knowledge of data architectures, APIs, and the delivery and transformation of data in a reliable way.  Your track record will demonstrate the engineering skills to design and build complex projects, and the leadership skills to get them into production.

As a member of the Data Platform team, you will:

  • Develop and test data models and platform architectures to accommodate rates of data ingest, storage requirements, cost constraints, and SLAs.
  • Scale our data platform up to handle a 10x increase in data ingest, storage, and high availability requirements.
  • Ensure that the data platform is compliant with CCPA, GDPR, and other mandates from our compliance team.
  • Research established and emerging technologies to determine their relevance for the data platform.
  • Help create the platform, tools and APIs necessary to enable other teams to work with data.
  • Work closely with Product teams to help them explore the feasibility of experimental data-driven features, helping them narrow down preliminary or unclear requirements, and building the tools and APIs necessary to support those features. A strong analytical mindset is a must.
  • Efficiently handle vast amounts of data from multiple sources and destinations, including relational and NoSQL databases as well as external systems, both in batch processing and real-time delivery.
  • Follow modern development best practices such as code reviews, unit testing, continuous integration, and agile methodology.
  • Work as part of a team. We value team players who share their knowledge and like collaborating with others.
  • Take full ownership of the solutions you build. This means analyzing requirements, building them, monitoring them, and troubleshooting them if problems arise.

YOUR PROFILE

  • Bachelor’s degree in CS or related field
  • Excellent communication and people skills, and the ability to work with multiple teams
  • Excellent system design skills.
  • Experience in designing and developing web services and REST APIs.
  • Strong coding skills, preferably in python.
  • 5+ years of AWS Redshift, Hadoop, NoSQL, and Open-Source data management technologies.
  • 8+ years of experience of IT platform implementation in a highly technical and analytical role.
  • 5+ years’ experience of Data Lake/Hadoop platform implementation, including 3+ years of hands-on experience in implementation and performance tuning data warehouses.

Ideally you have...

  • Advanced degree in CS or related field
  • AWS Certification, e.g. AWS Solutions Architect, Developer, or SysOps Associate/Professional
  • Track record of building data platforms with AWS services in a variety of businesses such as large enterprises and start-ups.
  • 2+ years experience with Apache Spark
  • 2+ years experience with Airflow, with emphasis on developing ETL pipelines using Airflow.
  • 2+ years experience with MongoDB
  • 2+ years experience with Kafka
  • ETL development experience on large datasets (terabyte scale or larger)
  • Experience designing and building data platforms with Azure is a plus.
  • Experience with data streaming technology, e.g. Flink, Spark, etc. is a plus.

#LI-Remote #LI-DJ1

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.