Mid-level / Senior Data Engineer (AWS/Python/Spark)

Mid-level / Senior Data Engineer (AWS/Python/Spark)

This job is no longer open

About the Role

Data is the driver for our future at CARS. We’re searching for a collaborative, analytical, and innovative engineer to build scalable and highly performant platforms, systems and tools to enable innovations with data.

Working within a dynamic, forward thinking team environment, you will design, develop, and maintain mission-critical, highly visible Big Data pipelines in direct support of our business objectives. As a member of our Enterprise Data Team, you will also work in close partnership with teams from every part of the company, including Product Engineering, Data Science, Business Intelligence, and Digital Marketing

If you are passionate about building large scale systems and data driven products, we want to hear from you.

Responsibilities Include:

  • Build data pipelines to ingest data from various source systems such as Kafka, Databases, APIs, flat files, etc.
  • Lead or participate in development projects from end to end, including requirement gathering, design, development, deployment, and debugging.
  • Work closely with domain experts and stakeholders from across the company.
  • Develop Spark jobs to cleanse/enrich/process/aggregate large amounts of data.
  • Monitor and optimize pipelines, tuning Spark jobs for efficient performance taking into account cluster resources, execution time of the job, execution memory, etc.
  • Identify and drive innovative improvements to our data ecosystem in areas such as data validation and meeting SLAs.

Required Skills & Experience:

  • Bachelor’s Degree in Computer Science, Mathematics, Engineering or related field/equivalent experience.
  • Software or Data Engineering | 3 - 6 years of designing & developing complex applications at enterprise scale; specifically Python, Java, and/or Scala. 
  • Big Data Ecosystem | 2+ years of hands-on, professional experience with tools like Spark, Kafka.
  • AWS Cloud | 2+ years of professional experience in developing Big Data applications in the cloud, specifically AWS.
  • Experience with SQL, writing analytics queries and designing/optimizing databases.
  • Excellent communication and collaboration skills.
  • Experience with Agile development methodology.

Preferred:

  • Experience with Apache Airflow.
  • Comfortable working in Linux environment and using IntelliJ IDEs.
  • Experience using Infrastructure as Code tools like Terraform and Packer.
  • Experience working with digital advertising data providers (Facebook, Bing, Google).
  • Experience with data warehouses and BI reporting tools (Snowflake, Redshift, Tableau).

Benefits & Perks*:

  • 18 days of paid time off, plus select paid holidays
  • Paid Volunteer Day & Paid Pet Wellness Day
  • Fully stocked kitchen and refrigerator
  • Robust Health Insurance Options: BCBS, Delta Dental, EyeMed
  • 401k plan with company match
  • Subsidized internet access for your home
  • Subsidized gym membership
  • Parental Leave
  • Life & Disability Insurance
  • Tuition Reimbursement

*Not a complete, detailed list. Benefits have terms and requirements before employees are eligible. 

#LI-REMOTE

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.