Software Engineer: Analytics

Software Engineer: Analytics

This job is no longer open

About Bevy

Bevy is an early stage Startup with a mission to help brands build strong global communities. Founded in April 2017 by the core team behind Startup Grind, Bevy is an enterprise-grade SaaS platform used by companies that include Adobe, Amazon, Asana, Atlassian, Ebay, Epic Games, IDEO, Intuit, MongoDB, Red Bull, Roblox, Salesforce, SAP, Slack and many more. In April 2019, Bevy acquired CMX which is the world’s largest network of community professionals. CMX offers world-class training, events and research.

The Role

As a software engineer focused on data and analytics, you will be embedded in Bevy's data analytics and insights team. You will be deeply involved in the development of analytics-focused product features to help make increasingly advanced data insights available to event organizers and community organizations.

Responsibilities

  • Create and maintain testable ETL pipelines using GCP services and Python
  • Optimize data ingestion, storage, and processing architecture to meet product, business, and performance needs
  • Care deeply about data quality, privacy, and security
  • Support data scientists and product engineers
  • Proactively support continuous learning and software engineering projects
  • Support data/analytics application development and R&D efforts
  • Have a solid technical background. You should have at least five years of professional software development experience and be able to point to a track record of caring about software engineering practices and increasingly challenging problems solved.
  • Feel at home with Python/pandas, JavaScript/React.js and the shell command line as well as standard data analytics-related Python and/or Javascript packages
  • Can point to a track record of interesting data science/analytics projects that you can discuss, some of which ideally available in open source repositories for discussion.

What we’re looking for:

  • Bachelor’s degree, or substantial coursework in Computer Science, Software Engineering, or similar
  • Familiarity with Hadoop ecosystem tools
  • Experience creating product data pipelines on GCP using Pub/Sub, Dataflow, and BigQuery, and Python
  • Ideally know what it is like to work in distributed development teams, or better yet, thrive in them. It probably means you already know you don’t need a structured office environment with a manager who checks in on you once a day. Likewise, you know that you will do best from your home office.
  • You are an excellent communicator. In our small team, English is the official language. You need to be able to articulate complex ideas efficiently and effectively. When people do not share an office, it is essential to pay extra attention to communication & speak up.

We welcome candidates from traditionally underrepresented groups to apply. We are proud to foster a workplace free from discrimination. We strongly believe that diversity of experience, perspectives, and background will lead to a better environment for our employees and a better product for our users and the communities we serve.

Our Team 

We are a small but powerful team, dedicated to achieving our mission to bring more community to the world. Many of us have worked in community positions before and understand the struggles and peaks that come with the role. Our team communicates candidly, giving feedback early and often. We set ambitious goals, and do what it takes to achieve them, while making sure that we take care of our own personal health and mental wellbeing. We will want you to be ready to take on a lot of responsibility with guidance and mentorship along the way. 

 

 

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.