DataRobot

Boston
1,001-5,000 employees
Delivering a unified platform for all users, all data types, and all environments to accelerate delivery of AI to production for every organization.

Senior Data Infrastructure & Ops Engineer

Senior Data Infrastructure & Ops Engineer

This job is no longer open

Job Description:

About DataRobot

DataRobot is the leader in enterprise AI, delivering trusted AI technology and enablement services to global enterprises competing in today’s Intelligence Revolution. DataRobot’s enterprise AI platform democratizes data science with end-to-end automation for building, deploying, and managing machine learning models. This platform maximizes business value by delivering AI at scale and continuously optimizing performance over time. The company’s proven combination of cutting-edge software and world-class AI implementation, training, and support services, empowers any organization – regardless of size, industry, or resources – to drive better business outcomes with AI. 

You will be responsible for: 

  • Work with analytics leadership, enterprise data architect, data engineers, and data analysts to understand data needs.

  • Provide technical leadership around data infrastructure and drive requirements, design, and delivery of scalable data infrastructure

  • Design, build and maintain highly efficient and reliable data pipelines to move data across a number of platforms including applications, database, and data warehouse, and BI tools. 

  • Provide data engineers with guidance and design support around workflows and pipelines, conduct code reviews, and select the tools the team will use.

  • Enable and automate Role Based Access Control (RBAC) for enterprise data warehouse

  • Drive day to day execution using agile scrum development process to maintain a consistent sprint velocity.

Requirements

  • 5+ years hands-on development experience in custom ETL design, implementation and maintenance

  • 5+ years of Python development experience.

  • 5+ years of SQL experience.

  • Proven track record of architecting large-scale data pipelines, CI/CD tools and have experience with technologies like Airflow, dbt, Docker, Terraform, and similar.

  • Experience with implementation of Data Ops tools and software engineering best practices e.g. Gitflow, 

  • Experience working with cloud MPP analytics platforms (Snowflake, AWS Redshift, Azure Data Warehouse and similar).

  • Experience in building data pipeline monitoring and alerting tools to provide real-time visibility about scalability and performance metrics.

  • Experience or certification in deploying AWS cloud infrastructure for hosting data platform infrastructure.

  • Experience with JIRA and Github

  • Experience with data model design is a plus

DataRobot is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. DataRobot is committed to working with and providing reasonable accommodations to applicants with physical and mental disabilities. Please see the United States Department of Labor’s EEO poster and EEO poster supplement for additional information.

All applicant data submitted is handled in accordance with our Applicant Privacy Policy.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.