Data Platform Engineer

Data Platform Engineer

Job Requisition ID #

22WD60316

Remote Work

Position Overview

We’re growing our Cloud (Data) Management team within the Enterprise IT organization. This team leverages Dev/DataOps principles to provide foundations and support for Autodesk’s AWS development communities. Our engineering culture will empower you to make effective decisions, work collaboratively, and take accountability for engineering projects at the core of the company and the leading edge of the latest industry technology trends.

The team is seeking a Site Reliability Engineer within our Data & Analytics Platform Support under Enterprise Systems & Experience [ESE] group to work in supporting our Amazon Web Services (AWS) environments. The ideal candidate is comfortable wearing multiple hats: diplomat, coach, reporter, translator, firefighter.

Responsibilities

  • Build efficient and scalable data pipelines using Airflow to enable a data-driven business

  • Work as part of a multi-disciplinary squad to establish our next generation of data pipelines and tools

  • Create, monitor, and support reliable data pipelines. Manage SLAs of the data pipelines

  • Understanding of SQL, dimensional modeling, and analytical data warehouses, like Snowflake

  • Hands-on software development experience in programming language such as Python with standard data processing & making API calls

  • Contribute to the development and education plans on data engineering capabilities, systems, standards, and processes

  • Run and troubleshoot Apache Airflow pipelines, including monitoring logs for errors

  • Be involved from the inception of projects, understanding requirements, designing and developing solutions and incorporating them into the designs of our platforms

  • Build and maintain strong relationships between Business, Engineering and our DataOps teams

  • Design and build automation solutions to reduce manual efforts and increase team efficiency

  • Provide visibility into metrics for the AWS/Azure management, operations, and support processes

  • Field Data management requests and providing support for AWS and Data & BI platform user communities

  • Troubleshoot and resolve issues in the AWS environments

  • Interact with the internal AWS user communities to understand requests/issues, establish clear expectations, and provide effective communication throughout the support process

  • Work with other members of the globally based Data Management team to ensure efficient resolution of requests

  • Ensure cloud services are used within compliance guidelines

  • Write technical documentation

         

Minimum Qualifications

  • Bachelor’s Degree or College Diploma in Computer Science, Information Systems, or equivalent experience

  • Demonstrable knowledge of AWS IAC (Infrastructure-as-Code) and DataOps expertise

  • Comfortable performing requirements analysis, interfacing with stakeholders of various levels and documenting solutions

  • Able to articulate technical topics to non-technical audiences both in writing, in diagrams and person

  • Energetic team player who works well across boundaries and readily adapts to change and enjoys rapid development

  • Able to manage daily support activities to ensure completion of operational requests within agreed Service Level Objectives

  • Able to think and act with an “automation” mindset to problem-solving

  • Focus on action, shipping mindset and iterative deployment practices

  • Strong written and oral communication skills

  • Require technical skills include:

  • Airflow or Oozie

  • AWS

  • Spark

  • SQL

  • Python

  • CI/CD (Jenkins, GitHub etc…)

  • ETL tools like DBT, Glue, etc..

Preferred Qualifications

  • 2+ years of hands-on experience with AWS (CloudFormation, EC2, Lambda, S3, DynamoDB, RDS, VPC, Route 53, CloudWatch, CloudTrail, and IAM)

  • 1+ years of development/automation scripting experience (with tools such as Python, Ruby, Node.js, Go, Rest APIs, etc.)

  • Exposure to Data exploration & Analytics tools like PowerBI, Looker, Tableau

  • Experience in DevOps, Serverless, containers [Docker, Kubernetes, ECS], CI/CD

  • Experience with Database or datastores e.g. RDS, DynamoDB, Redis, Redshift, Snowflake, S3

  • Experience in Monitoring tools like Dynatrace, Datadog, Pepperdata

  • Literacy with Agile practices and tools (such as Jira, Aha, Kanban, etc.)

  • AWS Developer Certification is desirable

#LI-POST

At Autodesk, we're building a diverse workplace and an inclusive culture to give more people the chance to imagine, design, and make a better world. Autodesk is proud to be an equal opportunity employer and considers all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender, gender identity, national origin, disability, veteran status or any other legally protected characteristic. We also consider for employment all qualified applicants regardless of criminal histories, consistent with applicable law.

Are you an existing contractor or consultant with Autodesk? Please search for open jobs and apply internally (not on this external site). If you have any questions or require support, contact Autodesk Careers.

Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.