Data Platform Engineer

Data Platform Engineer

Job Title:            Data Platform Engineer

Job Location:     Hershey, PA

This position is open for 100% remote.

Summary: 

The Enterprise Data organization drives value for Hershey by providing high-quality, well governed data to the Enterprise for analytics and decision-making.

As part of a team of highly skilled technologists, the Data Platform Engineer will build monitoring tools to derive actionable insights for informed decision-making as it relates to Hershey’s data platforms. This role will partner with engineering and DevOps teams to develop enhancements and recommendations to optimize data pipelines, jobs, tools, and platforms for cost and performance. This role will have the opportunity to work with the larger team to learn and work on innovative data technologies, PoCs to exercise new ideas and influence the future direction of our data stack to continue to drive business value.

This role is part of the Platform & Operations team, whose focus is to enable Hershey’s business partners to work with data efficiently, securely, and responsibly. In addition, the Data Platform Engineer will be responsible for the management and administration of mixed data platform environments, utilizing tools such as Snowflake, Teradata, Databricks and Informatica.

Major Duties/Responsibilities: 

Platform Budget and Monitoring: 

  • Develop and deliver high-quality reports for monitoring of platform performance and cost.  Identify opportunities for optimizations.

Platform Maintenance:

  • Manage platform upgrades, review product releases for enhancements and additional functionality, serve as point of contact for support teams and technical issues.
  • Platform Domain:
  • Collaborate with IT and business partners to define, manage and deliver innovative Data Platform solutions to drive growth and adoption of capabilities at Hershey.

Platform Advocacy:

  • Evangelize future data platform solutions identified by Enterprise Data leadership, including innovations such as: metadata management; data security and governance; cloud-based systems for data storage; multi-environment integration and automation of data tasks and movement.

Specific Job Responsibilities:

  • Manage SLA’s, performance and optimization opportunities of the cloud platforms will be a responsibility within this job family
  • Leverage industry standard KPI’s to measure the performance of the data platforms and processes
  • Collaborate with data governance to ensure adherence and adoption of data policies
  • Create, maintain, and monitor dashboards and KPI’s of cloud environments (e.g., Azure, Teradata, Snowflake) utilizing various reporting platforms
  • Monitor and centralize multi-cloud budget, expenses, and user activity to provide recommendations on cost avoidance and optimization of cloud environments
  • Collaborate with technical engineers to operationalize data solutions and ensure automation through the data value chain
  • Ensure relevant data and analytics are available to meet business needs by continuously consolidating and modernizing existing solutions and platforms
  • Strategic thinker with holistic vision, specific focus on the identification for the automation of existing manual processes to drive key business performance
  • Champion platform standards, tooling, and processes
  • Develop and/or maintain relationships with 3rd party vendors for appropriate solutions
  • Routine communication of complex cloud reports and technical concepts to users, clients, development teams, and management
  • Oversee incidents and enhancement requests within our managed services provider

Minimum knowledge, skills and abilities required to successfully perform major duties/responsibilities:

  • Ability to manage multiple priorities, meet deadlines and produce quality results under pressure
  • Demonstrated leadership and managerial skills
  • Strong problem solving and analytical skills
  • Strong team player, change agent, and advocate
  • Excellent customer service skills
  • High energy self-starter
  • Excellent verbal and written communication skills
  • Working knowledge of agile frameworks

Minimum Education and Experience Requirements: 

Education:        

  • Bachelor’s in a STEM degree
  • Master’s degree and/or related equivalent experience preferred

Experience:       

  • Experience working with data, much of which has been focused on working with cross-functional teams and enterprise-wide data management programs
  • 2+ years’ experience with public and private cloud solutions (e.g., Azure, GCP, AWS)
  • 1+ years’ experience with Snowflake, including best practices, development, monitoring, reporting
  • Advanced working knowledge and experience with relational/non-relational databases e.g., Teradata, Snowflake, Databricks, or Azure Data solutions
  • Experience building data visualizations or analytics e.g., Power BI, Tableau, SSRS
  • Experience leading a project team or project function to deliver an enterprise data solution, application and/or ERP solution
  • Experience building, configuring and consuming APIs
  • Experience with SAP S/4 a plus

Nice To Have:

  • Experience leveraging data integration tools to build data pipelines and microservices e.g., Informatica, Talend, Matillion
  • Experience working in a high performing agile delivery model, aligning with Scrum Masters, Product Owners, and other data execution team members to deliver rapid and impactful solutions that align to business partner strategy
  • Excellent problem-solving skills and can help triage operational issues proactively working to eliminate repetitive or manual tasks leveraging scripting and off-the-shelf tools

#LI-CW1

Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.