Data and Senior Data Engineer, Productivity Engineering

Data and Senior Data Engineer, Productivity Engineering

This job is no longer open
Xero is a beautiful, easy-to-use platform that helps small businesses and their accounting and bookkeeping advisors grow and thrive. 

At Xero, our purpose is to make life better for people in small business, their advisors, and communities around the world. This purpose sits at the centre of everything we do. We support our people to do the best work of their lives so that they can help small businesses succeed through better tools, information and connections. Because when they succeed they make a difference, and when millions of small businesses are making a difference, the world is a more beautiful place.

About the team 

We are on the lookout for multiple Data engineers in Productivity Engineering which is part of the wider Technology group. Productivity Engineering is an internal tooling team, focusing on decreasing the cognitive load of our engineers to allow faster delivery of value to our small business and accounting customers. 

We plan to make world class engineering at Xero easy and fun, automating repeatable tasks, providing engineering insights, reports and dashboard to support our DevOps transformation to push forward through productivity. 

In this specific team, Engineering Insights we aim to provide insights and reporting to our internal customers and stakeholders from data-sets ingested into our internal data platform.

About the role

You’ll contribute to our cross functional environment by working towards the same objectives, using modern principles and practices. As a Data Engineer, you will engineer large and complex data sets, as well as build robust, reusable and optimised data pipelines. You will develop, administer and operate the data pipelines and data enrichment activities on an internal Data platform. You will have broad technical skills of developing and implementing data pipelines as well as data infrastructure to support the implementation of data transformation for analytics , business intelligence, machine learning models and other data projects.


We would love to talk to you if:

    • Are comfortable with or keen to learn the most of the following technologies:
    • SQL,  Snowflake, dbt, Python, AWS, AWS Serverless (Lambda Functions), SQS, -SNS, S3, Glue, Athena, Terraform, Kubernetes, CI/CD pipelines  eg: TeamCity, etc.

    • Implement data pipelines to:
    • - extract data from multiple sources.
      - transform data using stream, micro-batch or batch methods  

    • Deploy data pipelines following  DevOps principles and incorporating automated code  & data testing.
    • Build highly scalable and resilient data pipelines
    • Develop and maintain ETL processes and processing schedules required to produce data sets
    • Provide support to data quality and data validation through analysis of data and assisting in understanding the meaning of data
    • Build and manage robust cloud architectures to provide efficient processing and transformation of data.
    • Have familiarity with data integration tools (such as dbt) and any data testing tools (example dbt expectations, deeq)
    • Have some experience in data modeling (example data vault) and visualization  in ways that are most valuable to customers
    • Are experienced in orchestration tools (such as Apache Airflow, Prefect)
    • Have hands-on experience utilising Terraform tools to orchestrate infrastructure (Infrastructure as Code).
    • Understand the best practices in the storage formats, storage and processing of data, as well as software best practices, automated testing, and distributed systems.
    • Collaborate with the team to help create and deliver beautiful software to our customers
    • Contribute to discussions around technical, architectural and process improvements
    • Advocate for continuous improvement of systems and processes within the team, and across the organisation
    • Establish and maintain good working relationships both within your immediate team and the wider organisation

What you’ll bring with you:

    • A growth mindset
    • Proficient use of one or more programming languages and tools
    • Has developed a specialisation in one or more specific technologies or areas of the development stack
    • Ability to communicate with and manage stakeholders and customers
    • Ability to debug across a technology stack
    • Ability to analyse complex data sets
    • Confident in upgrading tooling and technology underlying products
    • Proficient in roll out and maintenance of cloud infrastructure
    • Familiar with Data and infrastructure security concerns and proactively mitigates issues
    • Regularly practices test driven and trunk based development
    • Demonstrated ability to respond to production incidents
    • Familiarity with Eventing


Why you should become a Xero

It’s a diverse and inclusive environment, with people who will respect, challenge, support and mentor you to do the best work of your life. We’re a place where innovation and change are not only encouraged but also celebrated. We value our people and want them to enjoy and take pride in their work. 

We’re very supportive of flexible working arrangements and offer a competitive remuneration package including shares and life insurance, in addition to your base salary. We have a culture we’re proud of. Whether you're after a workplace with a social vibe, or a workplace which understands your family is priority - Xero is all of that and more.

Xero is an NZ Immigration Accredited Employer and Rainbow Tick certified too.

Please include a cover letter in your application, telling us why you’re a great fit for this position.
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.