Sensyne Health

Oxford, UK
51-200 employees
We combine clinical artificial intelligence technology and ethically sourced, anonymised patient data to help people everywhere get better care.

ETL Engineer

ETL Engineer

This job is no longer open

About Us:

At Sensyne Health we combine technology and ethically sourced patient data to help people everywhere get better care. To do this, we have created a unique partnership with the NHS that delivers a return to our partner Trusts and unlocks the value of clinical data for research while safeguarding patient privacy. Alongside this, we develop clinically validated software applications that create clinician and patient benefit while providing highly curated data. Our products include vital-signs monitoring in hospitals and patient-to-clinician apps to support self-care and remote monitoring of gestational diabetes and chronic diseases such as COPD and heart failure.

We use our proprietary clinical AI technology to analyse ethically sourced, clinically curated, anonymised patient data to solve serious unmet medical needs across a wide range of therapeutic areas, enabling a new approach to clinical trial design, drug discovery, development and post-marketing surveillance.

The Role:

An ETL Engineer will provide hands-on data engineering and best practice support across all Sensyne Health’s departments and customers. You'll be responsible for building and maintain ETL pipelines in SQL, Python, R and ensure performance and operations of the data infrastructure, data products and data APIs as well as designing and implementing new solutions for the data team and wider business. You'll provide subject matter expertise and keep up to date with emerging tools and technologies, pro-actively seeking improvements.

Responsibilities:

  • Lead the way in defining pipelines, architecture for all ETL data projects
  • Be hands-on, always developing, running and enhancing data pipelines
  • Review and manage the end-to-end ETL lifecycle including process management, data modelling, data warehouse architecture, ETL pipeline development and testing
  • Build objects in Azure to facilitate the flow of data across the business in the most efficient and sustainable method possible
  • Identify areas where improvements can be made, whether that is with applications, architectures, or the processes we use
  • Implement and maintain best practices for all software engineering
  • Liaise and work closely with other departments to meet both their business and technical ETL/data manipulation requirements
  • Data Lake, Big Data tooling and integration coordination/delivery
  • Provide technical leadership and mentor other ETL engineers

Essential:

  • Hands on and experience with SQL, R, and Python and experience with ETL pipelines in Azure
  • Experience and knowledge of Azure DevOps, MLOps, SSIS, Databricks, Data Factory, Synapse Analytics
  • Extensive knowledge in Microsoft Azure, SQL, SSIS & NoSQL DB, TDD
  • Advanced skills in ETL pipelines, data warehousing and analytics frameworks, as well as PostgreSQL and MySQL
  • Thought leader for data, and can present clear insights to both technical and non-technical stakeholders to support the value of data
  • Extensive experience identifying and solving problems in databases, data processes or data services as they occur
  • Experience and understanding of Data APIs
  • Experience of building infrastructure with ARM templates or equivalent
  • Experience of full life-cycle software development, to include AGILE, git, CI/CD
  • Experience operating, automating and supporting complex ETL processes/pipelines, software products and testing concepts alongside container orchestration with AKS, App Service, service fabric, etc.
  • Knowledge of optimisation technologies (profiling, indexing, routine maintenance, server configuration, file structures etc)
  • Dev/Test, Pre-prod/Prod, QA, etc. fault resolution
  • Knowledge, exposure and experience with Numpy, Pandas, Dask, Modin, Ray, Tidyverse, d(b)plyr
  • Experience building self-service tooling and workflows for Machine Learning and Analytics users
  • Knowledge, experience and strong leadership opinions of specific tooling (Hadoop, Kafka, Spark) to support technical architecture choices

Desirable:

  • Familiarity and experience with Medical data
  • Hands on experience with Mongo DB, Neo4js

Personal Qualities:

  • Communication: You are able to discuss technical issues at all levels of the business and provide clear presentations of technical work.
  • Technical: You will be a data geek! One who enjoys seeing value and insight derived from data; you will be a technology and cloud enthusiast, who embraces new ideas and processes, yet keeps a keen eye on delivery and providing value.

Skills and Qualifications:

Advanced IT Knowledge, Critical Thinking, Interpersonal Skills, Technological Analysis, Data Analytics, Big Data, Computational Skills, Excellent Written, Oral and Presentation Communication Skills

  • Company share option scheme
  • 5% employer matched salary sacrifice Pension scheme
  • Life Assurance & Income protection
  • A range of health, wealth and lifestyle benefit plans including BUPA, Gym and holiday trade options
  • Electric Vehicles & Cycle to work schemes
  • Proactive career development planning
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.