Data Operations Engineer

Data Operations Engineer

This job is no longer open

About the Opportunity 

How often are you given the opportunity to build something from the ground up, with an abundance of resources at your disposal; to be part of a team of people accomplished in diverse scientific and engineering disciplines, focused on using the best of what lies at the forefront of technology to address complex, real-world problems that have a positive impact on potentially millions of peoples' lives? This is that kind of opportunity.  

We are seeking a thoughtful team-player with real passion for developing highly performant and well-thought-out solutions to join the rapidly growing Lokavant team as the Data Operations Engineer. Lokavant is looking for a seasoned Data Operations Engineer that has diverse experience working with data integration across multiple database systems and applications.  This is a very hands-on role that will focus on helping to design, build and deploy our schemas and ETL pipelines and support integrations between systems.  It also involves designing the API’s that will connect to the databases and deliver the data to the application stack.  Your primary stakeholders will be the data engineering team and the front-end development team.  This role will own the processes to maintain availability of the production systems both in the active production AWS regions but also in the disaster recovery region. 

Key Responsibilities 

  • Develop data models and database topologies for data engineering and data science 
  • Implement and support cloud-based databases and data warehouses 
  • Configure and optimize performance  
  • Create ETL pipelines between databases 
  • Code API interfaces to data models to support applications 
  • Perform and implement system backup and recovery capabilities  
  • Implement Disaster Recovery capabilities in separate geographic regions 
  • Build performance test frameworks for data access  
  • Implement RDBMS upgrades and maintenance  
  • Perform troubleshooting and problem analysis 

Minimum Requirements 

  • 7+ years of relevant experience managing relational database systems 
  • Experience coding in Python, Node.js, or other relevant scripting language 
  • Experience with Postgres RDBMS 
  • Experience with building SQL stored procedures and functions 
  • Experience with troubleshooting and optimizing query performance  
  • Experience writing ETL pipelines 
  • Experience working in an Agile software development environment  
  • Exceptional written and verbal communication skills  
  • Strong attention to detail and highly organized, with effective multi-tasking and prioritization skills  
  • Proactive, self-motivated, and self-directed, with the ability to learn quickly and autonomously  
  • Comfortable with ambiguity  
  • Superior problem-solving and troubleshooting skills  
  • Ability to work as part of a collaborative cross-functional team in a fast-paced environment  
  • Sincere interest in working at a rapidly changing start-up and scaling with the company as we grow  
  • 4-year bachelor's degree in relevant field  

Preferred (Nice-to-have) Qualifications 

  • Experience with AWS, Snowflake, Docker, and Flyway 
  • Experience in the life sciences industry 
  • Experience with healthcare data, ideally clinical/operational clinical trial data 
  • Knowledge of clinical data standards (e.g. CDISC, FHIR, HL7, etc.) 
  • Knowledge of e-clinical systems and technologies (e.g. EDC, CTMS, IRT, etc.) 

Salary Range: $115,000- $140,000 

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.