Senior Data Engineer

Senior Data Engineer

This job is no longer open

We are looking for a Senior Data Engineer to join our team in Business Technology as part of a core, centralized business intelligence function serving the entire organization.  In this role, you will be responsible for designing and developing scalable solutions for a large data infrastructure in a fast-paced, Agile environment. You will participate in detailed technical design, development, and implementation of applications using cutting-edge technology stacks.

Our main focus is on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact.

Responsibilities

  • Design scalable and reliable data pipelines to consume, integrate and analyze large volumes of complex data from different sources to support the growing needs of our business
  • Build a data access layer to provide data services to internal and external stakeholders
  • Analyze a variety of data sources, structures, metadata and develop mapping, transformation rules, aggregations and ETL specifications
  • Proactively develop architectural patterns to improve efficiency 
  • Interface with stakeholders to gather requirements and build functionality
  • Support and enhance existing data infrastructure
  • Build data expertise and own data quality for areas of ownership
  • Experiment with different tools and technology. Share learnings with the team
  • Contribute to the evaluation of new technologies such as Docker, AWS lambda, ECS etc. 

Requirements

  • BS in Computer Science, Engineering, or another quantitative field of study
  • 5+ years in a data engineering role
  • 3+ years in the data warehouse and data lake space
  • Experience with building distributed systems
  • Expertise in a programming language (Preferably Python or JAVA)
  • 5+ years of experience working with SQL. Strong knowledge of SQL capabilities and best practices.
  • 3+ years of experience with ETL tools such as Airflow, Oozie, Luigi or Informatica
  • 3+ years of experience with relational databases and columnar MPP databases like Snowflake, Athena or, Redshift
  • 3+ years of experience with database and application performance tuning
  • Experience with CI/CD tools and procedures such as Jenkins, Git, Chef, and Ansible
  • Experience with cloud infrastructure/platforms (AWS, Azure, Google Cloud Platform)
  • Familiar with Jira and Confluence and Agile methodologies
  • Experience with real-time data streaming using Storm, Kinesis or Spark
  • Experience building APIs using Java spring boot or other programming languages 
  • Familiar with data visualization tools such as Tableau, Looker, QlikView, MicroStrategy
  • Detail oriented, Innovative and ability to execute
  • Excellent oral and written communication skills, both technical and non-technical audience

Okta is an Equal Opportunity Employer

Okta is rethinking the traditional work environment, providing our employees with the flexibility to be their most creative and successful versions of themselves, no matter where they are located.  We enable a flexible approach to work, meaning for roles where it makes sense, you can work from the office, or from home, regardless of where you live.  Okta invests in the best technologies and provides flexible benefits and collaborative work environments/experiences, empowering employees to work productively in a setting that best and uniquely suits their needs.  Find your place at Okta https://www.okta.com/company/careers/.

By submitting an application, you agree to the retention of your personal data for consideration for a future position at Okta.  More details about Okta’s privacy practices can be found at: https://www.okta.com/privacy-policy.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.