Sr. Data Engineer

Sr. Data Engineer

This job is no longer open

We are looking for a savvy Data Engineer to join our growing team of analytics experts. You will be responsible for expanding and optimizing our data and data pipeline architecture, as well as critical data flows. Your background is one of an experienced data pipeline builder and data gatherer who enjoys optimizing data systems and building them from the ground up.

As the Data Engineer, you will support our Data Analysts and Data Scientists on data initiatives and will ensure that optimal data delivery architecture is consistent throughout ongoing projects. You are self-directed and comfortable supporting the data needs of multiple teams, systems and products. In addition, you are excited by the prospect of optimizing or even re-designing the company’s data architecture to support next generation architectures and processes.

To help with your efforts, you can expect a comprehensive technology stack, talented co-workers, an AWS-based analytics environment with a comprehensive array of on-line and off-line data and a maturing data science team developing algorithms to enhance business performance and the customer experience. Because of its dedication and efforts, Lands’ End is an Internet Retailer top 50 company, and a great part of your future.

Responsibilities

  • Create and maintain an optimal data pipeline architecture.
  • Assemble large, complex datasets that meet functional and non-functional business requirements.
  • Identify, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build reports for production, convert existing reports in various formats including SAS and upgrade to current open-source technologies.
  • Build the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using SQL and AWS Glue technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Keep the data separated and secure across national boundaries.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Share technical knowledge and mentor data engineering staff and others within the broader analytics and e-commerce communities.

Qualifications

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and datasets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytical skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large, disconnected datasets.
  • Working knowledge of message queuing and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • 5+ years of experience in a Data or Software Engineering role, who has attained a BS in Computer Science or related field. Applicable graduate work a plus.
  • You should also have experience using the following software and tools:
    • 5+ Years of Experience with Linux
    • 3+ Years of Experience with object-oriented/object function scripting languages: Python, Scala, Java 8, etc.
    • Experience with big data tools: Hadoop, Spark, Kafka/Kinesis, etc. 2+ years preferred.
    • Experience with relational SQL and NoSQL databases, including Oracle, Postgres/Netezza and Elasticsearch.
    • Experience with AWS cloud services: EC2, EMR, ECS/EKS, Redshift, Elasticache, Glue ETL preferred.
    • Experience with BI tools: Kibana, Tableau, etc.
    • Experience with RESTful API and Docker preferred.
    • Experience with Airflow and MLFlow.
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.