Sr. Data Engineer (Python+Snowflake+AWS)

Sr. Data Engineer (Python+Snowflake+AWS)

This job is no longer open

Where You’ll Work 

Andela is a network of technology leaders dedicated to advancing human potential. We help companies build high-performing distributed engineering teams by investing in the world’s most talented software engineers. 

Based globally and operating remotely, Andela is catalyzing the growth of tech ecosystems across the globe while solving the global technical talent shortage.

Andela is hiring for a top priority placement at a leading digital delivery service organization. Our partner is looking for stable, long-term talent to join them as a Senior Data Engineer (Python + Snowflake + AWS)

What You’ll Do

We are seeking a Data Engineer who is excited about creating innovative solutions to make life effortless for our customers! The kind of people we are looking for want to build optimized routing systems to efficiently deliver to our customers, create an end to end shopping experience that will delight our customers, devise warehouse management systems that enable us to always fulfill our customers’ needs, or design mobile and web applications that are joyful to use. In short, we are looking for people who are eager to help create for the future!

We are seeking a Data Engineer responsible for designing, building, and maintaining a data platform in order to support a world class customer experience. Your primary responsibility will be to continually create and improve a cloud-based data platform to support technical and business growth. A commitment to collaborative problem solving, elegant design, and a quality product is essential.

Immediately, we are migrating from Redshift to Snowflake and you’ll likely be heavily involved in that project.

Job Responsibilities

  • Promote and support client Engineering’s culture of inclusion and diversity
  • Participate in cross-functional projects in an agile environment 
  • Build, deploy, and maintain your own code 
  • Design, build, and maintain big data processing pipeline in real-time streaming and batch 
  • Configure, deploy, manage, and document data extraction, transformation, enrichment, and governance process in cloud data platforms, including AWS and Microsoft Azure 
  • Implement and monitor analytics to ensure the health of data
  • Support engineering and business analytics use cases

Job Requirement

  • Bachelor degree in Computer Science (or related field) or 4 Years of production experience 
  • Experience working with large datasets (terabyte scale and growing) and tooling
  • Experience developing complex ETL processes, including SLA definition and performance measurements 
  • Production experience with building, maintaining, improving big data processing pipelines 
  • Production experience with stream and batch data processing, data distribution optimization, and data monitoring 
  • Understanding the data lifecycle 

Required Skills:

  • Experience with Python
  • Experience with Snowflake
  • Experience with AWS or Azure

Bonus Skills:

  • Experience with Spark
  • Experience with Airflow
  • Experience Snowpipe
  • Experience with Scala
  • Experience with Node.js
  • Experience with Function Apps / AWS Lambda 

What You’ll Get

  • Competitive compensation
  • Opportunity to work with the brightest minds inside and outside of your field 
  • A chance to change the world for the better  

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.