Senior Data Platform Engineer

Senior Data Platform Engineer

This job is no longer open
Upgrade is a fintech unicorn founded in 2017. In the last four years, over 15 million people have applied for an Upgrade card or loan, and we have delivered over $7 billion in affordable and responsible credit. Our innovative Upgrade Card combines the flexibility of a credit card with the low cost of an installment loan. Our latest offering, Rewards Checking, gives customers access to no-fee checking accounts with 2% cash back rewards on common everyday spending. Learn more about the team.
 
Upgrade has been named a “Best Place to Work in the Bay Area” by the San Francisco Business Times and Silicon Valley Business Journal 3 years in a row, and received “Best Company for Women” and “Best Company for Diversity” awards from Comparably. Upgrade has been included in the 2021 Inc. 5000 list of the fastest-growing private companies in America.
 
We are looking for new team members who get excited about designing and implementing new and better products to join a team of 750 talented and passionate professionals. Come join us if you like to tackle big problems and make a meaningful difference in people's lives.

This is a remote position based in the United States. At this time we are unable to consider international applicants for this role.

What You'll Do:

    • Design, architect, and maintain distributed fault tolerant data infrastructure that supports the needs of data pipeline, data warehouse and business intelligence engineers
    • Quickly gain a deep understanding of the business and how data flows through the organization and through the data engineering codebase while playing a key role in building an efficient and scalable data and reporting layer for the organization.  
    • Build and scale tools needed by data engineers to build awesome data pipelines to enrich our Enterprise Data warehouse.
    • Set up tools and processes for effective data loading, data loading, lineage tracking and monitoring and drive quality across data in the data warehouse.
    • Continuously improve our data infrastructure and stay ahead of technology.

What We Look For:

    • Solid understanding of core computer science including algorithms & data structures, operating systems, distributed systems, networking, and concurrent programming.
    • Experience scaling data environments with distributed batch and realtime systems and self-serve visualization environments.
    • High development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
    • Proficiency in writing object oriented code for data processing in Python and/or Java, including development of web services.
    • Experience building complex customized, highly scalable data pipelines with task orchestrators such as Airflow while building and maintaining the code base for data integration.
    • Experience building complex fault tolerant docker containerized batch data processing tasks using Python and SQL  distributed over a kubernetes cluster using distributed task frameworks such as celery.
    • Expertise producing and consuming data in real time from event driven microservices using streaming platforms like Kafka, Kinesis or RabbitMQ.
    • Experience with building and managing realtime and near time data replication systems from OLTP databases into OLAP databases.
    • A good understanding of cloud based columnar Data Warehouses/Data Lakes (Redshift and/or Snowflake) with distributed file systems such as S3, HDFS maintaining data in popular standard and  columnar compressed file formats.
    • Expertise in wrangling with 3rd party API’s to push/pull data.
    • Understand securely storing sensitive data in transit and at rest.
    • Excellent verbal and written communication skills – Ability to synthesize complex ideas and communicate them in very simple ways.
    • Highly analytical and detail-oriented.
    • Ability to troubleshoot and fix issues quickly in a fast-paced environment.

Strong Plus:

    • Worked with building web services using Flask, Django and/or FastApi
    • Knowledge of serverless data computing with Amazon lambda, iron.io, etc.
    • Financial services experience.
    • Reporting and data visualization skills.

What We Offer You:

    • Competitive salary and stock option plan. 
    • 100% paid coverage of medical, dental and vision insurance. 
    • Unlimited vacation. 
    • Learning stipend for personal growth and development. 
    • Paid parental leave.  
    • Health and wellness initiatives.
Interested in joining Upgrade but don't think this role is for you? Check out our careers page!

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.