Due to the temporary nature of the engagement, this position is not eligible for visa sponsorship.
The Role:
SoFi runs on data! We are seeking a highly motivated Senior Software Engineer to join our Data Platform team. As a Senior Software Engineer, you will work alongside our experienced team of data engineers and product managers to develop and maintain our cutting-edge data handling platform using Snowflake, dbt, Sagemaker, and Airflow. In this role you will be contributing to the long-term success of SoFi’s data vision by building out distributed systems and scalable data platforms.
As a Senior engineer on the Data Platform team at SoFi, you'll be tasked with building critical components and features. You will implement battle-tested patterns and interfaces, squash bugs, refactor code and continually grow as an engineer. The ideal candidate has a strong software engineering background and problem-solving ability along with cloud computing (AWS) and data engineering skill set with prior experience on technologies such as Snowflake, Airflow, dbt, Kafka, Spark, Dask, Python, and Tableau. Additionally, you will demonstrate SoFi’s core values by honing your skills as an effective communicator, showing personal responsibility, and setting ambitious goals. If you like working on problems with tangible and lasting impact, we would love to have you in our team!
What You’ll Do:
- Collaborate with cross-functional teams to understand data requirements and design scalable data solutions.
- Write high-quality, efficient, and scalable code to implement new features and functionality on the data platform.
- Participate in code reviews and provide feedback to other team members to ensure code quality and maintainability.
- Work with product managers and other stakeholders to understand user requirements and implement solutions that meet their needs.
- Participate in team meetings and contribute to discussions on technology, design, and implementation.
- Keep up-to-date with the latest developments in data engineering, and cloud technologies.
- Develop and optimize data pipelines using dbt and Airflow to ensure efficient data processing and data quality checks.
- Architect and implement data governance and metadata management solutions to maintain data integrity and compliance.
- Build a system to identify data quality issues and implement solutions to address them effectively.
- Utilize your proficiency in Python to develop custom scripts and tools to enhance data operations and automation.
- Mentor and guide junior team members, providing technical expertise and fostering a culture of continuous learning.
What You’ll Need:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- Minimum of 5 years of experience as a Software Engineer, with a focus on data engineering and data platform development.
- Extensive hands-on experience with Snowflake, AWS services, dbt, and Airflow.
- Strong understanding of data quality best practices and data governance principles.
- Proven track record in metadata management and data infrastructure design.
- Proficiency in Python for data manipulation, scripting, and automation.
- Excellent problem-solving skills and the ability to thrive in a fast-paced, collaborative environment.
- Strong communication skills to effectively work with diverse stakeholders and present technical concepts.
Nice to Have:
- Interest in personal finance
**Please note the following benefits only apply to full-time employees**
#LI-CA1
#LI-Remote