Evolver Federal is seeking a Data Engineer/ETL Developer to join our team providing data engineer developing and maintaining solutions that migrate and transform, analyze, and display large and disparate data sets in a cloud-based solution. The candidate must be an innovator, work in a team environment and as an individual contributor, and successfully prioritize work efforts and meet strict deadlines while producing high-quality solutions for our government customer.
Responsibilities Include:
- Design, develop, and implement ETL processes using AWS Glue, Python, and Apache Spark.
- Maintain and optimize AWS Glue workflows and data catalog.
- Implement best practices for data extraction, transformation, and loading (ETL).
- Transform complex datasets into actionable insights using Python and Spark.
- Work closely with the data science team to ensure data quality and integrity.
- Develop AWS cloud formation scripts and Lambda functions.
- Manage data integration pipelines and automate workflows to support business requirements.
- Troubleshoot and resolve issues related to data infrastructure.
- Ensure performance optimization of the data queries by tuning AWS Glue and Spark jobs.
- Collaborate effectively with other team members including Data Scientists, Business Analysts, and Stakeholders.
Minimum Qualifications and Requirements
Basic Qualifications:
- 5 years of software engineering experience
- 3 years experience working with AWS Glue, Python, and Apache Spark in a professional environment.
- 3 years experience in relevant AWS Cloud services (such as S3, EC2, IAM, RDS, and Lambda) and understanding of cloud architecture.
- 3 years experience in writing, optimizing, and debugging complex ETL jobs.
- 3 years of experience in scripting languages like Python and knowledge of SQL.
- 3 years of experience with data modeling and understanding of different data structures.
Preferred Qualifications:
- 8 years of software engineering experience
- 5 years experience working with AWS Glue, Python, and Apache Spark in a professional environment.
- 5 years experience in relevant AWS Cloud services (such as S3, EC2, IAM, RDS, and Lambda) and understanding of cloud architecture.
- 5 years experience in writing, optimizing, and debugging complex ETL jobs.
- 5 years of experience in scripting languages like Python and knowledge of SQL.
- 5 years of experience with data modeling and understanding of different data structures.
- A general knowledge of index migrations, debugging and researching concepts are major pluses
- Must be aware of CI/CD pipelines and well-versed in using GitLab for creating required pipelines for CI/CD
- Understand log monitoring and analytics
- Experience Meeting both technical and consumer needs
- Experience testing software to ensure responsiveness and efficiency