Use data to make strategic decisions.
GEICO Technology Solutions vision and strategy for modern data, analytics and artificial intelligence/machine learning (AI/ML) to provide an end-to-end ecosystem for data storage, ingestion, transformation, analytics and AI/ML, To leverage data and AI/ML to continuously improve our product, claims, services and all other aspects of our business, to be able to most efficiently turn data into actionable insights, execution plans and profitable growth. To enable this vision The Data Ingestion team of Data Security & Infrastructure (DSI) is seeking a highly motivated Data Engineer to start or continue an IT career in the GEICO Data Operations division.
In this role, you will:
- Team up with architects, scrum masters, leads, managers, and directors, you will work in an Agile environment to make the data on our Enterprise Data Platform accessible for the organization's needs.
- You will be working in a team with Data Architects and Analysts to build our next-generation data platform in Azure.
- You will be trailblazing to apply Software Development techniques such as Automated Testing and CI/CD to building data products.
- Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
- Utilize programming languages like Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Snowflake and Redshift
- Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
- Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
- You should be intellectually curious, have a solutions-oriented attitude, and enjoy learning new tools and techniques.
Experience & Skills:
3+ years of experience with designing, developing, implementing and maintaining solutions for Big Data or data warehouse system. 2+ Years of experience working in a cloud environment such as Azure, AWS or other private or public cloud
- Good experience with bringing data into a centralized data repository or manipulating the available data to build additional data sets for Analytics and Reporting purposes.
- Experienced with maintaining data quality throughout the lifecycle of the data.
- Experience with Data Modeling, source to target mapping, automated testing frameworks, CI/CD pipelines and task automation using scripting
- Experienced with working in Agile environment and end to end automation
- Developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components
- Strong working knowledge of SQL and the ability to write, debug and optimize SQL queries and ETL jobs to reduce the execution window or reduce resource utilization
- Data Engineering experience focused on batch and real-time data pipelines development, Data processing/data transformation using ETL tools, Snowflake and DBT
- Experience with Cloud Data Warehouse solutions experience (Snowflake, Azure DW, Redshift or similar technology in other private or public clouds);
- Complete software development lifecycle experience including design, documentation, implementation, testing, and deployment
- Bachelor’s Degree in a computer-related field or equivalent professional experience required
- At least 1 year of experience in data engineering using open source technology stack along with cloud computing (AWS, Microsoft Azure, Google Cloud)
- At least 1 year experience with designing, developing, implementing and maintaining solutions for data transformation projects.
- At least 1 years of advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Familiarity with Data Vault, DataBricks, Fishtown/DBT tool & Graph Databases
- At least 1 year experience working on real-time data and streaming applications (Spark Streaming or Kafka)
- At least 1 year of experience with Agile engineering practices
- At least 1 year Experience working with Cloud Data Warehouse solutions (i.e. Snowflake, Synapse, Redshift)
At GEICO, we make sure you have the support and resources to leverage and develop your skills, secure your financial future, and take care of your health and well-being. GEICO continually seeks to provide a workplace where everyone can be their authentic self. To help achieve this goal, we support associate-led Employee Resource Groups that foster a true sense of community. Through GEICO’s competitive benefits offerings and various training and development opportunities, we have you covered with our Total Rewards Program* that includes:
- Premier Medical, Dental and Vision Insurance with no waiting period**
- Paid Vacation, Sick and Parental Leave
- 401(k) Profit Sharing Plan
- Tuition Assistance including Direct Billing and Reimbursement payment plan options
- Paid Training, Licensures, and Certificates
*Benefits may be different by location. Benefit eligibility requirements vary and may include length of service.
**Coverage begins with the pay period after hire date. Must enroll in New Hire Benefits within 30 days of the date of hire for coverage to take effect.
GEICO is proud to be an equal opportunity employer. We are committed to cultivating an environment where equal employment opportunities are available to all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO celebrates diversity and believes it is critical to our success. As such, we are committed to recruit, develop and retain the most talented individuals to join our team.