Senior Data Engineer

Senior Data Engineer

This job is no longer open

About SecurityScorecard:

SecurityScorecard is the global leader in cybersecurity ratings, with over 12 million companies continuously rated, operating in 64 countries. Founded in 2013 by security and risk experts Dr. Alex Yampolskiy and Sam Kassoumeh and funded by world-class investors, SecurityScorecard’s patented rating technology is used by over 25,000 organizations for self-monitoring, third-party risk management, board reporting, and cyber insurance underwriting; making all organizations more resilient by allowing them to easily find and fix cybersecurity risks across their digital footprint. 

Headquartered in New York City, our culture has been recognized by Inc Magazine as a "Best Workplace,” by Crain’s NY as a "Best Places to Work in NYC," and as one of the 10 hottest SaaS startups in New York for two years in a row. Most recently, SecurityScorecard was named to Fast Company’s annual list of the World’s Most Innovative Companies for 2023 and to the Achievers 50 Most Engaged Workplaces in 2023 award recognizing “forward-thinking employers for their unwavering commitment to employee engagement.”  SecurityScorecard is proud to be funded by world-class investors including Silver Lake Waterman, Moody’s, Sequoia Capital, GV and Riverwood Capital.

About the Team

The Data Analytics Engineering team is responsible for managing our ratings data infrastructure, architecting and implementing business-critical data solutions and pipelines, and enabling data-driven decisions within the organization and for our customers.

About the Role

As a Senior Data Engineer in our Data Analytics and Engineering team, you will develop solutions to turn millions of data points into actionable insights, build complex yet robust data pipelines, drive data automation and constantly strive to evolve the data architecture and scalability. You will be working in a high-performance, fast-paced environment and contribute to an inclusive work environment.

Responsibilities

  • Data Integration and ETL Development 
    • Design, develop, and maintain complex data solutions that provide data to our Platform
    • Create ETL processes for data extraction, transformation, and loading (ETL) into target storage systems based on batch, micro-batch, or real-time ingestion needs using tools and technologies such as Scala, Spark, Databricks, Kafka, Clickhouse
  • Architecture and Design
    • Collaborate with architects and data engineers within and across teams to design scalable, efficient, and robust data pipelines based on technical best practices
    • Understand requirements, build business logic and ability to learn and quickly adopt for changing needs 
    • Design and deployment of analytical data models, demonstrating ownership from inception to delivery
  • Performance Optimization 
    • Identify and resolve performance bottlenecks in data integration processes and recommend best practices for optimization
  • Data Quality and Governance  
    • Implement data quality checks and data governance principles to maintain high data integrity
    • Monitor and resolve data issues promptly
    • Automate and improve processes to sustainably maintain the current features and pipelines.
  • Client Engagement 
    • Interact with Solution Architects and our customers to understand their data needs and provide technical guidance and solutions 
    • Communicate effectively with non-technical stakeholders to ensure clear project requirements and objectives

Required Qualifications:

  • BS/MS in computer science or equivalent technical experience, and must have worked in Data engineering space for at least 2+ years
  • Must have experience with NoSQL databases, preferably Cassandra / Scylla
  • Must have 4+ years experience in building and maintaining data pipelines Scala, Spark, Airflow, Hive, Presto, Redis
  • Must have 4+ experience with SQL databases, preferably Postgres
  • Must have 4+ years of development experience handling variety of data (structured/unstructured), data formats (flat files, XML, JSON, relational, legacy, parquet)
  • Must have experience with cloud environments, preferably AWS
  • Experience in developing batch and real time data streams to create meaningful insights and analytics
  • Should have experience with Kafka and event based processing
  • Clickhouse experience is a plus
  • Worked in Agile methodology

Benefits:

Specific to each country, we offer a competitive salary, stock options, Health benefits, and unlimited PTO, parental leave, tuition reimbursements, and much more!

SecurityScorecard is committed to Equal Employment Opportunity and embraces diversity. We believe that our team is strengthened through hiring and retaining employees with diverse backgrounds, skill sets, ideas, and perspectives. We make hiring decisions based on merit and do not discriminate based on race, color, religion, national origin, sex or gender (including pregnancy) gender identity or expression (including transgender status), sexual orientation, age, marital, veteran, disability status or any other protected category in accordance with applicable law. 

We also consider qualified applicants regardless of criminal histories, in accordance with applicable law. We are committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need assistance or accommodation due to a disability, please contact talentacquisitionoperations@securityscorecard.io.

Any information you submit to SecurityScorecard as part of your application will be processed in accordance with the Company’s privacy policy and applicable law. 

SecurityScorecard does not accept unsolicited resumes from employment agencies.      #LI-DNI

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.