About Nerdery and Being a “Nerd.”Nerdery is a digital product consultancy. Much more than consultants, we’re allies and guides on our clients’ digital journey – helping them to grow their business and delight their customers through intuitive, thoughtfully designed technology. As true partners, we prepare our clients for the opportunities in front of them, help them achieve their goals, and quickly deliver value for their customers. We do this by solving problems in creative ways across strategy, design, and technology.
At Nerdery, we’re not defined by our job titles but by the impact we make. You’ll work directly and closely with some of the world’s best brands to help create innovative digital products that serve everyone. As Nerds, our insight, innovation, and expertise are celebrated, and our growth is not only encouraged but expected. Being a Nerd means stepping up and pushing the boundaries of what’s possible. We’re curious, fearless (well, not totally fearless – there are heights and spiders, after all), and always our authentic selves.
We are looking for a Lead Data Architect (Principal) to join our team! We invite you to check out the details below and consider whether becoming a Nerd is the next step in your career journey.
The Lead Data Architect is a customer-facing, hands-on technical leader responsible for designing and delivering scalable, cloud-native data solutions on Google Cloud Platform (GCP). This role is pivotal in helping our customers leverage technology and data to drive efficiency, innovation, and competitive advantage.
As the lead technologist for all data capabilities, this individual will mentor and guide a team of data engineers, senior engineers, principals, and analysts, ensuring best practices in data strategy, analytics, governance, security, and performance optimization. As a member of the Technical Tiger Team, (Nerdery’s, highly skilled and agile team of experts who work together to address critical and complex technical challenges) this role will be at the forefront of innovation, developing proof-of-concept (POC) and prototypes that validate cutting-edge GCP solutions.
Requirements
- 12+ years of experience in data architecture, cloud data engineering, and analytics solutions.
- 7+ years of experience designing and implementing large-scale data solutions on Google Cloud Platform (GCP).
- Proven ability to lead technical teams, mentor engineers, and drive data strategy & governance.
- Strong experience in defining conceptual, logical, and physical data models. Experience with different data warehousing and data lake architectures
- Extensive experience in ETL/ELT, real-time data streaming, and big data processing frameworks.
- Strong experience in pre-sales support, client engagement, and solution architecture for data platforms.
- Proven ability to engage with clients, understand their business requirements, and translate them into technical solutions.
- Experience managing data-related projects from inception to completion.
- Deep knowledge of data security, compliance (GDPR, HIPAA, etc.), and performance optimization.
- Excellent communication skills, with the ability to articulate complex data concepts to both technical and non-technical stakeholders.
Technical Expertise Requirements:
- Core GCP Data Services:
- Data Warehousing & Databases: BigQuery, Cloud SQL, Spanner, AlloyDB, Memorystore.
- Data Integration & ETL: Cloud Dataflow (Apache Beam), Cloud Composer (Apache Airflow), Datastream (CDC).
- Analytics: BigQuery, CloudSQL: , Looker & Looker Studio, Connected Sheets, Pub/Sub.
- Preferred but not required: Bigtable, Firestore
- Data Engineering Fundamentals:
- ETL/ELT pipelines, real-time data streaming, big data processing frameworks.
- Data modeling (conceptual, logical, physical), data governance, security, and compliance.
What Things Will You Do As A Nerd?
- Architect & Strategize:
- Design and oversee scalable, high-performance cloud data solutions on GCP, including BigQuery, Vertex, Dataflow, Cloud SQL, Spanner, Bigtable, Firestore, AlloyDB, and more.
- Develop multi-phased cloud data strategies and implementation roadmaps tailored to client needs.
- Architect end-to-end data pipelines for structured, semi-structured, and unstructured data.
- Lead data modeling efforts, including conceptual, logical, and physical data models.
- Define and implement robust data governance, security, and compliance frameworks.
- Conduct in-depth research and analysis to recommend optimal technical approaches.
- Customer Engagement & Delivery:
- Engage with executive stakeholders and technical teams to translate business requirements into scalable GCP solutions.
- Gather technical requirements, assess client capabilities, and design cloud adoption strategies.
- Develop and deliver compelling POCs and prototypes showcasing the value of BigQuery, Cloud Composer, Dataflow, and other GCP Data technologies.
- Lead collaborative workshops and project meetings to ensure successful solution deployment.
- Design and implement cost effective and secure end to end cloud analytics solutions.
- Technical Leadership & Mentorship:
- Act as the technical authority for all data-related decisions across engineering, presales, and delivery teams.
- Provide hands-on mentorship to data engineers and solutions architects.
- Establish and enforce data standards, best practices, and frameworks.
- Foster a culture of innovation and continuous learning.
- Provide strategic guidance on cloud data adoption, modernization, and migration.
- GCP Data engineering services - BigQuery , Airflow/Cloud Composer, Dataflow
- Programming Language - Responsible for both hands on development of Python based ETL pipelines as well PRs and setting/maintaining high code standards within the team
- Data warehousing knowledge and good with SQL skills
- Working experience on data migration from On-premise to Cloud
- Experience in Data Migration from On-premises Database to Big query and experience in BQ conversion
- Experience and knowledge in building data pipelines and scheduling using Cloud Composer (Airflow) and data and file transformation using Python
- EDW (Enterprise Data Warehouse) and Data Model Designing
- Experience with Data modelling, Data warehousing and ETL processes
- Innovation & Prototyping:
- Be a core member of Nerdery’s Technical Tiger Team, rapidly prototyping and validating data architectures.
- Optimize data engineering pipelines, ensuring performance at scale.
What Skills Will Help You Be A Successful Nerd?
- Strong communication skills: Able to effectively explain technical decisions to non-technical stakeholders.
- Process improvement: Experienced in identifying process pain-points and taking ownership of refining processes to completion.
- Collaborative Problem Solver: Able to take initiative to understand a problem and make critical decisions to solve for next actionable steps.
- Ability to work in a fast-paced, dynamic environment.
Are We the Right Fit For You?
The best way to get the scoop on whether Nerdery is the right place for you is to chat with current Nerds. We would be delighted to have a conversation with you and share insight into what it’s really like to work at our organization and if it’s a place where you can thrive. Our interview process will provide you ample opportunity to talk with other team members and assess whether the role is a good fit for your next chapter. Take the first step and apply today – our Talent Advocates will then reach out to you to get the ball rolling!