YouGov is an international research and data, analytics group.
Our mission is to supply a continuous stream of accurate data and insight into what the world thinks so that companies, governments and institutions can better serve the people and communities that sustain them.
We have the best data and the best tools. We continuously challenge conventional approaches to research, and we disrupt our industry to ensure that our clients always get the best solutions.
We are driven by a set of shared values. We are fast, fearless and innovative. We work diligently to get it right. We are guided by accuracy, ethics and proven methodologies. We trust each other and bring these values into everything that we do.
Each day, our highly engaged proprietary global panel of over 8 million people provides us with thousands of data points on consumer opinions, attitudes and behaviour. We combine this continuous stream of data with our research expertise to provide insights that enable intelligent decision-making and informed conversations.
With operations in the UK, North America, Mainland Europe, the Nordics, the Middle East and Asia Pacific, YouGov has one of the world’s largest research networks.
You will help build some of the largest and most innovative research data products in the world. YouGov has 20 years of historical survey data including billions of data points across dozens of countries. In this position, you will own how we structure, standardize, process, and analyze that data internally — so that we can then build cutting edge data products for our external customers. You’ll be building data models for our users for the years to come.
If you think of yourself as a “data generalist” or a “data wrangler,” this position is a perfect fit for you. This isn’t a position for a pure statistician, data modeler, or a machine learning specialist.
Every day will be very different than the day before. An example day might be something like this:
The night before, a user opened a ticket with a puzzling question about some metadata. Your morning starts with data detective work to identify the issue, and you discover an anomaly. You realize there’s a historical inconsistency in some metadata, so you write code in Python or R to parse out that metadata, clean it, and standardize it. Then you update ETLs to Redshift to reflect the new metadata, and you add functionality to the web-frontend tool you built to allow users to search through that metadata in a new way.
- Collect data from various data sources including survey instrument, APIs, and relational databases, including the tens of thousands of survey questions and billions of answers
- Perform exploratory data analysis
- Assess the effectiveness and accuracy of new data sources and data gathering techniques
- Develop scripts, workflows, and ETL pipelines for data pre-processing, cleaning and model inputs.
- Deploy statistical models developed by senior data scientists into production environments, with a focus on accuracy, speed, and efficiency.
- Build and maintain summary dashboards of key survey information for tracking purposes
- Develop and deploy processes and tools to monitor and analyze pipeline performance and data accuracy
Demonstrated Knowledge and Experience:
- 2+ continuous years of professional experience as a data engineer or analyst with an engineering focus
- Demonstrated experience working cross-functionally with engineering, product, and data science teams
- Experience with polling, surveys, and public opinion analysis preferred but not required
- Degree in a quantitative field of study
- Experience using Python and SQL; R helpful as well
- Experience using tools such as Pandas to manipulate data and draw insights from data sets rapidly
- Preferably, experience working as a data engineer/analyst in a cloud-based environment, using AWS (preferred), GCP or Azure
- Ideally experience building small tools or interfaces for less technical users to interact with data
- You’re resourceful
- You deal well with open ended problems (i.e., figuring out the right questions to ask about a data set) rather than being told exactly what to analyze or how to analyze it
- You consider yourself to be an excellent “data wrangler”
- A drive to learn and master new technologies and techniques
- A collaborative mindset
All your information will be kept confidential according to EEO guidelines.