Who we are:
We are a group of researchers and engineers working to help Twitter improve how we apply machine learning in a range of impactful systems, such as recommendations, safety, abuse detection, content understanding, and advertising. We investigate these systems at scale with the goal of anticipating, discovering, and mitigating any harmful impact they might have on our global community.
We believe in the power of bringing diverse perspectives together. Our team operates at the intersection of machine learning, the social sciences, policy, legal and user research in collaboration with numerous partners from across Twitter.
What you will do:
You will apply your research expertise to help understand the implications of automated decision systems, as well as related societal and representational harms. Work with the team to conceptualize difficult problems, devise measurement and audit methodologies, work toward effective interventions, and propose more inclusive and fair alternatives to existing practices.
You’ll partner with other META leaders (EM, PM, Sr Engineers) and work with our product teams, researchers, and engineers to build a high-value roadmap of innovation for your organization. You will lead by example to build your team culture in keeping with Twitter’s culture. You will partner with Engineering Managers in multiple organizations across Twitter and guide engineers with your technical expertise by contributing to design reviews and ideation. You will help mentor, coach and grow people.
You will lead research projects to enable Twitter to better apply machine learning on its platform in a way that benefits all our customers and society at large. You will be contributing to strategic decisions and future roadmaps for products and technologies at Twitter.
Who you are:
You have a broad knowledge of machine learning within the larger societal context, as well as deep domain expertise in a relevant application area. You have a track record of impactful publications in top-level venues. You are passionate about applying your skills to difficult problems and drive real-world impact. You are a recognised expert in your field and your colleagues seek your advice and mentorship. You have an established reputation in the research community and have presented at conferences and workshops. You proactively identify problems as well as opportunities, formulate machine learning problems and drive research to address them. You are able to communicate effectively with your partners and have the experience to guide and mentor others.
Post-graduate or PhD in a relevant field, including (but not limited to) Computational Social Science, Economics, Political Science, Public Policy, Sociology and Computer Science.
2-3+ years machine learning research experience broadly construed, and at least a few years of research experience in ML fairness.
1- 3 years of independent research career, e.g. as faculty in a university or senior researcher in the industry
Ability to mentor and lead others
Excellent communication skills, both with technical and non-technical audiences
Strong theoretical grounding in core machine learning concepts and techniques
Proven ability to translate research into practical outcomes
Evidence of independence, originality and creativity in research
Excellent publication record in top conferences in the ML fairness or related fields
Recognition in the research community, as evidenced by e.g. invited keynote talks, program committee, or editorial board membership
Strong proficiency in Python or R or SQL
Nice to haves:
Experience as an academic faculty member
Experience in an industrial research lab or a startup
Experience with student supervision (and ideally successful graduates in the industry/academia)
Experience with large-scale systems and data, e.g. Hadoop, distributed systems
Twitter is what’s happening and what people are talking about right now. For us, life's not about a job, it's about purpose. We believe real change starts with conversation. Here, your voice matters. Come as you are and together we'll do what's right (not what's easy) to serve the public conversation.