Data Scientist II - Algorithmic Justice Specialist

Data Scientist II - Algorithmic Justice Specialist

This job is no longer open

ABOUT THE JOB

The ACLU seeks applicants for the full-time position of Data Scientist II- Algorithmic Justice Specialist in the Analytics Department of the ACLU’s National office in New York, NY/ Remote*.

ACLU Analytics partners with teams across the organization to enable the ACLU to make smart, evidence-based decisions and bring quantitative insights on our issues to the courtroom and the public. Our team's work ranges from social science research for litigation & advocacy, to analysis & reporting for fundraising and engagement, to building and maintaining our data infrastructure. We strive to ensure the ACLU leads by example in the ethical use of data and technology. This includes maintaining our privacy and security standards, pushing for transparent data practices from government and corporate actors, and helping to steward high standards for algorithmic fairness, accountability, and transparency.

Reporting to the Director, Legal Analytics & Quantitative Research, the Data Scientist II- Algorithmic Justice Specialist will be a be part of the Legal Analytics & Quantitative Research team, and work with team members across Analytics as well as stakeholders on our Legal and Policy teams, and in the ACLU affiliates. You will also collaborate closely with other Data Analysts, Data Scientists and Data Journalists on the Analytics team.

* Note: this position can function remotely from a U.S. location on an interim basis and may be eligible for permanent remote work from a location within the U.S.

RESPONSIBILITIES

Below is a sampling of projects you can expect to dive into:

  • Evaluate algorithms and predictive models using a range of techniques to assess their impact on marginalized communities, especially but not limited to disparities based on race, gender, sexuality, disability status, immigration status, income, and housing status
  • Analyze disparities in policing, housing, criminal law enforcement, employment, and education, sometimes in contexts where algorithms are used or considered, and sometimes in contexts where they are not
  • Review local, state and federal policies and legislation related to data, algorithms, and predictive analytics, and provide recommendations from a technical lens on how to make those policies more fair, transparent, and accountable
  • Submit FOIA requests to obtain data and models from jurisdictions using predictive analytics for further analysis
  • Build models that "flip the script" on decision-makers to evaluate judges, prosecutors, and other players to identify bias in the criminal justice and immigration systems
  • Identify new areas of opportunity for algorithmic justice work
  • Engage in special projects and other duties as assigned
  • Center principles of equity, inclusion, and belonging in all work, embedding the values in program development, policy application, and organizational practices and processes

EXPERIENCE & QUALIFICATIONS

  • Extensive experience conducting descriptive analysis and using or evaluating predictive models in a research or applied setting
  • Experience evaluating the ethics of algorithm scoping, development, testing and/or deployment in a research or applied setting
  • Mastery in R or Python, and ability to rapidly learn new frameworks and methodologies
  • Ability to distill complex quantitative results and methodologies into plain language for lawyers, advocates, judges and the public
  • Familiarity with modern methods for assessing fairness in predictive models
  • Experience acting as a technical mentor or guide to junior colleagues and interns (e.g., acting as primary code reviewer in a team setting)
  • Commitment to the mission of the ACLU
  • Demonstrate a commitment to diversity within the office using a personal approach that values all individuals and respects differences in regards to race, ethnicity, age, gender identity and expression, sexual orientation, religion, disability and socio-economic circumstance
  • Commitment to work collaboratively and respectfully toward resolving obstacles and/or conflicts

PREFERRED QUALIFICATIONS

  • Knowledge of concepts such as AI/ML auditing, algorithmic impact assessments, explainability, transparency, recourse, robustness, human-ai interaction, and/or other topics in the AI ethics domain
  • Experience with algorithmic decision-making systems or predictive analytics in a governmental or industry setting
  • Experience conducting causal inference using natural experiments, administrative data, or other observational methods
  • Experience with criminal justice, immigration, or other public sector data

COMPENSATION

The annual salary for this position is $115,638 (Level F). This salary is reflective of a position based in New York, NY. This salary will be subject to a locality adjustment (according to a specific city and state), if an authorization is granted to work outside of the location listed in this posting. Note that most of the salaries listed on our job postings reflect New York, NY salaries, where our National offices are headquartered.

ABOUT THE ACLU

The ACLU dares to create a more perfect union – beyond one person, party, or side. Our mission is to realize this promise of the United States Constitution for all and expand the reach of its guarantees.

For over 100 years, the ACLU has worked to defend and preserve the individual rights and liberties guaranteed by the Constitution and laws of the United States. Whether it’s ending mass incarceration, achieving full equality for the LGBTQ+ community, establishing new privacy protections for our digital age, or preserving the right to vote or the right to have an abortion, the ACLU takes up the toughest civil liberties cases and issues to defend all people from government abuse and overreach.

Equity, diversity, and inclusion are core values of the ACLU and central to our work to advance liberty, equality, and justice for all. We are a community committed to learning and growth, humility and grace, transparency and accountability. We believe in a collective responsibility to create a culture of belonging for all people within our organization – one that respects and embraces difference; treats everyone equitably; and empowers our colleagues to do the best work possible. We are as committed to anti-oppression and anti-racism internally as we are externally. Because whether we’re in the courts or in the office, we believe ‘We the People’ means all of us.

The ACLU is an equal opportunity employer. We value a diverse workforce and an inclusive culture. The ACLU encourages applications from all qualified individuals without regard to race, color, religion, gender, sexual orientation, gender identity or expression, age, national origin, marital status, citizenship, disability, veteran status and record of arrest or conviction, or any other characteristic protected by applicable law. Black people, Indigenous people, people of color; lesbian, gay, bisexual, transgender, queer, and intersex people; women; people with disabilities, protected veterans, and formerly incarcerated individuals are all strongly encouraged to apply.

The ACLU makes every effort to assure that its recruitment and employment provide all qualified persons, including persons with disabilities, with full opportunities for employment in all positions.

The ACLU is committed to providing reasonable accommodation to individuals with disabilities. If you are a qualified individual with a disability and need assistance applying online, please email benefits.hrdept@aclu.org. If you are selected for an interview, you will receive additional information regarding how to request an accommodation for the interview process.

The Department of Education has determined that employment in this position at the ACLU does not qualify for the Public Service Loan Forgiveness Program.

This job is no longer open
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.