Our artificial intelligence (AI) project

We’re building a language processer that will overcome barriers between help-seekers and legal help providers in Australia.

Why we’re building AI-based legal technology

Research shows that when people search for legal help online, they often struggle to correctly articulate their legal problem. This poses a major challenge when connecting them with the right information and services, and helping them access justice.

This issue is growing as demand for legal help increases each year, particularly as a result of the impacts of COVID-19.

Our AI project hopes to make it easier to connect people with legal help, and also extends beyond Justice Connect’s own services to strengthen the sector’s ability to develop information and connect with people with the right support.

Stretched funding is placing the community legal sector under pressure – creating barriers for lawyers to provide efficient and effective legal help without having to manually determine every single legal problem.

Investing in tools and resources is rendered useless unless we can build them to reach and serve the people who need them.

Justice Connect, alongside our funding partners and AI experts, are tackling this challenge.

Instead of expecting people to skill themselves with the technical legal language to explain their problem, we are building natural language processing AI to help assist with the accurate diagnosis of legal issues and empower people to find the right legal services quickly and easily.

 

The result of this project will be an AI-driven tool that can understand people’s everyday language and correctly diagnose their legal problem. Our hope is to share this technology at no cost with other legal service organisations across Australia, cutting down the time it takes to triage legal enquiries while also serving as an additional assistance tool for volunteers and lawyers alike.  

We hope by building and sharing this technology, our sector will be better placed to understand and meet people’s legal needs.

How we’re doing it: inclusive design

Most AI-driven text classification models are problematically biased, performing substantially worse for under-represented or socially disadvantaged communities. Often language processers are only built using samples from key majority groups. Our project has been intentionally designed to address potential issues experienced by people from marginalised community groups by capturing the voices of different groups across the country.

This project was designed in response to the findings from a range of evaluations we have undertaken internally and externally with our sector peers.

We have actively incorporated recent ethical AI and inclusive technology best practice principles released by the Australian Human Rights Commission. The Human Rights Commission’s principles are focused on eliminating bias in decision-making AI algorithms and ensuring that AI includes human rights principles by design.

While our model is a natural language processing classifier rather than a decision-making algorithm, the risks identified by the Human Rights Commission and the recommended approaches to ameliorate those risks are equally relevant to our project.

   We’re working with different communities to train our model 

Building an unbiased AI model takes more effort, but is particularly important for the legal service sector. We know that the people who most need legal help are also the same groups that are at significant risk of biased digital systems that don’t perform as well for them.

We are actively seeking natural language samples from a diverse range of people including:

  • Older people
  • People with disability
  • People with mental health issues and chronic illnesses
  • People with HIV
  • First Nations people
  • People without tertiary qualifications
  • People from culturally and linguistically diverse communities
  • LGBTQIA+ people
  • People who have recently migrated to Australia

Contribute to our project

Contribute as an organisation

If you would like to support our project by contributing an un-edited sample of your client describing a legal problem they’ve experienced in their own words, please use this simple form to upload a text or sound file. Please upload the language sample/s in English.

Any data that is shared with Justice Connect will only be used for the purposes of training our AI language processor tool. All data is stored securely, and is not shared with any other parties outside of the project. We will de-identify and anonymise every language sample prior to it being legally annotated. Our upload form does not ask any personal questions about which organisation you work for, so samples cannot be traced back to the organisation and your anonymity will be protected.

Read our privacy policy at justiceconnect.org.au/privacy.

Our Not-for-profit Law program has free legal resources on privacy law for charities and community organisations at nfplaw.org.au/privacy.

Upload a sample

  Please note that these samples do not qualify as an application for legal help. To make a referral for someone to access legal help, please make an application.

Spread the word

If you would like to support our project by helping us spread the word, you can:

  • Download a poster to print and display in your community or workplace.
Justice Connect’s artificial intelligence (AI) project poster (March 2022)Download PDF (2 MB)
  • Download a graphic to share on social media.

Justice Connect’s artificial intelligence (AI) project social media graphic (March 2022)Download PNG (467 KB)

The most effective language samples are a true and unedited transcript. This research will help us understand the way people from diverse backgrounds use syntax, grammar, shorthand, slang to describe their problem.

Illustration showing a range of language samples of people explaining their legal issue in their own language.

Pro bono powered AI

Across 2020 and early 2021, we have been working to produce a proof-of-concept natural language processing AI model using annotated language samples from legal help seekers.

Having launched an online intake system in 2018, we now have thousands of samples of natural language text describing legal problems.

A key challenge in natural language-based AI projects is generating a training data set. In our project, we need legal professionals to annotate language samples – a potentially expensive exercise. However, we’ve partnered with pro bono lawyers to achieve expert annotation at huge volumes.

Our in-house Digital Innovation team have built a Training AI Game (TAG) that presents language samples to participating pro bono lawyers and asks them to annotate the samples in several ways and multiple times to ensure accuracy. The samples can then be exported in an annotated format and provided to the University of Melbourne team helping to train our AI model.

By September 2021, we had onboarded 245 lawyers who collectively made over 90,000 annotations to 9,000+ natural language samples uploaded to our TAG tool.

Photograph showing two people looking at a computer screen that shows our in-house Training AI Game (TAG)

Our research partners

To deliver this ambitious project, we’ve partnered with Australian academics at the University of Melbourne School of Computing Science who specialise in artificial intelligence. Professor Tim Baldwin, Director of the ARC Centre in Cognitive Computing for Medical Technologies, has been working closely with us to plan and build the AI model.

Read more about our approach to digital innovation