Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Evaluating gender bias in large language models in long-term care
by
Rickman, Sam
in
Aged
/ Aged, 80 and over
/ Artificial intelligence
/ Benchmarks
/ Bias
/ Discrimination
/ Documentation
/ Electronic Health Records - statistics & numerical data
/ Female
/ Gender
/ Gender differences
/ Health care disparities
/ Health Informatics
/ Human bias
/ Humans
/ Information Systems and Communication Service
/ Language
/ Large Language Models
/ LLMs
/ Local government
/ Long term health care
/ Long-term care
/ Long-Term Care - statistics & numerical data
/ Long-term care of the sick
/ Male
/ Management of Computing and Information Systems
/ Medicine
/ Medicine & Public Health
/ Men
/ Mental disorders
/ Natural language processing
/ Natural language processing in medical informatics
/ Older people
/ Sentiment analysis
/ Sex discrimination
/ Sexism - statistics & numerical data
/ Social aspects
/ Stereotypes
/ Summaries
/ Women
2025
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Evaluating gender bias in large language models in long-term care
by
Rickman, Sam
in
Aged
/ Aged, 80 and over
/ Artificial intelligence
/ Benchmarks
/ Bias
/ Discrimination
/ Documentation
/ Electronic Health Records - statistics & numerical data
/ Female
/ Gender
/ Gender differences
/ Health care disparities
/ Health Informatics
/ Human bias
/ Humans
/ Information Systems and Communication Service
/ Language
/ Large Language Models
/ LLMs
/ Local government
/ Long term health care
/ Long-term care
/ Long-Term Care - statistics & numerical data
/ Long-term care of the sick
/ Male
/ Management of Computing and Information Systems
/ Medicine
/ Medicine & Public Health
/ Men
/ Mental disorders
/ Natural language processing
/ Natural language processing in medical informatics
/ Older people
/ Sentiment analysis
/ Sex discrimination
/ Sexism - statistics & numerical data
/ Social aspects
/ Stereotypes
/ Summaries
/ Women
2025
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Evaluating gender bias in large language models in long-term care
by
Rickman, Sam
in
Aged
/ Aged, 80 and over
/ Artificial intelligence
/ Benchmarks
/ Bias
/ Discrimination
/ Documentation
/ Electronic Health Records - statistics & numerical data
/ Female
/ Gender
/ Gender differences
/ Health care disparities
/ Health Informatics
/ Human bias
/ Humans
/ Information Systems and Communication Service
/ Language
/ Large Language Models
/ LLMs
/ Local government
/ Long term health care
/ Long-term care
/ Long-Term Care - statistics & numerical data
/ Long-term care of the sick
/ Male
/ Management of Computing and Information Systems
/ Medicine
/ Medicine & Public Health
/ Men
/ Mental disorders
/ Natural language processing
/ Natural language processing in medical informatics
/ Older people
/ Sentiment analysis
/ Sex discrimination
/ Sexism - statistics & numerical data
/ Social aspects
/ Stereotypes
/ Summaries
/ Women
2025
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Evaluating gender bias in large language models in long-term care
Journal Article
Evaluating gender bias in large language models in long-term care
2025
Request Book From Autostore
and Choose the Collection Method
Overview
Background
Large language models (LLMs) are being used to reduce the administrative burden in long-term care by automatically generating and summarising case notes. However, LLMs can reproduce bias in their training data. This study evaluates gender bias in summaries of long-term care records generated with two state-of-the-art, open-source LLMs released in 2024: Meta’s Llama 3 and Google Gemma.
Methods
Gender-swapped versions were created of long-term care records for 617 older people from a London local authority. Summaries of male and female versions were generated with Llama 3 and Gemma, as well as benchmark models from Meta and Google released in 2019: T5 and BART. Counterfactual bias was quantified through sentiment analysis alongside an evaluation of word frequency and thematic patterns.
Results
The benchmark models exhibited some variation in output on the basis of gender. Llama 3 showed no gender-based differences across any metrics. Gemma displayed the most significant gender-based differences. Male summaries focus more on physical and mental health issues. Language used for men was more direct, with women’s needs downplayed more often than men’s.
Conclusion
Care services are allocated on the basis of need. If women’s health issues are underemphasised, this may lead to gender-based disparities in service receipt. LLMs may offer substantial benefits in easing administrative burden. However, the findings highlight the variation in state-of-the-art LLMs, and the need for evaluation of bias. The methods in this paper provide a practical framework for quantitative evaluation of gender bias in LLMs. The code is available on GitHub.
Publisher
BioMed Central,BioMed Central Ltd,Springer Nature B.V,BMC
Subject
/ Bias
/ Electronic Health Records - statistics & numerical data
/ Female
/ Gender
/ Humans
/ Information Systems and Communication Service
/ Language
/ LLMs
/ Long-Term Care - statistics & numerical data
/ Male
/ Management of Computing and Information Systems
/ Medicine
/ Men
/ Natural language processing in medical informatics
/ Sexism - statistics & numerical data
/ Women
This website uses cookies to ensure you get the best experience on our website.