• Home
  • Politics
  • Health
  • World
  • Business
  • Finance
  • Tech
  • More
    • Sports
    • Entertainment
    • Lifestyle
What's Hot

Three Treatment Options To Consider

May 9, 2025

Microsoft Bans Employees From Using ‘Chinese Propaganda’ Chatbot

May 9, 2025

How Smart Mattresses Improve Sleep Quality For Couples

May 9, 2025
Facebook Twitter Instagram
  • Contact
  • Privacy Policy
  • Terms & Conditions
Monday, May 12
Patriot Now NewsPatriot Now News
  • Home
  • Politics

    Security video shows brazen sexual assault of California woman by homeless man

    October 24, 2023

    Woman makes disturbing discovery after her boyfriend chases away home intruder who stabbed him

    October 24, 2023

    Poll finds Americans overwhelmingly support Israel’s war on Hamas, but younger Americans defend Hamas

    October 24, 2023

    Off-duty pilot charged with 83 counts of attempted murder after allegedly trying to shut off engines midflight on Alaska Airlines

    October 23, 2023

    Leaked audio of Shelia Jackson Lee abusively cursing staffer

    October 22, 2023
  • Health

    Disparities In Cataract Care Are A Sorry Sight

    October 16, 2023

    Vaccine Stocks—Including Pfizer, Moderna, BioNTech And Novavax—Slide Amid Plummeting Demand

    October 16, 2023

    Long-term steroid use should be a last resort

    October 16, 2023

    Rite Aid Files For Bankruptcy With More ‘Underperforming Stores’ To Close

    October 16, 2023

    Who’s Still Dying From Complications Related To Covid-19?

    October 16, 2023
  • World

    New York Democrat Dan Goldman Accuses ‘Conservatives in the South’ of Holding Rallies with ‘Swastikas’

    October 13, 2023

    IDF Ret. Major General Describes Rushing to Save Son, Granddaughter During Hamas Invasion

    October 13, 2023

    Black Lives Matter Group Deletes Tweet Showing Support for Hamas 

    October 13, 2023

    AOC Denounces NYC Rally Cheering Hamas Terrorism: ‘Unacceptable’

    October 13, 2023

    L.A. Prosecutors Call Out Soros-Backed Gascón for Silence on Israel

    October 13, 2023
  • Business

    Microsoft Bans Employees From Using ‘Chinese Propaganda’ Chatbot

    May 9, 2025

    OpenAI CEO Warns: ‘Not A Huge Amount Of Time’ Until China Overpowers American AI

    May 9, 2025

    Trump Announces First Post-Tariff Trade Deal

    May 8, 2025

    Electric Vehicle Sales Nosedive As GOP Takes Buzzsaw To Biden’s Mandate

    May 7, 2025

    Tyson Foods Announces It Will Bend The Knee To Trump Admin’s New Rules

    May 7, 2025
  • Finance

    Ending China’s De Minimis Exception Brings 3 Benefits for Americans

    April 17, 2025

    The Trump Tariff Shock Should Push Indonesia to Reform Its Economy

    April 17, 2025

    Tariff Talks an Opportunity to Reinvigorate the Japan-US Alliance

    April 17, 2025

    How China’s Companies Are Responding to the US Trade War

    April 16, 2025

    The US Flip-flop Over H20 Chip Restrictions 

    April 16, 2025
  • Tech

    Cruz Confronts Zuckerberg on Pointless Warning for Child Porn Searches

    February 2, 2024

    FTX Abandons Plans to Relaunch Crypto Exchange, Commits to Full Repayment of Customers and Creditors

    February 2, 2024

    Elon Musk Proposes Tesla Reincorporates in Texas After Delaware Judge Voids Pay Package

    February 2, 2024

    Tesla’s Elon Musk Tops Disney’s Bob Iger as Most Overrated Chief Executive

    February 2, 2024

    Mark Zuckerberg’s Wealth Grew $84 Billion in 2023 as Pedophiles Target Children on Facebook, Instagram

    February 2, 2024
  • More
    • Sports
    • Entertainment
    • Lifestyle
Patriot Now NewsPatriot Now News
Home»Health»Do chatbot avatars prompt bias in health care?
Health

Do chatbot avatars prompt bias in health care?

June 6, 2023No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Do chatbot avatars prompt bias in health care?
Share
Facebook Twitter LinkedIn Pinterest Email
Credit: Pixabay/CC0 Public Domain

Chatbots are increasingly becoming a part of health care around the world, but do they encourage bias? That’s what University of Colorado School of Medicine researchers are asking as they dig into patients’ experiences with the artificial intelligence (AI) programs that simulate conversation.

“Sometimes overlooked is what a chatbot looks like—its avatar,” the researchers write in a new paper published in Annals of Internal Medicine. “Current chatbot avatars vary from faceless health system logos to cartoon characters or human-like caricatures. Chatbots could one day be digitized versions of a patient’s physician, with that physician’s likeness and voice. Far from an innocuous design decision, chatbot avatars raise novel ethical questions about nudging and bias.”

The paper, titled “More than just a pretty face? Nudging and bias in chatbots”, challenges researchers and health care professionals to closely examine chatbots through a health equity lens and investigate whether the technology truly improves patient outcomes.

In 2021, the Greenwall Foundation granted CU Division of General Internal Medicine Associate Professor Matthew DeCamp, MD, Ph.D., and his team of researchers in the CU School of Medicine funds to investigate ethical questions surrounding chatbots. The research team also included Internal medicine professor Annie Moore, MD, MBA, the Joyce and Dick Brown Endowed Professor in Compassion in the Patient Experience, incoming medical student Marlee Akerson, and UCHealth Experience and Innovation Manager Matt Andazola.

“If chatbots are patients’ so-called ‘first touch’ with the health care system, we really need to understand how they experience them and what the effects could be on trust and compassion,” Moore says.

See also  Drinking alcohol brings no health benefits, study finds

So far, the team has surveyed more than 300 people and interviewed 30 others about their interactions with health care-related chatbots. For Akerson, who led the survey efforts, it’s been her first experience with bioethics research.

“I am thrilled that I had the chance to work at the Center for Bioethics and Humanities, and even more thrilled that I can continue this while a medical student here at CU,” she says.

The face of health care

The researchers observed that chatbots were becoming especially common around the COVID-19 pandemic.

“Many health systems created chatbots as symptom-checkers,” DeCamp explains. “You can go online and type in symptoms such as cough and fever and it would tell you what to do. As a result, we became interested in the ethics around the broader use of this technology.”

Oftentimes, DeCamp says, chatbot avatars are thought of as a marketing tool, but their appearance can have a much deeper meaning.

“One of the things we noticed early on was this question of how people perceive the race or ethnicity of the chatbot and what effect that might have on their experience,” he says. “It could be that you share more with the chatbot if you perceive the chatbot to be the same race as you.”

For DeCamp and the team of researchers, it prompted many ethical questions, like how health care systems should be designing chatbots and whether a design decision could unintentionally manipulate patients.

There does seem to be evidence that people may share more information with chatbots than they do with humans, and that’s where the ethics tension comes in: We can manipulate avatars to make the chatbot more effective, but should we? Does it cross a line around overly influencing a person’s health decisions?” DeCamp says.

See also  How Mental Health Is Important For Students?

A chatbot’s avatar might also reinforce social stereotypes. Chatbots that exhibit feminine features, for example, may reinforce biases on women’s roles in health care.

On the other hand, an avatar may also increase trust among some patient groups, especially those that have been historically underserved and underrepresented in health care, if those patients are able to choose the avatar they interact with.

“That’s more demonstrative of respect,” DeCamp explains. “And that’s good because it creates more trust and more engagement. That person now feels like the health system cared more about them.”

Marketing or nudging?

While there’s little evidence currently, there is a hypothesis emerging that a chatbot’s perceived race or ethnicity can impact patient disclosure, experience, and willingness to follow health care recommendations.

“This is not surprising,” the CU researchers write in the Annals paper. “Decades of research highlight how patient-physician concordance according to gender, race, or ethnicity in traditional, face-to-face care supports health care quality, patient trust, and satisfaction. Patient-chatbot concordance may be next.”

That’s enough reason to scrutinize the avatars as “nudges,” they say. Nudges are typically defined as low-cost changes in a design that influence behavior without limiting choice. Just as a cafeteria putting fruit near the entrance might “nudge” patrons to pick up a healthier option first, a chatbot could have a similar effect.

“A patient’s choice can’t actually be restricted,” DeCamp emphasizes. “And the information presented must be accurate. It wouldn’t be a nudge if you presented misleading information.”

In that way, the avatar can make a difference in the health care setting, even if the nudges aren’t harmful.

See also  Google AI Chatbot Bard Would Flunk the SAT

DeCamp and his team urge the medical community to use chatbots to promote health equity and recognize the implications they may have so that the artificial intelligence tools can best serve patients.

“Addressing biases in chatbots will do more than help their performance,” the researchers write. “If and when chatbots become a first touch for many patients’ health care, intentional design can promote greater trust in clinicians and health systems broadly.”

More information:
Marlee Akerson et al, More Than Just a Pretty Face? Nudging and Bias in Chatbots, Annals of Internal Medicine (2023). DOI: 10.7326/M23-0877

Provided by
CU Anschutz Medical Campus


Citation:
Do chatbot avatars prompt bias in health care? (2023, June 6)
retrieved 6 June 2023
from https://medicalxpress.com/news/2023-06-chatbot-avatars-prompt-bias-health.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

avatars Bias care Chatbot health Prompt
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Microsoft Bans Employees From Using ‘Chinese Propaganda’ Chatbot

May 9, 2025

What Is The Best Option For Veterans Needing In-Home Care?

April 28, 2025

Understanding The Emotional And Physical Impact Of Elder Abuse On Senior Health

April 17, 2025

The Ancient Practice Helping Modern Women Reclaim Pelvic Health

April 17, 2025
Add A Comment

Leave A Reply Cancel Reply

Top Posts

Asia shares gain after Wall St rally as investors pin hopes on China stimulus

July 31, 2023

Doctor, Nurse Practitioner Pay Rising As Amazon, CVS And Walgreens Buy Providers

August 14, 2023

Scientists Create ‘Synthetic Human Embryo’

June 17, 2023

NWSLPA Demands FIFA Take Action to Correct ‘Offensive and Inauthentic’ Portrayal of ‘Players of Color’

March 31, 2023
Don't Miss

Three Treatment Options To Consider

Lifestyle May 9, 2025

The most common cause of hair loss in men is male androgenetic alopecia (MAA), otherwise…

Microsoft Bans Employees From Using ‘Chinese Propaganda’ Chatbot

May 9, 2025

How Smart Mattresses Improve Sleep Quality For Couples

May 9, 2025

OpenAI CEO Warns: ‘Not A Huge Amount Of Time’ Until China Overpowers American AI

May 9, 2025
About
About

This is your World, Tech, Health, Entertainment and Sports website. We provide the latest breaking news straight from the News industry.

We're social. Connect with us:

Facebook Twitter Instagram Pinterest
Categories
  • Business (4,112)
  • Entertainment (4,220)
  • Finance (3,202)
  • Health (1,938)
  • Lifestyle (1,629)
  • Politics (3,084)
  • Sports (4,036)
  • Tech (2,006)
  • Uncategorized (4)
  • World (3,944)
Our Picks

Forecast for the Masters: Water, Water Everywhere

April 7, 2023

SoFi Stock Falls After Student Loan Forgiveness Decision

June 30, 2023

McConnell’s Abrupt Freeze-Up Could’ve Been A Seizure Or Stroke—Or Just Dehydration, Experts Say

July 27, 2023
Popular Posts

Three Treatment Options To Consider

May 9, 2025

Microsoft Bans Employees From Using ‘Chinese Propaganda’ Chatbot

May 9, 2025

How Smart Mattresses Improve Sleep Quality For Couples

May 9, 2025
© 2025 Patriotnownews.com - All rights reserved.
  • Contact
  • Privacy Policy
  • Terms & Conditions

Type above and press Enter to search. Press Esc to cancel.