It is no secret that COVID-19 pandemic has touched our lives in so many ways. For health care providers, the burden has caused a strain on providing guidance to those with symptoms. The crisis has prompted researchers at IU Kelly School of Business to research on software applications that may relieve such a burden in the medical community.
Learn more about the devastating reality of the COVID-19 crisis from healthcare providers:
"The primary factor driving user response to screening hotlines -- human or chatbot -- is perceptions of the agent's ability," said Alan Dennis, the John T. Chambers Chair of Internet Systems at Kelley and corresponding author of the paper, "User reactions to COVID-19 screening chatbots from reputable providers." "When ability is the same, users view chatbots no differently or more positively than human agents."
The study will aim to help an anxious public get reliable information on COVID-19 while assisting medical providers to focus on treatments. The research focused on an online experiment of 371 participants who viewed a COVID-19 screening session consisting of a hotline agent and a user with mild to severe symptoms. The researchers then studied how chatbots carried themselves in delivering relevant information and how users viewed them.
"Chatbots are scalable, so they can meet an unexpected surge in demand when there is a shortage of qualified human agents," Dennis, Kim and their co-authors wrote, adding that chatbots "can provide round-the-clock service at a low operational cost.
Findings were published in Journal of the American Medical Informatics Association indicates that users viewed chatbots in positive light when compared to human agents. This was especially good newes for medical organizations struggling to meet screening services demands. Chatbots were seen as a potential tool to speed up user interaction with medical information even before the pandemic began.
"This positive response may be because users feel more comfortable disclosing information to a chatbot, especially socially undesirable information, because a chatbot makes no judgment," researchers wrote. "The CDC, the World Health Organization, UNICEF and other health organizations caution that the COVID-19 outbreak has provoked social stigma and discriminatory behaviors against people of certain ethnic backgrounds, as well as those perceived to have been in contact with the virus. This is truly an unfortunate situation, and perhaps chatbots can assist those who are hesitant to seek help because of the stigma."
"Proactively informing users of the chatbot's ability is important," the authors wrote. "Users need to understand that chatbots use the same up-to-date knowledge base and follow the same set of screening protocols as human agents. ... Because trust in the provider strongly influences perceptions of ability, building on the organization's reputation may also prove useful."
Source: Science Daily