I attended an online talk this morning about how AI is going to transform health care.
Oh good!
I'm sure I'll hear more about how ChatBPT, and Claude and the rest of the consumer-facing LLM Apps can help patients access and use health information quickly.
At the start, the speakers even named my fav topic - Chatbots!
But instead, there was plenty of talking about -
- Workflow management
- Data governance
- Regulations on health data analytics
- Process Improvement in implementing new technology
- New organizational mindsets
Not one word about Chatbot - how readable and useable is the information for consumers.
Example,Yesterday I was reading about the new Covid vaccine and posed this question to ChatGPT and then Claude
Chris to ChatGPT
"How do I know this new Covid vaccine will not create mutations in my own genes?"
(Below an excerpt of Chat's much longer response)
It’s understandable to have concerns about vaccines and how they might affect your body. Here’s a simple explanation:COVID-19 vaccines, including the ones updated for new variants, do not change or interact with your own genes. Here’s how they work:
1. Vaccine Mechanism: Most COVID-19 vaccines (like the mRNA vaccines) teach your immune system how to recognize and fight the virus. They deliver a small piece of information (like the virus's spike protein) to your cells, which helps your body learn to respond if it encounters the actual virus.
2. No Gene Editing: The mRNA or other components in the vaccines do not enter your DNA. They don’t change your genetic material. After the vaccine does its job by teaching your immune system, the mRNA is broken down and eliminated by your body.
Then I asked Claude (Anthropic), got a very complicated response, and ask to have it simplified, (my highlighting of the analogy Claude used).
...
Full disclosure - I've been spending more time than I should be looking into how Chatbots might play a helpful role in communicating accurate, understandable information to the majority of the public who have difficulties reading and using health and science information.
What I've found
- They use medical/scientific/complex words but do not consistently anticipate that they need to be explained, repeated and reinforced.
- Sentence structures are often complex without support.
- use list-like chunks of sentences – no cohesion
- resort to analogies – usually not appropriate for adults.
Chatbots have some evolving to do.
No comments:
Post a Comment