Sunday, February 17, 2019

Health literacy is in the cracks


Here's  an example of a fairly simple sentence that isn’t so simple unless you have the needed, underlying health literacy - health concept.



If you have been exposed to measles and feel ill, stay home to help prevent the spread of the disease. Call your healthcare provider to ask about testing and advice. If you don’t have a healthcare provider, call your local health department. If you need help with getting access to health care, call the Family Health Hotline at 1-800-322-2588.


Let’s focus on the seemingly easy to read sentence:
If you have been exposed to measles and feel ill, stay home to help prevent the spread of the disease.


When you unpack the sentence you get:

1.    You have been exposed to measles.
2.    You feel ill.
3.    Stay home.
4.     Staying home will help prevent the disease from spreading.

This text (sentence/message) assumes a specific health literacy that the reader may not have.  The text assumes the reader knows the relationship between staying home and spreading a disease. The needed health literacy is found in the cracks - the small connective tissue  (phrases & clauses) of the sentence. In other words the reader has to know that measles is contagious and if you go to a provider waiting room (doctor, emergency room, clinic), you can spread the disease.

Try This 
     



If you have been exposed to measles and feel ill, stay home and call your provider.  If you go directly to your provider you can spread measles in the waiting room. Stay home, call your provider and you can help stop the spread of measles.






To watch a short educational video on "unpacking sentences" visit my website
Healthliteracylab.com  for a library of free lessons

"Unpacking Sentences" 
http://healthliteracylab.com/healthliteracy/lesson/unpacking-sentences-1/

Friday, February 15, 2019

Explaining Measles: Language Acting Badly














This is an image of the Clark County Washington Public Health website https://www.clark.wa.gov/public-health/measles-investigation














Language Acting Badly 




Clark County Public Health is urging anyone who has been exposed and believes they have symptoms of measles to call their health care provider prior to visiting the medical office to make a plan that avoids exposing others in the waiting room. People who believe they have symptoms of measles should not go directly to medical offices, urgent care centers or emergency departments (unless experiencing a medical emergency) without calling in advance.




Language diagnosis:

          
         Unnecessarily long, complex sentence with multiply embedded clauses and phrases.
       
          who has been exposed
          and believes...
          prior to visiting...
          to make a plan...
          that avoids
Here are the individual statements/propositions of this complex sentence:

 1.  Clark County is urging you to do x ( something)
 2.  You are someone who has measles symptoms
 3.  You are someone that thinks you have been exposed to measles
 4.  You should call your provider before you go to a provider
 5.  You could expose other people to measles 
 6.  The people are in the waiting room. 
 7.  The provider will give you a plan to avoid exposing others to the measles

7 statements are way to ( or too) many to be crunched into one sentence if you're writing for clarity and ease of reading.  




How About This 




If you have symptoms of measles, or if you think you were exposed to measles, call your health provider first. Do not go to the provider until you call because measles can spread from person to person.  When you call the provider they will tell you the best way to keep other patients safe in the waiting room.  

  

        


Thursday, February 14, 2019

Hilton checkout screen - Say What?

I like Hilton as far as big chain hotels go. Good beds, clean rooms, helpful staff.  Good signage, nice pool and bar.

But what were they thinking when they posted these checkout instructions on my room TV!



“Upon” sounded better in the 18th century.  “When I gaze upon your face...”
“Retain” works best in the legal world. “ Retain this document for your records “

How about -
When you check out keep your room key.......

Sunday, January 20, 2019

Immoral, Obscene ALEXA



I was really struck with the hypocrisy and bigotry...and sexism that was the handling of the product produced by Osé,  debuted at the  CES Consumer Technology Innovation Show in Las Vegas last week.  The product -a sex toy for women, was stripped of an award at CES. Officials damned it as “immoral, obscene, indecent, profane or not in keeping with CTA’s image.   The simple reason it's made for women and their pleasure.


I find this blatant anti-women action even more absurd given that women - their voices, their bodies, their sensuality, their "femaleness" - or at least the traditional tropes for women - are used to sell AI ubiquitously - SIRI, ALEXA, Hanson Robotic's robot, Sophia...The high tech industry today is using femaleness but certainly not feminist in any way.

So, here's an update of a guest post by Roberta Duarte appearing earlier on Public Linguist.  

--------------------


In the past few years we've been experiencing a huge surge in conventional versions of artificial intelligence assistants. Whether it’s Apple’s Siri, Amazon’s Alexa, or Microsoft’s Cortana, AI machine ability to interact with its users is becoming an ordinary and sometimes integral part of everyday life. 

Through voice queries and natural language aptitude, they answer questions, make recommendations, and perform actions for us. These computerized personalities can be deemed almost human-like. The more human we make them, the more important it seems that we give them names, personality and —more worryingly — gender.


But robotic assistants don't have a gender, really. Strip them of the names and voices added by their human creators, there’s nothing there that requires it to be ‘he’ or ‘she’, other than our own assumptions. Yet many indeed do have names, and a disproportionate number of them are sold to the public as ‘female’.







Sure, you can change Siri to a different gender and even a different accent. But presently AI assistants seem to default to a female persona. 

To a certain extent, us humans,are led by our assumptions and biased truths. As we move into a new age of automation, the technology being created says an uncomfortable amount about the way society understands both women and work.



Assigning gender to these AI personalities may be saying something about the roles we expect them to play. Virtual assistants like Siri, Cortana and Alexa perform functions historically assigned to women.




Society has long implemented the female role to administrative, accommodating and aid lending positions. Assistant and secretary positions are especially stratified female. With its roots in early 20th century industrial revolution, the employment of secretaries quickly became women's work as companies realized they could pay women lower wages.






In fact, the preponderance of anticipated work to be one day carried out by robots is currently undertaken by women and girls, for low pay or no pay at all. A report by ONS quantifies the annual value of the “home production economy” — the housework, childcare and organizational chores done largely by women — at 1 trillion, almost 60% of the “official” economy. From nurses, secretaries, and sex workers to wives and girlfriends, the emotional labor that keeps society running is still feminized — and still stigmatized.

It is no mistake the face of AI is female. Always ready and predisposed, these technologies take on distilled and idolized femininity. Your AI is always working, always available, always ready at any minute to provide assistance with a positive attitude. The gendering of AI is purposely linked our to culturally underlying sexism. Customers interpret these AI personalities through the lens of their own biases. Stereotypes about women in service roles make female AIs easier to accept, which is the ultimate goal for tech companies that want to make AI mainstream.

The fast approaching world of competent faithful automated auxiliary is sadly all too susceptible to our long standing faulty presumptions of the female role in society. Right now, as we expect AI technology advancement to serve our personal organizational everyday needs, we need to be conscious of our outstanding biases. It is imperative to question why we feel the need to gender this innovative tool.




 Consider the artificially intelligent voices you hear on a regular basis. Your personal assisting device should be helpful, compliant, and do as you say. 
But should they also be female? 

Are technology companies catering to our desire for robotic assistants with personality, or are they reinforcing our biases about gender, and the roles that women play?