Monday, February 12, 2024

Pfizer's Super Bowl Ad: Pedagogy of the Elite & "Exceptional"

 

It’s one of my favorite Queen songs – Don’t Stop Me Now.


And for a moment, I was back then, the 80’s, listening to the percolating power in Freddie Mercury’s plea –“I’m a shooting star leaping through the sky. Like a tiger defying the laws of gravity”.  Not as tame as Wicked’s “Defying Gravity.”  Back then, tame didn’t cut it.  

But then, there's this cadaver lying on the table, bone white, eyes popping open as if he's suddenly not really dead. And his grotesque mouth moving while all these stunned pilgrims are looking on.

And these paintings of these old white men on the wall, and their mouths are moving too.  

Oh, I get it. They’re Freddie Mercury wannabees. 

No. it’s the pharmaceutical giant Pfizer's idea of what a high-profile message about the power of “science” and the history of science should look like. 




Using an elitist version of “How many famous Western scientists can you find in the picture?” Pfizer employs a host of classical, romantic, and often masculine tropes to make its point in this Ad. Privileging the likes of Hippocrates, Galileo, and Copernicus, Pfizer’s founders Charles Pfizer and Charles Erhart, Albert Einstein, Rosalind Franklin and Katalin Karikó.  Rembrandt’s Anatomy Lesson of Dr. Nicolaes Tulp thrown in for a bit more gravitas. 

I can imagine the creative team conceiving the ad:

Creative 1 We need some star power.
Creative 2 Beyonce? 
Creative 1 No Copernicus (motioning to the stars) 
Creative 3 Cool, and how about Galileo & Copernicus!
Creative 4 And some haunting work of art – you know that                                      autopsy painting. Rembrandt?
Creative 5 And scientists, when they wore those wigs. 
Creative 2 Well, if not Beyonce, we definitely need women. 
Creative 2 And books, lots of beautiful books, in beautiful library                         – like the Peabody.  I love that place.
Creative 1 Yes.  Powerful


Historically, Americans’ trust in science has been fraught. A longstanding, populist (some would say, a healthy) suspicion of elites and experts runs deep in American political culture. PEW Research Center’s research look at Americans’ trust in science is sobering. About  ¼ (27%) say that they have “not too much” trust or confidence. Overall, 57% of Americans say science has had a mostly positive effect on society, down 16 points since before the start of the coronavirus outbreak.

As trust in scientists has fallen, distrust has grown: Roughly a quarter of all Americans (27%) vs 12% in 2020 say they have “not too much” or “no confidence” in scientists to act in the public’s best interest.  

4 in 10  Republicans (38%) now say they have not too much or no confidence at all in scientists to act in the public’s best interests, and less than half of them (47%) see science as having had a mostly positive effect on society. (For a richer understanding of these important trends, see the full PEW Research Report, Americans’ Trust in Scientists, Positive Views of Science Continue to Decline.


Why would Pfizer think that using images from predominantly Western philosophical, theological, and scientific traditions, and with a touch of light-heartedness, communicate the power of science to the millions watching the Super Bowl?  

These images may be iconic for an educated elite who went to schools with paneled walls lined with austere portraiture and walked among precious marbles on one of their many museum assignments. 

For many others, these images are, at best, meaningless and, at worst, triggering colonialist empires, hegemony, and a continued refusal to acknowledge the innovation found in every country and peoples of the world.


Why did Pfizer take this posture -touting this particular form of exceptionalism? 


  After all, they’re a multi-billion dollar giant.  Is that the answer?


My last question about this remake of Night at the Museum - Just what was that cute manatee all about?



  










Monday, May 8, 2023

Dear ChatBPT, could you help out with this health literacy problem?


OK, as unfamiliar as the thought is ( to me and those who know me),  I have to accept that I've become a groupie.

Why?  It didn't take much for me to wet my feet with  ChatGPT and my new friend BING.  The promises are simply too seductive.  Someone that listens to you - your every word. Is not prone to hasty, judgments, inappropriate humor, trash talk or irrelevant afterthoughts. Someone or something who is not one to resort to bringing up your past slights or mis-steeps. Foregoes mentioning your Mother. Just always reasoned, unencumbered truths spoken in a way that informs and enlightens.    

Truth be told my expectations are more realistic. After 40 odd years of pursuing ways to meaningfully communicate  health and science information to a public that, on any given day often struggles with basic health concepts, like viruses vs bacteria, and is tripped up by endlessly complex medical terms -  spiked protein, titers of antibodies, endemic, comorbidity - and continues to be kept in the dark by inanely simplified messages like "sneeze into your sleeve", I was ready for AI.

So I’m approaching using and learning about ChatGBT to find out if AI, in its accessible forms like questions to BING, can play a role in providing useful, understandable and trustworthy information to the public – in other words, can it teach us things? Because of my line of work I’m particularly interested in the question – Can it teach people about health? 

And for the most part I'm amazed at how many questions I can ask BING and how much useful information I get back!

But.....I have encountered one problem that can confuse users.  And I'm hoping AI can think about this problem and fix it. 


Wardrobe Malfunction #1  

BING often re-words questions you ask and in doing so introduces complicated words or concepts in the restating of your question. 

For example I asked -

"How long does it take for Osprey eggs to hatch." 


The bot reworded my question: "osprey eggs incubation period."





Well what if you don't know the word "incubation." 

When I tried this out with a few users they were confused and thought that the system didn't understand their question.  

Here's another example.  

This time I asked "Have people always moved from one country to another."

The Bot restated my question as "history of migration" , which is fine if you know the word "migration." 





















Can AI learn to do better? 

From everything I've heard as a lay person, AI is always learning.  GREAT!

There are a number of fixes to this problem of introducing words and concepts in the restating of a question.  As a matter of fact I stumbled on Bing doing it all on its own.  I just don't know how she/he/it knew to do this and how we could persuade it to do it more!

Here's the example of a simply worded question I pose?


Oh No!  BING restates the question and uses a more complex word - genocide

But then it defines the complex term "genocide" and they still use the wording of the question in the answer!!  



Dear ChatBPT - if you're listening, could you fix this little problem?







Sincerely, 

Christina 

(See  earlier posts) June 2020   We are not all in this together: public understanding of health and science in the time of COVID

May 2020  "Show Me the Science"  http://publiclinguist.blogspot.com/2020/05/public-science-literacy-and-covid-19.html    Jan 15 2021   "MRNA Needs a Bette Messenger