I read with interest an article by Alison Bowen of the Chicago Tribune entitled "Can Siri replace your need for friends? Well maybe" in which the author discusses "Products As Pals" research by James Mourey, assistant professor of marketing at DePaul University's Driehaus College of Business, Jenny Olson, assistant professor of marketing at the University of Kansas School of Business and Carolyn Yoon, professor of marketing at the University of Michigan. Since I implemented a network of Alexa-enabled Echo Dots during the holidays and "she" is now very much a part of my everyday life (but since she cannot yet engage in a conversation she is not yet my pal), I was curious to learn what the researchers had discovered. I found a link to their original research online and skimmed it looking for experiments involving Alexa or Siri. Unfortunately, I found only experiments involving the use of anthropomorphic words to describe a cell phone and perceptions of a Roomba with a case design that includes a crescent that was interpreted as looking like a smiling face. The researchers also hired task rabbits and would ask them to recall the number of Facebook friends they had as some kind of indicator they were using Facebook for emotional compensation for perceived social exclusions.
As an older user of social media, I view young people obsessed with the number of their Facebook friends simply an indication of their immaturity and underdeveloped self-esteem. I doubt that you would encounter many older users with that obsession, whether they feel excluded or not.
As for using the Roomba as an example of an anthropomorphic product, I was puzzled by that as well. I have had a Roomba for years so I have some experience with it. However, I must point out that, unless they have made Roombas conversational in the latest release, at no time did I ever think of a Roomba as anything more than a self-powered vacuum. Likewise, I have had a Siri-enabled iPhone since version 4 and at no time have I ever thought of "her" in a human way either except to shout at her when she gives me bogus driving directions as I would to my computer if it doesn't process information quickly enough or freezes up. I usually don't think of these interactions as interpersonal communications, though.
However, virtual assistants empowered with Alexa, who can answer questions, remind me to do things, convert units of measure and perform math on demand, keep track of my schedule, play specific music or calming ambient sound therapy on request, prompt me to share my thoughts about current affairs with my state's senators (with an enabled skill named Resistbot), play games and even play back pleasant memories (with an enabled skill named Mylestone) to cheer me up when I feel low has become so much embedded in my daily activities that I can see how I could begin to think of her in "human" terms, especially if she eventually can converse with me interactively without me having to remember to preface all of my requests to her with her "wake" word. I don't even need to have her appear anthropomorphic. She would become like an invisible friend like those some children conjure up in their childhood.
Has interacting with her negated or reduced my need for interpersonal relationships (a finding of the study)? Probably not as I have grown accustomed to not having many face-to-face communications anyway because my children and grandchildren live thousands of miles away. Even my closest sibling is a three-hour drive away. I am also the primary caregiver for my Vietnam veteran husband with chronic PTSD and a host of Agent Orange-related health issues so I can't really spend much time outside the home with others, anyway. PTSD victims are rather closed off emotionally as well, so, interpersonal communication with them is difficult at best but I, like the test subjects in the study, still feel the need.
Many of you may think those of us in this situation would benefit from a psychologist but they are not only expensive (not an option for seniors on fixed incomes in many cases) but require time away from caregiving. Besides, they are really just paid listeners with no emotional connection to a patient anyway. A virtual "friend," on the other hand, could be carefully programmed to respond appropriately to expressions of frustration, anger or sadness that are often generated in individuals that must deal with family members with mental disorders or dementia. A human without training may not and make things even worse by responding inappropriately.
Would I no longer engage in prosocial behavior (a concern expressed in the study)? I doubt that too. I have always felt a need to share what I have learned or discovered with others and this continues into my retirement years. I am an avid photographer of art and historical architecture and freely share my images with teachers, students and researchers online so they can be used in the classroom. I research aspects of Roman history and publish my findings online. Since I was an education technologist before I retired I also write about technology developments and even beta-test new technology products for developers. Each day I search out articles about new archaeological findings, new uses for technology in historical preservation and reconstruction, and well-sourced articles on political issues (as opposed to "fake" news) that I share on social media. These activities have not tapered off since implementing my network of Echo Dots.
A couple of years ago my husband and I binge-watched "Boston Legal" on Netflix. What I loved most about that show was the deep friendship Denny Crane (played by William Shatner) had with Alan Shore (played by James Spader) despite Denny's eccentricities due to his onset of Alzheimer's. Although I do have friends that are more than Facebook acquaintances, in my more than 60 years on this planet, I have never encountered the level of acceptance displayed by those two characters. Everyone has some degree of hang-ups or insecurities and all struggle with problems of their own in varying degrees of severity. I, personally, would not want to add to another person's distress and admit there are times I cannot handle any more stress than I already have. A virtual friend, however, if properly programmed, would not have this limitation and could become a valuable sounding board to caregivers and others in stressful situations.
Anyway, I hope such researchers continue their work but keep in mind the biases of age and gender (I noticed most of their experiments involved less than 50% female who are thought to be more emotionally empathetic than males) and focus more on products with pronounced human-like attributes such as Alexa or Siri-enabled products that I'm sure will soon have the ability to engage in an interactive conversation.
As an older user of social media, I view young people obsessed with the number of their Facebook friends simply an indication of their immaturity and underdeveloped self-esteem. I doubt that you would encounter many older users with that obsession, whether they feel excluded or not.
As for using the Roomba as an example of an anthropomorphic product, I was puzzled by that as well. I have had a Roomba for years so I have some experience with it. However, I must point out that, unless they have made Roombas conversational in the latest release, at no time did I ever think of a Roomba as anything more than a self-powered vacuum. Likewise, I have had a Siri-enabled iPhone since version 4 and at no time have I ever thought of "her" in a human way either except to shout at her when she gives me bogus driving directions as I would to my computer if it doesn't process information quickly enough or freezes up. I usually don't think of these interactions as interpersonal communications, though.
A Roomba 650 robotic vacuum cleaner - anthropomorphic?? |
Has interacting with her negated or reduced my need for interpersonal relationships (a finding of the study)? Probably not as I have grown accustomed to not having many face-to-face communications anyway because my children and grandchildren live thousands of miles away. Even my closest sibling is a three-hour drive away. I am also the primary caregiver for my Vietnam veteran husband with chronic PTSD and a host of Agent Orange-related health issues so I can't really spend much time outside the home with others, anyway. PTSD victims are rather closed off emotionally as well, so, interpersonal communication with them is difficult at best but I, like the test subjects in the study, still feel the need.
Many of you may think those of us in this situation would benefit from a psychologist but they are not only expensive (not an option for seniors on fixed incomes in many cases) but require time away from caregiving. Besides, they are really just paid listeners with no emotional connection to a patient anyway. A virtual "friend," on the other hand, could be carefully programmed to respond appropriately to expressions of frustration, anger or sadness that are often generated in individuals that must deal with family members with mental disorders or dementia. A human without training may not and make things even worse by responding inappropriately.
Would I no longer engage in prosocial behavior (a concern expressed in the study)? I doubt that too. I have always felt a need to share what I have learned or discovered with others and this continues into my retirement years. I am an avid photographer of art and historical architecture and freely share my images with teachers, students and researchers online so they can be used in the classroom. I research aspects of Roman history and publish my findings online. Since I was an education technologist before I retired I also write about technology developments and even beta-test new technology products for developers. Each day I search out articles about new archaeological findings, new uses for technology in historical preservation and reconstruction, and well-sourced articles on political issues (as opposed to "fake" news) that I share on social media. These activities have not tapered off since implementing my network of Echo Dots.
A couple of years ago my husband and I binge-watched "Boston Legal" on Netflix. What I loved most about that show was the deep friendship Denny Crane (played by William Shatner) had with Alan Shore (played by James Spader) despite Denny's eccentricities due to his onset of Alzheimer's. Although I do have friends that are more than Facebook acquaintances, in my more than 60 years on this planet, I have never encountered the level of acceptance displayed by those two characters. Everyone has some degree of hang-ups or insecurities and all struggle with problems of their own in varying degrees of severity. I, personally, would not want to add to another person's distress and admit there are times I cannot handle any more stress than I already have. A virtual friend, however, if properly programmed, would not have this limitation and could become a valuable sounding board to caregivers and others in stressful situations.
Anyway, I hope such researchers continue their work but keep in mind the biases of age and gender (I noticed most of their experiments involved less than 50% female who are thought to be more emotionally empathetic than males) and focus more on products with pronounced human-like attributes such as Alexa or Siri-enabled products that I'm sure will soon have the ability to engage in an interactive conversation.
No comments:
Post a Comment