Persuasive technology

Persuasive technology is broadly defined as technology that is designed to change attitudes or behaviors of the users through persuasion and social influence, but not through coercion.[1] Such technologies are regularly used in sales, diplomacy, politics, religion, military training, public health, and management, and may potentially be used in any area of human-human or human-computer interaction. Most self-identified persuasive technology research focuses on interactive, computational technologies, including desktop computers, Internet services, video games, and mobile devices,[2] but this incorporates and builds on the results, theories, and methods of experimental psychology, rhetoric,[3] and human-computer interaction. The design of persuasive technologies can be seen as a particular case of design with intent.[4]

Taxonomies

Functional Triad

Persuasive technologies can be categorized by their functional roles. B. J. Fogg proposes the Functional Triad as a classification of three "basic ways that people view or respond to computing technologies": persuasive technologies can function as tools, media, or social actors – or as more than one at once.[5]

  • As tools, technologies can increase people's ability to perform a target behavior by making it easier or restructuring it.[6] For example, an installation wizard can influence task completion – including completing tasks (such as installation of additional software) not planned by users.
  • As media, interactive technologies can use both interactivity and narrative to create persuasive experiences that support rehearsing a behavior, empathizing, or exploring causal relationships.[7] For example, simulations and games instantiate rules and procedures that express a point of view and can shape behavior and persuade; these use procedural rhetoric.[3]
  • Technologies can also function as social actors.[8] This "opens the door for computers to apply ... social influence".[9] Interactive technologies can cue social responses, e.g., through their use of language, assumption of established social roles, or physical presence. For example, computers can use embodied conversational agents as part of their interface. Or a helpful or disclosive computer can cause users to mindlessly reciprocate.[10]

Direct interaction v. mediation

Persuasive technologies can also be categorized by whether they change attitude and behaviors through direct interaction or through a mediating role:[11] do they persuade, for example, through human-computer interaction (HCI) or computer-mediated communication (CMC)? The examples already mentioned are the former, but there are many of the latter. Communication technologies can persuade or amplify the persuasion of others by transforming the social interaction,[12][13] providing shared feedback on interaction,[14] or restructuring communication processes.[15]

Persuasion design

Persuasion design is the design of messages by analyzing and evaluating their content, using established psychological research theories and methods. Andrew Chak[16] argues that the most persuasive web sites focus on making users feel comfortable about making decisions and helping them act on those decisions.

Persuasion by social motivators

Previous research has also utilized on social motivators like competition for persuasion. By connecting a user with other users,[17] his/her coworkers,[18] friends and families,[19] a persuasive application can apply social motivators on the user to promote behavior changes. Social media such as Facebook, Twitter also facilitate the development of such systems. It has been demonstrated that social impact can result in greater behavior changes than the case where the user is isolated.[20]

Persuasive strategies

Halko and  Kientz[21] made an extensive search in the literature for persuasive strategies and methods used in the field of psychology to modify health-related behaviours. Their search concluded that there are 8 main types of persuasive strategies, which can be grouped into the following four categories, where each category has two complementary approaches:

1 – Instruction style

Authoritative: Persuade the technology user through an authoritative agent. For example, a strict personal trainer who instructing the user to perform the task that will meet their goal.

Non-Authoritative: Persuade the user through a neutral agent, For example, a friend who encourage the user to meet their goals.

2 – Social feedback

Cooperative: Persuade the user through the notion of cooperating and teamwork. For example, allowing the user to team up with friends to complete their goals.

Competitive: Persuade the user through the notion of competing. For example, users can play against friends or peers and be motivated to achieve their goal by winning the competition.

3 – Motivation type

Extrinsic: Persuade the user through external motivators. For example,  winning trophies, as a reward for completing a task.

Intrinsic: Persuade the user through internal motivators.  For example, The good feeling a user would have for being healthy, or for achieving the goal.

4 – Reinforcement type

Negative Reinforcement: Persuade the user by removing an aversive stimulus. For example, turns a brown and dying nature scene green and healthy as the user conducts more healthy behaviors.

Positive Reinforcement: Persuade the user by adding a positive stimulus. For example,  adding flowers, butterflies, and other nice-looking elements to any empty nature scene, as the user conducts more healthy behaviors.

More recently, Lieto and  Vernero[22] have also shown that arguments reducible to Logical Fallacies are a class of widely adopted persuasive techniques in both web and mobile technologies.

Reciprocal equality

One feature that distinguishes persuasion technology from familiar forms of persuasion is that the individual being persuaded often cannot respond in kind. This is a lack of reciprocal equality. For example, when a conversational agent persuades a user using social influence strategies, the user cannot also use similar strategies on the agent.[23]

Health behavior change

While persuasive technologies are found in many domains, considerable recent attention has focused on behavior change in health domains. Digital health coaching is the utilization of computers as persuasive technology to augment the personal care delivered to patients, and is used in numerous medical settings.[24]

Numerous scientific studies show that online health behaviour change interventions can influence users' behaviours. Moreover, the most effective interventions are modelled on health coaching, where users are asked to set goals, educated about the consequences of their behaviour, then encouraged to track their progress toward their goals. Sophisticated systems even adapt to users who relapse by helping them get back on the bandwagon.[25]

Promote sustainable lifestyles

Previous work has also shown that people are receptive to change their behaviors for sustainable lifestyles. This result has encouraged researchers to develop persuasive technologies to promote for example, green travels,[26] less waste,[18] etc.

One common technique is to facilitate people's awareness of benefits for performing eco-friendly behaviors. For example, a review of over twenty studies exploring the effects of feedback on electricity consumption in the home showed that the feedback on the electricity consumption pattern can typically result in a 5–12% saving.[27] Besides the environmental benefits such as CO2 savings, health benefit, cost are also often used to promote eco-friendly behaviors.[26]

Research challenges

Despite the promising results of existing persuasive technologies, there are three main challenges that remain present.

Technical challenges

Persuasive technologies developed relies on self-report or automated systems that monitor human behavior using sensors and pattern recognition algorithms. Several studies in the medical field have noted that self-report is subject to bias, recall errors and low adherence rates. The physical world and human behavior are both highly complex and ambiguous. Utilizing sensors and machine learning algorithms to monitor and predict human behavior remains a challenging problem, especially that most of the persuasive technologies require just-in-time intervention.

Difficulty in studying behavior change

In general, understanding behavioral changes require long-term studies. As multiple internal and external factors can influence these changes, such as personality type, age, income, willingness to change and more. For that, it becomes difficult to understand and measure the effect of persuasive technologies.

Ethical challenges

The question of manipulating feelings and desires through persuasive technology remains an open ethical debate. User-centered design guidelines should be developed encouraging ethically and morally responsible designs, and provide a reasonable balance between the pros and cons of persuasive technologies.[28]

See also

Other subjects which have some overlap or features in common with persuasive technology include:

References

  1. Fogg 2002
  2. Oinas-Kukkonen et al. 2008
  3. 1 2 Bogost 2007
  4. Lockton et al. 2010
  5. Fogg 1998
  6. Fogg 2002, ch. 3
  7. Fogg 2002, ch. 4
  8. Reeves & Nass 1996, Turkle 1984
  9. Fogg 2002, p. 90
  10. Fogg 1997b, Moon 2000
  11. Oinas-Kukkonen & Harjumaa 2008
  12. Licklider 1968
  13. Bailenson et al. 2004
  14. DiMicco 2004
  15. Winograd 1986
  16. Chak 2003
  17. de Oliveira, Rodrigo; Cherubini, Mauro; Oliver, Nuria (2010-01-01). "MoviPill: Improving Medication Compliance for Elders Using a Mobile Persuasive Social Game". Proceedings of the 12th ACM International Conference on Ubiquitous Computing. UbiComp '10. New York, NY, USA: ACM: 251–260. doi:10.1145/1864349.1864371. ISBN 9781605588438.
  18. 1 2 Thieme, Anja; Comber, Rob; Miebach, Julia; Weeden, Jack; Kraemer, Nicole; Lawson, Shaun; Olivier, Patrick (2012-01-01). ""We'Ve Bin Watching You": Designing for Reflection and Social Persuasion to Promote Sustainable Lifestyles". Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI '12. New York, NY, USA: ACM: 2337–2346. doi:10.1145/2207676.2208394. ISBN 9781450310154.
  19. Caraban, Ana; Ferreira, Maria José; Gouveia, Rúben; Karapanos, Evangelos (2015-01-01). "Social Toothbrush: Fostering Family Nudging Around Tooth Brushing Habits". Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers. UbiComp/ISWC'15 Adjunct. New York, NY, USA: ACM: 649–653. doi:10.1145/2800835.2809438. ISBN 9781450335751.
  20. Chiu, Meng-Chieh; Chang, Shih-Ping; Chang, Yu-Chen; Chu, Hao-Hua; Chen, Cheryl Chia-Hui; Hsiao, Fei-Hsiu; Ko, Ju-Chun (2009-01-01). "Playful Bottle: A Mobile Social Persuasion System to Motivate Healthy Water Intake". Proceedings of the 11th International Conference on Ubiquitous Computing. UbiComp '09. New York, NY, USA: ACM: 185–194. doi:10.1145/1620545.1620574. ISBN 9781605584317.
  21. Halko, Sajanee; Kientz, Julie A. (2010-06-07). "Personality and Persuasive Technology: An Exploratory Study on Health-Promoting Mobile Applications". Persuasive Technology. Springer, Berlin, Heidelberg: 150–161. doi:10.1007/978-3-642-13226-1_16.
  22. Lieto, Antonio; Vernero, Fabiana (2014-12-30). "Influencing the o thers' minds: An experimental evaluation of the use and efficacy of fallacious-reducible arguments in web and mobile technologies" (PDF). PsychNology Journal. University of Padova: 87–105.
  23. Fogg, 2002
  24. Elton 2007
  25. Cugelman et al. 2011
  26. 1 2 Froehlich, Jon; Dillahunt, Tawanna; Klasnja, Predrag; Mankoff, Jennifer; Consolvo, Sunny; Harrison, Beverly; Landay, James A. (2009-01-01). "UbiGreen: Investigating a Mobile Tool for Tracking and Supporting Green Transportation Habits". Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI '09. New York, NY, USA: ACM: 1043–1052. doi:10.1145/1518701.1518861. ISBN 9781605582467.
  27. Fischer, Corinna (2008-02-01). "Feedback on household electricity consumption: a tool for saving energy?". Energy Efficiency. 1 (1): 79–104. doi:10.1007/s12053-008-9009-7. ISSN 1570-646X.
  28. IJsselsteijn, Wijnand; Kort, Yvonne de; Midden, Cees; Eggen, Berry; Hoven, Elise van den (2006-05-18). "Persuasive Technology for Human Well-Being: Setting the Scene". Persuasive Technology. Springer, Berlin, Heidelberg: 1–5. doi:10.1007/11755494_1.

Sources

  • Bailenson, J. N., Beall, A. C., Loomis, J., Blascovich, J., & Turk, M. (2004). Transformed Social Interaction: Decoupling Representation from Behavior and Form in Collaborative Virtual Environments. Presence: Teleoperators & Virtual Environments, 13(4), 428-441.
  • Bogost, I. (2007). Persuasive Games: The Expressive Power of Videogames. MIT Press.
  • Chak, Andrew (2003). Guiding Users with Persuasive Design: An Interview with Andrew Chak, by Christine Perfetti, User Interface Engineering.
  • Cugelman, B., Thelwall, M., & Dawes, P. (2011). Online Interventions for Social Marketing Health Behavior Change Campaigns: A Meta-Analysis of Psychological Architectures and Adherence Factors. Journal of Medical Internet Research, 13(1), e17.
  • DiMicco, J. M., Pandolfo, A., & Bender, W. (2004). Influencing group participation with a shared display. In Proceedings of CSCW 2004 (pp. 614-623). Chicago, Illinois, USA: ACM. doi:10.1145/1031607.1031713.
  • Elton, Catherine . "`Laura' makes digital health coaching personal." The Boston Globe, May 21, 2007.
  • Fogg, B. J., & Nass, C. (1997a). Silicon sycophants: the effects of computers that flatter. International Journal of Human-Computer Studies, 46(5), 551-561.
  • Fogg, B. J., & Nass, C. (1997b) How users reciprocate to computers: an experiment that demonstrates behavior change. In Proceedings of CHI 1997, ACM Press, 331-332. .
  • Fogg, B. J. (1998). Persuasive computers: perspectives and research directions. Proceedings of CHI 1998, ACM Press, 225-232 .
  • Fogg, B. J. (2002). Persuasive Technology: Using Computers to Change What We Think and Do. Morgan Kaufmann
  • Fogg, B. J., & Eckles, D. (Eds.). (2007). Mobile Persuasion: 20 Perspectives on the Future of Behavior Change. Stanford, California: Stanford Captology Media.
  • Licklider, J. C. R., & Taylor, R. W. (1968). The Computer as a Communication Device. Science and Technology, 76(2).
  • Lieto, Antonio, & Vernero, Fabiana (2014). Influencing the others' minds: An experimental evaluation of the use and efficacy of fallacious-reducible arguments in web and mobile technologies. PsychNology Journal, 12(3), pp. 87-105.
  • Lockton, D., Harrison, D., & Stanton, N. A. (2010). The Design with Intent Method: A design tool for influencing user behaviour. Applied Ergonomics, 41(3), 382-392. doi:10.1016/j.apergo.2009.09.001 (preprint version)
  • Moon, Y. (2000). Intimate Exchanges: Using Computers to Elicit Self-Disclosure from Consumers. The Journal of Consumer Research, 26(4), 323-339.
  • Nass, C., & Moon, Y. (2000). Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues, 56(1), 81-103.
  • Oinas-Kukkonen Harri & Harjumaa Marja. 2008. A Systematic Framework for Designing and Evaluating Persuasive Systems. Proceedings of Persuasive Technology: Third International Conference, pp. 164-176.
  • Oinas-Kukkonen, H., Hasle, P., Harjumaa, M., Segerståhl, K., Øhrstrøm, P. (Eds.). (2008). Proceedings of Persuasive Technology: Third International Conference. Oulu, Finland, June 4–6, 2008. Lecture Notes in Computer Science. Springer.
  • Reeves, B., & Nass, C. (1996). The Media Equation: how people treat computers, television, and new media like real people and places. Cambridge University Press.
  • Turkle, S. (1984). The second self: computers and the human spirit. Simon & Schuster, Inc. New York, NY, USA.
  • Winograd, T. (1986). A language/action perspective on the design of cooperative work. Proceedings of the 1986 ACM conference on Computer-supported cooperative work, 203-220.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.