Robotics After Recess: A Call for Enhanced Human-Robot Interaction Focus in Education

Abstract

The trend toward robotics and automation are not only becoming ubiquitous across many sectors, but they are also advancing at more significant rates than ever before.  With robots becoming more involved in the workplace and sometimes even our homes, humans need some level of understanding of robot mechanics, artificial intelligence, and our own anthropomorphic tendencies.  If the general public’s vision of robots steers closer to iRobot than an ATM, or our behavioral lines between robot dogs and living dogs begin to blur, our human-human and human-robot interactions have the potential to suffer.  As robots become more integrated, society needs exposure to these concepts from an early age to interact efficiently and intelligently.  Examining the dangers of humans’ lack of understanding of robots, this paper takes an interdisciplinary approach, combining human-robot interaction (HRI) studies, early education research, and a computer literacy approach.  This research asks the questions: What dangers result from a general public’s lack of HRI understanding?  How can the formal education setting combat these potential societal dangers?  This paper makes a call for formalized HRI education standards and introduces recommendations for how these standards can be implemented.

Introduction

As Richards and Smart (2012) expressed nearly four years ago along with many other roboticists and computer scientists over the thirty years prior, “the robots are coming” (1). In many cases, such as health care, the service industry, and entertainment, they are already here. Robots are not only replacing humans in many sectors, for reasons such as factory efficiency and military safety, they are also supplementing humans in the workplace, in the home, and in the classroom. Living in the modern, industrialized world, humans very likely cannot pass through a single day without at least one interaction with a robot. ATMs, advanced parking meters, smartphone personal assistants (Siri), and the Roomba vacuum; these devices and many more pervade our everyday lives, often without the cognizant recognition of human-robot interaction (HRI).

Furthermore, the more we interact with robotics for efficiency or safety purposes, the more commonplace they become. For some interactions, like robot telephone operators, we are very aware that we are interfacing with an artificially intelligent device and we must respond accordingly in order to reach our desired end. But for others, we may be unaware or our response to the device verges on how we would respond to another living being, like in the case of a robot puppy. These types of responses are anthropomorphic, which are responses where one gives an inanimate object living or human-like qualities. Sometimes the construction of these characteristics are necessary for the human’s benefit, but other times they can construe our behaviors and how we characterize the object itself (Darling 2015).

If these devices, ranging from simple single-operation to highly advanced systems, have become so pervasive, the question arises: Why has robotics education, focusing on usability and understanding, not emerged in primary or secondary education? I argue that exposure to robotics and related concepts in the formal education setting will better equip people from a young age how to view, interact with, and benefit from these rapidly advancing technologies, as well as contribute to the future of STEM education.

This paper examines the state of robotics education currently in the United States, and proposes a preliminary set of standards for the primary education setting. This research asks the questions: What dangers result from a general public’s lack of HRI understanding? How can the formal education setting combat these potential societal dangers? Part I will review the concept of anthropomorphism and illustrate how this natural human response could impact human behaviors with robots. Part II will introduce a series of short case studies demonstrating examples of anthropomorphic responses to robots and their impact on humans. Part III will outline how robots are being used in the education sphere at the current time and the few instances where HRI education is taking place, mainly in the university setting. Part IV will conclude this paper by recommending a set of loose standards that could be integrated at the primary education level.

Part I: A Brief Introduction to Anthropomorphism

Anthropomorphism, in the context of robotics, is “the inclination to assign lifelike qualities onto robots” (Darling 2014). This inclination is not necessarily a good or bad reaction a human has when interacting with a robot, but it can be both depending on the circumstances. On one end, researchers tend to caution these behaviors because they can lead to emotional manipulation and undesirable sexual or otherwise behavior. Humans respond greatly to good framing, and framing robots as humans can distort our view of human interaction with that of robot interaction (Bai 2005; Darling, 2014). On the other end, framing a robot as merely a tool, as many researchers prefer, can greatly inhibit the effectiveness of the technology (Richards and Smart, 2012). These types of robots are intended to tap into human emotion in order to serve their purpose, which we will discuss in Part II.

In the case of these human-mimicking robots, sociology and robotics have come together in a sense to better understand how we can utilize and benefit most from the rapid development of social robots. Knight (2014) writes, “Sociability is our natural interface, to each other and to living creatures in general. As part of that innate behavior, we quickly seek to identify objects from agents. In fact, as social creatures, it is often our default behavior to anthropomorphize moving robots” (4). The more human-like the robot’s actions, appearance, or gestures, the more we assign it living characteristics in both our minds and words.

In order to determine if the anthropomorphic response is safe or dangerous, we must identify the context of the robot’s use and how it potentially manipulates humans. The biggest issue lies in the general public’s unawareness of their responses. So far, these concepts are discussed at robotics conferences, medical clinics, and research centers—not standing in line at the ATM, which is a robot, replacing a human, used every day by millions. These concepts, as least generally, must be exposed to people early on in order to protect our social behavior and well-beings as humans in a highly robot-integrated society.

Part II: Case Studies

The following examples provide four separate cases of the normalized use of robotics as replacements of or supplements to human activities and how anthropomorphic responses impact their use and integration.

Health and senior care

Paro, the advanced medical device in the shape of a baby seal, is likely the cuddliest ethical quandary the US medical system has ever faced (Tergesen and Inada 2010). Paro is designed to provide comfort to senior citizens in hospitals and senior care centers by its “cuteness” factor and to simultaneously gather information about the user to learn its interaction behaviors. Paro has been used in some centers as an alternative to medication and isolation, giving the patient a better late-life experience. However, the ethical issue remains that we are replacing real, life-sustaining human interactions with a simulation of interaction because of personnel or other restraints. Sometimes these patients even become unaware or unconcerned with the fact that Paro does not experience real feeling or emotion, construing their perception of living emotion.

Home maintenance

Created by MIT researcher Cynthia Breazeal, Jibo is a social or “family” robot that serves as an artificially intelligent home assistant that aims to make “personal” connections with users (Tilley 2015). Jibo has hundreds of learning capabilities with thousands more in development through third-party partnerships. In the home, Jibo can keep the family organized, take photos, surf the Web, engage in educational exercises, facilitate communication between people, and, most importantly, speak directly to its users and learn their preferences. This last function is what sets Jibo apart from other home maintenance applications, but also what creates a sense of uneasiness about it. In order for it to function at its highest capability, the humans it serves must express openly toward it, which requires some level of personal connection from the human. An issue arises if its users view Jibo as a replacement of humans in certain interactions. One example is Jibo being used as a homework helper in place of traditional parent-child interaction. Jibo has the ability to provide educational content and interactive lessons for its younger users, but these same interactions are some of the most important for the development of family bonds between parents and children. Replacing that vital interaction with a robot removes one half of the personal relationship, changing family dynamics and providing false companionship with the child.

Military robots

The United States military is currently and has historically funded the development of robots and other technological innovations (Palus 2014). One of those innovations set to replace the companion dog for soldiers is a small, hand-held robot resembling a turtle that will take commands on missions and report back to a soldier, increasing safety and efficiency. The robot’s name is FlipperBot, but will likely be renamed by each soldier who owns one, an anthropomorphic response. Taking the shape of a baby turtle is both a benefit of the FlipperBot and a detriment. On one side, the robot’s movement mechanism that mimics a live turtle is efficient on both land and sea, and the robot is small enough to fit in a pocket to be both mobile and difficult to detect when sent out on reconnaissance missions. However, soldiers will likely become attached to their FlipperBots, which can be extremely dangerous for a soldier both mentally and physically if the robot is sent out on missions like bomb detection that could result in its destruction.

Sex robots

The development of human-like robots as sex partners provides a bit of an extreme example, but it demonstrates well how anthropomorphizing non-living robots can change our perceptions of human interaction. Sex robots, most often made to look like women, have gained in popularity in the robot community because they serve as an alternative to people less likely to engage in human sexual encounters. What draws people to these robots is the fact that they do not have an emotional response to sex and do not turn down the act (Koplowitz 2012). One side of the argument states that encouraging these robots might prevent sex offenders or those less willing to seek consent from abusing human women. However, on the opposite side, research suggests these encouraged interactions could completely construe how humans view sex as an intimate and personal act that requires informed consent (Gutiu, 2012).

Part III: Robots Currently Found in Education

Though it sometimes still sounds foreign to the traditional classroom, the use of robots in education has been taking place for decades (Kimbler 1984; Bühler and Knops 1999). These robots have been used primarily as educational aids for purposes such as attention garnering, demonstration supplements, and student-initiated physical supplements for handicapped students. A recent review of robots in education finds that the most common uses in the classroom include language training and STEM education, performing tasks as tools, tutors, and peers (Mubin et al. 2013).

However, these robot educators are rarely the focus of the classroom activity at an early age despite becoming more and more present (Sauppe and Huang 2015). Currently, courses specifically focusing on HRI, problem-solving skills developed through HRI, or HRI as a gateway for advanced computer science education are only found at the postsecondary level or in carefully crafted youth interventions (Mead and Mataric 2015; Simmons and Nourbakhsh 2015; Huang et al. 2015; Sandoval et al. 2015; Tsui 2015). Many important cognitive life skills are developed or begin to develop even prior to the primary school age, such as working memory, self-regulated learning, and the ability to respond less reactively to situations or stimuli (Welsh et al. 2010). Implementing an intervention that introduces robot mechanisms during this time could create a level of familiarity and critical thinking skills that allow the student to interact more safely and efficiently with automation over the life span. Therefore, introducing students to HRI modes of thinking should take place far sooner than the college years, which is an educational attainment many people do not reach for both achievement and financial reasons.

An example of a college course trying to bridge the age gap in HRI education was developed and evaluated at North Carolina State University (Huang et al. 2015). The course evaluated undergraduate students’ problem solving skills, technical abilities, and self-efficacy in designing and building simple computer-based robots. Results of the course showed that students, especially those lacking in general conceptual knowledge or understanding rather than technical skills, greatly benefitted throughout the coursework, hands on projects, and journaling exercises. The college setting served as the testing ground for this course but the researchers noted the importance of making this type of education available at the K–12 level.

Addressing the issue of anthropomorphism specifically, an online class currently in development by Spanish researchers at Jaume I University is using the widely available Internet platform to reach as many people as possible, citing the importance of exposing people to “humanoid robots” as robotic technologies continue to advance (Casañ et al. 2015). The course uses the Robotic Programming Network (RPN) to allow students to access and control human-like robots, aiming “to create an open course which allows students (from high school age onwards) to access real humanoid robots and program them in a realistic human environment, learning how to use them as extensions of their own bodies to solve problems”(1). The course will hopefully give its students a greater understanding of robotics as an extension of self and develop their cognitive skills for a future in STEM fields.

Finally, HRI education can also supplement another vital STEM focus that has picked up a bit more mainstream traction than robotics education, computer science. Existing education recommendations by the Computer Science Teachers Association (CSTA) have been implemented across the country to enable students of all ages to gain core computer science competencies. Sauppe and Huang (2015) take these recommendations and apply them to HRI education, concluding that these fields overlap greatly (computational thinking, programming, practical design, etc.) and should be taught congruently to give computer science students an embodied platform which will “expose [them to] seemingly unrelated fields—such as psychology and education—that intersect with computer science”(2).

Part IV: Recommended Education Standards

The following list of recommended standards are put forth to provide educators, new and old, with guidelines for the competencies their students should exhibit. They derive from a similar framework used by the National Association for Media Literacy Education (NAMLE) and the current Common Core state standards in English Language Arts. The standards are meant to read non-specifically. As the level of integration and advancement of robots will continue to increase, these standards should remain useful as benchmarks for students to attain throughout their education. Each recommended standard is followed by a series of supplementary questions that serve to provide classrooms with the conversation material necessary to achieve the standard. Many tools, such as simulations or videos, and other conversation starters can lead to further understanding, but these questions can help to foster an initial discussion for young students.

Students should:

  1. Be able to identify a robotic system in their everyday lives by recognizing the (1a) mechanical action employed by the robot, and (1b) the specific need the robot fulfills.
    SUPPLEMENTAL QUESTIONS: What does the robot do? Why was the robot created? Does the robot replace the action a human would traditionally do? If so, what can the robot do that a human cannot?
  2. Be able to recognize humans’ roles in creating and dictating the actions of robots.
    SUPPLEMENTAL QUESTIONS: What are the instructions this robot is following? Who gave the robot these instructions? Why is the robot asked to carry out these specific actions?
  3. Be able to recognize their anthropomorphic tendencies and the tendencies others exhibit.
    SUPPLEMENTAL QUESTIONS: What is anthropomorphism [or a simpler variant term for younger students]? What parts about the robot make me think about the robot as something other than a machine?
  4. Be able to articulate, in simple terms, his or her own intellectual and emotional responses to a robot interaction.
    SUPPLEMENTAL QUESTIONS: How does the robot make you feel? If it does not make you feel a certain way, what are your initial thoughts about the robot? If it does make you feel any emotions toward it, when are other times you have felt this emotion?
  5. Be able to distinguish between anthropomorphic responses to a robot interaction and responses one has with other living entities.
    SUPPLEMENTAL QUESTIONS: Where are other instances you have felt the same response you are having to this robot? Why did you feel this emotion in your interaction with a human or other living thing? Why does this robot make you feel similarly? Does this emotion make you feel differently about robots or living things in any way?
  6. Be able to recognize the heightened capabilities of some advanced robots using an elementary knowledge of data collection.
    SUPPLEMENTAL QUESTIONS: Is this robot learning things about you? How can you tell? Is this robot changing the way it works for or interacts with you the more you use it? What do you think this robot knows about you or your habits? How can this robot potentially use this information about you in other ways?
  7. Be able to formulate critical questions about the purpose of the robot and why it exhibits certain design characteristics.
    SUPPLEMENTAL QUESTIONS: Why is the robot shaped in this way? For example, why was the robot given a pair of eyes if it does not use them to see?

Conclusion

As robots become more integrated into society and our technology allows for extremely life-like robotic systems, HRI education will need to be given a greater focus by early education. Humans need to better understand how their cognitive abilities and functions are being shaped when interacting with robots, but they cannot gain all of these competencies through life experience alone. Because many human cognitive functions develop in a child’s pre-school and elementary school years, introducing these learning goals into the pedagogies of early education teachers is vital for gaining both knowledge and comfort in interacting with robots. In order for greater efficiency and safety in robot interaction, there must be an implemented approach in our education system to provide people with the proper skills and understanding.

References

Bai, Matt. 2005. “The Framing Wars.” The New York Times, July 17. http://www.nytimes.com/2005/07/17/magazine/the-framing-wars.html.

Bühler, Christian, and Harry Knops. 1999. Assistive Technology on the Threshold of the New Millennium. IOS Press.

Calo, Ryan. 2010. “Robots and Privacy.” SSRN Scholarly Paper ID 1599189. Rochester, NY: Social Science Research Network. http://papers.ssrn.com/abstract=1599189.

Casañ, Gustavo, Enric Cervera, and Angel del Pobil. 2015. “Online Teaching of Humanoid Robots.” Portland, OR. http://www.rose-hulman.edu/media/1601875/Casan_FinalVersion-13.pdf.

Darling, Kate. 2015. “‘Who’s Johnny?’ Anthropomorphic Framing in Human-Robot Interaction, Integration, and Policy.” Miami FL. http://www.werobot2015.org/wp-content/uploads/2015/04/Darling_Whos_Johnny_WeRobot_2015.pdf.

Gutiu, Sinziana. 2012. “Sex Robots and Roboticization of Consent.”

Huang, Lixiao, Douglas Gillan, and Terri Varnado. 2015. “Using Robotics Education to Improve Problem Solving Skills, Metacognition, and Self-Efficacy.” Portland, OR. http://www.rose-hulman.edu/media/1600596/HRI-Education-Workshop-Extended-Abstract-_-Huang-et-al_Revised-final.pdf.

Inada, Miho, and Anne Tergesen. 2010. “It’s Not a Stuffed Animal, It’s a $6,000 Medical Device.” Wall Street Journal, June 21, sec. US Page One. http://www.wsj.com/articles/SB10001424052748704463504575301051844937276.

Jones, (Ambrose), and Meg Leta. 2015. “Privacy Without Screens & the Internet of Other People’s Things.” SSRN Scholarly Paper ID 2614066. Rochester, NY: Social Science Research Network. http://papers.ssrn.com/abstract=2614066.

Kimbler, D. L. 1984. “Robots and Special Education: The Robot as Extension of Self.” Peabody Journal of Education 62 (1): 67–76.

Knight, Heather. 2014. “How Humans Respond to Robots: Building Public Policy through Good Design.” Brookings Institute Report. The Robots Are Coming: The Project On Civilian Robotics. Robotics Institute: Carnegie Mellon University. http://www.brookings.edu/~/media/Research/Files/Reports/2014/07/29-how-humans-respond-to-robots-knight/HumanRobot-PartnershipsR2.pdf?la=en.

Koplowitz, Howard. 2015. “Sex Robots: Meet Roxxxy, Robot That Comes With ‘Skank Mode’ [NSFW, VIDEO].” International Business Times. Accessed December 14. http://www.ibtimes.com/sex-robots-meet-roxxxy-robot-comes-skank-mode-nsfw-video-439870.

Mead, Ross, and Maja Mataric. 2015. “Social Robotics in the Middle School Classroom: Increasing Student STEM Achievement-Related Choices.” Portland, OR. http://www.rose-hulman.edu/media/1599132/MeadMataric_HRI-E2015.pdf.

Mubin, Omar, Catherine J. Stevens, Suleman Shahid, Abdullah Al Mahmud, and Jian-Jie Dong. 2013. “A REVIEW OF THE APPLICABILITY OF ROBOTS IN EDUCATION.” Technology for Education and Learning 1 (1). doi:10.2316/Journal.209.2013.1.209-0015.

National Association of Media Literacy Education. 2015. “The Core Principles of Media Literacy Education.” National Association for Media Literacy Education. Accessed December 14. http://namle.net/publications/core-principles/.

National Governors Association Center for Best Practices, and Council of Chief State School Officers. 2010. “Common Core State Standards.”

Palus, Shannon. 2014. “The Military Is Funding the Creation of Adorable Robots.” Slate, July 10. http://www.slate.com/articles/technology/future_tense/2014/07/flipperbot_rhex_military_robots_get_adorable_and_creepy.html.

Richards, Neil, and William Smart. 2015. “How Should the Law Think about Robots?” Miami FL. http://robots.law.miami.edu/wp-content/uploads/2012/03/RichardsSmart_HowShouldTheLawThink.pdf.

Sandoval, Eddie, Omar Mubin, and Juergen Brandstetter. 2015. “Making HRI Accessible to Everyone Through Online Videos. A Proposal for a µMOOC in Human Robot Interaction.” Portland, OR. http://www.rose-hulman.edu/media/1599141/SandovalHRI_Education__From_outreach_to__new_researchers_in_the_field.pdf.

Sauppe, Allison, and Chien-Ming Huang. 2015. “Contextualizing the CSTA Recommendations Using Human-Robot Interaction.” Portland, OR. http://www.rose-hulman.edu/media/1599138/Sauppe-hri-15-ws.pdf.

Simmons, Reid, and Illah Nourbakhsh. 2015. “An Undergraduate Course in Human-Robot Interaction at Carnegie Mellon University.” Portland, OR. http://www.rose-hulman.edu/media/1599135/Simmons-An-Undergraduate-Course-in-HRI.pdf.

Stanley, Jay. 2015. “Computers vs. Humans: What Constitutes A Privacy Invasion?” American Civil Liberties Union. Accessed November 30. https://www.aclu.org/blog/computers-vs-humans-what-constitutes-privacy-invasion.

Tilley, Aaron. 2015. “Jibo Raises Another $16 Million To Bring Its ‘Family Robot’ To China And Japan.” Forbes. Accessed December 14. http://www.forbes.com/sites/aarontilley/2015/12/09/jibo-raises-another-16-million-to-bring-its-family-robot-to-china-and-japan/.

Tsui, Katherine. 2015. “Lessons Learned: Towards Informal Science Education of Human-Robot Interaction at Children’s Science Museums.” Portland, OR. http://www.rose-hulman.edu/media/1599129/Tsui-HRI-informal-education-telepresence-musuem-CAMERA-READY-final.pdf.

Welsh, Janet A., Robert L. Nix, Clancy Blair, Karen L. Bierman, and Keith E. Nelson. 2010. “The Development of Cognitive Skills and Gains in Academic School Readiness for Children from Low-Income Families.” Journal of Educational Psychology 102 (1): 43–53. doi:10.1037/a0016738.

Jilanne Doom

Jilanne Doom is a second-year Master's candidate in the Communication, Culture & Technology program at Georgetown University and the Assistant Multimedia Editor for gnovis. Alongside her studies, she works as a videographer at the Berkley Center for Religion, Peace and World Affairs and a communications specialist for IREX. Her interests include new media and identity, media bias and literacy, and photography. She may be reached through her website, www.jilannedoom.com.