Skip to main content

2021 | OriginalPaper | Buchkapitel

14. Geräusche, Stimmen und natürliche Sprache

Kommunikation mit sozialen Robotern

verfasst von : Kerstin Fischer

Erschienen in: Soziale Roboter

Verlag: Springer Fachmedien Wiesbaden

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Zusammenfassung

Dieser Beitrag gibt einen Überblick über Forschungsarbeiten zu unwillkürlichen Geräuschen von sozialen Robotern, zu nichtsprachlichen, aber kommunikativen Äußerungen, Roboterstimmen, Sprechstilen und schließlich zur natürlichsprachlichen Interaktion mit sozialen Robotern. Während die durch Motoren, Servos und andere technische Elemente verursachten Geräusche meist als eher störend eingeschätzt werden, werden andere Geräusche von Robotern als kommunikativ angesehen und grundsätzlich auf der Basis von Prinzipien, die die menschliche Interaktion bestimmen, interpretiert. Dies betrifft auch den natürlichsprachlichen Dialog mit Robotern, abhängig davon, ob sich die Nutzer überhaupt auf die soziale Interaktion mit Robotern einlassen.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
Siehe z. B. den Workshop Sounds in HRI auf der HRI’21-Konferenz: https://​r00binson.​wixsite.​com/​soundinhri.
 
Literatur
Zurück zum Zitat Andrist S, Spannan E, Mutlu B (2013) Rhetorical robots: making robots more effective speakers using linguistic cues of expertise. In: Proceedings of the 8th ACM/IEEE international conference on human-robot interaction (HRI ’13). IEEE Press, Piscataway, S 341–348 Andrist S, Spannan E, Mutlu B (2013) Rhetorical robots: making robots more effective speakers using linguistic cues of expertise. In: Proceedings of the 8th ACM/IEEE international conference on human-robot interaction (HRI ’13). IEEE Press, Piscataway, S 341–348
Zurück zum Zitat Andrist S, Tan XZ, Gleicher M, Mutlu B (2014) Conversational gaze aversion for humanlike robots. In Proceedings of the 2014 ACM/IEEE international conference on human-robot interaction. ACM, S 25–32 Andrist S, Tan XZ, Gleicher M, Mutlu B (2014) Conversational gaze aversion for humanlike robots. In Proceedings of the 2014 ACM/IEEE international conference on human-robot interaction. ACM, S 25–32
Zurück zum Zitat Andrist S, Ziadee M, Boukaram H, Mutlu B, Sakr M (2015) Effects of culture on the credibility of robot speech: a comparison between English and Arabic. In: Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction (HRI ’15). ACM, New York, S 157–164CrossRef Andrist S, Ziadee M, Boukaram H, Mutlu B, Sakr M (2015) Effects of culture on the credibility of robot speech: a comparison between English and Arabic. In: Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction (HRI ’15). ACM, New York, S 157–164CrossRef
Zurück zum Zitat Aylett MP, Sutton SJ, Vazquez-Alvarez Y (2019) The right kind of unnatural: designing a robot voice. In: Proceedings of the 1st International Conference on Conversational User Interfaces, S 1–2 Aylett MP, Sutton SJ, Vazquez-Alvarez Y (2019) The right kind of unnatural: designing a robot voice. In: Proceedings of the 1st International Conference on Conversational User Interfaces, S 1–2
Zurück zum Zitat Brščić D, Kidokoro H, Suehiro Y, Kanda T (2015) Escaping from children’s abuse of social robots. In Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction, S 59–66 Brščić D, Kidokoro H, Suehiro Y, Kanda T (2015) Escaping from children’s abuse of social robots. In Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction, S 59–66
Zurück zum Zitat Burkhardt F, Saponja M, Sessner J, Weiss B (2019) How should pepper sound-preliminary investigations on robot vocalizations. Studientexte zur Sprachkommunikation: Elektronische Sprachsignalverarbeitung 2019(2019):103–110 Burkhardt F, Saponja M, Sessner J, Weiss B (2019) How should pepper sound-preliminary investigations on robot vocalizations. Studientexte zur Sprachkommunikation: Elektronische Sprachsignalverarbeitung 2019(2019):103–110
Zurück zum Zitat Chao C, Thomaz AL (2013) Controlling social dynamics with a parametrized model of floor regulation. J Hum Robot Interact 2(1):4–29CrossRef Chao C, Thomaz AL (2013) Controlling social dynamics with a parametrized model of floor regulation. J Hum Robot Interact 2(1):4–29CrossRef
Zurück zum Zitat Chidambaram V, Chiang Y-H, Mutlu B (2012) Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues. HRI’12, March 5–8, 2012. Boston Chidambaram V, Chiang Y-H, Mutlu B (2012) Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues. HRI’12, March 5–8, 2012. Boston
Zurück zum Zitat Clark HH, Fischer K (eingereicht): Robots as dynamic depictions Clark HH, Fischer K (eingereicht): Robots as dynamic depictions
Zurück zum Zitat Crumpton J, Bethel CL (2016) A survey of using vocal prosody to convey emotion in robot speech. Int J Soc Robot 8(2):271–285CrossRef Crumpton J, Bethel CL (2016) A survey of using vocal prosody to convey emotion in robot speech. Int J Soc Robot 8(2):271–285CrossRef
Zurück zum Zitat Fischer K (2006) What computer talk is and isn’t – human-computer conversation as intercultural communication. AQ, Saarbrücken Fischer K (2006) What computer talk is and isn’t – human-computer conversation as intercultural communication. AQ, Saarbrücken
Zurück zum Zitat Fischer K, Soto B, Pontafaru C, Takayama L (2014a) The effects of social framing on people’s responses to robots’ requests for help. Proceedings of the IEEE conference on robot-human interactive communication – RO-MAN ’14, Edinburgh Fischer K, Soto B, Pontafaru C, Takayama L (2014a) The effects of social framing on people’s responses to robots’ requests for help. Proceedings of the IEEE conference on robot-human interactive communication – RO-MAN ’14, Edinburgh
Zurück zum Zitat Fischer K, Jensen LC, Bodenhagen L (2014b) To beep or not to beep is not the whole question. In: International conference on social robotics ’14 Fischer K, Jensen LC, Bodenhagen L (2014b) To beep or not to beep is not the whole question. In: International conference on social robotics ’14
Zurück zum Zitat Fischer K, Niebuhr O, Jensen LC, Bodenhagen L (2020a) Speech melody matters – how robots can profit from using charismatic speech. ACM transactions in human-robot interaction 9, 1, Article 4:1–21 Fischer K, Niebuhr O, Jensen LC, Bodenhagen L (2020a) Speech melody matters – how robots can profit from using charismatic speech. ACM transactions in human-robot interaction 9, 1, Article 4:1–21
Zurück zum Zitat Fischer K, Langedijk R, Nissen LD, Ramirez ER, Palinko O (2020b) Gaze-speech coordination influences the persuasiveness of human-robot dialog in the wild. International conference on social robotics Fischer K, Langedijk R, Nissen LD, Ramirez ER, Palinko O (2020b) Gaze-speech coordination influences the persuasiveness of human-robot dialog in the wild. International conference on social robotics
Zurück zum Zitat Fischer K (2016) Designing speec HRI ’12, h for a recipient: the roles of partner modeling, alignment and feedback in so-called ‚simplified registers‘. John Benjamins, Amsterdam Fischer K (2016) Designing speec HRI ’12, h for a recipient: the roles of partner modeling, alignment and feedback in so-called ‚simplified registers‘. John Benjamins, Amsterdam
Zurück zum Zitat Frid E, Bresin R, Alexanderson S (2018) Perception of mechanical sounds inherent to expressive gestures of a Nao robot-implications for movement sonification of humanoids. In: Proceedings of the 15th Sound and Music Computing Conference. Anastasia Georgaki and Areti Andreopoulou, Limassol, Cyprus Frid E, Bresin R, Alexanderson S (2018) Perception of mechanical sounds inherent to expressive gestures of a Nao robot-implications for movement sonification of humanoids. In: Proceedings of the 15th Sound and Music Computing Conference. Anastasia Georgaki and Areti Andreopoulou, Limassol, Cyprus
Zurück zum Zitat Huang C-M, Mutlu B (2013) Modeling and evaluating narrative gestures for humanlike robots. in: Proceedings of Robotics: Science and Systems IX, Technische Universität Berlin, Berlin, S 57–64 Huang C-M, Mutlu B (2013) Modeling and evaluating narrative gestures for humanlike robots. in: Proceedings of Robotics: Science and Systems IX, Technische Universität Berlin, Berlin, S 57–64
Zurück zum Zitat In J, Han J (2015) The prosodic conditions in robot’s TTS for children as beginners in English learning. Indian J Sci Technol 8(55):48–51CrossRef In J, Han J (2015) The prosodic conditions in robot’s TTS for children as beginners in English learning. Indian J Sci Technol 8(55):48–51CrossRef
Zurück zum Zitat Jonsson IM, Zajicek M, Harris H, Nass C (2005) Thank you, I did not see that: in-car speech based information systems for older adults. In CHI’05 extended abstracts on human factors in computing systems, S 1953–1956 Jonsson IM, Zajicek M, Harris H, Nass C (2005) Thank you, I did not see that: in-car speech based information systems for older adults. In CHI’05 extended abstracts on human factors in computing systems, S 1953–1956
Zurück zum Zitat Komatsu T, Yamada S (2011) How does the agents’ appearance affect users’ interpretation of the agents’ attitudes: experimental investigation on expressing the same artificial sounds from agents with different appearances. Int J Hum Comput Interact 27(3):260–279CrossRef Komatsu T, Yamada S (2011) How does the agents’ appearance affect users’ interpretation of the agents’ attitudes: experimental investigation on expressing the same artificial sounds from agents with different appearances. Int J Hum Comput Interact 27(3):260–279CrossRef
Zurück zum Zitat Korcsok B, Faragó T, Ferdinandy B, Miklósi Á, Korondi P, Gácsi M (2020) Artificial sounds following biological rules: A novel approach for non-verbal communication in HRI. Sci Rep 10(1):1–13CrossRef Korcsok B, Faragó T, Ferdinandy B, Miklósi Á, Korondi P, Gácsi M (2020) Artificial sounds following biological rules: A novel approach for non-verbal communication in HRI. Sci Rep 10(1):1–13CrossRef
Zurück zum Zitat Kraljic T, Samuel AG, Brennan SE (2008) First impressions and last resorts: how listeners adjust to speaker variability. Psychol Sci 19(4):332–338CrossRef Kraljic T, Samuel AG, Brennan SE (2008) First impressions and last resorts: how listeners adjust to speaker variability. Psychol Sci 19(4):332–338CrossRef
Zurück zum Zitat Labov W (1972) Some principles of linguistic methodology. Lang Soc1:97–120 Labov W (1972) Some principles of linguistic methodology. Lang Soc1:97–120
Zurück zum Zitat Latupeirissa AB, Bresin R (2020) Understanding non-verbal sound of humanoid robots in films. In Workshop on mental models of robots at HRI 2020 in Cambridge, UK Latupeirissa AB, Bresin R (2020) Understanding non-verbal sound of humanoid robots in films. In Workshop on mental models of robots at HRI 2020 in Cambridge, UK
Zurück zum Zitat Lee EJ, Nass C, Brave S (2000) Can computer-generated speech have gender? An experimental test of gender stereotype. In CHI’00 extended abstracts on Human factors in computing systems, S 289–290 Lee EJ, Nass C, Brave S (2000) Can computer-generated speech have gender? An experimental test of gender stereotype. In CHI’00 extended abstracts on Human factors in computing systems, S 289–290
Zurück zum Zitat McGinn C, Torre I (2019). Can you tell the robot by the voice? An exploratory study on the role of voice in the perception of robots. In 2019 14th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 211–221 McGinn C, Torre I (2019). Can you tell the robot by the voice? An exploratory study on the role of voice in the perception of robots. In 2019 14th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 211–221
Zurück zum Zitat Moore D, Martelaro N, Ju W, Tennent H (2017) Making noise intentional: A study of servo sound perception. In: 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI). ACM Digital Library, S 12–21. Moore D, Martelaro N, Ju W, Tennent H (2017) Making noise intentional: A study of servo sound perception. In: 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI). ACM Digital Library, S 12–21.
Zurück zum Zitat Moore RK (2015) From talking and listening robots to intelligent communicative machines. Robots that talk and listen, S 317–335 Moore RK (2015) From talking and listening robots to intelligent communicative machines. Robots that talk and listen, S 317–335
Zurück zum Zitat Moore RK (2017) Is spoken language all-or-nothing? Implications for future speech-based human-machine interaction. In: Dialogues with social robots. Springer, Singapore, S 281–291CrossRef Moore RK (2017) Is spoken language all-or-nothing? Implications for future speech-based human-machine interaction. In: Dialogues with social robots. Springer, Singapore, S 281–291CrossRef
Zurück zum Zitat Mutlu B (2011) Designing embodied cues for dialog with robots. AI Mag 32(4):17–30 Mutlu B (2011) Designing embodied cues for dialog with robots. AI Mag 32(4):17–30
Zurück zum Zitat Nass C, Moon Y (2000) Machines and mindlessness: social responses to computers. J Soc Issues 56(1):81–103CrossRef Nass C, Moon Y (2000) Machines and mindlessness: social responses to computers. J Soc Issues 56(1):81–103CrossRef
Zurück zum Zitat Nass CI, Brave S (2005) Wired for speech: how voice activates and advances the human-computer relationship. MIT Press, Cambridge MA Nass CI, Brave S (2005) Wired for speech: how voice activates and advances the human-computer relationship. MIT Press, Cambridge MA
Zurück zum Zitat Niebuhr O, Fischer K. (eingereicht) Which voice for which robots? Acoustic correlates of body size Niebuhr O, Fischer K. (eingereicht) Which voice for which robots? Acoustic correlates of body size
Zurück zum Zitat Niebuhr O, Voße J, Brem A (2016) What makes a charismatic speaker? A computer-based acoustic-prosodic analysis of Steve Jobs’ tone of voice. Comput Hum Behav 64:366–382CrossRef Niebuhr O, Voße J, Brem A (2016) What makes a charismatic speaker? A computer-based acoustic-prosodic analysis of Steve Jobs’ tone of voice. Comput Hum Behav 64:366–382CrossRef
Zurück zum Zitat Paepcke S, Takayama L (2010) Judging a bot by its cover: an experiment on expectation setting for personal robots. In 2010 5th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 45–52 Paepcke S, Takayama L (2010) Judging a bot by its cover: an experiment on expectation setting for personal robots. In 2010 5th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 45–52
Zurück zum Zitat Phillips E, Zhao X, Ullman D, Malle BF (2018) What is human-like? Decomposing robots’ human-like appearance using the anthropomorphic robot (abot) database. In: Proceedings of the ACM/IEEE international conference on human-robot interaction, S 105–113 Phillips E, Zhao X, Ullman D, Malle BF (2018) What is human-like? Decomposing robots’ human-like appearance using the anthropomorphic robot (abot) database. In: Proceedings of the ACM/IEEE international conference on human-robot interaction, S 105–113
Zurück zum Zitat Read R, Belpaeme T (2012) How to use non-linguistic utterances to convey emotion in child-robot interaction. In 2012 7th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 219–220 Read R, Belpaeme T (2012) How to use non-linguistic utterances to convey emotion in child-robot interaction. In 2012 7th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 219–220
Zurück zum Zitat Read R, Belpaeme T (2014) Situational context directs how people affectively interpret robotic non-linguistic utterances. In 2014 9th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 41–48 Read R, Belpaeme T (2014) Situational context directs how people affectively interpret robotic non-linguistic utterances. In 2014 9th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 41–48
Zurück zum Zitat Read R, Belpaeme T (2016) People interpret robotic non-linguistic utterances categorically. Int J Soc Robot 8(1):31–50CrossRef Read R, Belpaeme T (2016) People interpret robotic non-linguistic utterances categorically. Int J Soc Robot 8(1):31–50CrossRef
Zurück zum Zitat Robinson FA, Velonaki M, Bown O (2021) Smooth Operator: Tuning Robot Perception Through Artificial Movement Sound. In: Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, S 53–62 Robinson FA, Velonaki M, Bown O (2021) Smooth Operator: Tuning Robot Perception Through Artificial Movement Sound. In: Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, S 53–62
Zurück zum Zitat Rosenthal-von der Pütten AM, Straßmann C, Krämer NC (2016) Robots or agents-neither helps you more or less during second language acquisition. In: International conference on intelligent virtual agents. Springer, Cham, S 256–268CrossRef Rosenthal-von der Pütten AM, Straßmann C, Krämer NC (2016) Robots or agents-neither helps you more or less during second language acquisition. In: International conference on intelligent virtual agents. Springer, Cham, S 256–268CrossRef
Zurück zum Zitat Schuller B, Batliner A (2014) Computational paralinguistics: emotion, affect and personality in speech and language processing. Wiley, New York Schuller B, Batliner A (2014) Computational paralinguistics: emotion, affect and personality in speech and language processing. Wiley, New York
Zurück zum Zitat Strupka E, Niebuhr O, Fischer K (2016). Influence of Robot gender and speaker gender on prosodic entrainment in HRI. Interactive Session at the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2016), New York Strupka E, Niebuhr O, Fischer K (2016). Influence of Robot gender and speaker gender on prosodic entrainment in HRI. Interactive Session at the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2016), New York
Zurück zum Zitat Sutton SJ, Foulkes P, Kirk D, Lawson S (2019) Voice as a design material: sociophonetic inspired design strategies in human-computer interaction. In: Proceedings of the 2019 CHI conference on human factors in computing systems, S 1–14 Sutton SJ, Foulkes P, Kirk D, Lawson S (2019) Voice as a design material: sociophonetic inspired design strategies in human-computer interaction. In: Proceedings of the 2019 CHI conference on human factors in computing systems, S 1–14
Zurück zum Zitat Tennent H, Moore D, Jung M, Ju W (2017) Good vibrations: how consequential sounds affect perception of robotic arms. In 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, S 928–935 Tennent H, Moore D, Jung M, Ju W (2017) Good vibrations: how consequential sounds affect perception of robotic arms. In 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, S 928–935
Zurück zum Zitat Torre I, LeMaguer S (2020) Should robots have accents?. In: 2020 29th IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, S 208–214 Torre I, LeMaguer S (2020) Should robots have accents?. In: 2020 29th IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, S 208–214
Zurück zum Zitat Trovato G, Paredes R, Balvin J, Cuellar F, Thomsen NB, Bech S, Tan Z-H (2018) The sound or silence: investigating the influence of robot noise on proxemics. In: 2018 27th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, S 713–718 Trovato G, Paredes R, Balvin J, Cuellar F, Thomsen NB, Bech S, Tan Z-H (2018) The sound or silence: investigating the influence of robot noise on proxemics. In: 2018 27th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, S 713–718
Zurück zum Zitat Walters ML, Syrdal DS, Koay KL, Dautenhahn K, Te Boekhorst R (2008). Human approach distances to a mechanicallooking robot with different robot voice styles. In: RO-MAN 2008-the 17th IEEE international symposium on robot and human interactive communication. IEEE, S 707–712 Walters ML, Syrdal DS, Koay KL, Dautenhahn K, Te Boekhorst R (2008). Human approach distances to a mechanicallooking robot with different robot voice styles. In: RO-MAN 2008-the 17th IEEE international symposium on robot and human interactive communication. IEEE, S 707–712
Zurück zum Zitat Wang W, Athanasopoulos G, Yilmazyildiz S, Patsis G, Enescu V, Sahli H, Verhelst W, Hiolle A, Lewis M, Canamero L (2014) Natural emotion elicitation for emotion modeling in child-robot interactions. In: WOCCI, S 51–56 Wang W, Athanasopoulos G, Yilmazyildiz S, Patsis G, Enescu V, Sahli H, Verhelst W, Hiolle A, Lewis M, Canamero L (2014) Natural emotion elicitation for emotion modeling in child-robot interactions. In: WOCCI, S 51–56
Zurück zum Zitat Winkle K, Lemaignan S, Caleb-Solly P, Leonards U, Turton A, Bremner P (2019) Effective persuasion strategies for socially assistive robots. In 2019 14th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 277–285 Winkle K, Lemaignan S, Caleb-Solly P, Leonards U, Turton A, Bremner P (2019) Effective persuasion strategies for socially assistive robots. In 2019 14th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, S 277–285
Zurück zum Zitat Winkle K, Melsión GI, McMillan D, Leite I (2021) Boosting robot credibility and challenging gender norms in responding to abusive behaviour: a case for feminist robots. In: Companion of the 2021 ACM/IEEE international conference on human-robot interaction, S 29–37 Winkle K, Melsión GI, McMillan D, Leite I (2021) Boosting robot credibility and challenging gender norms in responding to abusive behaviour: a case for feminist robots. In: Companion of the 2021 ACM/IEEE international conference on human-robot interaction, S 29–37
Zurück zum Zitat Yilmazyildiz S, Read R, Belpeame T, Verhelst W (2016) Review of semantic-free utterances in social human-robot interaction. Int J Hum Comput Interact 32(1):2016CrossRef Yilmazyildiz S, Read R, Belpeame T, Verhelst W (2016) Review of semantic-free utterances in social human-robot interaction. Int J Hum Comput Interact 32(1):2016CrossRef
Zurück zum Zitat Zaga C (2017) Something in the way it moves and beeps: exploring minimal nonverbal robot behavior for child-robot interaction. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human-robot interaction, S 387–388 Zaga C (2017) Something in the way it moves and beeps: exploring minimal nonverbal robot behavior for child-robot interaction. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human-robot interaction, S 387–388
Metadaten
Titel
Geräusche, Stimmen und natürliche Sprache
verfasst von
Kerstin Fischer
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-658-31114-8_14

Premium Partner