An Empirical Study of Multichannel Communication: Russian Pear Chats and Stories

  • Андрей Александрович Кибрик Lomonosov Moscow State University
  • Ольга Викторовна Федорова Lomonosov Moscow State University
Keywords: multimodality, multichannel discourse, corpus creation, prosody, gestures, eye gaze, annotation

Abstract

This paper addresses language in its most natural form – in the form of spoken multichannel discourse. It includes the verbal component, prosody, eye gaze, as well as the different kinetic aspects of communication – facial, head, hand and torso gestures. To explore natural multichannel discourse as is, we created a resource “Russian Pear Chats and Stories”. The resource includes 40 sessions with 160 Russian native speakers aged 18–36, 60 men and 100 women; it consists of 15 hours of recording and about 170,000 words. This paper details how the corpus is created and how it can be used. First, we provide an overview of the methodology of multimodality and multichannel corpora. Then we describe the properties of our resource – the data collection set up, the recording software, types of annotation, as well as some avenues of (future) research, including: prosody as an interface between the vocal and gestural channels, specific nature and degree of coordination between manual gestures and elementary discourse units, individual variation and the “portrait” methodology, language production and comprehension in face-to-face communication, and visual attention in natural communication. In its current version, the corpus is available to the scientific community at the project website multidiscourse.ru (in Russian).

Downloads

Download data is not yet available.

References

1. Adolphs, S., & Carter, R. (2013). Spoken corpus linguistics: From monomodal to multimodal. New York: Routledge.

2. Aist, G., Campana, E., Allen, J., Swift, M., & Tanenhaus, M. K. (2012). Fruit carts: A domain and corpus for research in dialogue systems and psycholinguistics. Computational Linguistics, 38(3), 469-478.

3. Bressem, J. (2013). A linguistic perspective on the notation of form features in gestures. In C. Müller et al. (Eds.), Body-Language-Communication: An international handbook on multimodality in human interaction (Vol. 1, p. 1079-1098). Berlin/Boston: De Gruyter Mouton.

4. Brône, G., & Oben, B. (2015). InSight Interaction: a multimodal and multifocal dialogue corpus. Language Resources and Evaluation, 49(1), 195-214.

5. Campbell, N. (2009). Tools and resources for visualising conversational-speech Interaction. In M. Kipp et al. (Eds.), Multimodal corpora: From models of natural interaction to systems and applications (pp. 176-188). Heidelberg: Springer.

6. Carletta, J. (2006). Announcing the AMI meeting corpus. The ELRA Newsletter, 11(1), 3-5.

7. Chafe, W. (Ed.). (1980). The pear stories: Cognitive, cultural, and linguistic aspects of narrative production. Norwood, NJ: Ablex.

8. Church, R. B., Alibali, M. W., & Kelly, S. D. (Eds.). (2017). Why gesture? How the hands function in speaking, thinking and communicating. Amsterdam: John Benjamins Publishing.

9. Cosnier, J., & Brossard, A. (1984). La communication non verbale [Non-verbal communication]. Neuchâtel: Delachaux et Niestlé. (in French)

10. Ekman, F., & Friesen, W. V. (1969). The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica, 1(1), 49-98.

11. Fedorova, O. V. (2017). Raspredelenie zritel’nogo vnimaniya sobesednikov v estestvennoi kommunikatsii: 50 let spustya [Distribution of the interlocutors’ visual attention in natural communication: 50 years later]. In E. V.Pechenkova & M. V. Falikman (Eds.), Kognitivnaya nauka v Moskve: novye issledovaniya. Materialy konferentsii[Cognitive science in Moscow: new research. Proceedings of the conference] (pp. 370-375). Moscow: BukiVedi/IPPiP. (in Russian)

12. Fedorova, O. V., Kibrik, A. A., Korotaev, N. A., Litvinenko, A. O., & Nikolaeva, Yu. V. (2016). Temporal coordination between gestural and speech units in multimodal communication [Vremennaya koordinatsiya mezhdu zhestovymi i rechevymi edinitsami v mul’timodal’noy kommunikatsii]. In Komp’yuternaya lingvistika i intellektual’nye tekhnologii: Trudy mezhdunarodnoi konferentsii “Dialog 2016” [Computational linguistics and intellectual technologies: Proceedings of the International conference “Dialogue 2016”] (pp. 159-170). Moscow: RGGU. (in Russian)

13. Foster, M. E., & Oberlander, J. (2007). Corpus-based generation of head and eyebrow motion for an embodied conversational agent. Language Resources and Evaluation, 41(3/4), 305-323.

14. Goldin-Meadow, S. (2014). Widening the lens: What the manual modality reveals about language, learning, and cognition. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 369(1651), 20130295.

15. Karpiński, M., Jarmołowicz-Nowikow, E., & Malisz, Z. (2009). Aspects of gestural and prosodic structure of multimodal utterances in Polish task-oriented dialogues. Speech and Language Technology, 11, 113-122.

16. Kendon, A. (1967). Some functions of gaze direction in social interaction. Acta Psychologica, 26, 22-63.

17. Kibrik, A. A. (2010). Mul’timodal’naya lingvistika [Multimodal linguistics]. In Yu. I. Aleksandrov & V. D. Solov’ev (Eds.), Kognitivnye issledovaniya[Cognitive studies] (Iss. IV, pp. 134-152). Moscow: Institute of Psychology of RAS. (in Russian)

18. Kibrik, A. A. (2018). Russkii mul’tikanal’nyi diskurs. Chast’ 1. Postanovka problemy [Russian multichannel discourse. Part I. Setting up the problem]. Psikhologicheskii Zhurnal, 39(1), 70-80. (in Russian)

19. Kibrik, A. A., & Fedorova, O. V. (2018). A “Portrait” approach to multichannel discourse. In Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), 7-12 May 2018, Miyazaki (Japan).

20. Kibrik, A. A., & Podlesskaya, V. I. (Eds.). (2009). Rasskazy o snovideniyakh: Korpusnoe issledovanie ustnogo russkogo diskursa[Night dream stories: A corpus study of spoken Russian discourse]. Moscow: Yazyki slavyanskikh kul’tur. (in Russian)

21. Knight, D. (2011). Multimodality and active listenership: A corpus approach. London: Bloomsbury.

22. Kodzasov, S. V. (2009). Issledovaniya v oblasti psikhologii [Studies in the field of Russian prosody]. Moscow: Yazyki slavyanskikh kul’tur. (in Russian)

23. Kress, G. (2002). The multimodal landscape of communication. MedienJournal, 4, 4-19.

24. Litvinenko, A. O., Nikolaeva, Yu. V., & Kibrik, A. A. (2017). Annotirovanie russkikh manual’nykh zhestov: teoreticheskie i prakticheskie voprosy [Annotation of Russian manual gestures: Theoretical and practical issues]. In Komp’yuternaya lingvistika i intellektual’nye tekhnologii: Trudy mezhdunarodnoi konferentsii “Dialog 2017” [Computational linguistics and intellectual technologies: Proceedings of the International conference “Dialogue 2017”] (pp. 255-268). Moscow: RGGU. (in Russian)

25. Loehr, D. (2012). Temporal, structural, and pragmatic synchrony between intonation and gesture. Laboratory Phonology, 3(1), 71-89.

26. McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago: University of Chicago Press.

27. McNeill, D. (2005). Gesture and thought. Chicago: University of Chicago Press.

28. Mondada, L. (2014). Bodies in action. Language and Dialogue, 4(3), 357-403.

29. Mondada, L. (2016). Challenges of multimodality: Language and the body in social interaction. Journal of Sociolinguistics, 20, 336-366.

30. Mü ller, C., Fricke, E., Cienki, A., & McNeill, D. (Eds.). (2014). Body-Language-Communication: An international handbook on multimodality in human interaction. Berlin/Boston: De Gruyter Mouton.

31. Taylor, M. (1989). The structure of multimodal dialogue. Amsterdam: Elsevier.

32. Železný, M., Krňoul, Z., CRsař, P., & Matoušek, J.(2006). Design, implementation and evaluation of the Czech realistic audio-visual speech synthesis. Signal Processing, 83(12), 3657-3673.
Published
2018-11-05
How to Cite
КибрикА. А., & ФедороваО. В. (2018). An Empirical Study of Multichannel Communication: Russian Pear Chats and Stories. Psychology. Journal of the Higher School of Economics, 15(2), 191-200. https://doi.org/10.17323/1813-8918-2018-2-191-200
Section
Neurocognitive Aspects of Language Function and Use