Method of emotional forecasting in online interviewing

Authors

  • Sergey А. Aleynikov ITMO University
  • Sofia А. Sorokina ITMO University
  • Alexandr I. Ofitserov Academy of Federal Security Guard Service of Russian Federation

DOI:

https://doi.org/10.52575/2687-0932-2021-48-1-178-187

Keywords:

multimodality, online interviewing, psycho-emotional analysis, emotional forecasting, machine learning

Abstract

Currently, there is a significant increase in interest in digital technologies on the part of businesses. Many companies, mostly small and medium-sized businesses, are moving interviews and negotiations to audio and video conferencing systems, and CEOs of large companies are also declaring the need to transform the HR function into an online format by implementing new technologies. The media publish more and more information about the wide possibilities of tools for implementing these technologies and the incredible results of their implementation. For example, since 2021 a number of Russia's leading banks plan to introduce programs that will read the emotions of clients during a phone conversation and when they visit the office of the credit institution. However, researches show that users are often disappointed with the results obtained due to the noticeable loss of ease of perception of multimodality of information flows. The presence of these contradictions requires research and development of new approaches to online interviewing. The fundamentally new method of emotional prediction considered in the article will allow predicting the success of an interview result determined by interactively user-defined parameters, which should eventually lead to the complete removal of the problem of HR function transformation in the online format.

Downloads

Download data is not yet available.

Author Biographies

Sergey А. Aleynikov, ITMO University

Engineer ITMO University, Petersburg, Russia, Frontend developer Yandex. Technologies LLC, Moscow, Russia ORCID ID: 0000-0002-6884-2322

Sofia А. Sorokina, ITMO University

Engineer ITMO University, Petersburg, Russia

ORCID ID: 0000-0001-9159-5203

Alexandr I. Ofitserov, Academy of Federal Security Guard Service of Russian Federation

Сandidate of Technical Sciences, employee of the Academy of Federal Security Guard Service of Russian Federation,

ORCID ID: 0000-0002-4379-6948

References

HR партнер, 2019. Выпуск Цифровизация HR: эволюция или революция? 3. URL: https://universitetrzd.ru/wp-content/uploads/2019/12/1-2019_HR-partner_-1-1.pdf (дата обращения: 18.12.2020)

Бондаренко В.А., Максаев А.А., Шумакова И.А. 2020. Инновационный подход к управлению деятельностью вуза на основе применения HR-брендинга. Экономика. Информатика. 47 (1): 47–54.

Васильев Р.А. 2020. Применение методов фонетического анализа речи для выявления эмоционально устойчивых и нестабильных студентов университета. Научный результат. Информационные технологии. 5 (2).

Рубцова О.В., Панфилова А.С., Смирнова В.К. 2018. Исследование взаимосвязи личностных особенностей подростков с их поведением в виртуальном пространстве (на примере социальной сети «ВКонтакте»). Психологическая наука и образование. 23 (3): 54–66.

Смирнов А.В., Безручко В.В., Басов О.О. 2019. Теоретические основы построения социокиберфизических систем. Научные ведомости Белгородского государственного университета. Серия: Экономика. Информатика. 46 (3): 532–539.

Batbaatar E., Li M., Ryu K.H. 2019. Semantic-emotion neural network for emotion recognition from text. IEEE Access. 7: 111866–111878.

Calefato F., Lanubile F., Novielli N. 2017. EmoTxt: a toolkit for emotion recognition from text. 2017 seventh international conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW). IEEE. 79–80.

Fan Y. et al. 2016. Video-based emotion recognition using CNN-RNN and C3D hybrid networks. Proceedings of the 18th ACM International Conference on Multimodal Interaction. 445–450.

Fayek H.M., Lech M., Cavedon L. 2017. Evaluating deep learning architectures for Speech Emotion Recognition. Neural Networks. 92: 60–68.

Jones S.E., LeBaron C.D. 2002. Research on the relationship between verbal and nonverbal communication: Emerging integrations. Journal of communication. 52 (3): 499–521.

Jurafsky D. 2000. Speech & language processing. Pearson Education India.

Korenevskiy N. et al. 2013. Fuzzy determination of the human’s level of psycho-emotional. 4th International Conference on Biomedical Engineering in Vietnam. Springer, Berlin, Heidelberg. 213–216.

Kort B., Reilly R., Picard R. W. 2001. An affective model of interplay between emotions and learning: Reengineering educational pedagogy-building a learning companion. Proceedings IEEE International Conference on Advanced Learning Technologies. IEEE. 43–46.

Lim W., Jang D., Lee T. 2016. Speech emotion recognition using convolutional and recurrent neural networks. 2016 Asia-Pacific signal and information processing association annual summit and conference (APSIPA). IEEE. 1–4.

Liu C. et al. 2018. Multi-feature-based emotion recognition for video clips. Proceedings of the 20th ACM International Conference on Multimodal Interaction. 630–634.

Noroozi F. et al. 2017. Audio-visual emotion recognition in video clips. IEEE Transactions on Affective Computing. 10 (1): 60–75.

Plutchik R. 1991. The emotions. University Press of America.

Sailunaz K. et al. 2018. Emotion detection from text and speech: a survey. Social Network Analysis and Mining. 8 (1): 28

Vryzas N. et al. 2018. Speech emotion recognition adapted to multimodal semantic repositories. 2018 13th International Workshop on Semantic and Social Media Adaptation and Personalization (SMAP). IEEE. 31–35.

Williams J. et al. 2018. Recognizing emotions in video using multimodal DNN feature fusion. Proceedings of Grand Challenge and Workshop on Human Multimodal Language (Challenge-HML). 11–19.

Yoon S., Byun S., Jung K. 2018. Multimodal speech emotion recognition using audio and text. 2018 IEEE Spoken Language Technology Workshop (SLT). IEEE. 112–118.

Zuckerman M., DePaulo B.M., Rosenthal R. 1981. Verbal and nonverbal communication of deception. Advances in experimental social psychology. Academic Press. 14: 1–59.


Abstract views: 72

Share

Published

2022-09-19

How to Cite

AleynikovS. А., SorokinaS. А., & Ofitserov, A. I. (2022). Method of emotional forecasting in online interviewing. Economics. Information Technologies, 48(1), 178-187. https://doi.org/10.52575/2687-0932-2021-48-1-178-187

Issue

Section

INFOCOMMUNICATION TECHNOLOGIES

Most read articles by the same author(s)