BERTERS: Multimodal representation learning for expert recommendation system with transformers and graph embeddings
Abstract
An expert recommendation system suggests relevant experts of a particular topic based on three different scores authority, text similarity, and reputation. Most of the previous studies individually compute these scores and join them with a linear combination strategy. While, in this paper, we introduce a transfer learning-based and multimodal approach, called BERTERS, that presents each expert candidate by a single vector representation that includes these scores in itself. BERTERS determines a representation for each candidate that presents the candidate's level of knowledge, popularity and influence, and history. BERTERS directly uses both transformers and the graph embedding techniques to convert the content published by candidates and collaborative relationships between them into low-dimensional vectors which show the candidates' text similarity and authority scores. Also, to enhance the accuracy of recommendation, BERTERS takes into account additional features as reputation score. We conduct extensive experiments over the multi-label classification, recommendation, and visualization tasks. Also, we assess its performance on four different classifiers, diverse train ratios, and various embedding sizes. In the classification task, BERTERS strengthens the performance on Micro-F1 and Macro-F1 metrics by 23.40 % and 34.45 % compared with single-modality based methods. Furthermore, BERTERS achieves a gain of 9.12 % in comparison with the baselines. Also, the results prove the capability of BERTERS to extend into a variety of domains such as academic and CQA to find experts. Since our proposed expert embeddings contain rich semantic and syntactic information of the candidate, BERTERS resulted in significantly improved performance over the baselines in all tasks.
- Publication:
-
Chaos Solitons and Fractals
- Pub Date:
- October 2021
- DOI:
- 10.1016/j.chaos.2021.111260
- arXiv:
- arXiv:2007.07229
- Bibcode:
- 2021CSF...15111260N
- Keywords:
-
- Multimodal representation learning;
- Expert recommendation system;
- Transformer;
- Graph embedding;
- Computer Science - Information Retrieval;
- Computer Science - Computation and Language;
- Computer Science - Machine Learning
- E-Print:
- doi:10.1016/j.chaos.2021.111260