EntityBot: Supporting everyday digital tasks with entity recommendations

Tung Vuong, Salvatore Andolina, Giulio Jacucci, Pedram Daee, Khalil Klouche, Mats Sjöberg, Tuukka Ruotsalo, Samuel Kaski

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review


Everyday digital tasks can highly benefit from systems that recommend the right information to use at the right time. However, existing solutions typically support only specific applications and tasks. In this demo, we showcase EntityBot, a system that captures context across application boundaries and recommends information entities related to the current task. The user's digital activity is continuously monitored by capturing all content on the computer screen using optical character recognition. This includes all applications and services being used and specific to individuals' computer usages such as instant messaging, emailing, web browsing, and word processing. A linear model is then applied to detect the user's task context to retrieve entities such as applications, documents, contact information, and several keywords determining the task. The system has been evaluated with real-world tasks, demonstrating that the recommendation had an impact on the tasks and led to high user satisfaction.

Original languageEnglish
Title of host publicationRecSys 2021 - 15th ACM Conference on Recommender Systems
Number of pages4
ISBN (Electronic)9781450384582
Publication statusPublished - 13 Sept 2021
MoE publication typeA4 Conference publication
EventACM International Conference on Recommender Systems - Virtual, Online, Netherlands
Duration: 27 Sept 20211 Oct 2021
Conference number: 15


ConferenceACM International Conference on Recommender Systems
Abbreviated titleRecSys
CityVirtual, Online


  • Proactive information retrieval
  • Real-world tasks
  • User intent modeling


Dive into the research topics of 'EntityBot: Supporting everyday digital tasks with entity recommendations'. Together they form a unique fingerprint.

Cite this