June 21, 2023

FUSION R&I project HUMAINE concludes

  • Tagged:
  • Funded Project

Since July of 2022, the IDG has been running the HUMAINE research project which was funded by the Malta Council for Science and Technology (MCST) under the FUSION R&I: Research Excellence Programme (Project number: REP-2022-017). At the end of June 2023, HUMAINE is ending and an introspection of its accomplishents is in order. The objective of HUNAINE was inherently ambitious: to endow computational and AI systems that can best understand the human element. The work carried out within HUMAINE blends research in Human Computer Interaction (HCI) and Artificial Intelligence (AI). Innovations are needed in both domains to allow human users to provide human data at the appropriate level and format for the AI to understand (an HCI problem) and to build AI models that can predict such human data (an AI problem). This forms the two core research directions that were tackled within HUMAINE during the last year.

Interfaces for capturing the human ground truth: HUMAINE aimed to design and develop interaction paradigms and interfaces for capturing subjective notions such as player engagement. The goal is to capture the most by asking a human for their subjective experience with the least amount of bias and cognitive load. This was accomplished through the extensions of the Platform for Audiovisual General-purpose ANnotation focusing on improving current practices based on feedback from annotators and stakeholders. Beyond the interface, however, we wanted to come up with an interesting stimulus for the humans to provide annotations: we identified fast-paced, high-action first person shooter games as the ideal stimulus and collected gameplay videos from online Let’s Play videos, cleaned and cut to short sessions. Using at least 10 annotators, we collected third person time-continous annotation traces of what these annotators thought the player's engagement level was on the video.

AI that learns how humans feel when playing a game: The human data collected through these interface had an ulterior purpose: build AI models that can predict how engaged a player is by simply watching the playthrough video. To do this, HUMAINE combined state-of-the-art computer vision models with modern AI algorithms for Preference Learning and Learning Using Privileged Information. The goal of HUMAINE was to endow an AI with a decent ability of predicting the impact of visual stimuli (e.g. the raw pixels of a playthrough video) on the player’s engagement. Testing these algorithms on gameplay videos annotated for engagement and traditional affect datasets of facial expressions annotated for arousal and valence, we conducted a number of experiments that showed that, to a certain degree, you can learn more from less and predict at a reasonable accuracy from only the footage alone. This is very important for game development, as traditional ways of data collection for affect detection relies on worn biosensors, webcams, voice recordings and more, which are very intrusive and no gamer would want to play monitored by those 24/7.

Find more details about the achievements of HUMAINE and try out its annotation interface on the HUMAINE website.