Joint research and synergies with sister projects

Together with Carousel four other projects were selected by the European Commission in the 2020 call for proposal “FET Proactive emerging communities and paradigms – Artificial intelligence for extended social interaction“. Sonicom, Experience, Carousel and Touchless (SECT) are complementary projects with complementary objectives yet conducting research in the same Human Centric AI area. For leveraging synergies, fostering collaboration between participants and maximizing impact, the 4 SECT projects have decided to keep each other informed on a regular basis. Concretely by means of regular meetings of the collaboration board representing the 4 projects, and by cross participation to each others main project meetings.

After one year, this strategy of cross fertilization of ideas and technologies has already given some first ideas concepts for concrete co-research items. On this page updates will be posted illustrating concrete results of the collaborations between the 4 sisters.

SONICOM, EXPERIENCE, CAROUSEL, TOUCHLESS and GUESTXR (SECTG) are EU ‘sister projects’ that are funded under Horizon 2020 FET PROACT-EIC-07-2020 call and subtopic “Artificial Intelligence for extended social interaction”. SECT set up a cross-collaboration plan that expands on the interdisciplinary and ambitious goals of the individual projects, with 5 main objectives:
1) To examine any relevant synergies and establish two-way communication and dissemination between the sister projects;
2) To promote the development of science and innovation ideas between the sister projects;
3) To support the creation of a strategic expert group on “Artificial Intelligence for extended social interaction” that can influence and support the development of future legislation and funding opportunities aligned with this field;
4) To nurture the creation of future collaborations that are strategically aligned with the key areas identified and support them in identifying future funding opportunities; and
5) To identify and support stakeholder engagement for the research outputs developed in the sister projects to ensure enhanced project impact.
SONICOM – Transforming auditory-based social interaction and communication in AR/VR.
Sound is an integral part of the human experience. As one of the most important ways of sensing and interacting with our environment, sound plays a major role in shaping how the world is perceived. In virtual or augmented reality (VR/AR), simulating spatially correct audio is of vital importance to delivering an immersive virtual experience. However, acoustic VR/AR presents many challenges. Using the power of artificial intelligence, the EU-funded SONICOM project aims to deliver the next milestone in immersive audio simulation. The goal is to design the next generation of 3D audio technologies, provide tailored audio solutions and significantly improve how we interact with the virtual world. Grant agreement ID: 10107743. https://www.sonicom.eu/
EXPERIENCE – The “Extended-Personal Reality”: augmented recording and transmission of virtual senses through artificial – IntelligENCE. In recent years, virtual reality (VR) has found more commercial applications and larger market share. However, VR still holds great potential to enhance interaction and expression. The EU-funded EXPERIENCE project seeks to use VR to enhance daily life by allowing brand new ways of social interaction and personal expression. It will develop the technology required to help users easily create and manipulate their own unique VR environments, significantly improving their virtual experiences. The goal is to bring VR into areas such as mental health treatment, entertainment and education, promoting VR as a means of significantly improving the human experience. Grant agreement ID: 101017727. https://experience-project.eu/
CAROUSELReal-life interactions in the digital world.
Imagine a digital character taking on the real-world role of a physical therapist, teacher or guide. This may one day become reality. With this in mind, the EU-funded CAROUSEL project is developing technology that will pave the way for social-physical behaviour of digital characters by better understanding human body language and having the capability to interact autonomously with a group of people. It will examine, simulate, develop, test and validate human–digital interaction scenarios. The potential for this “Real-World Social and Physical AI” is vast since it can be applied to many applications in many areas, for physically interacting in a meaningful social way. Grant agreement ID: 101017779. https://www.carouseldancing.org/
TOUCHLESS Haptic Experiences with Neurocognitive AI.
The social distancing measures introduced to restrict the spread of coronavirus and help halt the COVID-19 pandemic have made us realise just how important interpersonal physical contact is to our overall well-being, and which we are currently missing. Even though there are many advanced haptic technologies, they cannot replace tactile interaction that enables bonding physical sensation of care through the stimulation of skin receptors during touching. The EU-funded TOUCHLESS project aims to overcome that challenge through development of a next-generation touchless haptic technology. It will construct neurocognitive models and a novel AI framework to enable touchless digital inducement of the sensation of touch through the receptor response. It will also enable a soothing, affective, social and cognitive experience. Grant agreement ID: 101017746. https://www.touchlessai.eu/
A Machine Learning Agent for Social Harmony in eXtended Reality.
User content often stimulates antisocial interaction and abuse, representing a threat to vulnerable adults, teenagers and children. The EU-funded GuestXR project intends to develop a socially interactive multisensory platform system that uses extended reality – virtual and augmented – as a vehicle to connect people for immersive, synchronous face to face interaction with positive social results. The project will introduce a critical innovation consisting of the intervention of artificial agents that learn over time to assist the virtual social gathering in realising its purposes. This agent, ‘The Guest’, will exploit machine learning to enable the meeting towards specific results. The project will rely on neuroscience and social psychology research on group behaviour to deliver rules to agent based models.