Conference Update: New Conference Date & Venue
IUI 2026 will take place through 13–16 July 2026.
For more information please refer to the conference website:
About the Workshop
The integration of eXtended Reality (XR) and Artificial Intelligence (AI) enables experiences that evolve from specialized applications to everyday wearable platforms. This realization has caused a paradigm shift in how people interact with intelligent systems.
Our workshop focuses on the potential of these interfaces to transform human-AI interaction, from hands-free, spatially contextualized dialogue to embodied, gesture-enhanced communication, enabling seamless integration into daily activities.
We bring together researchers and practitioners to explore how XR, combined with AI-driven adaptation and human-centered evaluation, can create intelligent systems that truly augment human capabilities.
Workshop Topics
AI-Driven Personalization
How AI can drive real-time personalization of immersive interfaces that adapt to user states, contexts, and goals.
Multimodal Interaction
Design principles for XR-based multimodal interaction with AI agents, ensuring efficiency and intuitiveness.
Human Factors
Methods to measure and optimize cognitive load, emotional engagement, and trust in AI-mediated XR interactions.
Inclusive Design
Accessibility by design approaches ensuring inclusivity for users with diverse abilities.
Collaborative Dynamics
Social and collaborative dynamics in multi-user AI-enhanced XR spaces.
Ethical Considerations
Ethical frameworks for responsible development of AI-powered XR experiences.
Key Dates
Full Paper Submission Deadline
December 19, 2025
January 16, 2026 (firm)
Full Paper Acceptance Notification
February 2, 2026
Final Paper Submission Deadline
February 16, 2026
Program: July 13, 2026 (Draft)
Session S1
9:00 – 9:30 — Welcome and introduction
9:30 – 10:00 — Occlusion-Robust Multimodal Emotion Recognition in VR via Fusion of
Facial Images and EMG
Birgit Nierula, Karam Tomotaki-Dawoud, Mert Akguel, Mustafa-Tevfik, David Przewozny, Anna Hilsmann, Peter Eisert, Dr. Sebastian Bosse
10:00 – 10:30 — Spatio-Temporal Representation of Eye-Tracking Data for Cognitive
Failure Detection in VR
Tanaz Ghahremani, Sahar Niknam, Berin Venedik, Jean Botev
10:30 – 11:00 — Coffee Break
Session S2
11:00 – 11:30 — Inclusive Hand-Based Region-of-Interest Selection Using On-Device
Landmark Recognition
Fabian Glöggler, Markus Hahn, Marianne von Schwerin
11:30 – 12:00 — From Multimodal Signals to Adaptive XR Experiences for De-escalation
Training
Birgit Nierula, Karam Tomotaki-Dawoud, Daniel Johannes, Iryna Ignatieva, Mina Mottahedin, Thomas Koch, Dr. Sebastian Bosse
12:00 – 12:30 — Adaptive Virtual Reality Museum A Closed-Loop Framework for
Engagement-Aware Cultural Heritage
Joseph Damouni, Wadia Tanus, Dr Naomi Unkelos Shpigel
12:30 – 14:00 — Lunch
Session S3
14:00 – 14:20 — Empowering NAO: Overcoming Humanoid Robot Constraints via Large
Language Models
Valerio Colonnese, Gilda Manfredi, Nicola Capece, Ugo Erra, Anthony Rivelli
14:20 – 14:40 — Supporting Medical Doctors with AI-based Smart Wearable Assistance:
Case Study of Diagnosis and Therapy Evaluation for Ward Visits
Agnese Augello, Giuseppe Caggianese, Silvia Franchini, Ignazio Infantino, Luca Sabatucci
14:40 – 15:15 — Demo Presentation – Inclusive Hand-Based Region-of-Interest Selection
Using On-Device Landmark Recognition
Fabian Glöggler, Markus Hahn, Marianne von Schwerin
15:15 – 15:30 — Workshop closing and next steps
Organizers
Giuseppe Caggianese
National Research Council of Italy (CNR)
Research Scientist heading the Augmented Human-Computer Interaction group. Focuses on designing advanced interfaces for XR in medical and cultural domains.
Marta Mondellini
National Research Council of Italy (CNR)
Researcher focusing on the Human Factor in human-machine interaction, particularly in clinical rehabilitation and telemedicine scenarios.
Nicola Capece
University of Basilicata
Assistant Professor focusing on Real-Time Rendering, Deep Learning, Computer Graphics, and XR applications in cultural heritage and beyond.
Mario Covarrubias
Politecnico di Milano
Associate Professor and Head of the Virtual Prototyping and Augmented Reality Research Laboratory at Politecnico di Milano.
Gilda Manfredi
University of Basilicata
Research Fellow focusing on Deep Learning, Computer Graphics, XR and HCI. Specializes in AI for creating and manipulating 3D content in XR environments.
Program Committee
Ugo Erra, University of Basilicata
Valerio Colonnese, University of Basilicata
Lorenzo Stacchio, University of Macerata
Anran Qi, INRIA
Agnese Augello, National Research Council of Italy
Boriana Koleva, University of Nottingham
Pietro Neroni, National Research Council of Italy
Sara Buonocore, University of Naples Federico II
Luca Sabatucci, National Research Council of Italy
Valerio De Luca, Pegaso University
Luigi Casoria, National Research Council of Italy
Federico Manuri, Politecnico di Torino
Silvia Franchini, National Research Council of Italy
Luigi Gallo, Pegaso University
Katia Lupinetti, National Research Council of Italy
Alfonso Matropietro, National Research Council of Italy
Ludovica Gargiulo, National Research Council of Italy
Sabatina Criscuolo, National Research Council of Italy
Marta Pizzolante, Università Cattolica del Sacro Cuore
Antonio Esposito, University of Naples Federico II
Sabrina Bartolotta, Università Cattolica del Sacro Cuore
Walter Terkaj, National Research Council of Italy
Luigi Duraccio, University of Naples Federico II
Veronica Sundstedt, Blekinge Institute of Technology
Giovanni D’Errico, University of Salento
Call for Papers
TOPICS OF INTEREST
The list of topics includes (but it is not limited to) the following:
- AI-driven real-time personalization of immersive interfaces based on user states, contexts, and goals in XR environments
- Multimodal interaction (speech, gesture, gaze, haptics) with AI agents
- Cognitive load, trust, ethics, and emotional engagement in AI-powered XR
- Accessibility and inclusivity in immersive AI systems
- Evaluation methods for human-centered XR and AI interfaces
- Social and collaborative AI-enhanced XR experiences
- Design principles and methodologies for adaptive intelligent interfaces
- Applications of XR-AI interfaces in healthcare, education, work, or entertainment
SUBMISSION AND DECISIONS
Authors should prepare a Regular or Short paper. Regular papers are at least 10 "standard" pages and short papers at least 5 pages (5-9 "standard" pages without references). Authors must use the new CEURART style format. Papers should clearly indicate the originality of the contribution, relevance to the workshop themes, and potential impact.
All contributions will be reviewed by the Technical Program Committee, and acceptance will be based on quality, originality, and relevance. Authors of accepted papers must submit the final paper version according to the deadline, register for the workshop, and attend to present their papers.
PROCEEDINGS with CEUR-WS
All contributions will be peer-reviewed, and acceptance will be based on quality, originality, and relevance. Accepted papers will be published at CEUR-WS.org.
SUBMISSION INSTRUCTIONS
Submissions must be anonymized for double-blind review. To submit your contribution, in the submission platform select "IUI 2026 Workshop SHAPEXR".
REGISTRATION
For registration, follow the instructions available on the IUI Conferece website.