Anderson Augusto Simiscuka, Theo Plantefol, Abid
Author : calandra-battersby | Published Date : 2025-05-23
Description: Anderson Augusto Simiscuka Theo Plantefol Abid Yaqoob and GabrielMiro Muntean Contact andersonsimiscukainsightcentreorg theoplantefol2maildcuie abidyaqoobdcuie gabrielmunteandcuie CNNbased 360 Scene Recognition for
Presentation Embed Code
Download Presentation
Download
Presentation The PPT/PDF document
"Anderson Augusto Simiscuka, Theo Plantefol, Abid" is the property of its rightful owner.
Permission is granted to download and print the materials on this website for personal, non-commercial use only,
and to display it on your personal computer provided you do not modify the materials and that you retain all
copyright notices contained in the materials. By downloading content from our website, you accept the terms of
this agreement.
Transcript:Anderson Augusto Simiscuka, Theo Plantefol, Abid:
Anderson Augusto Simiscuka, Theo Plantefol, Abid Yaqoob and Gabriel-Miro Muntean Contact: anderson.simiscuka@insight-centre.org, theo.plantefol2@mail.dcu.ie, abid.yaqoob@dcu.ie, gabriel.muntean@dcu.ie CNN-based 360° Scene Recognition for Automatic Generation of Omnidirectional Olfactive Effects Introduction Scents are connected to the brain memory function and can be used in various areas, including marketing, product design and entertainment. Olfactory stimuli, combined with visual and audio cues, enable the creation of more realistic experiences in immersive environments. In 360° videos, omnidirectional scents can be used as a cue to help users to look into important areas of the videos. In order to automate the task of generating olfactory effects, Convolutional Neural Networks (CNNs) can be used to perform scene recognition. Four CNNs were compared on the cloud-based platform Google Colab. The CNNs detect the scent, tile and time of effect occurrence. Each tile of the 360° video can trigger scents independently, on the respective scent dispenser. The application server controls the scent dispensers via USB or WiFi. It also sends videos to the Oculus Rift. System Architecture Testing and Results Multiple olfaction devices are placed around a user, releasing scents in the same direction with the triggering scene in the 360º video. Videos are watched with an Oculus Rift VR headset. Tests demonstrated an Olfaction Accuracy of up to 71.28%. A number of participants evaluated the solution, which was perceived positively and helped to improve Quality of Experience (QoE): 79% of participants agreed or strongly agreed that smells helped in making the VR experience more immersive and enjoyable. 24 participants watched videos processed with different CNNs, and indicated ResNet-18 is the model with the best results in most questions.