Page 47 - 2024S
P. 47
40 UEC Int’l Mini-Conference No.52
Our user study provided valuable insights [3] C. Kim, K. K. Lee, M. S. Kang, D.-M.
into the effectiveness of olfactory-enhanced Shin, J.-W. Oh, C.-S. Lee, and D.-W. Han,
multimedia. Participants reported moderate to “Artificial olfactory sensor technology that
high levels of match between presented odors mimics the olfactory mechanism: a com-
and video content, with average match scores prehensive review,” Biomaterials Research,
of 4.5 and 5.33 out of 7 for the two test videos. vol. 26, no. 1, p. 40, 2022.
The addition of olfactory stimuli significantly
enhanced the immersive quality of the viewing [4] S. Saito, “Expressions of offensive odors
experience, with most participants preferring and everyday odors using words (in
the olfactory-enhanced viewing. However, the japanese),” Journal of Japan Association
study also revealed challenges, including the on Odor Environment, vol. 44, no. 6,
need for improved odor dissipation, calibration pp. 363–379, 2013.
of odor intensities, and consideration of indi- [5] Y. Eda, H. Matsukura, Y. Nozaki, and
vidual differences in olfactory perception. M. Sakamoto, “Detection of odor-related
objects in images based on everyday odors
This research highlights the untapped po- in japan,” 2023.
tential of olfactory integration in multimedia,
paving the way for more immersive and engag- [6] A. Radford, J. W. Kim, T. Xu, G. Brock-
ing experiences. By combining cutting-edge man, C. McLeavey, and I. Sutskever, “Ro-
AI techniques with practical implementation bust speech recognition via large-scale weak
strategies, we have set a foundation for future supervision,” in International Conference
advancements in this exciting field. Moving on Machine Learning, pp. 28492–28518,
forward, efforts should focus on refining odor PMLR, 2023.
presentation techniques, exploring the com- [7] T. Nakamoto and H. P. D. Minh, “Improve-
plex interplay between visual, auditory, and ment of olfactory display using solenoid
olfactory cues in multimedia contexts, and in- valves,” in 2007 IEEE Virtual Reality Con-
vestigating personalization strategies to account ference, pp. 179–186, 2007.
for individual differences in odor perception.
[8] L. Ouyang, J. Wu, X. Jiang, D. Almeida,
In conclusion, our work demonstrates the fea- C. Wainwright, P. Mishkin, C. Zhang,
sibility and potential impact of integrating ol- S. Agarwal, K. Slama, and A. Ray, “Train-
factory experiences into multimedia content us- ing language models to follow instructions
ing advanced AI techniques. As we continue with human feedback,” Advances in Neu-
to push the boundaries of multi-sensory media, ral Information Processing Systems, vol. 35,
such technologies promise to revolutionize how pp. 27730–27744, 2022.
we interact with and experience digital content, [9] J. Wei, X. Wang, D. Schuurmans,
opening new avenues for immersive storytelling M. Bosma, F. Xia, E. Chi, Q. V. Le,
and information delivery.
D. Zhou, et al., “Chain-of-thought prompt-
ing elicits reasoning in large language mod-
References els,” Advances in neural information pro-
cessing systems, vol. 35, pp. 24824–24837,
2022.
[1] R. S. Herz, “Odor-associative Learning and
Emotion: Effects on Perception and Behav- [10] S. Min, X. Lyu, A. Holtzman, M. Artetxe,
ior,” Chemical Senses, vol. 30, pp. i250– M. Lewis, H. Hajishirzi, and L. Zettle-
i251, 01 2005. moyer, “Rethinking the role of demon-
strations: What makes in-context learning
[2] R. Herz, The scent of desire: Discover- work?,” arXiv preprint arXiv:2202.12837,
ing our enigmatic sense of smell. Harper 2022.
Collins, 2009.