Committee |
Date Time |
Place |
Paper Title / Authors |
Abstract |
Paper # |
WIT, HI-SIGACI |
2022-12-08 10:25 |
Tokyo |
AIST Tokyo Waterfront (TBD) |
Proposal for Visuospatial Cognitive Training Support System Mio Sakuma (NIT. Sendai), Kei Obata (ISB), Satoshi Funaki (TOYOTA SYSTEMS), Hitoshi Daikoku (TBGU), Takahiro Yonamine (NIT.Okinawa) WIT2022-6 |
We proposed a visuospatial cognitive training support system that reflects the needs of occupational therapists for reha... [more] |
WIT2022-6 pp.11-16 |
ET |
2022-11-05 13:00 |
Online |
Online |
Proposal for Dynamic Gaze Feedback System to Support Programming Debugging Learning Kohei Yoshimori, Toru Kano, Takako Akakura (TUS) ET2022-33 |
In recently, IT human resources are much in demand to realize of Society 5.0. Therefore, the expansion of programming ed... [more] |
ET2022-33 pp.19-24 |
MVE, VRSJ-SIG-MR, IPSJ-EC, HI-SIG-DeMO, VRSJ-SIG-CS |
2022-10-06 13:20 |
Hokkaido |
(Primary: On-site, Secondary: Online) |
Estimation of out-of-view attention region Koki Hara, AtsuShi Nakazawa (Kyoto Univ) MVE2022-18 |
Gazing behavior is one of the most important elements of human attention to the outward things.
The study of saliency e... [more] |
MVE2022-18 pp.1-6 |
MVE, IPSJ-HCI, IPSJ-EC, VRSJ, ITE-HI, HI-SIG-DeMO, ITE-SIP |
2022-06-16 11:00 |
Online |
Online |
An Examination on Eye-Gaze Input Using a Bubble Cursor in AR Tomohiro Fujiwara, Kei Kanari, Mie Sato (Utsunomiya Univ) |
In recent years, research on augmented reality (AR) has been actively conducted, and eye-gaze input has been attracting ... [more] |
|
HCS, HIP, HI-SIGCOASTER [detail] |
2022-05-15 13:10 |
Okinawa |
Okinawa Industry Support Center (Primary: On-site, Secondary: Online) |
Part-based processing in the perception of eye-gaze direction
-- Evidence from a spatial Stroop paradigm -- Yoshihiko Tanaka, Kenta Ishikawa, Takato Oyama, Matia Okubo (Senshu Univ.) HCS2022-11 HIP2022-11 |
(To be available after the conference date) [more] |
HCS2022-11 HIP2022-11 pp.53-56 |
HCS, HIP, HI-SIGCOASTER [detail] |
2022-05-15 13:30 |
Okinawa |
Okinawa Industry Support Center (Primary: On-site, Secondary: Online) |
Attentional process for various types of gaze stimuli Kenta Ishikawa, Takato Oyama, Yoshihiko Tanaka, Matia Okubo (Senshu Univ.) HCS2022-12 HIP2022-12 |
(To be available after the conference date) [more] |
HCS2022-12 HIP2022-12 pp.57-62 |
HCS, HIP, HI-SIGCOASTER [detail] |
2022-05-15 14:10 |
Okinawa |
Okinawa Industry Support Center (Primary: On-site, Secondary: Online) |
Social facilitation in gaze cueing tasks Takato Oyama, Kenta Ishikawa, Matia Okubo (Senshu Univ.) HCS2022-14 HIP2022-14 |
(To be available after the conference date) [more] |
HCS2022-14 HIP2022-14 pp.67-70 |
HCS, HIP, HI-SIGCOASTER [detail] |
2022-05-15 15:55 |
Okinawa |
Okinawa Industry Support Center (Primary: On-site, Secondary: Online) |
Utilization of 3DCG with Hollow Mask Effect to Support Drawing of Character's Eyes Jun Asuka, Takafumi Higashi (Tokyo Denki Univ.) HCS2022-18 HIP2022-18 |
In this research, for beginners who draw the composition of the subjective viewpoint, we support the drawing of the eyes... [more] |
HCS2022-18 HIP2022-18 pp.86-91 |
HCS, HIP, HI-SIGCOASTER [detail] |
2022-05-16 09:40 |
Okinawa |
Okinawa Industry Support Center (Primary: On-site, Secondary: Online) |
Analysis of the features of resilience potential to adapt to changing situations
-- Toward interaction design -- Haruka Yoshida, Hideki Kama, Ren Nakano (Nihon Univ.), Daisuke Karikawa (Tohoku Univ.), Hisae Aoyama (ENRI), Takashi Toriizuka (Nihon Univ.) HCS2022-20 HIP2022-20 |
This research aims to realize a system interaction to support the improvement of resilience potential (hereafter referre... [more] |
HCS2022-20 HIP2022-20 pp.98-103 |
MBE |
2022-05-13 13:55 |
Online |
Online |
Fundamental Measurement of Driver's Gaze Behavior toward Detection for Pedestrian in an Intersection Ao Ogata, Takashi Imamura (Niigata Univ) MBE2022-2 |
The accident rate about pedestrian is high in cases of collision with a turning right vehicle at intersections. One of f... [more] |
MBE2022-2 pp.6-9 |
PRMU, IPSJ-CVIM |
2022-05-13 10:15 |
Aichi |
Toyota Technological Institute |
Shape-preserving Style Transfer by Dense Pixel Correspondences between Input and Output Images and its Application to Gaze Estimation Daiki Mushiake, Norimichi Ukita (TTI) PRMU2022-2 |
Gaze estimation has been applied to a variety of tasks, and many methods using deep learning have been proposed in recen... [more] |
PRMU2022-2 pp.6-11 |
ITS, IEE-ITS |
2022-03-11 14:40 |
Online |
Online |
Gaze Estimation for a Driving Simulator Using Machine Learning Takayuki Furukawa, Yuji Matsuki (FIT) ITS2021-70 |
Several studies have been conducted on gaze estimation in driving simulators due to the widespread attention it has attr... [more] |
ITS2021-70 pp.38-43 |
PRMU, IPSJ-CVIM |
2022-03-10 09:15 |
Online |
Online |
Unsupervised adaptation of appearance-based gaze estimation models for domains with different label distributions. Takuru Shimoyama, Yusuke Sugano (The Univ. of Tokyo) PRMU2021-61 |
The annotation of gaze estimation is time-consuming, and it is not easy to collect training data under the exact same li... [more] |
PRMU2021-61 pp.7-12 |
CQ, IMQ, MVE, IE (Joint) [detail] |
2022-03-11 16:05 |
Online |
Online (Zoom) |
Quantification of Visual Exploration Activities on Experiencing the Basketball Game Using VR Simulation Shinya Ishikawa, Hidehiko Shishido, Kenji Yoshida, Yoshinari Kameda (Tsukuba Univ) IMQ2021-64 IE2021-126 MVE2021-93 |
In basketball, players are required to make accurate judgments about complex and rapidly changing situations. An importa... [more] |
IMQ2021-64 IE2021-126 MVE2021-93 pp.284-289 |
HCS |
2022-01-28 15:20 |
Online |
Online |
Observation of verbal and non-verbal interactions related to Japanese food experience Yasuyuki Sumi (Future Univ. Hakodate), Naomi Yamashita (NTT), Naoe Imura (Kyoto Sangyo Univ.), Akane Okuno (Future Univ. Hakodate) HCS2021-49 |
In order to deepen the understanding of verbal and nonverbal interactions related to the Japanese food experience, we ar... [more] |
HCS2021-49 pp.36-42 |
HIP |
2021-12-23 13:30 |
Online |
Online |
Effects of Eye and Head Movement on Image Preference Toshiaki Chiba, Yasuhiro Hatori, Yoshiyuki Sato, Chia-huei Tseng, Satoshi Shioiri (Tohoku Univ.) HIP2021-49 |
Gaze shift is a cue to the human decision-making process as an indicator of visual information uptake. When evaluating a... [more] |
HIP2021-49 pp.7-10 |
HCGSYMPO (2nd) |
2021-12-15 - 2021-12-17 |
Online |
Online |
Proposal of three-dimensional visualization of body parts where the observer's gaze is gathered Ken Kinoshita, Masashi Nishiyama, Yoshio Iwai (Tottori Univ.) |
(To be available after the conference date) [more] |
|
ET |
2021-12-11 14:35 |
Online |
Online |
Proposal of a Method for Constructing a Cognitive Process Model during Debugging Based on Gaze Analysis Kohei Yoshimori, Toru Kano, Takako Akakura (TUS) ET2021-37 |
The purpose of this study is to propose a debugging learning support for novice programmers using the gaze behavior of e... [more] |
ET2021-37 pp.47-52 |
HIP |
2021-10-22 13:10 |
Online |
Online |
A scanpath prediction model using deep learning considering the context of the gazing objects Yuhei Ohsawa, Takeshi Kohama (Kindai Univ.) HIP2021-43 |
Since the human gaze is a biological signal which reflects internal states such as consciousness and attention, it is po... [more] |
HIP2021-43 pp.69-74 |
HCS, HIP, HI-SIGCOASTER [detail] |
2021-05-25 12:50 |
Online |
Online |
Development of a Job Interview Training System with Multi-modal Behavior Analysis Nao Takeuchi, Tomoko Koda (OIT) HCS2021-10 HIP2021-10 |
This paper introduces a job interview training system that recognizes the nonverbal behaviors of the interviewee, namely... [more] |
HCS2021-10 HIP2021-10 pp.50-54 |