Committee |
Date Time |
Place |
Paper Title / Authors |
Abstract |
Paper # |
HCGSYMPO (2nd) |
2023-12-11 - 2023-12-13 |
Fukuoka |
Asia pacific Import Mart (Kitakyushu) (Primary: On-site, Secondary: Online) |
Gaze features representing anticipatory gaze and machine learning models for predicting Self-Efficacy
-- From data in use of a rotational transformation mouse -- Yuka Hayakawa, Saki Tanaka, Airi Tsuji (TUAT), Junichi Yamamoto (TMU), Kaori Fujinami (TUAT) |
Self-efficacy is the degree of confidence in one's ability to perform a behavior, and has been attempted to apply to pr... [more] |
|
MI, MICT |
2023-11-14 10:50 |
Fukuoka |
|
A Study on How to Classify Paralytic Strabismus Based on Ocular Motility Photographs Takeshi Noda, Koh Kakusho, Takeshi Okadome (Kwansei Gakuin Univ.), Yoichi Okita, Akiko Kimura, Fumi Gomi (Hyogo Medical Univ.) MICT2023-27 MI2023-20 |
In this paper, we discuss how to obtain the types of paralytic strabismus from ocular motility photographs of the patien... [more] |
MICT2023-27 MI2023-20 pp.9-12 |
NS |
2023-10-06 14:15 |
Hokkaido |
Hokkaidou University + Online (Primary: On-site, Secondary: Online) |
A study on traffic reduction in crowd gaming using eye Information Shintaro Okade (Osaka Univ.), Takumasa Ishioka (Osaka Univ./Kyoto Tachibana Univ.), Takuya Fujihashi, Shunsuke Saruwatari, Takashi Watanabe (Osaka Univ.) NS2023-102 |
In cloud gaming, a remote cloud server handles the various game processing and video generation tasks that have been per... [more] |
NS2023-102 pp.151-156 |
IMQ, IE, MVE, CQ (Joint) [detail] |
2023-03-17 11:10 |
Okinawa |
Okinawaken Seinenkaikan (Naha-shi) (Primary: On-site, Secondary: Online) |
Creation of human gaze heat maps using skeletal estimation Satoki Nakamura, Dan Mikami (Kogakuin Univ.) IMQ2022-68 IE2022-145 MVE2022-98 |
In sports, one may analyze one's own form or the form of a skilled athlete on video. When analyzing such video in traini... [more] |
IMQ2022-68 IE2022-145 MVE2022-98 pp.241-246 |
ET |
2023-03-15 14:00 |
Tokushima |
Tokushima University (Primary: On-site, Secondary: Online) |
Methods for Estimating Problematic Parts in Presentations Based on Analysis of Slide Data and Audiences' Unconscious Responses Ko Saito (Fukushima Univ.), Hiroki Nakayama (Yamagata Univ.), Ryo Onuma (Tsuda Univ.), Hiroaki Kaminaga (Fukushima Univ.), Youzou Miyadera (Tokyo Gakugei Univ.), Shoichi Nakamura (Fukushima Univ.) ET2022-88 |
Presentation is one of the important means of explaining the results and progress of creative works. Particularly, prese... [more] |
ET2022-88 pp.178-183 |
MVE, VRSJ-SIG-MR, IPSJ-EC, HI-SIG-DeMO, VRSJ-SIG-CS |
2022-10-06 13:20 |
Hokkaido |
(Primary: On-site, Secondary: Online) |
Estimation of out-of-view attention region Koki Hara, AtsuShi Nakazawa (Kyoto Univ) MVE2022-18 |
Gazing behavior is one of the most important elements of human attention to the outward things.
The study of saliency e... [more] |
MVE2022-18 pp.1-6 |
HCS, HIP, HI-SIGCOASTER [detail] |
2022-05-15 15:55 |
Okinawa |
Okinawa Industry Support Center (Primary: On-site, Secondary: Online) |
Utilization of 3DCG with Hollow Mask Effect to Support Drawing of Character's Eyes Jun Asuka, Takafumi Higashi (Tokyo Denki Univ.) HCS2022-18 HIP2022-18 |
In this research, for beginners who draw the composition of the subjective viewpoint, we support the drawing of the eyes... [more] |
HCS2022-18 HIP2022-18 pp.86-91 |
HCS, HIP, HI-SIGCOASTER [detail] |
2021-05-25 12:50 |
Online |
Online |
Development of a Job Interview Training System with Multi-modal Behavior Analysis Nao Takeuchi, Tomoko Koda (OIT) HCS2021-10 HIP2021-10 |
This paper introduces a job interview training system that recognizes the nonverbal behaviors of the interviewee, namely... [more] |
HCS2021-10 HIP2021-10 pp.50-54 |
MVE, IMQ, IE, CQ (Joint) [detail] |
2021-03-02 09:20 |
Online |
Online |
Creation of a gazing point database when appreciating a painting and comparison with a saliency map Yusuke Nosaka (Tokai Univ.), Eriko Ishii (Kagoshima Prefectural College), Yuko Hoshino, Mitsuho Yamada (Tokai Univ.) CQ2020-111 |
It is said that the painter draw painting in order to lead the gaze of the viewer to the intended subject. We assumed t... [more] |
CQ2020-111 pp.16-21 |
EID, ITE-IDY, IEIJ-SSL, SID-JC, IEE-EDD [detail] |
2021-01-28 13:25 |
Online |
Online |
3D Image Depth Enlargement in Large and Long-Viewing Distance Edge-Based DFD Display by Blurring Edge Images Hideto Matsubara, Shiro Suyama, Haruki Mizushina (Tokushima Univ.) EID2020-20 |
We can successfully extend depth-fusion limit of front-rear gap from two image depths to one perceived depth by blurring... [more] |
EID2020-20 pp.21-24 |
ET |
2020-12-12 12:50 |
Online |
Online |
Development of a Gaze Information Collecting System in e-Testing for Examinee Authentication and Its Evaluation Toru Tokunaga, Toru Kano, Takako Akakura (TUS) ET2020-38 |
Today, with the development of communication technology, distance learning has improved to a point where it is comparabl... [more] |
ET2020-38 pp.23-28 |
HIP |
2020-10-09 14:35 |
Online |
Online |
A new calibration method for gaze estimation by a camera installed almost by the side of the eye Keisuke Kawamoto, Atsumu Naito, Kiyoshi Hoshino (Univ.Tsukuba) HIP2020-48 |
In the corneal reflection method, the user is asked to gaze at a number of calibration points whose visual angles are kn... [more] |
HIP2020-48 pp.81-84 |
HCGSYMPO (2nd) |
2019-12-11 - 2019-12-13 |
Hiroshima |
Hiroshima-ken Joho Plaza (Hiroshima) |
Decreasing VR Sickness by information reduction using gaze position Masami Mori, Shinji Uchida (NIT, Nara College), Kiyoshi Kiyokawa (NAIST) |
Recently, despite the release of VR-related devices for the general public, the VR industry is stagnant. The reason is t... [more] |
|
HIP |
2019-10-31 15:10 |
Kyoto |
Kyoto Terrsa |
Simple measurement of eye movements using a camera mounted on a temple of glasses Shinichi Hikita, Shingo Hashida (Osaka Electro-Communication Univ.) HIP2019-62 |
A wearable eye-tracker is possible to enable human-robot communication with gaze gestures in daily life. The conventiona... [more] |
HIP2019-62 pp.71-73 |
WIT |
2019-08-10 17:20 |
Tochigi |
Teikyo University, Utsunomiya Campus (TBD) |
Analysis of Gaze Behaviors for Deaf and Hard of Hearing Students on Lecture with Caption Tatsuya Arai, Daisuke Wakatsuki (NTUT), Takeaki Shionome (Teikyo Univ.) WIT2019-15 |
In previous studies, various methods and devices for presenting captions to the deaf and hard of hearing (DHH) students ... [more] |
WIT2019-15 pp.35-39 |
IMQ, IE, MVE, CQ (Joint) [detail] |
2019-03-15 14:50 |
Kagoshima |
Kagoshima University |
Visualization parts of interest based on gaze of instructor and its evaluation Syota Osumi, Hiroaki Kudo, Tetsuya Matsumoto (Nagoya Univ.), Yoshinori Takeuchi (Daido Univ.) IMQ2018-70 IE2018-154 MVE2018-101 |
Gazing behaviors based on knowledge of instructors are expressed during tutoring activities. The knowledge is acquired w... [more] |
IMQ2018-70 IE2018-154 MVE2018-101 pp.263-267 |
ITS, IE, ITE-MMS, ITE-HI, ITE-ME, ITE-AIT [detail] |
2019-02-20 16:15 |
Hokkaido |
Hokkaido Univ. |
[Invited Talk]
Automatic Gaze Correction based on Deep Learning and Image Warping Masataka Seo, Yamamoto Takahiro (Ritsumeikan Univ), Toshihiro Kitajima (Samsung), Chen Yen-Wei (Ritsumeikan Univ) |
When people take a selfie photo or talk through a video chat system, they tend to look at the screen. Since the position... [more] |
|
HIP |
2018-10-23 13:30 |
Kyoto |
Kyoto Terrsa |
A gaze input method using relationship between luminance change of visual target and timing of pupillary reaction to light Shoma Ichino, Hirohiko Kaneko (Tokyo Tech.) HIP2018-66 |
Techniques have been developed that allow you to enter characters by looking at among many items displayed on the screen... [more] |
HIP2018-66 pp.39-41 |
MBE, NC (Joint) |
2018-05-19 16:25 |
Toyama |
Univ. of Toyama |
Gaze-step simulation on visual search of target letter Masayuki Tsuno, Ryunosuke Kodea, Sigehito Tanahashi, Atsuhiko Iizima (Niigata Univ.), Akira Tsukada (NIT.Toyama College), Yoshinobu Maeda (Niigata Univ.) MBE2018-4 |
What kind of mechanism of human eye movement makes possible to find a target from the wide area in relatively a short ti... [more] |
MBE2018-4 pp.19-22 |
ICM, IPSJ-CSEC, IPSJ-IOT |
2018-05-18 11:45 |
Toyama |
|
Gaze Estimation Method based on Interaction-informed Electrooculography Jumpei Yamashita, Hidetaka Koya, Hajime Nakajima, Akira Inoue (NTT) ICM2018-7 |
Our goal is to prevent data entry mistakes on enterprise systems due to overlooking important information such as proced... [more] |
ICM2018-7 pp.99-104 |