|
|
All Technical Committee Conferences (Searched in: All Years)
|
|
Search Results: Conference Papers |
Conference Papers (Available on Advance Programs) (Sort by: Date Descending) |
|
Committee |
Date Time |
Place |
Paper Title / Authors |
Abstract |
Paper # |
PRMU |
2020-12-18 16:30 |
Online |
Online |
Estimating 3D regions for grasping an object Atsuki Tsukamoto, Kiyoshi Kogure (KIT) PRMU2020-65 |
This paper proposes a method for estimating 3D regions for object grasping. The method takes as its inputs two RGB image... [more] |
PRMU2020-65 pp.156-160 |
MI |
2019-07-06 11:55 |
Hokkaido |
Future Univ. Hakodate |
[Poster Presentation]
Basic study of left atrial appendage segmentation from cardiac CT images Itaru Takayashiki, Doi Akio, Toru Kato, Hiroki Takahashi (Iwate Prefectural Univ.), Shoto Sekimura (ISP), Maiko Hozawa, Yoshihiro Morino (Iwate Medical Univ.) MI2019-30 |
In this study, we propose a method to automatically extract the left atrial appendage region from the cardiac CT image f... [more] |
MI2019-30 pp.43-48 |
PRMU, IPSJ-CVIM |
2019-05-30 10:50 |
Tokyo |
|
Estimating areas in images for grasping an object by a three-fingered robot hand Atsuki Tsukamoto, Ryosuke Kubota, Kiyoshi Kogure (KIT) PRMU2019-4 |
This paper proposes a method for estimation areas for grasping an object by a three-fingered robot hand. The method take... [more] |
PRMU2019-4 pp.19-24 |
PRMU, MVE, IPSJ-CVIM [detail] |
2019-01-17 17:00 |
Kyoto |
|
Recalling candidates of gripping hand shapes from an object image using neural network Makoto Sanada, Tadashi Matsuo, Nobutaka Shimada, Yoshiaki Shirai (Ritsumeikan Univ.) PRMU2018-103 MVE2018-45 |
Robots are required to support people’s work. In order to alleviate the burden on people, it is desirable that robot can... [more] |
PRMU2018-103 MVE2018-45 pp.51-55 |
MICT, MI |
2018-11-06 17:40 |
Hyogo |
University of Hyogo |
MICT2018-56 MI2018-56 |
(To be available after the conference date) [more] |
MICT2018-56 MI2018-56 pp.77-78 |
PRMU, MI, IE, SIP |
2018-05-18 11:00 |
Gifu |
|
Contour Extraction of Transparent Objects Using Fully Convolutional Networks Ryosuke Kubota, Kiyoshi Kogure (KIT) SIP2018-10 IE2018-10 PRMU2018-10 MI2018-10 |
In this paper, we propose two methods to extract contours of transparent objects from a grayscale image using fully conv... [more] |
SIP2018-10 IE2018-10 PRMU2018-10 MI2018-10 pp.41-46 |
SIP, EA, SP, MI (Joint) [detail] |
2018-03-19 13:40 |
Okinawa |
|
MI2017-78 |
Diseases appearing in the spine include spondylolysis, spondylolisthesis, vertebral fracture, and the like. A preoperati... [more] |
MI2017-78 pp.43-44 |
CQ, MVE, IE, IMQ (Joint) [detail] |
2018-03-09 13:30 |
Okinawa |
Okinawa Industry Support Center |
Efficient and Interactive Image Retrieval Based on Semantic Segmentation Ryosuke Furuta, Naoto Inoue, Toshihiko Yamasaki (Univ. of Tokyo) IMQ2017-59 IE2017-151 MVE2017-101 |
This paper proposes an efficient image retrieval system. When users wish to retrieve images with semantic and spatial co... [more] |
IMQ2017-59 IE2017-151 MVE2017-101 pp.189-194 |
MI, MICT |
2017-11-06 10:40 |
Kagawa |
Sunport Hall Takamatsu |
On the influence of Dice loss function in multi-class organ segmentation of abdominal CT using 3D fully convolutional networks Chen Shen, Holger R. Roth, Hirohisa Oda, Masahiro Oda, Yuichiro Hayashi (Nagoya Univ.), Kazunari Misawa (Aichi Cancer Central Center Hospital), Kensaku Mori (Nagoya Univ.) MICT2017-29 MI2017-51 |
Deep learning-based methods achieved impressive results in segmentations from medical images. With the development of 3D... [more] |
MICT2017-29 MI2017-51 pp.15-20 |
|
|
|
Copyright and reproduction :
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034)
|
[Return to Top Page]
[Return to IEICE Web Page]
|