Presentation | 2012-05-23 Automatic Editing of Scene Movie : Successive Selection of Camera Video through Application of Event-driven Composition Rules Yuho NODA, Masashi SUGAWARA, Hideji ENOKIZU, |
---|---|
PDF Download Page | PDF download Page Link |
Abstract(in Japanese) | (See Japanese page) |
Abstract(in English) | Recently, the automatic editing is investigated to archive the meeting or the lecture as the video. One of the reasons is that the video appears to be media that can convey the affairs to people more accurately. If the automatic editing can be applicable to a variety of the affairs, it will become more useful. The present study is intended to construct automatic editing system for recorded video of the scene that was performed on the basis of a short scenario. Eight cameras simultaneously shot the scene occurring in limited space. The movement of characters, the change of posture, and the change of direction were derived from recorded videos every 0.2 second. Speech sound was also recorded by the microphone that put in each character. Sound date was sampled from recorded speech sound in the same way to be processed to detect the presence or absence of speech. These four informations defined each event occurred in the scene. On the basis of the shot analysis of two films, we generated the event-driven composition rules. The event-driven composition rules were applied to event information to extract composition information. Composition information involved the layout of characters and the shot size. The video recorded by camera, which reflects composition information most, was selected as the clip for editing. Then, each selected clip was cropped to bear resemblance to the shot size. Finally, cropped clips were pieced together to make up the scene video. Our system was able to make up some scene videos. However, several problems are still left that need to be addressed to improve our system. |
Keyword(in Japanese) | (See Japanese page) |
Keyword(in English) | Filming Space / Automatic Shooting and Editing / Event-driven Composition Rules / Shot Analysis |
Paper # | HCS2012-25,HIP2012-25 |
Date of Issue |
Conference Information | |
Committee | HCS |
---|---|
Conference Date | 2012/5/15(1days) |
Place (in Japanese) | (See Japanese page) |
Place (in English) | |
Topics (in Japanese) | (See Japanese page) |
Topics (in English) | |
Chair | |
Vice Chair | |
Secretary | |
Assistant |
Paper Information | |
Registration To | Human Communication Science (HCS) |
---|---|
Language | JPN |
Title (in Japanese) | (See Japanese page) |
Sub Title (in Japanese) | (See Japanese page) |
Title (in English) | Automatic Editing of Scene Movie : Successive Selection of Camera Video through Application of Event-driven Composition Rules |
Sub Title (in English) | |
Keyword(1) | Filming Space |
Keyword(2) | Automatic Shooting and Editing |
Keyword(3) | Event-driven Composition Rules |
Keyword(4) | Shot Analysis |
1st Author's Name | Yuho NODA |
1st Author's Affiliation | Graduate School of Engineering, Shibaura Institute of Technology() |
2nd Author's Name | Masashi SUGAWARA |
2nd Author's Affiliation | Shibaura Institute of Technology |
3rd Author's Name | Hideji ENOKIZU |
3rd Author's Affiliation | Shibaura Institute of Technology |
Date | 2012-05-23 |
Paper # | HCS2012-25,HIP2012-25 |
Volume (vol) | vol.112 |
Number (no) | 45 |
Page | pp.pp.- |
#Pages | 6 |
Date of Issue |