The Journal of The Institute of Image Information and Television Engineers
Online ISSN : 1881-6908
Print ISSN : 1342-6907
ISSN-L : 1342-6907
Volume 66, Issue 4
Displaying 1-25 of 25 articles from this issue
Focus
Invited Paper
Special Issue
Broadcasting and Social Media
Topics
Technical Survey
Technical Guide
Technology Frontier of Augmented Reality(4)
Production File on Broadcast Program(1)
Journey into Media Arts(24)
Media Watch(4)
Activity Notes on Standardization(20)
Keywords you should know(75)
My Recommendations on Research and Development Tools(58)
News
  • Takuya Furukawa, Hironobu Fujiyoshi
    2012 Volume 66 Issue 4 Pages J93-J100
    Published: 2012
    Released on J-STAGE: March 23, 2012
    JOURNAL FREE ACCESS
    We propose a technique for automatically editing personal videos using video and sensor information obtained from the camera. In this study, we used personal videos cut-edited by individual users and their editing history to analyze tendencies in cut scenes (used scenes) and cut-edit points. This analysis revealed that the amount of camerawork is high in used scenes. Based on these findings, the proposed technique uses continuous rank-increase measure (CRIM) and motion correlation (MC) values calculated from space-time patches (ST-patches), which capture change in motion, and acceleration and angular speed obtained from camera sensors to select scenes in which there is a lot of camerawork and subject movement. Editing video in this way based on the user's editing tendencies promotes video editing in accordance with the user's preferences. Experimental results show that automatic video editing by the proposed technique achieves a higher degree of viewer satisfaction than a video editing technique based only on ST-patch features.
    Download PDF (1504K)
  • Takumi Yoshida, Sho Kamuro, Kouta Minamizawa, Hideaki Nii, Susumu Tach ...
    2012 Volume 66 Issue 4 Pages J101-J107
    Published: 2012
    Released on J-STAGE: March 23, 2012
    JOURNAL FREE ACCESS
    We propose a novel full-parallax 3D display system, called RePro3D, that is suitable for interactive 3D applications. The approach is based on retro-reflective projection technology in which several images from a projector array are displayed on a retro-reflective screen. When viewers look at the screen through a half mirror, they see a 3D image superimposed on the real scene without having to wear glasses. RePro3D has a sensor function to recognize user input, so it can support some interactive features, such as manipulation of 3D objects. In this paper, we describe the optical system of the high-density projector array. Then, we develop a prototype of RePro3D. The prototype shows parallax images that were displayed on a real scene from 42 different viewpoints. The user can touch the 3D image with his/her hands.
    Download PDF (989K)
feedback
Top