Reports of the Technical Conference of the Institute of Image Electronics Engineers of Japan
Online ISSN : 2758-9218
Print ISSN : 0285-3957
Reports of the294th Technical Conference of the Institute of Image Electronics Engineers of Japan
Displaying 1-9 of 9 articles from this issue
  • Kenta UCHIDA, Tokiichiro TAKAHASHI
    Session ID: 20-01-01
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    In recent years, with the development of virtual reality (VR) technology, VR content has increased and is becoming more popular in homes. On the other hand, VR sickness is a problem. We developed a sensory interface that mimics the “swimming motion” and obtained good results in reducing VR sickness. In this study, we analyze the relationship between the motion sickness susceptibility to motion sickness and the sickness caused by “swimming motion” by using the MSSQ, which measures general motion sickness susceptibility, and verify the relationship between the motion sickness susceptibility and VR sickness.
    Download PDF (847K)
  • Keito TANIZAKI, Tokiichiro TAKAHASHI
    Session ID: 20-01-02
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    VR social platforms have been become very popular, which allows users to communicate with other users by wearing 3DCG characters’ costumes as avatars. This allows users to express themselves in a free form, just like cosplay in the real world. However, cosplay photographers are few and far between. This can be attributed to the fact that shooting failures cannot occur. The failure of shooting is a part of the fun of shooting, but in the existing methods, the failure of shooting such as camera shake and exposure error does not occur. In this paper, we propose a virtual space photography system that allows to simulate real shooting failures using a real camera.
    Download PDF (2482K)
  • Kazuma OHARA, Keito TANIZAKI, Tokiichiro TAKAHASHI
    Session ID: 20-01-03
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    We have already developed an AR support system that allows users to practice racing line and braking position while driving. In this study, we developed a function that enables the user to check the timing of braking by displaying own braking position in real time in AR during the driving practice, based on the previous system. The experimental results showed that the new system reduced the running time compared to the conventional system. The usefulness of this system was shown.
    Download PDF (584K)
  • Akihiro OKUDA, Tokiichiro TAKAHASHI
    Session ID: 20-01-04
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    In the metropolitan area of Japan, many passengers use the railway lines for commuting to work, etc., but in the morning and evening hours, a chronic delay occurs and the time required to reach the destination increases. In this research, we propose a method of visualizing the tendency of chronic delay by accumulating and analyzing open data related to railway operation for several months
    Download PDF (652K)
  • Nobuo TAKAHASHI, Mayu URATA, Mamoru ENDO, Takami YASUDA
    Session ID: 20-01-05
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    In this study, in order to improve the visual consistency of CG live-action composites, we investigated a method for representing the bloom effect based on a simple measured camera PSF. First, the camera-specific ESF was obtained from the edge images taken, and the PSF was estimated from the LSF obtained. After that, we added the bloom effect by convolution of the CG image and PSF and evaluated the image quality. As a result, the PSF-based blooms were found to be effective in improving the visual consistency of CG live-action compositing, because the pixel values of the PSF-based blooms were almost identical to those of the camera-captured live-action blooms.
    Download PDF (838K)
  • Mamoru SATO, Masanori KAKIMOTO, Kai LENZ
    Session ID: 20-01-06
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    Download PDF (369K)
  • Kazuki WAKIDA, Ryosuke FURUTA, Yukinobu TANIGUCHI
    Session ID: 20-01-07
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    In order to reduce human cost and monitor health conditions of dairy cows, we previously proposed a method for individual identification from cameras installed on ceiling of barns, which uses spot patterns as a clue. However, the appearance of spots on a cow varies greatly depending on the angle and posture of the cow in the captured images. To solve the problem, we propose a method for creating a database that can deal with the changes of spot patterns for accurate identification of dairy cows. The proposed method consists of two steps. First, a 3D model of a cow is reconstructed by compositing several RGB-D images taken from different angles. Then, multi-view images with different appearances of spot patterns are obtained by moving and rotating the 3D model. To confirm the effectiveness of the proposed method, we evaluate its performance on individual identification by matching the spot patterns between the obtained multi-view images and query images taken by another camera from a different angle and position.
    Download PDF (607K)
  • Daisuke IMOTO, Kenji KUROSAWA, Masakatsu HONMA, Ryo YOKOTA, Manato HIR ...
    Session ID: 20-01-08
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    Silhouette-based gait analysis is a technique for identifying whether two pedestrian videos depict the same person. However, when the pedestrian is captured close to the camera, projective distortion causes their shape to change nonlinearly. This shape change can make it difficult to correctly determine if it is the same person, even if the camera viewing angle with respect to the person is only slightly different. In this study, we proposed a gait analysis method in which the shooting angle, which is a three-dimensional (3D) camera parameter, of each pedestrian video was first calibrated and then the silhouette video for learning was computed (perspective projection or simulation) from a four-dimensional (3D + time) gait database so that the view angle of the silhouette corresponds to that of each video. We examined the person identification rate of the proposed gait analysis and the relationship between the degree of the projective distortion and the degree of improvement in the individual identification accuracy using the proposed method. Consequently, the proposed method was found to be effective when the projective distortion was strong, i.e., when the pedestrian is close to the camera.
    Download PDF (1174K)
  • Kunio KONDO
    Session ID: 20-01-09
    Published: 2020
    Released on J-STAGE: March 31, 2023
    CONFERENCE PROCEEDINGS RESTRICTED ACCESS
    The purpose of this paper is to summarize the process of modeling with Photogrammetry and the open problems that have been revealed. In this paper, I introduce the current status of using Photogrammetry, the process of creating 3D models and my modeling example using Photogrammetry. And finally, I will explain the open problems of Photogrammetry technology and its application.
    Download PDF (3577K)
feedback
Top