Journal of the Robotics Society of Japan
Online ISSN : 1884-7145
Print ISSN : 0289-1824
ISSN-L : 0289-1824
Volume 41, Issue 7
Displaying 1-15 of 15 articles from this issue
On special issue
Review
Event Report
Paper
  • Toshiyuki Hayashi, Takashi Tsubouchi
    2023 Volume 41 Issue 7 Pages 631-634
    Published: 2023
    Released on J-STAGE: September 09, 2023
    JOURNAL FREE ACCESS

    In order to perform reliable maintenance of infrastructures such as bridges, it is desirable to utilize robotic technologies such as UAV to support engineers. In this research, we aim to propose a method to identify one-dimensional directional blur amount for sharpening blurred images acquired by UAV. In order to sharpen one-dimensional blurred images of concrete structures, we focused on simple and uniform features such as sandy patterns peculiar to concrete surfaces and used a cepstrum of partial images of such features to obtain at least one candidate image of the amount of blur. In this paper, we apply our proposed "MD method" to blurred images of concrete bridges actually captured by UAV and to sharpen the degraded image with PSF-MB corresponding to the identified amount of blur.

    Download PDF (1102K)
  • —A Proof-of-Concept Study with Braava Jet m6 at a Geriatric Care Facility—
    Yuji Asaishi, Kazuko Obayashi, Shigeru Masuyama, Naonori Kodate
    2023 Volume 41 Issue 7 Pages 635-638
    Published: 2023
    Released on J-STAGE: September 09, 2023
    JOURNAL FREE ACCESS

    It has been argued that the use of robots in non-personal care can help reduce workload in welfare facilities. In this study, we conducted a proof-of-concept study to examine the utility of a floor-cleaning robot, on which there are few previous studies. We also carried out a questionnaire survey of staff who observed the robot in operation. The results of these mixed-method analyses revealed that the robot is not practical in its current state. The study suggests, however, that the robot has potential if its performance can be improved and the environment can be set up appropriately.

    Download PDF (467K)
  • Kaito Ichihara, Tadahiro Hasegawa, Shin'ichi Yuta, Tomoyuki Kato, Moto ...
    2023 Volume 41 Issue 7 Pages 639-642
    Published: 2023
    Released on J-STAGE: September 09, 2023
    JOURNAL FREE ACCESS

    The prototype of a retrofittable deep snow work assist system was developed and experimentally verified in deep snow removal works. This assist system, which displays a vehicle model in real time on a 3D map of the snow removal site created when there is no snow, enables vehicle's operators to accurately identify the vehicle's position and attitude even in deep snow environments. Furthermore, the assist system can be retrofitted to various types of vehicles. The experimental results showed the potential application as a guidance system for deep snow removal works.

    Download PDF (1517K)
  • Hao Chen, Takuya Kiyokawa, Zhengtao Hu, Weiwei Wan, Kensuke Harada
    2023 Volume 41 Issue 7 Pages 643-646
    Published: 2023
    Released on J-STAGE: September 09, 2023
    JOURNAL FREE ACCESS

    Diversified manufacturing requires a robot system to be highly generalizable to different novel objects. Previous studies try to achieve this goal by using learning-based methods but are high-cost and lack of generalization due to insufficient knowledge in training. In this paper, we propose a new approach of novel object grasping using an object ontology to implement similarity matching between known objects and novel objects. We realize successful grasps on a novel object by imitating robust grasps from its similar known object. Our method is training free and verified for generalizability with an average success rate of 83% in novel object grasping experiments.

    Download PDF (1931K)
  • Yuya Okadome, Yutaka Nakamura
    2023 Volume 41 Issue 7 Pages 647-650
    Published: 2023
    Released on J-STAGE: September 09, 2023
    JOURNAL FREE ACCESS

    Human-human interaction includes synchronizing behaviors, such as nodding, turn-taking, and smiling, and these behaviors are expressed at the appropriate timing. Extracting and implementing these synchronization behaviors are crucial for the communication robot with the function of ``feeling good'' conversations. In this research, the framework for extracting synchronization behaviors from dyadic conversation data. A neural network model is trained based on the converted data with ``lag operation'' which is the time-shifting operation of the feature of one subject. The representation space after learning is expected to become a distinctive structure in which time-dependent behavior is separated in the space. The proposed framework is applied to four hours of dyadic conversation data, and the representation space of the conversion feature is obtained. By extracting representation that is differentiated in the space, the data includes synchronization behavior, e.g., ``turn-taking'' and ``nodding''. Designing rules of behaviors of a social robot from the extracted data is one of our future projects.

    Download PDF (982K)
  • Takuya Kiyokawa, Kotaro Yoshimoto, Jun Takamatsu
    2023 Volume 41 Issue 7 Pages 651-654
    Published: 2023
    Released on J-STAGE: September 09, 2023
    JOURNAL FREE ACCESS

    Through self-supervised learning in simulation, our assumed robotic waste sorter acquires an efficient action policy to scatter densely gathered objects and grasp them. To promptly generalize the model according to the short-life cycle and various-shaped waste items carried in recycling facilities, we are considering quickly reconstructing the simulator used for the training. The simulator requires object models to be reproduced as realistically as possible in appearance and shape. To tackle quickly generating models for waste items including semi-transparent objects, we propose a thermal-based shape-from-silhouette method. Our experiments demonstrate that the proposed method can reconstruct 3D transparent shapes and the generated models can be used to train the manipulation policy.

    Download PDF (3116K)
feedback
Top