In recent years, Japan has been faced with an increasingly aging society, which has resulted in problems such as a shortage of care workers and a growing number of elderly people living alone. The Cabinet Office of Japan anticipates that these problems will become more serious in the future. It will be increasingly difficult for elderly people to receive care. Here, we propose a system for confirming the safety of elderly people. Using the proposed system, which is equipped with a magnetic resistance (MR) sensor, we can infer the safety of elders from the operation of electrical home appliances. When a home appliance is operated, the current passing through its power cable produces a magnetic field, which can be detected by the MR sensor. Therefore, the proposed system detects the operational status of home appliances, which allows us to confirm the safety of elderly people.
In order to lighten the heavy burden of care workers in the aging society in Japan, many contributions have been made in safety confirmation systems for elders in daily life in recent years. A safety confirmation system for elders is proposed in this paper. The Obrid-Sensor and the ultrasonic sensor are employed in the system. The ultrasonic sensor is based on the theory of reflection of ultrasonic wave. And the Obrid-Sensor is constructed with rod lens and line sensor. Theoretically, the detections of the proposed method employ no images or video information. The application of the sensors performances quite well according to the privacy protection requirement in nursing. In the system, the applications of the sensors make it possible that we can observe the basic motion of subject alarming when he/she falls down, with privacy preservation. The effectiveness of the proposed method is confirmed by experiments.
This paper describes a new method of operating the page turner machine with gazing input. User's face was observed by the USB camera, and it was taken to the control computer to recognize the user's gazing directions. Gazing points with respect to right or left directions could be recognized by the relative distance of an iris and the outer corner of left and right eyes. The states of gazing the camera and reading a book were judged by the curvature of the eyelid shape which was approximated by the Bezier curve. As the gazing point became lower than the height of the camera, the curvature value of the approximation curve was decreased because the eyelid shape roughly became linear. The proposed recognition system was applied to the operations of the page turner machine. In the test trial, it could be confirmed that the developed system could recognize the user's intention, and the subjects were consequently able to operate the page turner machine by only gazing input with non-contact.
The number of people with disabilities is growing in Japan. Since the people with upper limb disability have difficulties in eating, a system to help eating is required. Traditional assistant robots are operated with joysticks or touch sensors. However, people with upper limb disability cannot operate the robots fully. Therefore, we propose a maneuverable meal assistance robot operated with eyes. User can have meal using the robot by eye manipulation. Its operation interface consists a tablet PC connected web camera calculating motion of user's eyes. The terminal detects position of user's eyes on the face image. The barycenter vector is calculated between barycenter of sclera and iris. The motion of eyes is decided from the magnitude and direction of the calculated vector. By the proposed method, user can manipulate the robot by eyes. Moreover, the designed robot with independent arms offers stable motions and quiet operations by using ultrasonic motors. In this paper, the details about the robot will be introduced.
In this report, we propose an inexpensive and handy system in which the direction of sight line of human is recognized under the construction of Raspberry Pi, open source hardware with embedded Linux, and proprietary camera module. For the differentiation of sight line of human, face, inner corner of the eye, pupil are detected from the image of the front face first, and the distance between human and camera and the direction of sight line is estimated from the analysis of them. When comparing the performance of the differentiation of the sight line in the proposed system with that in the general circumstance with personal computer, the results of detection are perfectly matched. In addition, it is confirmed that the differentiation of sight line can be done accurately within the distance of 1.0m between human and camera.
In this study, an interface with which a human can obtain information on remote circumstances from a mobile robot using eye-gaze input is considered. With this system, information on the remote circumstances is acquired by a camera installed on the mobile robot, and an image is displayed to a user. The position of human attention in the image is recognized by the eye-gaze input system, and the orientation of the robot and camera is adjusted based on the recognition so that the position of human attention is at the center of the image, which means that the user can recognize his/her target easily. In the eye-gaze input system, particle filter processing is applied to detect the movement of the dark part of an eye (iris and pupil) to improve recognition by the system. The desirable movement speed of the image is examined through psychological evaluations while the orientation of the camera and the mobile robot is adjusted. The results show that desirable speeds exist for pitch and yaw angles. In addition, human experiments were conducted to recognize an object in remote circumstances using the proposed method and speeds. The results show that the trajectory of eye-gaze movement becomes smooth and the user can recognize the object easily, which shows the effectiveness of the proposed method.
This paper proposed the notice reading support system using the omnidirectional electric wheelchair. Firstly, we examined the distance range which user felt like moving a general wheelchair to see the notices located at a far place on the bulletin board. There was a tendency that the user started to approach the target notices by operating the general wheelchair even though the target notice could be seen within ability of eyesight. Next, we carried out the same experiment with an omnidirectional electric wheelchair. Because of the higher moving flexibility of the omnidirectional electric wheelchair than the general wheelchair, user tended to move along with the notice board. Furthermore, we investigated the characteristics of the user's nostril positions in turning the face toward the target notices, and it was applied to control of the omnidirectional electric wheelchair. Consequently, the user could automatically move forward to the front of the gazing notices by the developed system.
August 28, 2017 There had been a service stop from Aug 28‚ 2017‚ 1:50 to Aug 28‚ 2017‚ 10:08(JST) (Aug 27‚ 2017‚ 16:50 to Aug 28‚ 2017‚ 1:08(UTC)) . The service has been back to normal.We apologize for any inconvenience this may cause you.
July 31, 2017 Due to the end of the Yahoo!JAPAN OpenID service, My J-STAGE will end the support of the following sign-in services with OpenID on August 26, 2017: -Sign-in with Yahoo!JAPAN ID -Sign-in with livedoor ID * After that, please sign-in with My J-STAGE ID.
July 03, 2017 There had been a service stop from Jul 2‚ 2017‚ 8:06 to Jul 2‚ 2017‚ 19:12(JST) (Jul 1‚ 2017‚ 23:06 to Jul 2‚ 2017‚ 10:12(UTC)) . The service has been back to normal.We apologize for any inconvenience this may cause you.
May 18, 2016 We have released “J-STAGE BETA site”.
May 01, 2015 Please note the "spoofing mail" that pretends to be J-STAGE.