Engineering in Agriculture, Environment and Food
Online ISSN : 1881-8366
ISSN-L : 1881-8366
Vision-based furrow line detection for navigating intelligent worker assistance robot
Yoshinari Morio Kouki TeramotoKatsusuke Murakami
著者情報
ジャーナル フリー

2017 年 10 巻 2 号 p. 87-103

詳細
抄録
In this study, a visual-based furrow line detection method was developed for navigating an autonomous robot vehicle in an agricultural field. The furrow line detection method integrates a crop or non-crop field identification method, two types of box filters, which are a color-based furrow detection filter and a grayscale separability-based furrow detection filter, and a robust furrow line parameter estimator. In experiments, the performance of our developed method was tested on more than 8000 images of 17 types of test fields: nine types of crop fields (sweet pea, green pea, snow pea, lettuce, Chinese cabbage, cabbage, green pepper, tomato, and tea), and eight types of tilled soil fields. By using a wide camera angle with a low depression angle, the detection rate of the furrow line was 98.0%, the root mean square error (RMSE) of positioning of the furrow line was 12.1 pixels, and the RMSE of angle of the furrow line was 3.8°. Moreover, by using the oblique camera angle, the detection rate was 93.4%, the RMSE of positioning was 23.3 pixels, and the RMSE of angle was 6.1°. The results showed that our method using the wide and oblique camera angles could approximately detect the furrow line in the test fields. The average processing speed was approximately 2.5 Hz for the crop fields and 4.0 Hz for the tilled soil fields. Our method demonstrated a high potential to robustly and precisely detect a single targeted furrow line in the 17 types of test fields.
著者関連情報
© 2017 Asian Agricultural and Biological Engineering Association
前の記事 次の記事
feedback
Top