We have used the Convolutional Pose Machines (CPM) model, designed to estimate joint positions in the human body, to estimate the lower limb and pelvis key points in cattle. Here we examined whether the differences in distance and angle between key points could be used to predict calving difficulty. Skeletal key points can be acquired by both manual annotation and inferred by CPM. Annotated points are considered to be correct. Inferred points are estimated from images and contain errors. We compared the accuracy of judgement of the degree of calving difficulty by machine learning based on both annotated and estimated key points, using the distance and angle between key points as explanatory variables. Side view images had 20 explanatory variables and back view images had 8. Of 5 machine learning models (support vector machine, random forest, linear discriminant analysis, logistic regression, and decision tree) tested, the support vector machine trained and tested with annotated key points had the highest accuracy: 0.79 for the side view images and 0.77 for the back view images. However, when it used the estimated key points, the accuracy was lower, at 0.52 and 0.53, respectively. Our work shows that with correct skeletal key points, calving difficulty can be accurately predicted.
We introduce a new sensing system that uses time-of-flight (ToF) cameras for improved monitoring of canopy growth. A camera is mounted on a tractor together with position sensors to collect canopy growth information without reliance on unmanned aerial vehicles (UAVs) or 3D reconstruction. We installed a ToF camera on a tractor to perform sensing in a potato field from June to July 2023, the local pest control period, to test its function during field management practices. The accuracy of the estimated crop volume had a linear correlation (R2) of 0.89 with data from aerial photography taken at a height of 15 m by UAV. This high value indicates that the estimation accuracy of the new system is comparable to that of UAV-based sensing and can serve as a viable alternative. Furthermore, it took only 1.64 s for each observation to calculate the crop volume from the depth data obtained by the ToF camera. This speed enables sensing to be carried out in tandem with normal tractor operations such as pest control, which are carried out at speeds of 2–3 km/h, without data loss due to absence of overlap.