Brush-shaped Motion Gesture of UGV Using Hand Gesture Recognition

Authors

  • Agus Murdiono Agus Universitas Trunojoyo Madura
  • Muhammad Fuad Universitas Trunojoyo Madura https://orcid.org/0000-0003-4902-9763
  • Hairil Budiarto Universitas Trunojoyo Madura
  • Faikul Umam Universitas Trunojoyo Madura
  • Vivi Tri Widyaningrum Universitas Trunojoyo Madura
  • Achmad Imam Sudianto Universitas Trunojoyo Madura

DOI:

https://doi.org/10.37802/joti.v8i1.1315

Keywords:

Observation, UGV, KNN, Gesture Control, Brush-shaped, Smart farming

Abstract

Manual observation of corn leaf diseases in agricultural fields often faces challenges related to time, effort, and accuracy. To address these challenges, brush-shaped motion patterns, such as zig-zag and boustrophedon trajectories, provide an effective solution by enabling uniform area coverage while reducing redundant traversal, energy consumption, and sensing gaps, making them well-suited for precision agriculture applications. Building on this approach, the system utilizes the MediaPipe framework for hand landmark tracking and the K-Nearest Neighbors (KNN) algorithm to recognize six navigation commands: forward, backward, stop, turn_right, turn_left, and capture. These commands are transmitted via Wi-Fi with an average latency of 0.001964 s. To ensure navigation accuracy during pattern execution, corrections are made using rotary encoders. Gesture classification experiments on 6,000 samples achieved a maximum accuracy of 99.46% across two participants, with stable KNN performance under both indoor and outdoor lighting variations, as well as hand distances ranging from 50 cm. Furthermore, the capture gesture produced an average image acquisition latency of 0.3037 s at various UGV observation positions. In summary, these results demonstrate that integrating real-time gesture control with UGV maneuvers enables systematic field surveys for maize leaf disease monitoring and supports Sustainable Development Goal (SDG) 2 through precision agriculture technology.

Downloads

Download data is not yet available.

References

A. Schmitz, C. Badgujar, H. Mansur, D. Flippo, B. McCornack, and A. Sharda, “Design of a Reconfigurable Crop Scouting Vehicle for Row Crop Navigation: A Proof-of-Concept Study,” Sensors, vol. 22, no. 16, p. 6203, Aug. 2022, doi: 10.3390/s22166203.

X. Zhou, X. Yu, Y. Zhang, Y. Luo, and X. Peng, “Trajectory Planning and Tracking Strategy Applied to an Unmanned Ground Vehicle in the Presence of Obstacles,” IEEE Transactions on Automation Science and Engineering, vol. 18, no. 4, pp. 1575–1589, Oct. 2021, doi: 10.1109/TASE.2020.3010887.

A. Edet, S. Inyang, I. Umoren, and U. E. Etuk, “Machine Learning Approach for Classification of Cyber Threats Actors in Web Region,” Journal of Technology and Informatics (JoTI), vol. 6, no. 1, pp. 70–77, Oct. 2024, doi: 10.37802/joti.v6i1.679.

S. F. P. D. Musa and K. H. Basir, “Smart farming: towards a sustainable agri-food system,” British Food Journal, vol. 123, no. 9, pp. 3085–3099, Sep. 2021, doi: 10.1108/BFJ-03-2021-0325.

Rajvir Yadav and Vala VS, “Steering Geometry of Remote Control Agricultural Vehicle,” Ergon Int J, vol. 7, no. 2, pp. 1–6, Mar. 2023, doi: 10.23880/eoij-16000303.

Jay Kushwaha, Akash Gupta, Dhannajay Kumar Vishwakarma, Indrajeet, and Mr.Darshan Srivastav, “RAKSHAK-The Multipurpose Unmanned Ground Vehicle,” International Research Journal on Advanced Engineering Hub (IRJAEH), vol. 2, no. 06, pp. 1816–1820, Jun. 2024, doi: 10.47392/IRJAEH.2024.0249.

M. Oudah, A. Al-Naji, and J. Chahl, “Hand Gesture Recognition Based on Computer Vision: A Review of Techniques,” J Imaging, vol. 6, no. 8, p. 73, Jul. 2020, doi: 10.3390/jimaging6080073.

K. Anam et al., “Electric wheelchair navigation based on hand gestures prediction using the k-Nearest Neighbor method,” Journal of Mechatronics, Electrical Power, and Vehicular Technology, vol. 16, no. 1, pp. 132–143, Jul. 2025, doi: 10.55981/j.mev.2025.1229.

M. Fuad et al., “Towards Controlling Mobile Robot Using Upper Human Body Gesture Based on Convolutional Neural Network,” Journal of Robotics and Control (JRC), vol. 4, no. 6, pp. 856–867, Dec. 2023, doi: 10.18196/jrc.v4i6.20399.

Y. Xu, Y. Zhang, Y. Leng, and Q. Gao, “AIP-Net: An anchor-free instance-level human part detection network,” Neurocomputing, vol. 573, p. 127254, Mar. 2024, doi: 10.1016/j.neucom.2024.127254.

N. Chalista Imanuela Natun, M. Angelica Santhia, and Y. R. Kaesmetan, “Identifikasi Pengenalan Wajah Berdasarkan Jenis Kelamin Menggunakan Metode Convolutional Neural Network (CNN),” Journal of Technology and Informatics (JoTI), vol. 6, no. 1, pp. 50–57, Oct. 2024, doi: 10.37802/joti.v6i1.694.

R. Wang, T. Chen, P. Yao, S. Liu, I. Rajapakse, and A. O. Hero, “ASK: Adversarial Soft k-Nearest Neighbor Attack and Defense,” IEEE Access, vol. 10, pp. 103074–103088, 2022, doi: 10.1109/ACCESS.2022.3209243.

R. Zaqi Megantara, P. Iryanto Faot, R. Haba Ito, K. Felicia Annabel, and O. Karnalim, “Pemanfaatan Machine Learning dalam Prediksi Rating: Studi Kasus pada Data Abstrak Publikasi Ilmiah,” Journal of Technology and Informatics (JoTI), vol. 7, no. 1, pp. 62–75, Apr. 2025, doi: 10.37802/joti.v7i1.999.

Y. Zhang, C. Cao, J. Cheng, and H. Lu, “EgoGesture: A New Dataset and Benchmark for Egocentric Hand Gesture Recognition,” IEEE Trans Multimedia, vol. 20, no. 5, pp. 1038–1050, May 2018, doi: 10.1109/TMM.2018.2808769.

G. Amprimo, G. Masi, G. Pettiti, G. Olmo, L. Priano, and C. Ferraris, “Hand tracking for clinical applications: validation of the Google MediaPipe Hand (GMH) and the depth-enhanced GMH-D frameworks.” [Online]. Available: https://www.ieiit.cnr.it/people/Ferraris-Claudia

F. Zhang et al., “MediaPipe Hands: On-device Real-time Hand Tracking,” Jun. 2020, [Online]. Available: http://arxiv.org/abs/2006.10214

“Hand landmarks detection guide,” https://ai.google.dev/edge/mediapipe/solutions/vision/hand_landmarker.

S. Zhang, “Challenges in KNN Classification,” IEEE Trans Knowl Data Eng, vol. 34, no. 10, pp. 4663–4675, Oct. 2022, doi: 10.1109/TKDE.2021.3049250.

S. Zhang, X. Li, M. Zong, X. Zhu, and D. Cheng, “Learning k for kNN Classification,” ACM Trans Intell Syst Technol, vol. 8, no. 3, Jan. 2017, doi: 10.1145/2990508.

Lukman Arif Sanjani, R. Bimo Mandala Putra, and U. Laili Yuhana, “Exploring the Application of Machine Learning for Automatic Inbound Email Classification in CRM System at XYZ Company,” Journal of Technology and Informatics (JoTI), vol. 6, no. 1, pp. 1–7, Oct. 2024, doi: 10.37802/joti.v6i1.715.

K. S. Chia, “Ziegler-nichols based proportional-integral-derivative controller for a line tracking robot,” Indonesian Journal of Electrical Engineering and Computer Science, vol. 9, no. 1, pp. 221–226, Jan. 2018, doi: 10.11591/ijeecs.v9.i1.pp221-226.

Downloads