(2) Pedro Dinis Gaspar (University of Beira Interior, Portugal; C-MAST Center for Mechanical and Aerospace Science and Technologies, Portugal)
(3) Vasco Nuno da Gama de Jesus Soares (Polytechnic Institute of Castelo Branco, Portugal; Instituto de Telecomunicações, Portugal)
(4) * João Manuel Leitão Pires Caldeira (Polytechnic Institute of Castelo Branco, Portugal; Instituto de Telecomunicações, Portugal)
*corresponding author
AbstractWild flowers and plants play an important role in protecting biodiversity and providing various ecosystem services. However, some of them are endangered or threatened and are entitled to preservation and protection. This study represents a first step to develop a computer vision system and a supporting mobile app for detecting and monitoring the development stages of wild flowers and plants, aiming to contribute to their preservation. It first introduces the related concepts. Then, surveys related work and categorizes existing solutions presenting their key features, strengths, and limitations. The most promising solutions and techniques are identified. Insights on open issues and research directions in the topic are also provided. This paper paves the way to a wider adoption of recent results in computer vision techniques in this field and for the proposal of a mobile application that uses YOLO convolutional neural networks to detect the stages of development of wild flowers and plants.
KeywordsWild flowers; Development stages; Computer vision; Machine learning; Deep learning
|
DOIhttps://doi.org/10.26555/ijain.v9i3.1012 |
Article metricsAbstract views : 1345 | PDF views : 353 |
Cite |
Full TextDownload |
References
[1] A. Wang, W. Zhang, and X. Wei, “A review on weed detection using ground-based machine vision and image processing techniques,” Comput. Electron. Agric., vol. 158, pp. 226–240, Mar. 2019, doi: 10.1016/j.compag.2019.02.005.
[2] T. Dines, “Plantlife - A Voice for Wildflowers,” ArkWildlife. Accessed Jan. 04, 2023. [Online]. Available at : https://www.arkwildlife.co.uk/blog/plantlife-a-voice-for-wildflowers/.
[3] “Chain-reaction extinctions will cascade through nature: Study,” Daily Sabah, 2022. Accessed Feb. 04, 2023. [Online]. Available at: https://www.dailysabah.com/life/environment/chain-reaction-extinctions-will-cascade-through-nature-study.
[4] E. V. Christaki and P. C. Florou-Paneri, “Aloe vera: A plant for many uses,” J. Food, Agric. Environ., vol. 8, no. 2, pp. 245–249, 2010, [Online]. Available at: https://www.researchgate.net/publication/265268175_Aloe_vera_A_plant_for_many_uses.
[5] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, May 2015, doi: 10.1038/nature14539.
[6] Z. Song, Q. Chen, Z. Huang, Y. Hua, and S. Yan, “Contextualizing object detection and classification,” in CVPR 2011, Jun. 2011, pp. 1585–1592, doi: 10.1109/CVPR.2011.5995330.
[7] “Anchor Boxes for Object Detection,” MATLAB & Simulink. Accessed Jan. 06, 2022. [Online]. Available at: https://www.mathworks.com/help/vision/ug/anchor-boxes-for-object-detection.html.
[8] “What Is Object Detection?,” MATLAB & Simulink. Accessed Jan. 06, 2022. [Online]. Available at: https://www.mathworks.com/discovery/object-detection.html?s_tid=srchtitle_object detection_1.
[9] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You Only Look Once: Unified, Real-Time Object Detection,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2016, pp. 779–788, doi: 10.1109/CVPR.2016.91.
[10] J. Redmon and A. Farhadi, “YOLO: Real-Time Object Detection.” Accessed Jan. 06, 2023. [Online]. Available at: https://pjreddie.com/darknet/yolo/.
[11] A. Shill and M. A. Rahman, “Plant Disease Detection Based on YOLOv3 and YOLOv4,” in 2021 International Conference on Automation, Control and Mechatronics for Industry 4.0 (ACMI), Jul. 2021, pp. 1–6, doi: 10.1109/ACMI53878.2021.9528179.
[12] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4: Optimal Speed and Accuracy of Object Detection,” Comput. Vis. Pattern Recognit., pp. 1–17, Apr. 2020, Accessed: Jun. 06, 2023. [Online]. Available: https://arxiv.org/abs/2004.10934v1.
[13] M. Valente, H. Silva, J. Caldeira, V. Soares, and P. Gaspar, “Detection of Waste Containers Using Computer Vision,” Appl. Syst. Innov., vol. 2, no. 1, p. 11, Mar. 2019, doi: 10.3390/asi2010011.
[14] O. Kramer, “K-Nearest Neighbors,” in Dimensionality Reduction with Unsupervised Nearest Neighbors, Springer, Berlin, Heidelberg, 2013, pp. 13–23, doi: 10.1007/978-3-642-38652-7_2.
[15] K. Pavani and P. Sriramya, “Comparison of KNN, ANN, CNN and YOLO algorithms for detecting the accurate traffic flow and build an Intelligent Transportation System,” in 2022 2nd International Conference on Innovative Practices in Technology and Management (ICIPTM), Feb. 2022, pp. 628–633, doi: 10.1109/ICIPTM54933.2022.9753900.
[16] D. S. Shakya, “Analysis of Artificial Intelligence based Image Classification Techniques,” J. Innov. Image Process., vol. 2, no. 1, pp. 44–54, 2020, doi: 10.36548/jiip.2020.1.005.
[17] H. Yigit, “A weighting approach for KNN classifier,” in 2013 International Conference on Electronics, Computer and Computation (ICECCO), Nov. 2013, pp. 228–231, doi: 10.1109/ICECCO.2013.6718270.
[18] A. F. Abate, M. Nappi, S. Barra, and M. De Marsico, “What are you doing while answering your smartphone?,” in 2018 24th International Conference on Pattern Recognition (ICPR), Aug. 2018, vol. 2018-Augus, pp. 3120–3125, doi: 10.1109/ICPR.2018.8545797.
[19] D. C. Anastasiu and G. Karypis, “Fast Parallel Cosine K-Nearest Neighbor Graph Construction,” in 2016 6th Workshop on Irregular Applications: Architecture and Algorithms (IA3), Nov. 2016, pp. 50–53, doi: 10.1109/IA3.2016.013.
[20] Y. Xu, Q. Zhu, Z. Fan, M. Qiu, Y. Chen, and H. Liu, “Coarse to fine K nearest neighbor classifier,” Pattern Recognit. Lett., vol. 34, no. 9, pp. 980–986, Jul. 2013, doi: 10.1016/j.patrec.2013.01.028.
[21] A. G. Howard et al., “MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications,” Comput. Vis. Pattern Recognit., pp. 1–9, Apr. 2017, Accessed: Jan. 06, 2023. [Online]. Available at: https://arxiv.org/abs/1704.04861v1.
[22] S. Ioffe and C. Szegedy, “Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift,” 32nd Int. Conf. Mach. Learn. ICML 2015, vol. 1, pp. 448–456, Feb. 2015, Accessed: Jan. 06, 2023. [Online]. Available at: https://arxiv.org/abs/1502.03167v3.
[23] S. Suthaharan, “Support Vector Machine,” in Machine Learning Models and Algorithms for Big Data Classification, Springer, Boston, MA, 2016, pp. 207–235, doi: 10.1007/978-1-4899-7641-3_9.
[24] “How SVM Works,” IBM Documentation, 2021. Accessed Jan. 06, 2022. [Online]. Available at: https://www.ibm.com/docs/en/spss-modeler/saas?topic=models-how-svm-works.
[25] K. He, G. Gkioxari, P. Dollar, and R. Girshick, “Mask R-CNN,” in 2017 IEEE International Conference on Computer Vision (ICCV), Oct. 2017, vol. 2017-Octob, pp. 2980–2988, doi: 10.1109/ICCV.2017.322.
[26] H. He et al., “Mask R-CNN based automated identification and extraction of oil well sites,” Int. J. Appl. Earth Obs. Geoinf., vol. 112, p. 102875, Aug. 2022, doi: 10.1016/j.jag.2022.102875.
[27] S. T. Cynthia, K. M. Shahrukh Hossain, M. N. Hasan, M. Asaduzzaman, and A. K. Das, “Automated Detection of Plant Diseases Using Image Processing and Faster R-CNN Algorithm,” in 2019 International Conference on Sustainable Technologies for Industry 4.0 (STI), Dec. 2019, pp. 1–5, doi: 10.1109/STI47673.2019.9068092.
[28] A. Lohia, K. D. Kadam, R. R. Joshi, and A. M. Bongale, “Bibliometric Analysis of One-stage and Two-stage Object Detection,” Libr. Philos. Pract., vol. 2021, no. February, pp. 1–33, 2021, [Online]. Available at: https://www.researchgate.net/profile/Rahul-Joshi-9/publication/349297260_.
[29] S. Ren, K. He, R. Girshick, and J. Sun, “Faster R-CNN: towards real-time object detection with region proposal networks,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 39, no. 6, pp. 1137–1149, Jun. 2017, doi: 10.1109/TPAMI.2016.2577031.
[30] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” Commun. ACM, vol. 60, no. 6, pp. 84–90, May 2017, doi: 10.1145/3065386.
[31] M. Pak and S. Kim, “A review of deep learning in image recognition,” in 2017 4th International Conference on Computer Applications and Information Processing Technology (CAIPT), Aug. 2017, vol. 2018-Janua, pp. 1–3, doi: 10.1109/CAIPT.2017.8320684.
[32] “PictureThis Identificar Planta,” Apps no Google Play. Accessed Jan. 06, 2023. [Online]. Available at: https://play.google.com/store/apps/details?id=cn.danatech.xingseus&hl=pt_PT&gl=US&pli=1.
[33] “NatureID - Identificar plantas,” Apps no Google Play. Accessed Jan. 06, 2023. [Online]. Available at: https://play.google.com/store/apps/details?id=plant.identification.flower.tree.leaf.identifier.identify.cat.dog.breed.nature&hl=pt_PT&gl=US.
[34] G. Li, X. Huang, J. Ai, Z. Yi, and W. Xie, “Lemon‐YOLO: An efficient object detection method for lemons in the natural environment,” IET Image Process., vol. 15, no. 9, pp. 1998–2009, Jul. 2021, doi: 10.1049/ipr2.12171.
[35] S. Chavan, J. Ford, X. Yu, and J. Saniie, “Plant Species Image Recognition using Artificial Intelligence on Jetson Nano Computational Platform,” in 2021 IEEE International Conference on Electro Information Technology (EIT), May 2021, vol. 2021-May, pp. 350–354, doi: 10.1109/EIT51626.2021.9491893.
[36] M. H. IBRAHIM, “WBA-DNN: A hybrid weight bat algorithm with deep neural network for classification of poisonous and harmful wild plants,” Comput. Electron. Agric., vol. 190, p. 106478, Nov. 2021, doi: 10.1016/J.COMPAG.2021.106478.
[37] N. M. A. Ibrahim, D. G. Gabr, and A.-H. M. Emara, “A New Deep Learning System for Wild Plants Classification and Species Identification: Using Leaves and Fruits,” in Lecture Notes on Data Engineering and Communications Technologies, vol. 127, Springer Science and Business Media Deutschland GmbH, 2022, pp. 26–37, doi: 10.1007/978-3-030-98741-1_3.
[38] D. Wu, S. Lv, M. Jiang, and H. Song, “Using channel pruning-based YOLO v4 deep learning algorithm for the real-time and accurate detection of apple flowers in natural environments,” Comput. Electron. Agric., vol. 178, p. 105742, Nov. 2020, doi: 10.1016/j.compag.2020.105742.
[39] S. Patil and M. Sasikala, “Segmentation and identification of medicinal plant through weighted KNN,” Multimed. Tools Appl., vol. 82, no. 2, pp. 2805–2819, Jan. 2023, doi: 10.1007/S11042-022-13201-7/METRICS.
[40] R. I. Borman, R. Napianto, N. Nugroho, D. Pasha, Y. Rahmanto, and Y. E. Pratama Yudoutomo, “Implementation of PCA and KNN Algorithms in the Classification of Indonesian Medicinal Plants,” in 2021 International Conference on Computer Science, Information Technology, and Electrical Engineering (ICOMITEE), Oct. 2021, pp. 46–50, doi: 10.1109/ICOMITEE53461.2021.9650176.
[41] E. T. Assunção et al., “Peaches Detection Using a Deep Learning Technique—A Contribution to Yield Estimation, Resources Management, and Circular Economy,” Climate, vol. 10, no. 2, p. 11, Jan. 2022, doi: 10.3390/cli10020011.
[42] V. Tiwari, R. C. Joshi, and M. K. Dutta, “Dense convolutional neural networks based multiclass plant disease detection and classification using leaf images,” Ecol. Inform., vol. 63, p. 101289, Jul. 2021, doi: 10.1016/j.ecoinf.2021.101289.
[43] “Plant Disease Identification a,” Apps no Google Play. Accessed Jan. 06, 2023. [Online]. Available at: https://play.google.com/store/apps/details?id=com.fouxa.plantdiseasedetection&hl=pt_PT&gl=US.
[44] “Plantix - seu médico agrícola,” Apps no Google Play. Accessed Jun. 06, 2023. [Online]. Available at: https://play.google.com/store/apps/details?id=com.peat.GartenBank&hl=pt_PT&gl=US.
[45] V. K. Shrivastava and M. K. Pradhan, “Rice plant disease classification using color features: a machine learning paradigm,” J. Plant Pathol., vol. 103, no. 1, pp. 17–26, Feb. 2021, doi: 10.1007/s42161-020-00683-3.
[46] H. Kibriya, I. Abdullah, and A. Nasrullah, “Plant Disease Identification and Classification Using Convolutional Neural Network and SVM,” in 2021 International Conference on Frontiers of Information Technology (FIT), Dec. 2021, pp. 264–268, doi: 10.1109/FIT53504.2021.00056.
[47] Z. Li et al., “Improved AlexNet with Inception-V4 for Plant Disease Diagnosis,” Comput. Intell. Neurosci., vol. 2022, pp. 1–12, Sep. 2022, doi: 10.1155/2022/5862600.
[48] S. A. Wagle and H. R, “Comparison of Plant Leaf Classification Using Modified AlexNet and Support Vector Machine,” Trait. du Signal, vol. 38, no. 1, pp. 79–87, Feb. 2021, doi: 10.18280/ts.380108.
[49] H. Tarek, H. Aly, S. Eisa, and M. Abul-Soud, “Optimized Deep Learning Algorithms for Tomato Leaf Disease Detection with Hardware Deployment,” Electronics, vol. 11, no. 1, p. 140, Jan. 2022, doi: 10.3390/electronics11010140.
[50] A. Elaraby, W. Hamdy, and M. Alruwaili, “Optimization of Deep Learning Model for Plant Disease Detection Using Particle Swarm Optimizer,” Comput. Mater. Contin., vol. 71, no. 2, pp. 4019–4031, Dec. 2022, doi: 10.32604/cmc.2022.022161.
[51] A. Mohandas, M. S. Anjali, and U. Rahul Varma, “Real-Time Detectionand Identification of Plant Leaf Diseases using YOLOv4-tiny,” in 2021 12th International Conference on Computing Communication and Networking Technologies (ICCCNT), Jul. 2021, pp. 1–5, doi: 10.1109/ICCCNT51525.2021.9579783.
[52] O. Saxena, S. Agrawal, and S. Silakari, “Disease Detection In Plant Leaves Using Deep Learning Models: AlexNet And GoogLeNet,” in 2021 IEEE International Conference on Technology, Research, and Innovation for Betterment of Society (TRIBES), Dec. 2021, pp. 1–6, doi: 10.1109/TRIBES52498.2021.9751620.
[53] U. Afzaal, B. Bhattarai, Y. R. Pandeya, and J. Lee, “An Instance Segmentation Model for Strawberry Diseases Based on Mask R-CNN,” Sensors, vol. 21, no. 19, p. 6565, Sep. 2021, doi: 10.3390/s21196565.
[54] H. Li et al., “Symptom recognition of disease and insect damage based on Mask R-CNN, wavelet transform, and F-RNet,” Front. Plant Sci., vol. 13, Jul. 2022, pp. 1-14, doi: 10.3389/fpls.2022.922797.
[55] E. Assuncao, C. Diniz, P. D. Gaspar, and H. Proenca, “Decision-making support system for fruit diseases classification using Deep Learning,” in 2020 International Conference on Decision Aid Sciences and Application (DASA), Nov. 2020, pp. 652–656, doi: 10.1109/DASA51403.2020.9317219.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
___________________________________________________________
International Journal of Advances in Intelligent Informatics
ISSN 2442-6571 (print) | 2548-3161 (online)
Organized by UAD and ASCEE Computer Society
Published by Universitas Ahmad Dahlan
W: http://ijain.org
E: info@ijain.org (paper handling issues)
andri.pranolo.id@ieee.org (publication issues)
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0