Lettuce growth stage identification based on phytomorphological variations using coupled color superpixels and multifold watershed transformation

(1) * Ronnie Sabino Concepcion II Mail (De La Salle University, Philippines)
(2) Jonnel Dorado Alejandrino Mail (De La Salle University, Philippines)
(3) Sandy Cruz Lauguico Mail (De La Salle University, Philippines)
(4) Rogelio Ruzcko Tobias Mail (De La Salle University, Philippines)
(5) Edwin Sybingco Mail (De La Salle University, Philippines)
(6) Elmer Pamisa Dadios Mail (De La Salle University, Philippines)
(7) Argel Alejandro Bandala Mail (De La Salle University, Philippines)
*corresponding author


Identifying the plant's developmental growth stages from seed leaf is crucial to understand plant science and cultivation management deeply. An efficient vision-based system for plant growth monitoring entails optimum segmentation and classification algorithms. This study presents coupled color-based superpixels and multifold watershed transformation in segmenting lettuce plant from complicated background taken from smart farm aquaponic system, and machine learning models used to classify lettuce plant growth as vegetative, head development and for harvest based on phytomorphological profile. Morphological computations were employed by feature extraction of the number of leaves, biomass area and perimeter, convex area, convex hull area and perimeter, major and minor axis lengths of the major axis length the dominant leaf, and length of plant skeleton. Phytomorphological variations of biomass compactness, convexity, solidity, plant skeleton, and perimeter ratio were included as inputs of the classification network. The extracted Lab color space information from the training image set undergoes superpixels overlaying with 1,000 superpixel regions employing K-means clustering on each pixel class. Six-level watershed transformation with distance transformation and minima imposition was employed to segment the lettuce plant from other pixel objects. The accuracy of correctly classifying the vegetative, head development, and harvest growth stages are 88.89%, 86.67%, and 79.63%, respectively. The experiment shows that the test accuracy rates of machine learning models were recorded as 60% for LDA, 85% for ANN, and 88.33% for QSVM. Comparative analysis showed that QSVM bested the performance of optimized LDA and ANN in classifying lettuce growth stages. This research developed a seamless model in segmenting vegetation pixels, and predicting lettuce growth stage is essential for plant computational phenotyping and agricultural practice optimization.


Computer vision; Lettuce; Machine learning; Morphological; Superpixels




Article metrics

Abstract views : 359 | PDF views : 52




Full Text



[1] A. Conn, U. V. Pedmale, J. Chory, and S. Navlakha, “High-Resolution Laser Scanning Reveals Plant Architectures that Reflect Universal Network Design Principles,” Cell Syst., vol. 5, no. 1, pp. 53-62.e3, Jul. 2017, doi: 10.1016/j.cels.2017.06.017.

[2] J. Jin et al., “An Arabidopsis Transcriptional Regulatory Map Reveals Distinct Functional and Evolutionary Features of Novel Transcription Factors,” Mol. Biol. Evol., vol. 32, no. 7, pp. 1767–1773, Jul. 2015, doi: 10.1093/molbev/msv058.

[3] P. J. M. Loresco, I. C. Valenzuela, and E. P. Dadios, “Color Space Analysis Using KNN for Lettuce Crop Stages Identification in Smart Farm Setup,” in TENCON 2018 - 2018 IEEE Region 10 Conference, 2018, pp. 2040–2044, doi: 10.1109/TENCON.2018.8650209.

[4] W. de Bruin, C. F. van der Merwe, Q. Kritzinger, R. M. S. Bornman, and L. Korsten, “Morphological characterisation of lettuce plasma membrane ultrastructure and vesicle formation caused by nonylphenol: A scanning electron microscopy study,” South African J. Bot., vol. 111, pp. 176–181, Jul. 2017, doi: 10.1016/j.sajb.2017.03.027.

[5] Z. Yan, D. He, G. Niu, and H. Zhai, “Evaluation of growth and quality of hydroponic lettuce at harvest as affected by the light intensity, photoperiod and light quality at seedling stage,” Sci. Hortic. (Amsterdam)., vol. 248, pp. 138–144, Apr. 2019, doi: 10.1016/j.scienta.2019.01.002.

[6] H. Liu, Y. Fu, D. Hu, J. Yu, and H. Liu, “Effect of green, yellow and purple radiation on biomass, photosynthesis, morphology and soluble sugar content of leafy lettuce via spectral wavebands ‘knock out,’” Sci. Hortic. (Amsterdam)., vol. 236, pp. 10–17, Jun. 2018, doi: 10.1016/j.scienta.2018.03.027.

[7] J. Zou et al., “Morphological and physiological properties of indoor cultivated lettuce in response to additional far-red light,” Sci. Hortic. (Amsterdam)., vol. 257, p. 108725, Nov. 2019, doi: 10.1016/j.scienta.2019.108725.

[8] Q. Meng and E. S. Runkle, “Far-red radiation interacts with relative and absolute blue and red photon flux densities to regulate growth, morphology, and pigmentation of lettuce and basil seedlings,” Sci. Hortic. (Amsterdam)., vol. 255, pp. 269–280, Sep. 2019, doi: 10.1016/j.scienta.2019.05.030.

[9] S. Z. Ilić et al., “Light modification by color nets improve quality of lettuce from summer production,” Sci. Hortic. (Amsterdam)., vol. 226, pp. 389–397, Dec. 2017, doi: 10.1016/j.scienta.2017.09.009.

[10] J. E. Relf-Eckstein, A. T. Ballantyne, and P. W. B. Phillips, “Farming Reimagined: A case study of autonomous farm equipment and creating an innovation opportunity space for broadacre smart farming,” NJAS - Wageningen J. Life Sci., vol. 90–91, p. 100307, Dec. 2019, doi: 10.1016/j.njas.2019.100307.

[11] A. A. Adenle, H. Azadi, and J. Arbiol, “Global assessment of technological innovation for climate change adaptation and mitigation in developing world,” J. Environ. Manage., vol. 161, pp. 261–275, Sep. 2015, doi: 10.1016/j.jenvman.2015.05.040.

[12] V. Blok and B. Gremmen, “Agricultural Technologies as Living Machines: Toward a Biomimetic Conceptualization of Smart Farming Technologies,” Ethics, Policy Environ., vol. 21, no. 2, pp. 246–263, May 2018, doi: 10.1080/21550085.2018.1509491.

[13] A. Zhang, I. Baker, E. Jakku, and R. Llewellyn, “Accelerating precision agriculture to decision agriculture: The needs and drivers for the present and future of digital agriculture in Australia,” 2017. Available at: Google Scholar

[14] C. Eastwood, M. Ayre, R. Nettle, and B. Dela Rue, “Making sense in the cloud: Farm advisory services in a smart farming future,” NJAS - Wageningen J. Life Sci., vol. 90–91, p. 100298, Dec. 2019, doi: 10.1016/j.njas.2019.04.004.

[15] M. Shepherd, J. A. Turner, B. Small, and D. Wheeler, “Priorities for science to overcome hurdles thwarting the full promise of the ‘digital agriculture’ revolution,” J. Sci. Food Agric., vol. 100, no. 14, pp. 5083–5092, Nov. 2020, doi: 10.1002/jsfa.9346.

[16] V. Bellon Maurel and C. Huyghe, “Putting agricultural equipment and digital technologies at the cutting edge of agroecology,” OCL, vol. 24, no. 3, p. D307, May 2017, doi: 10.1051/ocl/2017028.

[17] J. Doshi, T. Patel, and S. kumar Bharti, “Smart Farming using IoT, a solution for optimally monitoring farming conditions,” Procedia Comput. Sci., vol. 160, pp. 746–751, 2019, doi: 10.1016/j.procs.2019.11.016.

[18] L. Klerkx, E. Jakku, and P. Labarthe, “A review of social science on digital agriculture, smart farming and agriculture 4.0: New contributions and a future research agenda,” NJAS - Wageningen J. Life Sci., vol. 90–91, p. 100315, Dec. 2019, doi: 10.1016/j.njas.2019.100315.

[19] D. Glaroudis, A. Iossifides, and P. Chatzimisios, “Survey, comparison and research challenges of IoT application protocols for smart farming,” Comput. Networks, vol. 168, p. 107037, Feb. 2020, doi: 10.1016/j.comnet.2019.107037.

[20] A. T. Balafoutis et al., “Smart Farming Technologies – Description, Taxonomy and Economic Impact,” 2017, pp. 21–77. doi: 10.1007/978-3-319-68715-5_2

[21] I. C. Valenzuela et al., “Quality assessment of lettuce using artificial neural network,” in 2017IEEE 9th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), 2017, pp. 1–5, doi: 10.1109/HNICEM.2017.8269506.

[22] X. A. P. Calangian et al., “Vision-based Canopy Area Measurements,” in 2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology,Communication and Control, Environment and Management (HNICEM), 2018, pp. 1–4, doi: 10.1109/HNICEM.2018.8666251.

[23] Y. Fang, X. Wang, P. Shi, C. Lin, and R. Zhai, “Automatic identification of two growth stages for rapeseed plant: Three leaf and four leaf stage,” in 2015 Fourth International Conference on Agro-Geoinformatics (Agro-geoinformatics), 2015, pp. 148–153, doi: 10.1109/Agro-Geoinformatics.2015.7248125.

[24] W. Gelard, A. Herbulot, M. Devy, and P. Casadebaig, “3D Leaf Tracking for Plant Growth Monitoring,” in 2018 25th IEEE International Conference on Image Processing (ICIP), 2018, pp. 3663–3667, doi: 10.1109/ICIP.2018.8451553.

[25] M. P. Rico-Fernández, R. Rios-Cabrera, M. Castelán, H.-I. Guerrero-Reyes, and A. Juarez-Maldonado, “A contextualized approach for segmentation of foliage in different crop species,” Comput. Electron. Agric., vol. 156, pp. 378–386, Jan. 2019, doi: 10.1016/j.compag.2018.11.033.

[26] X. Guo, M. Zhang, and Y. Dai, “Image of Plant Disease Segmentation Model Based on Pulse Coupled Neural Network with Shuffle Frog Leap Algorithm,” in 2018 14th International Conference on Computational Intelligence and Security (CIS), 2018, pp. 169–173, doi: 10.1109/CIS2018.2018.00044.

[27] P. J. Loresco, A. Bandala, A. Culaba, and E. Dadios, “Computer vision performance metrics evaluation of object detection based on Haar-like, HOG and LBP features for scale-invariant lettuce leaf area calculation,” Int. J. Eng. Technol., vol. 7, no. 4, pp. 4866–4872, 2018, doi: 10.14419/ijet.v7i4.26071

[28] D. G. Fernández-Pacheco, D. Escarabajal-Henarejos, A. Ruiz-Canales, J. Conesa, and J. M. Molina-Martínez, “A digital image-processing-based method for determining the crop coefficient of lettuce crops in the southeast of Spain,” Biosyst. Eng., vol. 117, pp. 23–34, Jan. 2014, doi: 10.1016/j.biosystemseng.2013.07.014.

[29] D. Escarabajal-Henarejos, J. M. Molina-Martínez, D. G. Fernández-Pacheco, F. Cavas-Martínez, and G. García-Mateos, “Digital photography applied to irrigation management of Little Gem lettuce,” Agric. Water Manag., vol. 151, pp. 148–157, Mar. 2015, doi: 10.1016/j.agwat.2014.08.009.

[30] R. Bhagwat and Y. Dandawate, “Indian plant species identification under varying illumination and viewpoint conditions,” in 2016 Conference on Advances in Signal Processing (CASP), 2016, pp. 469–473, doi: 10.1109/CASP.2016.7746217.

[31] P. J. M. Loresco and E. P. Dadios, “A scale-invariant lettuce leaf area calculation using machine vision and knowledge-based methods,” Int. J. Eng. Technol., vol. 7, no. 4, pp. 4880–4885, 2018, doi: 10.14419/ijet.v7i4.26553

[32] H. Hajjdiab and A. Obaid, “A vision-based approach for nondestructive leaf area estimation,” in 2010 The 2nd Conference on Environmental Science and Information Application Technology, 2010, pp. 53–56, doi: 10.1109/ESIAT.2010.5568973.

[33] Y. Hong, Z. Chen, and J. Qiu, “Classification of leaf shapes and estimation of leaf mass,” in 2012 International Conference on Green and Ubiquitous Technology, 2012, pp. 167–171, doi: 10.1109/GUT.2012.6344176.

[34] S. Zhang, H. Wang, W. Huang, and Z. You, “Plant diseased leaf segmentation and recognition by fusion of superpixel, K-means and PHOG,” Optik (Stuttg)., vol. 157, pp. 866–872, Mar. 2018, doi: 10.1016/j.ijleo.2017.11.190.

[35] Y. Shen, T. Ai, W. Li, M. Yang, and Y. Feng, “A polygon aggregation method with global feature preservation using superpixel segmentation,” Comput. Environ. Urban Syst., vol. 75, pp. 117–131, May 2019, doi: 10.1016/j.compenvurbsys.2019.01.009.

[36] I. Jebari and D. Filliat, “Color and Depth-Based Superpixels for Background and Object Segmentation,” Procedia Eng., vol. 41, pp. 1307–1315, 2012, doi: 10.1016/j.proeng.2012.07.315.

[37] K. Fu, C. Gong, J. Yang, Y. Zhou, and I. Yu-Hua Gu, “Superpixel based color contrast and color distribution driven salient object detection,” Signal Process. Image Commun., vol. 28, no. 10, pp. 1448–1463, Nov. 2013, doi: 10.1016/j.image.2013.07.005.

[38] L. Zhang, B. Verma, and D. Stockwell, “Spatial contextual superpixel model for natural roadside vegetation classification,” Pattern Recognit., vol. 60, pp. 444–457, Dec. 2016, doi: 10.1016/j.patcog.2016.05.013.

[39] B. Pace, M. Cefola, P. Da Pelo, F. Renna, and G. Attolico, “Non-destructive evaluation of quality and ammonia content in whole and fresh-cut lettuce by computer vision system,” Food Res. Int., vol. 64, pp. 647–655, Oct. 2014, doi: 10.1016/j.foodres.2014.07.037.

[40] M. Yazdi, M. Kolahi, E. Mohajel Kazemi, and A. Goldson Barnaby, “Study of the contamination rate and change in growth features of lettuce (Lactuca sativa Linn.) in response to cadmium and a survey of its phytochelatin synthase gene,” Ecotoxicol. Environ. Saf., vol. 180, pp. 295–308, Sep. 2019, doi: 10.1016/j.ecoenv.2019.04.071.

[41] J. G. A. Barbedo, “Detection of nutrition deficiencies in plants using proximal images and machine learning: A review,” Comput. Electron. Agric., vol. 162, pp. 482–492, Jul. 2019, doi: 10.1016/j.compag.2019.04.035.

[42] H. Mao, H. Gao, X. Zhang, and F. Kumi, “Nondestructive measurement of total nitrogen in lettuce by integrating spectroscopy and computer vision,” Sci. Hortic. (Amsterdam)., vol. 184, pp. 1–7, Mar. 2015, doi: 10.1016/j.scienta.2014.12.027.

[43] R. S. Concepcion and L. C. Ilagan, “Application of Hybrid Soft Computing for Classification of Reinforced Concrete Bridge Structural Health Based on Thermal-Vibration Intelligent System Parameters,” in 2019 IEEE 15th International Colloquium on Signal Processing & Its Applications (CSPA), 2019, pp. 207–212, doi: 10.1109/CSPA.2019.8696007.

[44] R. S. Concepcion et al., “Alertness and Mental Fatigue Classification Using Computational Intelligence in an Electrocardiography and Electromyography System with Off-Body Area Network,” 2020, pp. 153–169, doi: 10.1007/978-3-030-20904-9_12

[45] R. S. Concepcion II, P. J. M. Loresco, R. A. R. Bedruz, E. P. Dadios, S. C. Lauguico, and E. Sybingco, “Trophic state assessment using hybrid classification tree-artificial neural network,” Int. J. Adv. Intell. Informatics, vol. 6, no. 1, p. 46, Mar. 2020, doi: 10.26555/ijain.v6i1.408.

[46] I. Dagher, “Quadratic kernel-free non-linear support vector machine,” J. Glob. Optim., vol. 41, no. 1, pp. 15–30, May 2008, doi: 10.1007/s10898-007-9162-0.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Advances in Intelligent Informatics
ISSN 2442-6571  (print) | 2548-3161 (online)
Organized by Informatics Department - Universitas Ahmad Dahlan,  UTM Big Data Centre - Universiti Teknologi Malaysia, and ASCEE Computer Society
Published by Universitas Ahmad Dahlan
W: http://ijain.org
E: ijain@uad.ac.id (paper handling issues)
    info@ijain.org, andri.pranolo.id@ieee.org (publication issues)

View IJAIN Stats

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0