monitoring

Improved Monitoring of Citrus Maturity

Daniel Cooper International, Research

monitoring
The process of building the dataset
(Image credit: Plant Phenomics (2023). DOI: 10.34133/plantphenomics.0057)

A new approach developed primarily by Chinese researchers allows for more precise monitoring of citrus fruit development and optimal harvest timing.

A key to improving citrus fruit quality and post-harvest processes is understanding citrus color change, a critical indicator of fruit maturity that is traditionally gauged by human judgment. Recent machine vision and neural network advancements offer more objective and robust color analysis, but they struggle with varying conditions and translating color data into practical maturity assessments.

Research gaps remain in predicting color transformation over time and developing user-friendly visualization techniques. Additionally, implementing these advanced algorithms on devices like smartphones in agriculture is challenging due to their limited computing capabilities, highlighting a need for optimized, efficient technologies in this field.

In June 2023, Plant Phenomics published a research article titled Predicting and Visualizing Citrus Colour Transformation Using a Deep Mask-Guided Generative Network. The article was written by seven authors: Zehan Bao, Weifu Li, Jun Chen, Hong Chen and Yaohui Chen, all with Huazhong Agricultural University in Wuhan, China; Vijay John with the RIKEN Guardian Robot Project at Kyoto, Japan; and Chi Xiao with Hainan University in Haikou, China.

In this study, researchers developed a novel framework for predicting and visualizing citrus fruit color transformation in orchards, leading to the creation of an Android application. This network model processes citrus images and a specified time interval, outputting a future color image of the fruit.

The dataset, encompassing 107 orange images captured during color transformation, was crucial for training and validating the network. The framework utilizes a deep mask-guided generative network for accurate predictions and has a design requiring fewer resources, facilitating mobile device implementation.

The network also excelled in citrus color prediction and visualization, indicating less distortion and high fidelity of generated images. The network’s robustness was evident in its ability to replicate color transformation accurately, even with different viewing angles and colors of oranges.

Additionally, the network’s merged design, incorporating embedding layers, allowed for accurate predictions over various time intervals with a single model, reducing the need for multiple models for different time frames. Sensory panels further validated the network’s effectiveness, with a majority finding high similarity between synthesized and real images.

In summary, this study’s innovative approach allows for more precise monitoring of fruit development and optimal harvest timing, with potential applications extending to other citrus species and fruit crops. The framework’s adaptability to smartphones makes it highly practical for in-field use, demonstrating the potential of generative models in agriculture and beyond.

Source: TranSpread

Share this Post

Sponsored Content