Smartphone App Under Development for Diagnosing Citrus Leaf Symptoms

Tacy Callies Technology

By Arnold Schumann, Perseverança Mungofa, Laura Waldo and Chris Oswalt

smartphone
Figure 1. Automatic identification of leaf symptoms by machine vision uses deep learning artificial neural networks.

Since the first Apple iPhone was launched in 2007, there have been many improvements to smartphone cameras and their increasingly powerful graphics-processing capabilities. In recent years, these powerful hand-held computers have also made their impact on agriculture, where they are being used for communications, mapping, navigation, information retrieval and diagnostic services.

The recent integration of deep-learning artificial neural networks with image-processing, machine-vision technologies allowed the artificial intelligence of trained neural networks in smartphone apps to be used for identifying various objects visible with the on-board cameras. Artificial neural networks can be trained to recognize virtually anything that the human eye can perceive in images, including objects that are partially obscured.

The operating principle of an artificial neural network is essentially a massive collection of inter-connected three-dimensional regression models on a computer designed to mimic the connected architecture of neurons in a brain. A July 2018 Citrus Industry magazine article, “Artificial intelligence for detecting citrus pests, diseases and disorders” (https://citrusindustry.net/2018/07/02/artificial-intelligence-detecting-citrus-pests-diseases-disorders), introduced the technologies behind the smartphone apps.

IDENTIFICATION MADE EASY
Visual identification of nutrient deficiencies in foliage is an important diagnostic tool for fine-tuning nutrient management of citrus, especially in the huanglongbing (HLB) disease era. Disease and pest symptoms on leaves may cause chlorotic patterns that can be confused with nutrient deficiency symptoms. An expertly trained person can distinguish and correctly identify most of the common leaf symptoms seen in Florida citrus, but it can take years to build sufficient experience and confidence.

Due to the abundance of new computer technology in the artificial intelligence realm, it is now possible to package a trained artificial neural network model in a standard smartphone app. The app, operated by an untrained person, can automatically recognize leaf symptoms from video or photos taken with the smartphone camera.

In this article, we describe the development of a smartphone app at the University of Florida Institute of Food and Agricultural Sciences(UF/IFAS) Citrus Research and Education Center (CREC) that uses the front camera to view, detect and diagnose symptoms of nutrient deficiencies, pests and diseases of citrus leaves. The goal of the app is to help growers, homeowners and Extension agents obtain expert diagnoses of leaf symptoms in groves without undergoing special training.

WHAT CAN BE RECOGNIZED
During development of the app, 130 citrus leaves were collected in groves for each of nine recognizable symptoms. The leaves were photographed with a smartphone camera using 8-megapixel resolution. The images were tagged with their respective symptom identities and used to train a convolutional artificial neural network on a powerful computer server. The training process during which the neural network “learns” the attributes or features of the various symptoms is very intensive and typically takes hours to complete. Fortunately, the training needs to be conducted only once on the server. Thereafter, the trained network can be deployed to multiple smaller devices like smartphones. The trained network is then used to analyze new images and rapidly identify symptoms for which it has been trained (Figure 1).

The citrus leaf symptoms that can be recognized by the smartphone app include citrus canker, HLB and greasy spot diseases; two-spotted-spider mites; and deficiencies of magnesium, iron, manganese and zinc. Leaves with no symptoms are identified as healthy.

Our tests with the app on an iPhone show that machine vision can diagnose leaf symptoms in less than one second.

ACCURACY LEVELS
Spider mite, greasy spot and canker damage can be identified by the symptoms visible on the abaxial (lower) surfaces of leaves. In a separate validation study comparing true or false symptom identification with human experts, the average accuracy of the trained neural network to correctly identify all leaf symptoms was 89 percent, ranging from 63 percent for zinc deficiency to 100 percent for greasy spot disease, citrus canker and asymptomatic healthy leaves.

Symptoms of spider mite damage and deficiencies of manganese, iron and magnesium were identified with more than 91 percent accuracy. Identifying HLB symptoms with the trained network was less accurate (82 percent), which is attributable to the complex blotchy mottle leaf symptoms that are characteristic of the disease.

NEXT STEPS
The smartphone app will be thoroughly field-tested, and additional symptom classes will be added before deploying it to the Apple and Google app stores. It will then be available for free download as a new scouting tool for diagnosing citrus leaf symptoms in Florida. Additional leaf symptoms being added include phytophthora and citrus scab as well as nitrogen and potassium deficiencies.

Figure 2. Citrus leaf symptoms are rapidly identified with the smartphone app. Symptoms are highlighted on the image with a rectangular outline and label.

Since accuracy is the highest priority, users will be encouraged to maximize performance of the app by placing individual leaves on white paper in a well-illuminated environment (a room or car), as shown in Figure 2. A brightly lit indoor or outdoor environment is ideal for obtaining a high-quality image for analysis.

The encouraging 89 percent accuracy level achieved by the prototype app will be improved upon with more data and training. Additional enhancements to the app will include linkages of the identified symptoms to UF/IFAS online Electronic Data Information Source (EDIS) Extension documents. The EDIS documents were written by the experts in their research field and contain many more details about the symptoms and remedies for nutrient deficiencies, pest and disease management.

Smartphone apps will not replace conventional leaf nutrient or disease testing in laboratories for the foreseeable future. Instead, they will increase the availability of expert, in-field advice — in the palm of any user’s hand.

Acknowledgements: This article is based on work supported by the HLB Multi-Agency Coordination System. See the project page at www.makecitrusgreatagain.com/SmartphoneApp.htm.

Arnold Schumann (schumaw@ufl.edu) is a professor, Laura Waldo (ljwaldo@ufl.edu) is a senior biological scientist, and Perseverança Mungofa (pmungofa@ufl.edu) is a graduate student — all at the UF/IFAS CREC in Lake Alfred. Chris Oswalt (wcoswalt@ufl.edu) is a UF/IFAS citrus Extension agent in Bartow.