The unmanned aerial vehicles (UAV) has been applied to agricultural area in various ways. In fact, UAVs are revolutionizing agriculture by providing farmers with valuable insights into their crops through image analysis. UAVs can detect early signs of plant diseases (Kouadio et al. 2023), identify nutrient deficiencies (Uktoro et al. 2024), and monitor soil moisture levels (Ge et al. 2021), aiding in disease prevention (Neupane et al. 2021) and efficient irrigation (Wenting et al. 2020). Additionally, UAVs can estimate crop yield by counting plants and assessing crop health (Hassan et al. 2022), while also identifying and mapping weeds for targeted control (Hunter et al. 2020). By creating accurate field boundary maps and digital elevation models, UAVs help farmers understand their land and optimize agricultural practices (Ajayi et al. 2017). This data enables precision agriculture, such as variable rate application of inputs and site-specific management, leading to improved crop yields and reduced resource waste.
These astonishing successes were the result of collaborations between multiple engineering techniques and various agricultural disciplines. Such collaborative studies work well when all parties communicate effectively. Furthermore, success is achieved when each domain understands its role and expertise. If any of the teams lack clarity in their tasks, the entire project may fail from the outset. Unfortunately, this type of failure tends to occur in emerging fields, such as UAV-related research, particularly among those who are just starting out. In the worst case, when policymakers or research planners are overconfident in their knowledge, it can lead to the development of impractical or unfeasible projects. Here, we demonstrated what happened when the early detection of brown planthopper (Nilaparvata lugens) damage using UAV was pursued in the rice field without knowledge in disease symptom pattern.
This study was carried out in a commercial rice field located in Sagokri, Haimyeon, Goseong-gun, Gyeongsangnam-do (Fig. 1). The field covers an area of 3,248.5 m², with a slope varying between 7% and 15%, and has loam soil (Heuktoram, https://soil.rda.go.kr). The experimental site experiences an average annual temperature of 12°C, a temperature range of 25°C, and an annual precipitation of 1,300 mm.
Beginning in mid-August 2024, five surveys were conducted to monitor brown planthopper emergence in Sagokri, a region where the pest was prevalent in 2023. Rice was cultivated across 126 fields in the area, with a significant increase in the brown planthopper population noted in the field marked in green on August 26 (Fig. 1). In particular, the location marked in red in Fig. 1 was identified as the critical site of brown planthopper emergence within this field. As a result, data collected by the UAV over the red-marked area was analyzed to detect any notable differences compared to other regions within the target fields.
The DJI P4 multispectral UAV was used for the experiment in the rice field (Fig. 2). This UAV has a take-off weight of 1,487 g and an average flight time of 30 minutes. It is equipped with an imaging system that includes six 1/2.9-inch CMOS sensors, allowing it to capture both RGB images through a visible camera and spectral data simultaneously through a multispectral imaging system covering five spectral bands. The ranges of these spectral bands are: Blue (B): 450 nm±16 nm; Green (G): 560 nm±16 nm; Red (R): 650 nm±16 nm; Red Edge (RE): 730 nm±16 nm; and Near-Infrared (NIR): 840 nm±26 nm.
The UAV mission was conducted using the GS Pro app (DJI, China) at an altitude of 20 m, with 80% overlap and sidelap to ensure complete coverage of the target field. Waypoints were created for the mission, and location information was tagged at each waypoint using camera triggers. The DJI P4 multispectral system, equipped with real-time kinematic-global positioning system (RTK-GPS), detected positional information with an accuracy margin of less than 5 cm, enabling precise flight along the designated waypoints. The flights were carried out between 11:00 am and 12:00 pm on September 2, 2024. Immediately after the flight, a Calibrated Radiometric Panel (CRP, Micasense, USA) was captured to perform radiometric calibration, converting the sensor's digital numbers into reflectance values (Fig. 3).
The spectral response data from the UAV's multispectral sensor, along with the reflectance data of the CRP provided by the manufacturer, were used to calculate the mean reflectance values for each band of the multispectral sensor, as described in Equation (2) (Table 1). The overlapping images captured by the UAV were processed into reflectance maps for each spectral band using Pix4Dmapper Pro 3.0.17 (Pix4D SA, Lausanne, Switzerland). During processing, the “Radiometric Processing and Calibration” option was set to “Camera and Sun Irradiance” for correction. The CRP area was manually identified from the ground-captured CRP image immediately after the flight, and the average reflectance value for this area was recorded following Table 1. This radiometric calibration accounted for changes in irradiance during the UAV flight and converted the multispectral digital numbers into accurate reflectance values (Wang 2021). The reflectance maps generated for each band were the n used to evaluate the potential for early detection of brown planthopper infestations.
Where is indicated the calculated mean reflectance values of the CRP, denotes the standard reflectance spectrum of the CRP provide by manufacturer, represents the spectral response of the image sensor, and k k corresponds to one of the bands: Blue (B), Green (G), Red (R), Red Edge (RE), or Near-Infrared (NIR).
In this study, the normalized difference vegetation index (NDVI) was employed to detect brown planthopper infestations. Previous research has demonstrated that vegetation indices based on NIR band spectral data captured by UAVs often show lower values in crops affected by pests compared to healthy crops. For instance, Choosumrong et al. (2023) detected insect damage in banana trees using UAV-based vegetation indices, noting that affected trees had lower values in several NIR-based indices, including NDVI. Similarly, Santos et al. (2022) analyzed coffee plant infections caused by the leaf miner pest (Leucoptera coffeella) with UAV-acquired multispectral data, finding reduced NDVI values in infected coffee plants. Narmilan et al. (2022) also used NDVI to detect white leaf disease in sugarcane. Building on these studies, this research aimed to test the hypothesis that NDVI values from UAV data could identify damage from brown planthoppers in rice. If this hypothesis is correct, the multispectral observations should indicate lower NDVI values in the red-marked areas of the target field, as illustrated in Fig. 1.
The NDVI map for the target field was generated using the reflectance maps of the NIR and RED bands, following the calculation outlined in Equation (1). Despite effective weed management resulting in a weed-free field, the NDVI map still included shadows and bare soil that required further removal. To address this, an Excess Green image (ExG) was used to differentiate crops from soil and shadows, based on the threshold value of 0.045 as described by Li et al. (2018) in Equation (3). This ExG-based filtering method was applied in this study (Fig. 4). Crop regions identified by the ExG filter were assigned a value of 1, while soil and shadow regions were marked as ‘NAN.’ The NDVI image was then multiplied by this filtered image to exclude non-crop areas, resulting in an NDVI map that highlighted only the crop regions, as shown in Fig. 4 (d). For data extraction, random points were generated at 0.6 m intervals across the target field, totaling 10,000 points. Covering an area of approximately 3,000 square meters, this point density ensured an average spacing of about 6 meters between points. A buffer of 0.3 m was applied around each random point to define the Region of Interest (RoI) for data extraction.
The equations of NDVI and ExG are like followings:
Where RED and NIR represent the reflectance of red and near-infrared bands, respectively.
Where GREEN, BLUE and RED the reflectance of green, blue and red bands, respectively.
The steps for calculating the vegetation index and generating the RoI were carried out using QGIS software. Shadow and soil removal were managed with Python's OpenCV and rasterio modules. The filtered NDVI map was then used to assess whether regions with relatively low NDVI values aligned with the affected areas.
The data extracted from the modified NDVI map were categorized into eight classes to create a choropleth map, with the results of brown planthopper detection based on previous studies shown in Fig. 5. Contrary to expectations, brown planthopper proliferation was detected in the northern region (Fig. 5a); however, the NDVI values in this area did not appear significantly lower compared to other regions (Fig. 5b). The histogram of NDVI values in Fig. 5 shows that the affected area had values closer to the mean rather than at the lower end of the histogram. Additionally, the southern region exhibited relatively lower NDVI values. As a result, the initial presumption appears to be rejected.
Previous studies have demonstrated the effectiveness of UAVs in detecting pest damage in fields, primarily focusing on leaf-based indicators (Duarte et al. 2022, Hentz et al. 2018, Hunt et al. 2017, Tao et al. 2022, Tetila et al. 2020). While those researches have effectively utilized UAVs for detecting pest damage based on leaf characteristics, our current study highlights a potential limitation. Despite capturing leaf images from a top-down perspective, similar to prior work, we were unable to identify disease symptoms using UAV-mounted cameras. This suggests that the visual differences between healthy and infected leaves may be subtle or obscured from this vantage point. Observing the symptom progression depicted in Fig. 6, we noticed a temporal discrepancy between stem and leaf discoloration. This indicates that the disease may manifest differently in different plant parts, potentially hindering early detection using UAVs if symptoms on stems are more indicative of infection. Further investigation is required to confirm this hypothesis and explore alternative detection strategies.
The challenges encountered in our study emphasize the critical role of biological knowledge in experimental design. A solid understanding of the target traits is fundamental for selecting appropriate detection methods and ensuring the validity of research findings. Neglecting this crucial step can undermine the entire project. To avoid such pitfalls, project planners, investigators, and scientists must meticulously examine their objectives and validate the feasibility of their targets before proceeding with experiments.
This research was carried out with the support of “Cooperative Research Program for Agriculture Science & Technology Development (Development of an approach to analyze timing and locality of migratory pests occurrence for rice using information derived from an automated surveillance system: RS-2021-RD009729)” Rural Development Administration, Republic of Korea. Also, this work was supported by the BK21 FOUR, Global Smart Farm Educational Research Center, Seoul National University, Seoul, Korea and Advanced Institute of Convergence Technology, Suwon, 16229, Republic of Korea.
Download Form