Developing applications for remote and proximal sensing technologies in agriculture

UAV with standard RGB camera conducting field survey

Exploring emerging remote and proximal sensing technologies for novel applications in crop protection is a nascent area of research at the James Hutton Institute that has the potential to produce many powerful tools for inclusion in Integrated Pest and Disease Management programmes. The rationale that underpins this area of research is that the properties of the light (and other electromagnetic waves) reflected by plant surfaces is dependent on leaf surface attributes, internal structures and biochemical components, which are all influenced in distinctive ways by crop genotype as well as nutritional and disease status. At the James Hutton Institute we have several ongoing research projects that are exploring various remote and proximal sensing technologies for potential applications to IPM.

The continuing rise in the availability of unmanned aerial vehicle (UAV) technology, combined with sensors that can measure crop reflectance in wavelengths beyond the visible light spectrum, presents the possibility to use automated image analysis to inform crop disease management decision support systems. The research project ‘In-field optical detection of potato diseases (Poptical)’ is investigating crop image data captured by UAV-mounted sensors for potential to discriminate between diseases of potato plants. The ultimate goal of ‘Poptical’ is to produce morbidity maps for growers that allow them to treat disease outbreaks in a targeted, reactive manner, rather than unnecessarily treating healthy plants. This project is funded by Innovate UK (TSB 102105), and partners JHI with James Hutton Limited, Agrii and Mannor Fresh.

For further information on Poptical, contact Damian Bienkowski.

The RESAS funded Work Package 2.1.5 ‘In-Field Detection’, underpins IPM by delivering improved methods of detecting, identifying and quantifying key pests and pathogens in a timely manner in ‘the field’, using recent and emerging technologies. Within this research deliverable is the objective of ‘Improved application of sensors and satellite imaging into early warning systems’ in which sensor technologies, which can be deployed in-field, will be evaluated for their ability to detect disease.

Information will be extracted from mobile phone imagery (e.g. leaf morphology and colour), and augmented with low-cost hyperspectral add-on equipment that is currently at the working prototype level. Image texture and spectroscopy analysis methods will be developed to optimise extraction of data from field measurements (with BioSS). Ongoing and continuing development of this technique will provide the ability to develop a large set of baseline data for sensor calibration and modelling approaches, leading to the development of a mobile phone app for field-based visual crop disease assessment to assist land managers with decision-making. FTIR sensors will be used to detect chemical changes in crop plant species, in response to pathogen infection. FTIR spectroscopy provides a unique chemical profile of a sample allowing the qualitative analysis of a large variety of samples. Quantitative estimations of disease will be made through calibration with spectral data sets and laboratory reference data. A mobile field phenotyping platform integrating thermal, visual, near and short wave infra-red sensors/cameras to measure changes in canopy temperature and leaf spectral properties will be developed. This will indicate plant stress associated with early disease development under field conditions and allow the utility of imaging as a tool for high throughput detection and diagnosis of biotic stress to be evaluated. Field-acquired sensor data (mobile phone camera imagery, visible-wavelength spectroscopy) will be integrated with remote sensing data and existing spatial datasets (topography, climate, soil, land cover) using “cloud-based” processing, for crop disease monitoring in real time.

For further information on RESAS Work Package 2.1.5, contact Jennie Brierley.