Using digital plant phenotyping to detect pests and pathogens
Imaging technologies have the potential to become a key player as a method for early detection of pests and diseases, due to the need to rapidly identify the early signs of infestation and early disease. A way of scanning the crop quickly and accurately is going to be needed.
Initially this could involve a two-stage process, first using a drone or satellite imagery to detect areas of the field which appear to be under stress, and then deploying a ground-based system to look in greater detail at those areas of the field showing initial signs of stress.
If we are talking about pests affecting outdoor crops in the UK, aphids are a big problem. These insects not only cause physical damage to the crops, but they are also vectors for a number of viruses. Additional pests include the infamous cabbage stem flea beetle (CSFB) and nematodes, which are becoming more of a problem because of the increasingly limited control options available for growers.
In terms of diseases, potato blight is one of the greatest problems for broad acre crops. Additional major diseases include Septoria and mildew in wheat, plus some of the fusarium diseases, which can result in residues of mycotoxins in the grain. These are just some of the main threats the UK agriculture sector is currently facing, and new problems are always emerging.
Digital phenotyping-based methods for identification of pests and diseases range from RGB cameras – used at the lower scale because they are more affordable – to multispectral and hyperspectral cameras. The latter are more advanced than regular cameras, and are consequently more expensive. However, costs are coming down, which is enabling them to be used increasingly in R&D to identify signals of pre-symptomatic detection and responses to pest damage. The RGB cameras definitely still have plenty of potential, for example in the SlugBot project – in which CHAP’s Digital Phenotyping Lab at Rothamsted Research is working with Small Robot Company and AV&N Lee – they are being used to detect slugs in-field.
In cases where additional details are required from a camera in order to pick up and identify crop stress signals that may not be visible to the naked eye, a more advanced camera, such as the multispectral option, would be required, as this can detect signals that are not in the visible spectrum. You can even go one step further into hyperspectral technology, which can provide even more information.
However, proximity to the crop is currently essential for this type of advanced imaging. For this reason, drones cannot yet be used reliably, due to their lack of granularity to enable sufficient detail to be captured. Ultimately, there is much more variation in images captured from a drone as the the angle of each leaf can create extensive discrepancies. You also need more complex software to analyse such data. In the end it comes down to machine learning, training the cameras to identify a particular signal and that is much easier to do, and more accurate, if the camera is close to the leaf and the leaf is flat.
One follows on from the other and as camera technology and software develop then it will become easier to use cameras that aren’t necessarily as close to the crop.
The technology is still in its infancy, but I think we are going to see a rapid development as the costs are coming down and the number of companies that are starting to use them is increasing. Digital phenotyping technologies could soon become mainstream in agriculture – just as GPS has become the norm – especially in the battle to protect higher value crops.
To find out more about how CHAP can help to take your crop protection to another level, see also Digital Phenotyping Lab.