Investigating three proximal remote sensing techniques for vineyard yield monitoring
Abstract
Context and purpose of the study – Yield monitoring can provide the winegrowers with information for precise production inputs during the season, thereby, ensuring the best possible harvest. Yield estimation is currently achieved through an intensive process that is destructive and time-consuming. However, remote sensing provides a group of proximal technologies and techniques for a non-destructive and less time-consuming method for yield monitoring.The objective of this study was to analyse three different approaches, for measuring grapevine yield close to harvest. Traditional destructive measurements for yield determination were used as a reference. Each technique was tested in controlled conditions (laboratory) and field conditions (vineyard) at bunch and vine levels.
Material and methods – This study was carried out in a drip-irrigated vineyard cv. Shiraz at the Welgevallen farm (Stellenbosch University, South Africa). The Shiraz block was planted with a North-South orientation in the year 2000 (2.7m x 1.5m spacing). The vines are spur pruned on a seven-wire vertical shoot positioned system (VSP). Three proximal remote sensing techniques: a) RGB imagery (Conventional Red-Green-Blue images), b) infrared depth sensing (Kinect sensor), and c) light detection and ranging (LiDAR) were analysed for yield monitoring. The estimated yield was accomplished using bunch volume estimation in three experiments at harvest. Experiment 1 uses the Kinect and RGB imagery to estimate bunch volume based on a sample of 94 individual bunches under laboratory conditions. Experiment 2 uses Kinect and RGB imagery to estimate the volume of 21 individual bunches in-field. Experiment 3 uses Kinect, RBG imagery, and LiDAR, in-field, to estimate total yield per vine of 31 individual vines. Experiment 2 and Experiment 3 were undertaken using two canopy treatments: i) full canopy (FC), and ii) leaf removal (LR – 100% leaf removal in the bunch zone thereby exposing the bunches).
Results – The results obtained in this study show a strong correlation between the volume calculated by RGB images (2D modelling) and Kinect (3D modelling) versus the control volume of the individual bunches (Experiment 1). Experiments 2 and 3 show promising results for the three proximal remote sensing techniques studied, especially in the case of fully exposed bunches (LR treatment). Therefore, it’s possible to state the feasibility of these techniques as alternative fast and non-destructive methods for yield monitoring.
DOI:
Issue: GiESCO 2019
Type: Poster
Authors
1 Department of Geography and Environmental Studies, Stellenbosch University, Private Bag X1, Matieland 7602, South Africa
2 Dipartimento di Scienze AgroAlimentari, Ambientali e Animali, University of Udine, Via delle Scienze 208, Udine, Italy
3 Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, South Africa
Contact the author
Keywords
grapevine, yield monitoring, non-destructive methods, light detection and ranging (LiDAR), infrared depth sensing, conventional Red Green Blue images