To see the other types of publications on this topic, follow the link: Crop row detection.

Journal articles on the topic 'Crop row detection'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Crop row detection.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Jiang Guoquan, 姜国权, 柯杏 Ke Xing, 杜尚丰 Du Shangfeng, 张漫 Zhang Man, and 陈娇 Chen Jiao. "Crop Row Detection Based on Machine Vision." Acta Optica Sinica 29, no. 4 (2009): 1015–20. http://dx.doi.org/10.3788/aos20092904.1015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Vidović, Ivan, Robert Cupec, and Željko Hocenski. "Crop row detection by global energy minimization." Pattern Recognition 55 (July 2016): 68–86. http://dx.doi.org/10.1016/j.patcog.2016.01.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ronchetti, Giulia, Alice Mayer, Arianna Facchi, Bianca Ortuani, and Giovanna Sona. "Crop Row Detection through UAV Surveys to Optimize On-Farm Irrigation Management." Remote Sensing 12, no. 12 (June 18, 2020): 1967. http://dx.doi.org/10.3390/rs12121967.

Full text
Abstract:
Climate change and competition among water users are increasingly leading to a reduction of water availability for irrigation; at the same time, traditionally non-irrigated crops require irrigation to achieve high quality standards. In the context of precision agriculture, particular attention is given to the optimization of on-farm irrigation management, based on the knowledge of within-field variability of crop and soil properties, to increase crop yield quality and ensure an efficient water use. Unmanned Aerial Vehicle (UAV) imagery is used in precision agriculture to monitor crop variability, but in the case of row-crops, image post-processing is required to separate crop rows from soil background and weeds. This study focuses on the crop row detection and extraction from images acquired through a UAV during the cropping season of 2018. Thresholding algorithms, classification algorithms, and Bayesian segmentation are tested and compared on three different crop types, namely grapevine, pear, and tomato, for analyzing the suitability of these methods with respect to the characteristics of each crop. The obtained results are promising, with overall accuracy greater than 90% and producer’s accuracy over 85% for the class “crop canopy”. The methods’ performances vary according to the crop types, input data, and parameters used. Some important outcomes can be pointed out from our study: NIR information does not give any particular added value, and RGB sensors should be preferred to identify crop rows; the presence of shadows in the inter-row distances may affect crop detection on vineyards. Finally, the best methodologies to be adopted for practical applications are discussed.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Shaolin, Qianglong Ma, Shangkun Cheng, Dong An, Zhenling Yang, Biao Ma, and Yang Yang. "Crop Row Detection in the Middle and Late Periods of Maize under Sheltering Based on Solid State LiDAR." Agriculture 12, no. 12 (November 25, 2022): 2011. http://dx.doi.org/10.3390/agriculture12122011.

Full text
Abstract:
As the basic link of autonomous navigation in agriculture, crop row detection is vital to achieve accurate detection of crop rows for autonomous navigation. Machine vision algorithms are easily affected by factors such as changes in field lighting and weather conditions, and the majority of machine vision algorithms detect early periods of crops, but it is challenging to detect crop rows under high sheltering pressure in the middle and late periods. In this paper, a crop row detection algorithm based on LiDAR is proposed that is aimed at the middle and late crop periods, which has a good effect compared with the conventional machine vision algorithm. The algorithm proposed the following three steps: point cloud preprocessing, feature point extraction, and crop row centerline detection. Firstly, dividing the horizontal strips equally, the improved K-means algorithm and the prior information of the previous horizontal strip are utilized to obtain the candidate points of the current horizontal strip, then the candidate points information is used to filter and extract the feature points in accordance with the corresponding threshold, and finally, the least squares method is used to fit the crop row centerlines. The experimental results show that the algorithm can detect the centerlines of crop rows in the middle and late periods of maize under the high sheltering environment. In the middle period, the average correct extraction rate of maize row centerlines was 95.1%, and the average processing time was 0.181 s; in the late period, the average correct extraction rate of maize row centerlines was 87.3%, and the average processing time was 0.195 s. At the same time, it also demonstrates accuracy and superiority of the algorithm over the machine vision algorithm, which can provide a solid foundation for autonomous navigation in agriculture.
APA, Harvard, Vancouver, ISO, and other styles
5

Romeo, J., G. Pajares, M. Montalvo, J. M. Guerrero, M. Guijarro, and A. Ribeiro. "Crop Row Detection in Maize Fields Inspired on the Human Visual Perception." Scientific World Journal 2012 (2012): 1–10. http://dx.doi.org/10.1100/2012/484390.

Full text
Abstract:
This paper proposes a new method, oriented to image real-time processing, for identifying crop rows in maize fields in the images. The vision system is designed to be installed onboard a mobile agricultural vehicle, that is, submitted to gyros, vibrations, and undesired movements. The images are captured under image perspective, being affected by the above undesired effects. The image processing consists of two main processes: image segmentation and crop row detection. The first one applies a threshold to separate green plants or pixels (crops and weeds) from the rest (soil, stones, and others). It is based on a fuzzy clustering process, which allows obtaining the threshold to be applied during the normal operation process. The crop row detection applies a method based on image perspective projection that searches for maximum accumulation of segmented green pixels along straight alignments. They determine the expected crop lines in the images. The method is robust enough to work under the above-mentioned undesired effects. It is favorably compared against the well-tested Hough transformation for line detection.
APA, Harvard, Vancouver, ISO, and other styles
6

Ji, Ronghua, and Lijun Qi. "Crop-row detection algorithm based on Random Hough Transformation." Mathematical and Computer Modelling 54, no. 3-4 (August 2011): 1016–20. http://dx.doi.org/10.1016/j.mcm.2010.11.030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhai, Zhiqiang, Zhongxiang Zhu, Yuefeng Du, Zhenghe Song, and Enrong Mao. "Multi-crop-row detection algorithm based on binocular vision." Biosystems Engineering 150 (October 2016): 89–103. http://dx.doi.org/10.1016/j.biosystemseng.2016.07.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Pengfei, Xiao Ma, Fangyong Wang, and Jing Li. "A New Method for Crop Row Detection Using Unmanned Aerial Vehicle Images." Remote Sensing 13, no. 17 (September 5, 2021): 3526. http://dx.doi.org/10.3390/rs13173526.

Full text
Abstract:
Crop row detection using unmanned aerial vehicle (UAV) images is very helpful for precision agriculture, enabling one to delineate site-specific management zones and to perform precision weeding. For crop row detection in UAV images, the commonly used Hough transform-based method is not sufficiently accurate. Thus, the purpose of this study is to design a new method for crop row detection in orthomosaic UAV images. For this purpose, nitrogen field experiments involving cotton and nitrogen and water field experiments involving wheat were conducted to create different scenarios for crop rows. During the peak square growth stage of cotton and the jointing growth stage of wheat, multispectral UAV images were acquired. Based on these data, a new crop detection method based on least squares fitting was proposed and compared with a Hough transform-based method that uses the same strategy to preprocess images. The crop row detection accuracy (CRDA) was used to evaluate the performance of the different methods. The results showed that the newly proposed method had CRDA values between 0.99 and 1.00 for different nitrogen levels of cotton and CRDA values between 0.66 and 0.82 for different nitrogen and water levels of wheat. In contrast, the Hough transform method had CRDA values between 0.93 and 0.98 for different nitrogen levels of cotton and CRDA values between 0.31 and 0.53 for different nitrogen and water levels of wheat. Thus, the newly proposed method outperforms the Hough transform method. An effective tool for crop row detection using orthomosaic UAV images is proposed herein.
APA, Harvard, Vancouver, ISO, and other styles
9

Kennedy, HannahJoy, Steven A. Fennimore, David C. Slaughter, Thuy T. Nguyen, Vivian L. Vuong, Rekha Raja, and Richard F. Smith. "Crop signal markers facilitate crop detection and weed removal from lettuce and tomato by an intelligent cultivator." Weed Technology 34, no. 3 (November 14, 2019): 342–50. http://dx.doi.org/10.1017/wet.2019.120.

Full text
Abstract:
AbstractIncreasing weed control costs and limited herbicide options threaten vegetable crop profitability. Traditional interrow mechanical cultivation is very effective at removing weeds between crop rows. However, weed control within the crop rows is necessary to establish the crop and prevent yield loss. Currently, many vegetable crops require hand weeding to remove weeds within the row that remain after traditional cultivation and herbicide use. Intelligent cultivators have come into commercial use to remove intrarow weeds and reduce cost of hand weeding. Intelligent cultivators currently on the market such as the Robovator, use pattern recognition to detect the crop row. These cultivators do not differentiate crops and weeds and do not work well among high weed populations. One approach to differentiate weeds is to place a machine-detectable mark or signal on the crop (i.e., the crop has the mark and the weed does not), thereby facilitating weed/crop differentiation. Lettuce and tomato plants were marked with labels and topical markers, then cultivated with an intelligent cultivator programmed to identify the markers. Results from field trials in marked tomato and lettuce found that the intelligent cultivator removed 90% more weeds from tomato and 66% more weeds from lettuce than standard cultivators without reducing yields. Accurate crop and weed differentiation described here resulted in a 45% to 48% reduction in hand-weeding time per hectare.
APA, Harvard, Vancouver, ISO, and other styles
10

Hassanein, M., M. Khedr, and N. El-Sheimy. "CROP ROW DETECTION PROCEDURE USING LOW-COST UAV IMAGERY SYSTEM." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W13 (June 4, 2019): 349–56. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w13-349-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> Precision Agriculture (PA) management systems are considered among the top ten revolutions in the agriculture industry during the last couple decades. Generally, the PA is a management system that aims to integrate different technologies as navigation and imagery systems to control the use of the agriculture industry inputs aiming to enhance the quality and quantity of its output, while preserving the surrounding environment from any harm that might be caused due to the use of these inputs. On the other hand, during the last decade, Unmanned Aerial Vehicles (UAVs) showed great potential to enhance the use of remote sensing and imagery sensors for different PA applications such as weed management, crop health monitoring, and crop row detection. UAV imagery systems are capable to fill the gap between aerial and terrestrial imagery systems and enhance the use of imagery systems and remote sensing for PA applications. One of the important PA applications that uses UAV imagery systems, and which drew lots of interest is the crop row detection, especially that such application is important for other applications such as weed detection and crop yield predication. This paper introduces a new crop row detection methodology using low-cost UAV RGB imagery system. The methodology has three main steps. First, the RGB images are converted into HSV color space and the Hue image are extracted. Then, different sections are generated with different orientation angles in the Hue images. For each section, using the PCA of the Hue values in the section, an analysis can be performed to evaluate the variances of the Hue values in the section. The crop row orientation angle is detected as the same orientation angle of the section that provides the minimum variances of Hue values. Finally, a scan line is generated over the Hue image with the same orientation angle of the crop rows. The scan line computes the average of the Hue values for each line in the Hue image similar to the detected crop row orientation. The generated values provide a graph full of peaks and valleys which represent the crop and soil rows. The proposed methodology was evaluated using different RGB images acquired by low-cost UAV for a Canola field. The images were taken at different flight heights and different dates. The achieved results proved the ability of the proposed methodology to detect the crop rows at different cases.</p>
APA, Harvard, Vancouver, ISO, and other styles
11

Yang, Ranbing, Yuming Zhai, Jian Zhang, Huan Zhang, Guangbo Tian, Jian Zhang, Peichen Huang, and Lin Li. "Potato Visual Navigation Line Detection Based on Deep Learning and Feature Midpoint Adaptation." Agriculture 12, no. 9 (September 1, 2022): 1363. http://dx.doi.org/10.3390/agriculture12091363.

Full text
Abstract:
Potato machinery has become more intelligent thanks to advancements in autonomous navigation technology. The effect of crop row segmentation directly affects the subsequent extraction work, which is an important part of navigation line detection. However, the shape differences of crops in different growth periods often lead to poor image segmentation. In addition, noise such as field weeds and light also affect it, and these problems are difficult to address using traditional threshold segmentation methods. To this end, this paper proposes an end-to-end potato crop row detection method. The first step is to replace the original U-Net’s backbone feature extraction structure with VGG16 to segment the potato crop rows. Secondly, a fitting method of feature midpoint adaptation is proposed, which can realize the adaptive adjustment of the vision navigation line position according to the growth shape of a potato. The results show that the method used in this paper has strong robustness and can accurately detect navigation lines in different potato growth periods. Furthermore, compared with the original U-Net model, the crop row segmentation accuracy is improved by 3%, and the average deviation of the fitted navigation lines is 2.16°, which is superior to the traditional visual guidance method.
APA, Harvard, Vancouver, ISO, and other styles
12

Bah, Mamadou Dian, Adel Hafiane, and Raphael Canals. "CRowNet: Deep Network for Crop Row Detection in UAV Images." IEEE Access 8 (2020): 5189–200. http://dx.doi.org/10.1109/access.2019.2960873.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Rabab, Saba, Pieter Badenhorst, Yi-Ping Phoebe Chen, and Hans D. Daetwyler. "A template-free machine vision-based crop row detection algorithm." Precision Agriculture 22, no. 1 (June 26, 2020): 124–53. http://dx.doi.org/10.1007/s11119-020-09732-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Schmitz, Austin, Chetan Badgujar, Hasib Mansur, Daniel Flippo, Brian McCornack, and Ajay Sharda. "Design of a Reconfigurable Crop Scouting Vehicle for Row Crop Navigation: A Proof-of-Concept Study." Sensors 22, no. 16 (August 18, 2022): 6203. http://dx.doi.org/10.3390/s22166203.

Full text
Abstract:
Pest infestation causes significant crop damage during crop production, which reduces the crop yield in terms of quality and quantity. Accurate, precise, and timely information on pest infestation is a crucial aspect of integrated pest management practices. The current manual scouting methods are time-consuming and laborious, particularly for large fields. Therefore, a fleet of scouting vehicles is proposed to monitor and collect crop information at the sub-canopy level. These vehicles would traverse large fields and collect real-time information on pest type, concentration, and infestation level. In addition to this, the developed vehicle platform would assist in collecting information on soil moisture, nutrient deficiency, and disease severity during crop growth stages. This study established a proof-of-concept of a crop scouting vehicle that can navigate through the row crops. A reconfigurable ground vehicle (RGV) was designed and fabricated. The developed prototype was tested in the laboratory and an actual field environment. Moreover, the concept of corn row detection was established by utilizing an array of low-cost ultrasonic sensors. The RGV was successful in navigating through the corn field. The RGV’s reconfigurable characteristic provides the ability to move anywhere in the field without damaging the crops. This research shows the promise of using reconfigurable robots for row crop navigation for crop scouting and monitoring which could be modular and scalable, and can be mass-produced in quick time. A fleet of these RGVs would empower the farmers to make meaningful and timely decisions for their cropping system.
APA, Harvard, Vancouver, ISO, and other styles
15

Winterhalter, Wera, Freya Veronika Fleckenstein, Christian Dornhege, and Wolfram Burgard. "Crop Row Detection on Tiny Plants With the Pattern Hough Transform." IEEE Robotics and Automation Letters 3, no. 4 (October 2018): 3394–401. http://dx.doi.org/10.1109/lra.2018.2852841.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kise, M., Q. Zhang, and F. Rovira Más. "A Stereovision-based Crop Row Detection Method for Tractor-automated Guidance." Biosystems Engineering 90, no. 4 (April 2005): 357–67. http://dx.doi.org/10.1016/j.biosystemseng.2004.12.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Huang, Shuangping, Sihang Wu, Chao Sun, Xu Ma, Yu Jiang, and Long Qi. "Deep localization model for intra-row crop detection in paddy field." Computers and Electronics in Agriculture 169 (February 2020): 105203. http://dx.doi.org/10.1016/j.compag.2019.105203.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Wiegman, Christopher R., Ramarao Venkatesh, and Scott A. Shearer. "Intra-Canopy Sensing Using Multi-Rotor sUAS: A New Approach for Crop Stress Detection and Diagnosis." Journal of the ASABE 65, no. 4 (2022): 913–25. http://dx.doi.org/10.13031/ja.14342.

Full text
Abstract:
HighlightsA novel platform was developed for intra-canopy insertion of sensors from a multi-rotor sUAS.The system enables real-time data acquisition from inside the crop canopy comparable to an in-person view.The system provides ideal data for use in modern CNN-based stress diagnosis in row crop production.Abstract. Remote sensing is a critical tool in precision agriculture, giving producers the ability to monitor field conditions throughout the growing season. Although several remote sensing platforms are in use today, small unmanned aerial systems (sUAS) provide the greatest flexibility with the highest resolution. As sUAS capabilities continue to increase (i.e., payload, flight time, and speed), their potential in commercial row crop production is substantial. However, like other forms of remote sensing, traditional sUAS are limited to a nadir view of the target and only capture the top of the crop canopy. Although disease epidemiology and stress origins vary, this limited view usually does not capture the impact of stress at the initial manifestation. For example, stresses such as macronutrient deficiencies in corn originate at the base of the plant and then move upward as nutrients translocate. By the time the stress is detectable at the top of the canopy, the opportunity to mitigate yield loss is limited. A new sUAS platform is needed for sensing beneath the upper portion of the canopy. The Stinger platform, developed to meet this need, consists of a 4.0 m fiberglass rod, custom sensor mount, communication network, and radio link. Using this platform, a variety of sensors can be inserted into the crop canopy from a hovering sUAS. The Stinger platform, when combined with artificial intelligence (AI), significantly expands the capabilities of sUAS for diagnosis of crop stress in row crop production. Keywords: Intra-canopy sensing, Remote sensing, RGB imagery, Stress diagnosis, sUAS.
APA, Harvard, Vancouver, ISO, and other styles
19

Behfar, Hossein, HamidReza Ghasemzadeh, Ali Rostami, MirHadi Seyedarabi, and Mohammad Moghaddam. "Vision-Based Row Detection Algorithms Evaluation for Weeding Cultivator Guidance in Lentil." Modern Applied Science 8, no. 5 (September 25, 2014): 224. http://dx.doi.org/10.5539/mas.v8n5p224.

Full text
Abstract:
It is important to detect crop rows accurately for field navigation. In order to accurate weeding, cultivator guidance system should detect the crop center line precisely. The methods of vision-based row detection for lentil field were studied. Monochrome and color images were used in this research. The color images are transformed into grey scale images in two different formulas to make comparing among them and find an optimal one. In order to detect the center of the crop row rapidly and effectively, Hough transform and gravity center image processing algorithms were applied to acquired images. The field crop images were segmented into two parts by using optimal thresholding (plant and soil as a background), then Hough transform was applied on these binary images. Gray scale images were used in gravity center method. The center line detection algorithms were tested for two weed distribution density, include general and intensive. It was observed that both systems successfully detects and calculates the pose and orientation of the crop row on synthetic images. The mean errors between the calculated and manually estimated lines were obtained. Mean errors for Hough transform and gravity center methods were 8 and 10 mm with standard deviations of 7 and 12 mm in general distribution density and 12 and 16mm with standard deviation of 11 and 15mm in high distribution density, respectively. Computational time for Hough transform and gravity center were 0.7 and 0.4 s for general distribution density and 1.2 and 0.8 s for high distribution density, respectively.
APA, Harvard, Vancouver, ISO, and other styles
20

Basso, Maik, and Edison Pignaton de Freitas. "A UAV Guidance System Using Crop Row Detection and Line Follower Algorithms." Journal of Intelligent & Robotic Systems 97, no. 3-4 (March 9, 2019): 605–21. http://dx.doi.org/10.1007/s10846-019-01006-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Baquero Velasquez, Andres Eduardo, Vitor Akihiro Hisano Higuti, Mateus Valverde Gasparino, Arun Sivakumar, Marcelo Becker, and Girish Chowdhary. "Multi-Sensor Fusion based Robust Row Following for Compact Agricultural Robots." Field Robotics 2, no. 1 (March 10, 2022): 1291–319. http://dx.doi.org/10.55417/fr.2022043.

Full text
Abstract:
This paper presents a state-of-the-art light detection and ranging (LiDAR) based autonomous navigation system for under-canopy agricultural robots. Under-canopy agricultural navigation has been a challenging problem because global navigation satellite system (GNSS) and other positioning sensors are prone to loss of accuracy due to attenuation and multi-path errors caused by crop leaves and stems. Reactive navigation by detecting crop rows using LiDAR measurements has proved to be an efficient alternative to GNSS. Nevertheless, it presents challenges, due to occlusion from leaves under the canopy. Our system addresses these issues by fusing inertial measurement unit (IMU) and LiDAR measurements in a Bayesian framework on low-cost hardware. In addition, a local goal generator (LGG) is introduced to provide a local reference trajectory to the onboard controller. Our system is validated extensively in real-world field environments over a distance of 50.88 km, on multiple robots, in different field conditions, across different locations. We report leading distance between intervention results for LiDAR+IMU-based under-canopy navigation, showing that our system is able to safely navigate without interventions for 386.9 m on average, in fields without significant gaps in the crop rows.
APA, Harvard, Vancouver, ISO, and other styles
22

Rovira-Más, F., Q. Zhang, J. F. Reid, and J. D. Will. "Hough-transform-based vision algorithm for crop row detection of an automated agricultural vehicle." Proceedings of the Institution of Mechanical Engineers, Part D: Journal of Automobile Engineering 219, no. 8 (August 1, 2005): 999–1010. http://dx.doi.org/10.1243/095440705x34667.

Full text
Abstract:
Finding a pathway between crop rows is essential for automated guidance of some agricultural vehicles. The research reported in this paper developed a vision-based method for detecting crop rows. This method applied the Hough transform and connectivity analysis to process images of a vehicle's forward view and to use them to find the appropriate pathway in the field. The Hough transform was used to detect crop rows and the connectivity analysis was applied to identify the most suitable path from all possible choices. This system was implemented in an agricultural tractor and tested in both laboratory and field experiments. The methodology devised overcame image noise problems and successfully determined the proper trajectory for the tractor.
APA, Harvard, Vancouver, ISO, and other styles
23

Guerrero, J. M., M. Guijarro, M. Montalvo, J. Romeo, L. Emmi, A. Ribeiro, and G. Pajares. "Automatic expert system based on images for accuracy crop row detection in maize fields." Expert Systems with Applications 40, no. 2 (February 2013): 656–64. http://dx.doi.org/10.1016/j.eswa.2012.07.073.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Hanks, James E., and James L. Beck. "Sensor-Controlled Hooded Sprayer for Row Crops." Weed Technology 12, no. 2 (June 1998): 308–14. http://dx.doi.org/10.1017/s0890037x00043864.

Full text
Abstract:
Methods were developed and evaluated that utilize state of the art weed-sensing technology in row-crop production systems. Spectral differences in green living plants and bare soil allowed ‘real-time’ weed detection, with intermittent spraying of herbicide only where weeds were present. Sensor units were mounted in 0.7-m-wide hooded sprayers providing sensors with an unobstructed view of the area between soybean rows. Single hood and commercial-size eight-row systems were evaluated, and savings in glyphosate spray solution applied using sensors ranged from 63 to 85%, compared to conventional hooded spray systems with continuous application. Weed control by the sensor-controlled spray system was equal to the conventional system. This technology can significantly reduce herbicide usage and decrease production cost without reducing weed control.
APA, Harvard, Vancouver, ISO, and other styles
25

Pei, Haotian, Youqiang Sun, He Huang, Wei Zhang, Jiajia Sheng, and Zhiying Zhang. "Weed Detection in Maize Fields by UAV Images Based on Crop Row Preprocessing and Improved YOLOv4." Agriculture 12, no. 7 (July 6, 2022): 975. http://dx.doi.org/10.3390/agriculture12070975.

Full text
Abstract:
Effective maize and weed detection plays an important role in farmland management, which helps to improve yield and save herbicide resources. Due to their convenience and high resolution, Unmanned Aerial Vehicles (UAVs) are widely used in weed detection. However, there are some challenging problems in weed detection: (i) the cost of labeling is high, the image contains many plants, and annotation of the image is time-consuming and labor-intensive; (ii) the number of maize is much larger than the number of weed in the field, and this imbalance of samples leads to decreased recognition accuracy; and (iii) maize and weed have similar colors, textures, and shapes, which are difficult to identify when an UAV flies at a comparatively high altitude. To solve these problems, we propose a new weed detection framework in this paper. First, to balance the samples and reduce the cost of labeling, a lightweight model YOLOv4-Tiny was exploited to detect and mask the maize rows so that it was only necessary to label weeds on the masked image. Second, the improved YOLOv4 was used as a weed detection model. We introduced the Meta-ACON activation function, added the Convolutional Block Attention Module (CBAM), and replaced the Non-Maximum Suppression (NMS) with Soft Non-Maximum Suppression (Soft-NMS). Moreover, the distributions and counts of weeds were analyzed, which was useful for variable herbicide spraying. The results showed that the total number of labels for 1000 images decrease by half, from 33,572 to 17,126. The improved YOLOv4 had a mean average precision (mAP) of 86.89%.
APA, Harvard, Vancouver, ISO, and other styles
26

Vong, Chin Nee, and Peter Ako Larbi. "Development and Prototype Testing of an Agricultural Nozzle Clog Detection Device." Transactions of the ASABE 64, no. 1 (2021): 49–61. http://dx.doi.org/10.13031/trans.13519.

Full text
Abstract:
HighlightsPrototypes of an agricultural nozzle clog detection system (for 18 nozzles) have been successfully developed.Spray quality characteristics (droplet size, pattern, and coverage) were not significantly affected when testing the device with extended-range nozzles (TeeJet XR8004).Most of the spray quality characteristics were significantly affected when testing the device with ultra low-drift nozzles (John Deere PSULDQ2004).Abstract. Agricultural nozzles are the main components that perform the spraying of agrochemicals, and their proper functionality is a key element for uniform spray application on crops. Because nozzles have small orifices, they can become clogged when there is debris from the agrochemical in the tank. Nozzle clogging during spray application results in poor pest and weed management and increased cost for re-spraying the affected crop row. Measures used to prevent nozzles from clogging include using screens or strainers to filter out debris before it reaches the nozzle tip, as well as performing regular checks on the nozzles. However, nozzle clogging still occurs during spraying despite the precautions taken. Thus, a device that can detect nozzle clogging during spraying is necessary to enable a quicker response that will ensure uniform application across each row of the crop. A novel, patented device for detecting clogged nozzles that is externally attachable to each nozzle on a sprayer boom was developed in the Precision Application Technology Lab at Arkansas State University. The main objective of this article is to present a general description of this prototype nozzle clog detection device and the nozzle clog detection system. Spray droplet size and pattern tests under controlled conditions and spray coverage tests under field conditions were conducted with and without the device to determine if there were significant differences in droplet size, spray pattern, or spray coverage between using and not using the device. The tests demonstrated that this new technology has potential for detecting clogged nozzles without significantly influencing spray quality for extended-range nozzles but not for ultra low-drift nozzles. To increase the reliability of the performance of this new technology, further improvements in the design need to be considered. Keywords: Clogged nozzle, Detection, Droplet size, Prototype device, Spray coverage, Spray pattern.
APA, Harvard, Vancouver, ISO, and other styles
27

Zhu, Zhong Xiang, Yan He, Zhi Qiang Zhai, Jin Yi Liu, and En Rong Mao. "Research on Cotton Row Detection Algorithm Based on Binocular Vision." Applied Mechanics and Materials 670-671 (October 2014): 1222–27. http://dx.doi.org/10.4028/www.scientific.net/amm.670-671.1222.

Full text
Abstract:
As a relative locating method, machine vision is generally used for automatic navigation of cotton cultivator or cotton insecticide sprayer. However, it is difficult to achieve reliable and stable recognition of crop row with monocular stereo vision system, because it neither can access directly to the depth information of the image, which leads to massive time-consuming calculation, nor possess high-accuracy recognition or a good anti-noise property. This paper presents an algorithm for cotton row detection based on binocular stereo vision to be used for automatic navigation of cotton cultivator. The Zhang's plane calibration is used to obtain the internal and external parameters of the binocular stereo vision. Preprocessing means are applied to distinguish the cotton from soil, stereoscopic match is conducted according to the SIFT operators after the preprocessing of images, after which cotton space three-dimensional coordinates are acquired by parallax distance measuring method, with the elevation information combination of Hough transform, cotton lines are finally detected. The detection results indicate that this method has an accuracy higher than 90%, which primarily meets the need of automatic navigation for cotton cultivator.
APA, Harvard, Vancouver, ISO, and other styles
28

Zhang, Zhenqian, Ruyue Cao, Cheng Peng, Renjie Liu, Yifan Sun, Man Zhang, and Han Li. "Cut-Edge Detection Method for Rice Harvesting Based on Machine Vision." Agronomy 10, no. 4 (April 20, 2020): 590. http://dx.doi.org/10.3390/agronomy10040590.

Full text
Abstract:
A cut-edge detection method based on machine vision was developed for obtaining the navigation path of a combine harvester. First, the Cr component in the YCbCr color model was selected as the grayscale feature factor. Then, by detecting the end of the crop row, judging the target demarcation and getting the feature points, the region of interest (ROI) was automatically gained. Subsequently, the vertical projection was applied to reduce the noise. All the points in the ROI were calculated, and a dividing point was found in each row. The hierarchical clustering method was used to extract the outliers. At last, the polynomial fitting method was used to acquire the straight or curved cut-edge. The results gained from the samples showed that the average error for locating the cut-edge was 2.84 cm. The method was capable of providing support for the automatic navigation of a combine harvester.
APA, Harvard, Vancouver, ISO, and other styles
29

Bakker, Tijmen, Hendrik Wouters, Kees van Asselt, Jan Bontsema, Lie Tang, Joachim Müller, and Gerrit van Straten. "A vision based row detection system for sugar beet." Computers and Electronics in Agriculture 60, no. 1 (January 2008): 87–95. http://dx.doi.org/10.1016/j.compag.2007.07.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Xu, Yanlei, Zongmei Gao, Lav Khot, Xiaotian Meng, and Qin Zhang. "A Real-Time Weed Mapping and Precision Herbicide Spraying System for Row Crops." Sensors 18, no. 12 (December 3, 2018): 4245. http://dx.doi.org/10.3390/s18124245.

Full text
Abstract:
This study developed and field tested an automated weed mapping and variable-rate herbicide spraying (VRHS) system for row crops. Weed detection was performed through a machine vision sub-system that used a custom threshold segmentation method, an improved particle swarm optimum (IPSO) algorithm, capable of segmenting the field images. The VRHS system also used a lateral histogram-based algorithm for fast extraction of weed maps. This was the basis for determining real-time herbicide application rates. The central processor of the VRHS system had high logic operation capacity, compared to the conventional controller-based systems. Custom developed monitoring system allowed real-time visualization of the spraying system functionalities. Integrated system performance was then evaluated through field experiments. The IPSO successfully segmented weeds within corn crop at seedling growth stage and reduced segmentation error rates to 0.1% from 7.1% of traditional particle swarm optimization algorithm. IPSO processing speed was 0.026 s/frame. The weed detection to chemical actuation response time of integrated system was 1.562 s. Overall, VRHS system met the real-time data processing and actuation requirements for its use in practical weed management applications.
APA, Harvard, Vancouver, ISO, and other styles
31

Yu, Yue, Yidan Bao, Jichun Wang, Hangjian Chu, Nan Zhao, Yong He, and Yufei Liu. "Crop Row Segmentation and Detection in Paddy Fields Based on Treble-Classification Otsu and Double-Dimensional Clustering Method." Remote Sensing 13, no. 5 (February 27, 2021): 901. http://dx.doi.org/10.3390/rs13050901.

Full text
Abstract:
Visual navigation is developing rapidly and is of great significance to improve agricultural automation. The most important issue involved in visual navigation is extracting a guidance path from agricultural field images. Traditional image segmentation methods may fail to work in paddy field, for the colors of weed, duckweed, and eutrophic water surface are very similar to those of real rice seedings. To deal with these problems, a crop row segmentation and detection algorithm, designed for complex paddy fields, is proposed. Firstly, the original image is transformed to the grayscale image and then the treble-classification Otsu method classifies the pixels in the grayscale image into three clusters according to their gray values. Secondly, the binary image is divided into several horizontal strips, and feature points representing green plants are extracted. Lastly, the proposed double-dimensional adaptive clustering method, which can deal with gaps inside a single crop row and misleading points between real crop rows, is applied to obtain the clusters of real crop rows and the corresponding fitting line. Quantitative validation tests of efficiency and accuracy have proven that the combination of these two methods constitutes a new robust integrated solution, with attitude error and distance error within 0.02° and 10 pixels, respectively. The proposed method achieved better quantitative results than the detection method based on typical Otsu under various conditions.
APA, Harvard, Vancouver, ISO, and other styles
32

Ospina, Ricardo, and Noboru Noguchi. "Simultaneous mapping and crop row detection by fusing data from wide angle and telephoto images." Computers and Electronics in Agriculture 162 (July 2019): 602–12. http://dx.doi.org/10.1016/j.compag.2019.05.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Fériani, W., S. Rezgui, and M. Cherif. "Detection of QTL and QTL × environment interaction for scald resistance in a two-row × six-row cross of barley." Cereal Research Communications 48, no. 2 (February 27, 2020): 187–93. http://dx.doi.org/10.1007/s42976-020-00024-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Iberraken, Dimia, Florian Gaurier, Jean-Christophe Roux, Colin Chaballier, and Roland Lenain. "Autonomous Vineyard Tracking Using a Four-Wheel-Steering Mobile Robot and a 2D LiDAR." AgriEngineering 4, no. 4 (September 22, 2022): 826–46. http://dx.doi.org/10.3390/agriengineering4040053.

Full text
Abstract:
The intensive advances in robotics have deeply facilitated the accomplishment of tedious and repetitive tasks in our daily lives. If robots are now well established in the manufacturing industry, thanks to the knowledge of the environment, this is still not fully the case for outdoor applications such as in agriculture, as many parameters are varying (kind of vegetation, perception conditions, wheel–soil interaction, etc.) The use of robots in such a context is nevertheless important since the reduction of environmental impacts requires the use of alternative practices (such as agroecological production or organic production), which require highly accurate work and frequent operations. As a result, the design of robots for agroecology implies notably the availability of highly accurate autonomous navigation processes related to crop and adapting to their variability. This paper proposes several contributions to the problem of crop row tracking using a four-wheel-steering mobile robot, which straddles the crops. It uses a 2D LiDAR allowing the detection of crop rows in 3D thanks to the robot motion. This permits the definition of a reference trajectory that is followed using two different control approaches. The main targeted application is navigation in vineyard fields, to achieve several kinds of operation, such as monitoring, cropping, or accurate spraying. In the first part, a row detection strategy based on a 2D LiDAR inclined in front of the robot to match a predefined shape of the vineyard row in the robot framework is described. The successive detected regions of interest are aggregated along the local robot motion, through the system odometry. This permits the computation of a local trajectory to be followed by a robot. In a second part, a control architecture that allows the control of a four-wheel-steering mobile robot is proposed. Two different strategies are investigated, one is based on a backstepping approach, while the second considers independently the regulation of front and rear steering axle position. The results of these control laws are then compared in an extended simulation framework, using a 3D reconstruction of actual vineyards in different seasons.
APA, Harvard, Vancouver, ISO, and other styles
35

Gerhards, Roland, Benjamin Kollenda, Jannis Machleb, Kurt Möller, Andreas Butz, David Reiser, and Hans-Werner Griegentrog. "Camera-guided Weed Hoeing in Winter Cereals with Narrow Row Distance." Gesunde Pflanzen 72, no. 4 (October 29, 2020): 403–11. http://dx.doi.org/10.1007/s10343-020-00523-5.

Full text
Abstract:
Abstract Farmers are facing severe problems with weed competition in cereal crops. Grass-weeds and perennial weed species became more abundant in Europe mainly due to high percentages of cereal crops in cropping systems and reduced tillage practices combined with continuous applications of herbicides with the same mode of action. Several weed populations have evolved resistance to herbicides. Precision weed hoeing may help to overcome these problems. So far, weed hoeing in cereals was restricted to cropping practices with row distances of more than 200 mm. Hoeing in cereals with conventional row distances of 125–170 mm requires the development of automatic steering systems. The objective of this project was to develop a new automatic guidance system for inter-row hoeing using camera-based row detection and automatic side-shift control. Six field studies were conducted in winter wheat to investigate accuracy, weed control efficacy and crop yields of this new hoeing technology. A three-meter prototype and a 6-meter segmented hoe were built and tested at three different speeds in 150 mm seeded winter wheat. The maximum lateral offset from the row center was 22.53 mm for the 3 m wide hoe and 18.42 mm for the 6 m wide hoe. Camera-guided hoeing resulted in 72–96% inter-row and 21–91% intra-row weed control efficacy (WCE). Weed control was 7–15% higher at 8 km h−1 compared to 4 km h−1. WCE could be increased by 14–22% when hoeing was combined with weed harrowing. Grain yields after camera-guided hoeing at 8 km h−1 were 15–76% higher than the untreated control plots and amounted the same level as the weed-free herbicide plots. The study characterizes camera-guided hoeing in cereals as a robust and effective method of weed control.
APA, Harvard, Vancouver, ISO, and other styles
36

Lacotte, Virginie, Toan NGuyen, Javier Diaz Sempere, Vivien Novales, Vincent Dufour, Richard Moreau, Minh Tu Pham, et al. "Pesticide-Free Robotic Control of Aphids as Crop Pests." AgriEngineering 4, no. 4 (October 7, 2022): 903–21. http://dx.doi.org/10.3390/agriengineering4040058.

Full text
Abstract:
Because our civilization has relied on pesticides to fight weeds, insects, and diseases since antiquity, the use of these chemicals has become natural and exclusive. Unfortunately, the use of pesticides has progressively had alarming effects on water quality, biodiversity, and human health. This paper proposes to improve farming practices by replacing pesticides with a laser-based robotic approach. This study focused on the neutralization of aphids, as they are among the most harmful pests for crops and complex to control. With the help of deep learning, we developed a mobile robot that spans crop rows, locates aphids, and neutralizes them with laser beams. We have built a prototype with the sole purpose of validating the localization-neutralization loop on a single seedling row. The experiments performed in our laboratory demonstrate the feasibility of detecting different lines of aphids (50% detected at 3 cm/s) and of neutralizing them (90% mortality) without impacting the growth of their host plants. The results are encouraging since aphids are one of the most challenging crop pests to eradicate. However, enhancements in detection and mainly in targeting are necessary to be useful in a real farming context. Moreover, robustness regarding field conditions should be evaluated.
APA, Harvard, Vancouver, ISO, and other styles
37

García-Santillán, Iván, José Miguel Guerrero, Martín Montalvo, and Gonzalo Pajares. "Curved and straight crop row detection by accumulation of green pixels from images in maize fields." Precision Agriculture 19, no. 1 (January 3, 2017): 18–41. http://dx.doi.org/10.1007/s11119-016-9494-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Bah, M., Adel Hafiane, and Raphael Canals. "Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images." Remote Sensing 10, no. 11 (October 26, 2018): 1690. http://dx.doi.org/10.3390/rs10111690.

Full text
Abstract:
In recent years, weeds have been responsible for most agricultural yield losses. To deal with this threat, farmers resort to spraying the fields uniformly with herbicides. This method not only requires huge quantities of herbicides but impacts the environment and human health. One way to reduce the cost and environmental impact is to allocate the right doses of herbicide to the right place and at the right time (precision agriculture). Nowadays, unmanned aerial vehicles (UAVs) are becoming an interesting acquisition system for weed localization and management due to their ability to obtain images of the entire agricultural field with a very high spatial resolution and at a low cost. However, despite significant advances in UAV acquisition systems, the automatic detection of weeds remains a challenging problem because of their strong similarity to the crops. Recently, a deep learning approach has shown impressive results in different complex classification problems. However, this approach needs a certain amount of training data, and creating large agricultural datasets with pixel-level annotations by an expert is an extremely time-consuming task. In this paper, we propose a novel fully automatic learning method using convolutional neuronal networks (CNNs) with an unsupervised training dataset collection for weed detection from UAV images. The proposed method comprises three main phases. First, we automatically detect the crop rows and use them to identify the inter-row weeds. In the second phase, inter-row weeds are used to constitute the training dataset. Finally, we perform CNNs on this dataset to build a model able to detect the crop and the weeds in the images. The results obtained are comparable to those of traditional supervised training data labeling, with differences in accuracy of 1.5% in the spinach field and 6% in the bean field.
APA, Harvard, Vancouver, ISO, and other styles
39

Zhang, Xiya, Xiaona Li, Baohua Zhang, Jun Zhou, Guangzhao Tian, Yingjun Xiong, and Baoxing Gu. "Automated robust crop-row detection in maize fields based on position clustering algorithm and shortest path method." Computers and Electronics in Agriculture 154 (November 2018): 165–75. http://dx.doi.org/10.1016/j.compag.2018.09.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Pang, Yan, Yeyin Shi, Shancheng Gao, Feng Jiang, Arun-Narenthiran Veeranampalayam-Sivakumar, Laura Thompson, Joe Luck, and Chao Liu. "Improved crop row detection with deep neural network for early-season maize stand count in UAV imagery." Computers and Electronics in Agriculture 178 (November 2020): 105766. http://dx.doi.org/10.1016/j.compag.2020.105766.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Pradhan, Nrusingh Charan, Pramod Kumar Sahoo, Dilip Kumar Kushwaha, Indra Mani, Ankur Srivastava, Atish Sagar, Nikul Kumari, Susheel Kumar Sarkar, and Yash Makwana. "A Novel Approach for Development and Evaluation of LiDAR Navigated Electronic Maize Seeding System Using Check Row Quality Index." Sensors 21, no. 17 (September 3, 2021): 5934. http://dx.doi.org/10.3390/s21175934.

Full text
Abstract:
Crop geometry plays a vital role in ensuring proper plant growth and yield. Check row planting allows adequate space for weeding in both direction and allowing sunlight down to the bottom of the crop. Therefore, a light detection and ranging (LiDAR) navigated electronic seed metering system for check row planting of maize seeds was developed. The system is comprised of a LiDAR-based distance measurement unit, electronic seed metering mechanism and a wireless communication system. The electronic seed metering mechanism was evaluated in the laboratory for five different cell sizes (8.80, 9.73, 10.82, 11.90 and 12.83 mm) and linear cell speed (89.15, 99.46, 111.44, 123.41 and 133.72 mm·s−1). The research shows the optimised values for the cell size and linear speed of cell were found to be 11.90 mm and 99.46 mm·s−1 respectively. A light dependent resistor (LDR) and light emitting diode (LED)-based seed flow sensing system was developed to measure the lag time of seed flow from seed metering box to bottom of seed tube. The average lag time of seed fall was observed as 251.2 ± 5.39 ms at an optimised linear speed of cell of 99.46 mm·s−1 and forward speed of 2 km·h−1. This lag time was minimized by advancing the seed drop on the basis of forward speed of tractor, lag time and targeted position. A check row quality index (ICRQ) was developed to evaluate check row planter. While evaluating the developed system at different forward speeds (i.e., 2, 3 and 5 km·h−1), higher standard deviation (14.14%) of check row quality index was observed at forward speed of 5 km·h−1.
APA, Harvard, Vancouver, ISO, and other styles
42

Gai, Jingyao, Lirong Xiang, and Lie Tang. "Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle." Computers and Electronics in Agriculture 188 (September 2021): 106301. http://dx.doi.org/10.1016/j.compag.2021.106301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Cruz Ulloa, Christyan, Anne Krus, Antonio Barrientos, Jaime Del Cerro, and Constantino Valero. "Robotic Fertilisation Using Localisation Systems Based on Point Clouds in Strip-Cropping Fields." Agronomy 11, no. 1 (December 23, 2020): 11. http://dx.doi.org/10.3390/agronomy11010011.

Full text
Abstract:
The use of robotic systems in organic farming has taken on a leading role in recent years; the Sureveg CORE Organic Cofund ERA-Net project seeks to evaluate the benefits of strip-cropping to produce organic vegetables. This includes, among other objectives, the development of a robotic tool that facilitates the automation of the fertilisation process, allowing the individual treatment (at the plant level). In organic production, the slower nutrient release of the used fertilisers poses additional difficulties, as a tardy detection of deficiencies can no longer be corrected. To improve the detection, as well as counter the additional labour stemming from the strip-cropping configuration, an integrated robotic tool is proposed to detect individual crop deficiencies and react on a single-crop basis. For the development of this proof-of-concept, one of the main objectives of this work is implementing a robust localisation method within the vegetative environment based on point clouds, through the generation of general point cloud maps (G-PC) and local point cloud maps (L-PC) of a crop row. The plants’ geometric characteristics were extracted from the G-PC as a framework in which the robot’s positioning is defined. Through the processing of real-time lidar data, the L-PC is then defined and compared to the predefined reference system previously deduced. Both subsystems are integrated with ROS (Robot Operating System), alongside motion planning, and an inverse kinematics CCD (Cyclic Coordinate Descent) solver, among others. Tests were performed using a simulated environment of the crop row developed in Gazebo, followed by actual measurements in a strip-cropping field. During real-time data-acquisition, the localisation error is reduced from 13 mm to 11 mm within the first 120 cm of measurement. The encountered real-time geometric characteristics were found to coincide with those in the G-PC to an extend of 98.6%.
APA, Harvard, Vancouver, ISO, and other styles
44

Faccini, Nadia, Stefano Delbono, Arzu Çelik Oğuz, Luigi Cattivelli, Giampiero Valè, and Alessandro Tondelli. "Resistance of European Spring 2-Row Barley Cultivars to Pyrenophora graminea and Detection of Associated Loci." Agronomy 11, no. 2 (February 20, 2021): 374. http://dx.doi.org/10.3390/agronomy11020374.

Full text
Abstract:
Pyrenophora graminea is the seed-borne pathogen causal agent of barley leaf stripe disease. In this work, we screened a collection of 206 spring two-row barley cultivars from Europe for their resistance to the fungal pathogen. Artificial inoculation with the highly virulent isolate Dg2 revealed a continuous variation for the incidence of infection, with few highly resistant or highly susceptible genotypes. On average, old cultivars showed higher resistance than the more modern ones. Genome-Wide Association Scan was performed by exploiting available molecular data for >4000 SNP markers and revealed a single, highly significant association on the short arm of chromosome 6H, in a genomic position where quantitative trait loci (QTL) for barley resistance to P. graminea were not detected before. Based on the last version of the reference barley genome, genes encoding for proteins with a kinase domain were suggested as candidates for the locus.
APA, Harvard, Vancouver, ISO, and other styles
45

Van Evert, Frits K., Gerie W. A. M. Van Der Heijden, Lambertus A. P. Lotz, Gerrit Polder, Arjan Lamaker, Arjan De Jong, Marjolijn C. Kuyper, Eltje J. K. Groendijk, Jacques J. Neeteson, and Ton Van Der Zalm. "A Mobile Field Robot with Vision-Based Detection of Volunteer Potato Plants in a Corn Crop." Weed Technology 20, no. 4 (December 2006): 853–61. http://dx.doi.org/10.1614/wt-05-132.1.

Full text
Abstract:
Volunteer potato is a perennial weed that is difficult to control in crop rotations. It was our objective to build a small, low-cost robot capable of detecting volunteer potato plants in a cornfield and thus demonstrate the potential for automatic control of this weed. We used an electric toy truck as the basis for our robot. We developed a fast row-recognition algorithm based on the Hough transform and implemented it using a webcam. We developed an algorithm that detects the presence of a potato plant based on a combination of size, shape, and color of the green elements in an image and implemented it using a second webcam. The robot was able to detect potatoes while navigating autonomously through experimental and commercial cornfields. In a first experiment, 319 out of 324 images were correctly classified (98.5%) as showing, or not showing, a potato plant. In a second experiment, 126 out of 141 images were correctly classified (89.4%). Detection of a potato plant resulted in an acoustic signal, but future robots may be fitted with weed control equipment, or they may use a global positioning system to map the presence of weed plants so that regular equipment can be used for control.
APA, Harvard, Vancouver, ISO, and other styles
46

Li, Dongfang, Boliao Li, Sifang Long, Huaiqu Feng, Te Xi, Shuo Kang, and Jun Wang. "Rice seedling row detection based on morphological anchor points of rice stems." Biosystems Engineering 226 (February 2023): 71–85. http://dx.doi.org/10.1016/j.biosystemseng.2022.12.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Hudelson, B. D., M. K. Clayton, K. P. Smith, and C. D. Upper. "Detection and Description of Spatial Patterns of Bacterial Brown Spot of Snap Beans Using Cyclic Samples." Phytopathology® 87, no. 1 (January 1997): 33–41. http://dx.doi.org/10.1094/phyto.1997.87.1.33.

Full text
Abstract:
Snap bean plants within seven-row segments that ranged from 65 to 147 m were sampled, using a cyclic sampling plan. In the cyclic sampling plan, only 6 of every 31 plants were sampled, but sampled plants were spaced such that pairs of plants that were 1, 2, 3, 4,…, 1,525 plants apart could be identified within each sample. Every leaflet on every sampled plant was assessed for bacterial brown spot, and the proportion of disease leaflets per plant was determined. Arcsine square-root-transformed disease incidence values were analyzed for spatial patterns by autocorrelation and spectral analyses. Disease patterns were detected at several different scales within a single snap bean row, at distances that ranged from ˜20 to ˜100 m. Approximately 23 to 53% of the disease variability in the samples could be described by sine and cosine curves, indicating a substantial component of regularity in the disease patterns. Possible origins for these regular patterns, including cultural practices and seed infestation, are discussed.
APA, Harvard, Vancouver, ISO, and other styles
48

Torres-Sánchez, Jorge, Francisco Javier Mesas-Carrascosa, Francisco M. Jiménez-Brenes, Ana I. de Castro, and Francisca López-Granados. "Early Detection of Broad-Leaved and Grass Weeds in Wide Row Crops Using Artificial Neural Networks and UAV Imagery." Agronomy 11, no. 4 (April 12, 2021): 749. http://dx.doi.org/10.3390/agronomy11040749.

Full text
Abstract:
Significant advances in weed mapping from unmanned aerial platforms have been achieved in recent years. The detection of weed location has made possible the generation of site specific weed treatments to reduce the use of herbicides according to weed cover maps. However, the characterization of weed infestations should not be limited to the location of weed stands, but should also be able to distinguish the types of weeds to allow the best possible choice of herbicide treatment to be applied. A first step in this direction should be the discrimination between broad-leaved (dicotyledonous) and grass (monocotyledonous) weeds. Considering the advances in weed detection based on images acquired by unmanned aerial vehicles, and the ability of neural networks to solve hard classification problems in remote sensing, these technologies have been merged in this study with the aim of exploring their potential for broadleaf and grass weed detection in wide-row herbaceous crops such as sunflower and cotton. Overall accuracies of around 80% were obtained in both crops, with user accuracy for broad-leaved and grass weeds around 75% and 65%, respectively. These results confirm the potential of the presented combination of technologies for improving the characterization of different weed infestations, which would allow the generation of timely and adequate herbicide treatment maps according to groups of weeds.
APA, Harvard, Vancouver, ISO, and other styles
49

Su, Daobilige, Yongliang Qiao, He Kong, and Salah Sukkarieh. "Real time detection of inter-row ryegrass in wheat farms using deep learning." Biosystems Engineering 204 (April 2021): 198–211. http://dx.doi.org/10.1016/j.biosystemseng.2021.01.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Nardon, Gustavo F., and Guido F. Botta. "Prospective study of the technology for evaluating and measuring in-row seed spacing for precision planting: A review." Spanish Journal of Agricultural Research 20, no. 4 (October 10, 2022): e02R01. http://dx.doi.org/10.5424/sjar/2022204-19269.

Full text
Abstract:
Corn is the most cultivated and consumed cereal in the world. The overall objective of this review was to study the methodologies to measure and evaluate the in-row seed spacing for precision planting as well as to determine the technological alternatives that would allow obtaining information about seed mapping for corn crop planting in precision agriculture applications. As a conceptual synthesis about the electronic measurement system, there are two strategies for determining in-row seed spacing in the precision planting. Indirect methods correspond to the measurement before the seeds reach the furrow, while direct methods correspond to the measurement with the seeds placed in the furrow. The indirect measurement strategy is the most widely used in research publications and commercial planter monitors. Within this method, the seed spacing measurement systems use optical or radio wave type seed sensors. Corn seed counting accuracy through electronic measurement systems with optical-type seed sensor is at least 96%. The microwave seed sensor is used commercially by a few companies whose technologies are patented. The direct measurement strategy is under development and requires further research. The main limitation of these technologies is the seed detection in the furrow, which limits the planter travel speed and the equipment cost. The conceptual proposal for the term ‘seed mapping’ is to provide integrated and geo-referenced information on in-row seed spacing and depth for precision planting.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography