Farms.com Home   News

Old Macdonald Had a Robot: the Future of Automated Agriculture

By Mara Johnson-Groh 

At farms across America, you might notice something new in the fields: robots. Not unlike Roombas, these robots are automating some of the most tedious tasks in agriculture. Some robots till land, others carry crops, and some power entirely autonomous tractors. A product called the Model B Smart Sprayer—a 19-foot-long machine that rides on the back of tractors—can identify, target, and spray up to 700,000 weeds with chemical or organic herbicides every hour with submillimeter precision. It also simultaneously collects valuable information on how each plant is growing over the season.

This is not your grandparents’ farming.

These farming robots are built by Verdant Robotics, one company among many in a young and bourgeoning sector that are leveraging cutting-edge automation techniques to revolutionize agriculture. In 2022, venture investors put nearly a billion dollars into ag-tech start-ups.

This investment comes at a pivotal time. Farming has never been easy, yet now more than ever, it’s a demanding job. Costs are skyrocketing. The climate is becoming harsher and more unpredictable. In many places, there is also an ongoing labor shortage for a career that is known to be more punishing than profitable. But new technologies are trying to alter that trajectory.

From weeding to detecting disease and harvesting crops, robots are helping improve yields, reduce labor requirements, and lessen environmental impact. This adoption of technology is playing a necessary role in helping usher farming through the coming era where new sustainable practices are needed to feed the globe without further environmental damages. By 2050, the global population is expected to reach 9.7 billion people, and crop yields are simultaneously expected to see a climate change-driven decline, according to scientific models.

At the heart of this robot revolution are camera systems that will serve as the eyes of agriculture. Optics has already had a place at the agricultural table, from designing better greenhouses to improving artificial LED lighting sources. Now, paired with image analysis software, optical technologies are helping farmers see better and see more.

“Computer vision is going to fundamentally change agriculture,” says Curtis Garner, co-founder and chief commercial officer for Verdant Robotics.

The Smart Sprayer’s success comes from multiple high-resolution cameras paired with an intricate computational program that combines neural networks, video processing, and complex algorithms to determine plant location and qualities on the go as the robot bounces down dusty rows of crops. The machine not only automates the work of many farmhands, but also tailors the needs of each plant in terms of water, fertilizer, herbicide, and so on.

“Historically, we farmed on a whole field basis. We would apply everything uniformly across the field—water, fertilizer, herbicide, pesticides,” says Alex Thomasson, a professor of agricultural and biological engineering as well as director of the Agricultural Autonomy Institute at Mississippi State University. “With automation comes the ability to be more precise.”

Today, farmers can use systems like Verdant Robotics’ to apply water and chemicals only where needed. This optimizes costs for the farmer, but also improves environmental stewardship. Verdant Robotics says its Smart Sprayer uses 96 percent fewer chemicals than standard methods and can reduce the cost of farming an acre from $3,000 to just $30.

“Fundamentally, it’s about doing more with less,” says Gabe Sibley, Verdant Robotics CEO and co-founder. “The amount of efficiency that can be gained with the technology is actually huge.”

The role of cameras in replacing the eyes of farmers dates to the 1920s, when pilots realized airplane flyovers could help detect disease in cotton fields. Today, cameras in agriculture run the gamut from off-the shelf units to custom-designed models costing thousands of dollars.

The most common, and the ones employed by the Smart Sprayer, are RGB cameras—the same type put in phones and that use red, green, and blue channels to make full-color images. Their ubiquity means the cameras are relatively cheap and accessible. As a result, many scientists are researching their use in various agricultural automations.

At Wageningen University in the Netherlands, Gerrit Polder works on machine-vision and robotics projects in agriculture. He and a team recently developed a disease detection robot for vineyards. It’s a coffee-table-sized wheeled box that follows behind a tractor and uses RGB cameras and computer vision software to detect disease in grape plants at the earliest stages.

While RGB cameras can work well for disease detection in plants, researchers have known since the 1930s that other wavelengths are even more useful for measuring plant wellbeing. For example, wavelengths from the red edge of visible light to the near infrared are important for calculating vegetation indexes, a proxy for plant health. Similarly, shortwave infrared wavelengths are useful for probing a plant’s water content, which can reveal the level of crop stress.

In his work to develop a disease detector for grapes, Polder also tested multispectral cameras, which combine several specific wavelength ranges that include visible light and beyond. While these cameras can detect diseases earlier, they are also more challenging to work with.

That’s because, for a camera to detect a disease or a weed, it must know what it’s looking at. This is typically achieved with artificial intelligence or neural network software trained on a large dataset of images to tell the computer what is and isn’t a disease or a weed. Training datasets for ubiquitous RGB cameras are easy to come by; those for multispectral cameras, however, are not. This means that, contrary to expectations, the resulting software for multispectral cameras can actually be worse at detecting disease because they are trained on smaller multispectral datasets.

“In the end we decided to switch back to just [using RGB cameras] because the neural networks work so well for them,” Polder says. Additionally, multispectral cameras remain prohibitively expensive at around $10,000 apiece, whereas RGB cameras cost in the hundreds of dollars range.

“I think in the future this will change,” Polder says, citing the rapid development of photonics technologies in recent years. “More multispectral data will help develop deep learning networks as well.”

Other researchers are testing the limits of hyperspectral imaging, which is a step up from multispectral imaging and covers hundreds of even-more-narrow wavelength ranges. However, due to their incredibly high cost of $75,000 to $220,000 apiece, only a limited amount of research has been done on what these cameras can achieve, though prices are starting to drop.

“Hyperspectral imaging is not a new technology. It has been around for 30 years or so,” says Bing Lu, an assistant professor of geography at Simon Fraser University, who studies the use of hyperspectral imaging in agriculture. “But the sensor price is quite high, so it’s not even widely studied in academia and less in the industry to support farm practice.”

Hyperspectral cameras are more commonly flown on satellites to image large areas. That doesn’t help the farmer who is interested in mapping their fields at a per-crop level. Lu and others have completed studies, however, that show hyperspectral cameras could work well when flown with drones or used from ground-based platforms that pass over single fields. The cameras were able to identify weeds and plants’ nutrient levels, such as nitrogen, phosphorus, and potassium. While promising, it could be a while before they’re widely used.

“The hyperspectral and multispectral [cameras] have yet to be successfully commercially deployed into agriculture,” Garner says. But when they are, “There’s going to be a lot of value derived in that color space for early disease detection [and] pest-damage quantification.”

Hyperspectral cameras are also enhancing the building blocks of farming—the plants themselves—by revolutionizing plant phenomics, the study of a plant’s visible traits that are dependent on the organism’s environment as well as its genes. A large group of researchers are studying crop phenotypes, for example, to design plants that can better weather droughts, floods, and the coming weather challenges associated with climate change.

Click here to see more...

Trending Video

Farm Succession Planning Crucial for Ag Future

Video: Farm Succession Planning Crucial for Ag Future

In the next 15 years, experts predict that 70% of land ownership will change hands, sparking uncertainty, especially for family farms. We explore why succession planning is key to preserving a farm's legacy.