Soybeans are Ready for their Closeup

Dec 12, 2022

By Randy Barrett

Photo credit: USDA Agricultural Research Service

Unhappy soybean plants wilt—so scientists are developing artificial intelligence to track the behavior on site and save farmers time on making crop management decisions.

Experts at the USDA Agricultural Research Service (ARS) have spent the past few years training cameras on soybean, corn and cotton plants to judge their level of moisture stress. It’s a big jump from the age-old practice of bending down and inspecting individual plants. The key to avoiding that particular exercise is the use of AI, or artificial intelligence. In popular culture, the technology spurs nightmares of a world ruled by machines—but the reality is far more humane, and helpful, for humans.

For a camera to be able to recognize different stages of plant stress, it must first be trained on hundreds of images. Each photo is manually classified by a scientist on a scale of 0 to 5 for wilt. That repository is then referenced by the AI program as the system works to match in-situ plant behavior caught by the camera.

“The goal is to have cameras and AI detection operating on farms so growers can do long-term decision making,” says Anna Locke, a research plant physiologist at ARS. “It’s also a useful tool for soybean breeders who have to evaluate the stress level in thousands of plots.”

Right now, the system has an 80% accuracy rate. While that might seem sub-optimal, Locke is pleased. “We think for practical purposes it’s pretty good.”

The AI technology is early in development, but it offers a leap beyond the current use of soil sensors, says Steven Mirsky a research ecologist at ARS’ Sustainable Agriculture Research Laboratory in Beltsville, Maryland. “They do a good job but are expensive, and they only measure a point in time.” He emphasizes the plant is the best indicator of stress—not the soil. The StressCam can watch plant behavior and trigger irrigation, as well as help breeders see which varieties are drought-resistant.

Mirsky co-leads the Precision Sustainable Agriculture project, an interdisciplinary team of universities, government agencies, private businesses and farmers working to develop new technologies for improved decision making and long-term planning.

The StressCam system uses a low-cost RGB camera that creates images to replicate human vision, capturing light in red, green and blue wavelengths (RGB) for accurate color representation. Both the hardware and the software for the StressCam have been made publicly available by PSA through GitHUB. Mirsky wants private industry to pick up the open-source technology and run with it: “We’ve de-risked it. It’s a level playing field.”

Farmer Andrew Nelson raises wheat, lentils and peas in Farmington, Washington, and has been one of the first to use the StressCam. He says the system works, but the challenge has been connectivity to the camera. “Cell reception is terrible here,” Nelson says. Once this kind of technology is more robust, he wants to be able to do an online virtual drive-by of his whole farm every day.

In the future, Mirsky sees the creation of larger AI image repositories that could be used for a wide variety of crops. Multi-camera mounts could check for biomass—and you could put one on a tractor and map a field in real time.