The Farmers Business Network (FBN) is currently working on an initiative to break down agricultural data on millions of acres of U.S. farmland – while providing real-world results on the performance of various seed and crop strains as well as fertilizers.
As Datanami’s George Leopold reports, the FBN, which recently announced a $15 million investment round led by Google Ventures, says the goal of the network is to tap into hard-won collective knowledge while making agricultural data more accessible. The goal? To produce larger crop yields at a lower cost.
“Based in the agricultural hub of Davenport, Iowa, and launched last November, the network claims to have aggregated data on the performance of 7 million acres of farmland spanning 17 states. The database also includes a database of more than 500 seed varieties and a growing list of crops such as alfalfa, corn, wheat and soybeans,” writes Leopold.
“Along with vagaries of weather, farmers are generally forced to rely on university farm extension test plots or seed salesmen in deciding which seed types to plant. Hence, the platform’s ‘whole farm analysis’ feature attempts to pinpoint key factors like cumulative precipitation and planting temperature that affect individual crop yields. The service also includes a ‘seed finder’ function designed to sift through the acreage data to discover the optimum seed for specific farms.”
According to Leopold, Big Data could play a more significant role in agriculture by helping to identify more drought-resistant seed strains as farmers are forced to cope with extreme weather conditions. However, says Leopold, there is no sure thing when it comes to farming.
“What the big agricultural data service could do, at least, is help narrow the odds against the next catastrophic crop failure,” he added.
Hundreds of thousands of miles away, a local Indonesian startup known as Ci-Agriculture is promoting the notion of “precision farming” via the use of sensors, aerial imagery and Big Data analytics. As TechInAsia’s Nadine Freischlad reports, Ci-Agriculture began its first trial last year on a rice paddy outside of Jakarta on the foot of Mount Gede.
Image Credit: TechInAsia
“For about four months, we planted rice from the beginning to the harvest season. We experimented with drones and weather sensors,” Regina Rivani Andani, the agricultural scientist in charge of developing the Ci-Agriculture program, told TechInAsia. “During the planting season, we monitored the soil condition, created aerial photographs and collected data. We also learned about local farming practices, the supply chain, the social dynamics of the people there.”
Ci-Agriculture has already identified three products it wants to develop further, including Crop Accurate, which uses Big Data analytics to help farmers decide when to plant, fertilize and deploy pest control.
Of course, agricultural challenges aren’t limited to Indonesia or California. Indeed, the global population – growing at around 140 people per minute – is predicted to reach 8 billion by 2030, 9.1 billion by 2050 and possibly as high as 14 billion by 2100. Current projections, cited by Richard Eckard in The Age, indicate food production will have to jump between 60 and 80 per cent by 2050 to meet the needs of an increasing population.
“Precision electronic technologies will both enable real-time and cost effective [agricultural] decision making, through linking these technologies with predictive models,” Eckard added. “Agriculture will also need a more coordinated framework for managing the masses of [Big] Data that will be collected by electronic technologies along the supply chain, to enable the effective use of this data.”
According to Jason Waxman, VP of Intel’s Data Center Group, the ability to bring the right data into the decision making process is essential for managing agricultural operations.
“Moore’s Law has brought dramatic advances in computing and memory technologies, increasing capability and affordability,” Waxman explained in an IQ Intel article. “As a result, the ability to store and analyze large amounts of information in real time is leading to breakthroughs in analytics across industries.”
Vin Sharma, director of strategy and business development for the cloud analytics division at Intel, points out that the Santa Clara-based company is already leveraging Big Data to help solve large-scale food security problems, starting with those in drought-stricken California.
“We’re using Big Data analytic solutions [that are] applied to significant problems,” Sharma told TechRepublic. “Without overstating it, it’s trying to solve problems of world-changing scope.”
More specifically, Intel is working with University of California, Davis and the World Food Center on big data’s role in precision farming techniques.
“It makes it interesting from [a] Big Data analytics [perspective]. One particular stream of data when combined with other sources of data, so it’s not just soil moisture, nature, and composition combined with weather and climate,” Sharma added. “That combo of data sets is a better predictor of that irrigation operation, [whereas before] it was purely based on hindsight, or on guesswork.”
To help facilitate the evolution of Moore’s Law in the Big Data space, Intel recently introduced its new Xeon processor E7 v3 family, which is targeted at the acceleration of real-time analytics on enormous datasets with sizes of multi-terabyte and even petabyte-scale.
As we’ve previously discussed on Rambus Press, the Xeon E7 v3 processors are designed to work in the current Brickland server platforms, which debuted with the Ivy Bridge-EX Xeon E7 v2 processors back in February 2014. It should be noted that the Brickland platform also features Intel’s Jordan Creek scalable memory buffer chip.
“This memory chip and controller combination now supports DDR4 main memory, which clocks a little higher and yet burns a bit less power compared to DDR3 memory,” The Platform’s Timothy Prickett Morgan explained. “The memory controllers support the same three DIMMs per channel, and so a single socket supports up to 96 memory sticks and up to 12 TB of memory capacity using 64 GB memory modules.”
Frank Ferro, senior director of product marketing at Rambus, told us that DDR4 memory delivers a 40-50 percent increase in bandwidth, along with a 35 percent reduction in power consumption compared to DDR3 memory (currently in servers).
“New Xeon processors, coupled with increased memory bandwidth and capacity, go a long way in accelerating real-time analytics for enormous datasets comprising petabytes of information,” he concluded.
Leave a Reply