Since April 2017 we’ve been hosting a Field Connect weather station at the MicroFarm.
The station offers a set of weather readings comparable to our Plant and Food HortPlus weather station. The main advantage to us is easy access to (nearly) current conditions as the FieldConnect station is updating regularly during the day.
Being web-based we can view the data from anywhere, anytime. This has been helpful in checking wind conditions when irrigation or spraying is due and for our records after spray applications.
The online dashboard is easily customised, selecting the date range and sensors reported with a few clicks. That lets us compare soil moisture, PET and rainfall for example as shown below.
Over winter the station has been monitoring soil moisture in our access strip between cropped areas, but we can shift the sensor into crops to monitor those as we want.
This season we are linking with regional agronomists and farmers in Pukekohe/Pukekawa, Hawke’s Bay and Canterbury to further test the Management Action Zone tools.
One of the Pukekawa crops is most advanced and our mapping there starts very soon. Our aim is to map crops at 3 leaf stage, use that to identify canopy zones and take photos in each zone for detailed analysis.
The ground area calculated from the photos is added to plant population counts and run through a crop development model on SmartFarm.co.nz to predict final yield in each zone. The model also identifies if population of plant growth rate are causing lower than expected development, and therefore yield.
This field is almost ready for the first mapping exercise. We are somewhat nervous because the weeds can cause errors: green is green! Sometimes we can filter the weeds out of the images, but if there is very little difference between weed and onion, it is not yet possible.
The MicroFarm crop has started to emerge in recent days. We are keeping an eye on this to see what impact emergence has later.
Few activities are more tied to location and the geospatial landscape than agriculture. Agricultural businesses, research and policy makers rely on quantitative data about soils, water, weather, inputs, productivity, outputs, and markets. This summit will tackle the big questions on big data for agriculture in New Zealand and globally: how to make it really work for farmers, policy-makers, markets and consumers?
Presentations and workshops will cover
Environmental Data and Information
The Internet of Things and new sensor technologies
Applications and mobile
Privacy, security and protections
Maps and models – current and future
Collaborations and standards in action
Join international geospatial experts along with local innovators in Palmerston North for this one day Summit.
Dan Bloomer attended the 20th Symposium on Precision Agriculture in Sydney.
The PA Symposium brings together farmers, growers, researchers, advisors and industry to discuss and absorb developments. Speakers covered cutting edge research, on-farm application by researchers, advisors and farmers, and industry background information such as the state of telecommunications and data ownership.
As Brett Whelan told delegates, “The purpose of precision agriculture has always been to increase the number of correct decisions made in the businesses of crop and animal management. It is a logical step in the evolution of agricultural management systems toward increased efficiency of inputs relative to production, minimized waste and improved product quality, traceability and marketability.”
Crop and soil sensing continues to develop, and there is increasing use of new approaches. Canopy assessment has relied heavily on NDVI, the 1970s vegetation index chosen for distinguishing forest from desert and ocean. In recent years a wider range of sensors capturing more light bands (blue, green, red and infrared) have become affordable and available. Some look at red-edge and thermal infra-red, two bands often related to crop stress of some form. Off the shelf cameras that fit simple UAVs are within farm budgets now.
Ian Yule described research with hyperspectral sensors that capture very detailed images with hundreds of light bands. Hundreds of ground control samples provide “real” information and enormous amounts of data get analysed to identify relationships. The capacity of this to determine species, plant nutrient status and other useful information is remarkable. The current research equipment and processing is very expensive but assume price drops as commercialisation progresses.
Machine vision including object shape, texture and colour is being used to recognise individual objects such as plants, parts of plants or specific weeds. Discussing robotics research to guide decision making on vegetable farms Zhe Xu noted, “If a human can recognise something, a machine can be taught to as well.” Get used to artificial intelligence, neural programming and autonomous phenotyping!
We presented our own onions research which is using smartphone cameras to capture very useful crop development information quickly and cost effectively. Combined with crop models and web based calculation we can predict final yields with fair accuracy early enough to support crop management decisions.
An Australian vegetable research project is using similar approaches to support decision making in carrot crops and investigating others with promise. That team includes researchers and farmers, and is increasingly using yield monitors for crops such as potatoes and carrots. Converting yield data to value allows farmers to estimate costs of variability and how much to invest to fix problem areas.
Data capture, communications and analysis was a key theme. Kim Bryceson described the establishment of a sensor network and analytics using IoT (internet of things) tools at Queensland University Gatton. Rob Bramley explained a process that predicted sugar yields at regional scale to promote better fertiliser management in that industry. Patrick Filippi presented a “big data” approach to predicting grain yield.
The data revolution is changing our world in ways we can’t yet imagine. The increasing amount of things measured, the spatial scale and time span of collection and development of data science to analyse huge streams of information revolutionise our understanding. These are exciting times. Some jobs are going to go, but others will be created as we require completely new skills for jobs not heard of a decade ago.
“We are all in the position of making decisions from a limited understanding or a particular perspective, working with biological systems that are incredibly complex and impossible to fully understand, “ said Ian Yule. “Recent experience with new sensing technologies and data processing has produced new information that challenges our preconceived ideas and understandings,” he said.
The PA Symposium is presented by SPAA, the Society for Precision Agriculture Australia, and the Precision Agriculture Laboratory at the University of Sydney. There has always been a New Zealand presence because while some details are unique, the tools and processes are for the most part generic.
In 2017 our 15th Annual Conference focuses on automated tools for data collection, decision making and doing actual tasks on the farm (and beyond).
What do you want?
What’s on offer?
How will farms and management have to change?
We have a comprehensive programme. We’ve gone a bit outside the box to bring a variety including from outside the horticultural and arable sectors. We find cross-pollination and hybrid vigour valuable!
So register, come along and listen to excellent presenters, discuss the ideas with colleagues and go away with new understanding and plans.
GrowMaps principal Luke Posthuma completed the survey, and says his observations as the survey progressed suggest there is a reasonable spread of pH across our relatively small area.
As well as Veris sampling, Luke took a number of soil samples for verification and calibration checks.
The Veris equipment also maps soil electrical conductivity (EC) down to 60cm. Soil EC is a measurement of how much electrical current soil can conduct. It is often an effective way to map soil texture because smaller soil particles such as clay conduct more current than larger silt and sand particles.
Part of the Veris pH mapping is post-survey processing to create the most reliable result. We await the processed maps with considerable interest.
We previously had a similar soil conductivity map provided by AgriOptics and it will be interesting to compare the results.
Now in year two of our OnionsNZ SFF project, we have trials at the MicroFarm and monitoring sites at three commercial farms in Hawke’s Bay and three more in Pukekohe.
A summary of Year 1 is on our website. A key aspect was testing a range of sensors and camera systems for assessing crop size and variability. Because onions are like needles poking from the ground, all sensors struggled especially when plants were small. This is when we want to know about the developing crop, as it is the time we make decisions and apply management.
By November our sensing was more satisfactory. At this stage we captured satellite, UAV, smartphone and GreenSeeker data and created a series of maps.
We used the satellite image to create canopy maps and identify zones. We sampled within the zones at harvest, and used the raltioship between November canopy and February yield to create yield maps and profit maps.
We also developed relationships between photographs of ground cover, laboratory measurements of fresh weight and leaf area and the final crop yield.
In reviewing the season’s worth of MicroFarm plot measurements and noticed there were areas where yield reached its potential, areas where yield was limited by population (establishment), some where yield was limited by canopy growth (development) and some by both population and development.
This observation helped us form a concept of Management Action Zones, based on population and canopy development assessments.
Our aims for Year 2 are on the website. We set out to confirm the relationships we found in Year 1.
This required developing population expectations and determining estimates of canopy development as the season progressed, against which field measurement could be compared.
We had to select our “zones” before the crop got established as we did a lot of base line testing of the soil. So our zones were chosen based on paddock history and a fair bit of guess work. Really, we need to be able to identify zones within an establishing or developing crop, then determine what is going on so we can try to fix it as quickly as possible.
In previous seasons we experimented with smartphone cameras and image processing to assess canopy size and relate that to final yields. We are very pleased that photographs of sampling plots processed using the “Canopeo” app compare very well with Leaf Area Index again this season.
Through the season we tracked crop development in the plots and using plant counts and canopy cover assessments to try and separate the effects of population (establishment) and soil or other management factors.
We built a web calculator to do the maths, aiming for a tool any grower or agronomist can use to aid decision making. The web calculator was used to test our theories about yield prediction and management zones.
ASL Software updated the “CoverMap” smartphone application and we obtained consistent results from it. The app calculates canopy ground cover and logs data against GPS position in real time. Because we have confidence that ground cover from image processing is closely related to Leaf Area Index we are working to turn our maps into predictions of final yields.
The current season’s MicroFarm crop is certainly variable. Some is deliberate: we sat the irrigator over some areas after planting to simulate heavy rain events, and we have a poorly irrigated strip. We know some relates to different soil and cover crop histories.
But some differences are unexpected and so far reasons unexplained.
Together with Plant and Food Research we have been taking additional soil samples to try and uncover the causes of patchiness.
We’ve determined one factor is our artificial rain storm, some crop loss is probably runoff from that and some is historic compaction. We’ve even identified where a shift in our GPS AB line has left 300mm strips of low production where plants are on last year’s wheel tracks!
But there is a long way to go before this tricky crop gives up its secrets.
A version of this article previously appeared in The Grower
Dan Bloomer has been travelling in Australia and Europe asking, “How ready are robots for farmers and how ready are farmers for robots?”
Notable areas of active research and development globally are scouting, weeding and fruit picking. Success requires machines that can determine and follow a route traversing whatever terrain it must, capture information, identify and selectively remove weeds, and identify, pick and transport fruit. They have to sense, analyse, plan and act.
Robotics is widespread in industries such as car manufacturing that have the exactly the same task being repeated over and over again. With possible exception of robotic milking, farm operations are not like that. Virtually every single case is unique with unique responses needed.
Many groups around the world are looking at robotic weeding . There are many items needing attention. How do we tell weeds from crop plants? Can we do that fast enough and reliably enough to make a robot commercially viable on-farm? Once identified, how do we optimise robotic arm movement to best attack a patch of weeds?
A key theme for Rob Fitch and colleagues is Active Perception: perception being what we can detect with what accuracy and confidence; active meaning in real time and including planning actions. They invest heavily in developing mathematics to get fast results. And they are succeeding.
Using Intel’s RealSense structured light camera it takes them less than half a second to identify and precisely locate groups of apples on a trellis. Within that time they also calculate exactly where to place the camera to get a second confirming view.
Cheryl McCarthy and colleagues at the National Centre for Engineering in Agriculture (NCEA) are conducting a range of research projects that integrate autonomous sensing and control with on-farm operations to robotically manage inputs within a crop. Major projects include automation for weed spot spraying, adaptive control for irrigation optimisation, and remote crop surveillance using cameras and remotely piloted aircraft.
Now Cheryl is using UAVs to capture photos of crops, stitching the pictures to get a whole paddock image, then splitting it up again to efficiently identify and locate individual plants and weeds. This is enabling her to create accurate maps some other weed destroying robot can use.
SwarmFarm founders, Andrew and Jocie Bate grow cereals and pulses near Emerald. Spray-fallow is used to conserve water in this dryland environment and WeedSeeker® and Weedit® technologies reduce chemical use to a very small percentage of traditional broadcast application.
With large areas, most growers move to bigger machinery to maximise labour efficiency. This has a number of adverse effects including significant soil damage and inability to work small areas or work efficiently around obstacles such as trees.
SwarmFarm chose robots as practical light weight equipment. They reason that several small machines working together reduce soil impact and have the same work rate as one big machine. Andrew estimates that adoption of 8 m booms versus 34 m booms could increase the effective croppable area in Queensland by 2%.
Are these robots ready for farmers? Are farmers ready for these robots?
Only SwarmFarm has multiple machines currently working on farm in Australia. They are finalising a user interface that will allow non-graduate engineers (smart farmers) to manage the machines.
The question that remains is, “Why would I buy a specialised machine when I can put a driver on a cheaper conventional tractor or higher work rate sprayer and achieve the same?”
Is it the same?
Travel to Australia was supported by a Trimble Foundation Study Grant
A desire to reduce soil compaction and avoid high and inefficient use of chemicals and energy inspired Steve Tanner and Aurelien Demaurex to found eco-Robotix in Switzerland.
Their solution is a light-weight fully solar-powered weeding robot, a 2 wheel drive machine with 2D camera vision and basic GPS. Two robotic arms position herbicide nozzles or a mechanical device for precision weed control.
The ecoRobotix design philosophy is simplicity and value: avoiding batteries cuts weight, technology requirements and slashes capital costs. It is a step towards their vision of cheap autonomous machines swarming around the farm.
Bought by small farms, Naio Technologies’ Oz440 is a small French robot designed to mechanically weed between rows. The robots are left weeding while the farmer spends time on other jobs or serving customers. Larger machines for vegetable cropping and viticulture are in development.
Naio co-founder Gaetan Severac notes Oz440 has no GPS, relying instead on cameras and LiDAR range finders to identify rows and navigate. These are small machines with a total price similar to a conventional agricultural RTK-GPS system, so alternatives are essential.
Tech companies have responded and several “RTK-GPS” systems are now available under $US1000. Their accuracy and reliability is not known!
Broccoli is one of the world’s largest vegetable crops and is almost entirely manually harvested, which is costly. Leader Tom Duckett says robotic equipment being developed at the University of Lincoln in England is as good as human pickers at detecting broccoli heads of the right size, especially if the robot can pick through the night. With identification in hand, development is now on mechanical cutting and collecting.
In 1996, Tillett and Hague Technologies demonstrated an autonomous roving machine selectively spraying individual cabbages. Having done that, they determined that tractors were effective and concentrated on automating implements. They are experts in vision systems and integration with row and plant identification and machinery actuation, technology embedded in Garford row crop equipment.
Parrish Farms has their own project adapting a Garford mechanical to strip spray between onion rows. Nick Parrish explained that Black Grass control was difficult, and as available graminicides strip wax off onions boom spraying prevents use of other products for up to two weeks.
Route planning to avoid hazards and known obstacles
Laser range finder to sense objects and define them as obstacles
Wide area safety curtain sensing ground objects at 2m
Dead man’s handle possibly via smartphone
Collapsible bumper as a physical soft barrier that activates Stop
Big Red Buttons anyone close can see and use to stop the machine
Machines that are small, slow and light minimise inertia
“Hands free hectare” is Harper Adams University’s attempt to grow a commercial crop using open source software and commercially available equipment in an area no-one enters.
Harper Adams research to develop a robotic strawberry harvester is notable for the integration of genetics for varieties with long stalks, a growing system that has plants off the ground, and the robotic technologies to identify, locate and assess the ripeness of individual berries and pick them touching only the peduncle (stalk).
So what have I learned about farm robotics?
People believe our food production systems have to change
Farm labour is in short supply throughout the western world
Machines can’t get bigger as the soil can’t support that
Robotics has huge potential but when
Safety is a key issue but manageable
There is huge investment in research at universities, but also in industry
It’s about rethinking the whole system not replacing the driver
There are many technologies available, but probably not the mix you want for your application.
After identifying areas within paddocks that had yields limited by different probably causes, we conceived the idea of Management Action Zones (MAZs).
Some areas showed that yield was limited by plant number: establishment was poor. Others had the expected population, but low biomass: the plants were small due to some other limiting factor.
If we can identify zones easily, and determine the causes, we should be able to target a management response accordingly. So for this season, we set out a revised research aim.
What we want to know:
Can we successfully determine a management action zone in a field?
Why do we need to know this?
Develop a tool to increase uniformity and yield outcomes
Develop a tool to evaluate management practices and crop productivity
If we want to successfully determine a management action zone in a field then there are two main steps to achieve in this year’s work:
Confirm the relationship between digital data and crop model parameters
Does the relationship stay constant over time and sites?
How early in growth can a difference be detected?
Can the relationship be used to show a growth map across a field?
Develop an approach to gather information and ways to input and display results, initially using a website approach.
Can we integrate a plant count and yield information to start developing a management action zone?
How should this be put together in a way growers can start to use to gather information about their crops?
At the MicroFarm, we established six research zones based on paddock history and excessive wetness at establishment.
We have three paddock histories: two years of onion production with autumn cover crops of Caliente mustard, two years of onion production with autumn cover crops of oats, and no previous onion crops planted after previous summer sweetcorn and autumn sown rye grass. In each of these areas, we deliberately created sub-zones by applying about 45mm of spray irrigation as a “large rain event”.
The impact of the artificial rainstorm is evident on images taken at the end of November.