In 2017 our 15th Annual Conference focuses on automated tools for data collection, decision making and doing actual tasks on the farm (and beyond).
What do you want?
What’s on offer?
How will farms and management have to change?
We have a comprehensive programme. We’ve gone a bit outside the box to bring a variety including from outside the horticultural and arable sectors. We find cross-pollination and hybrid vigour valuable!
So register, come along and listen to excellent presenters, discuss the ideas with colleagues and go away with new understanding and plans.
GrowMaps principal Luke Posthuma completed the survey, and says his observations as the survey progressed suggest there is a reasonable spread of pH across our relatively small area.
As well as Veris sampling, Luke took a number of soil samples for verification and calibration checks.
The Veris equipment also maps soil electrical conductivity (EC) down to 60cm. Soil EC is a measurement of how much electrical current soil can conduct. It is often an effective way to map soil texture because smaller soil particles such as clay conduct more current than larger silt and sand particles.
Part of the Veris pH mapping is post-survey processing to create the most reliable result. We await the processed maps with considerable interest.
We previously had a similar soil conductivity map provided by AgriOptics and it will be interesting to compare the results.
Now in year two of our OnionsNZ SFF project, we have trials at the MicroFarm and monitoring sites at three commercial farms in Hawke’s Bay and three more in Pukekohe.
A summary of Year 1 is on our website. A key aspect was testing a range of sensors and camera systems for assessing crop size and variability. Because onions are like needles poking from the ground, all sensors struggled especially when plants were small. This is when we want to know about the developing crop, as it is the time we make decisions and apply management.
By November our sensing was more satisfactory. At this stage we captured satellite, UAV, smartphone and GreenSeeker data and created a series of maps.
We used the satellite image to create canopy maps and identify zones. We sampled within the zones at harvest, and used the raltioship between November canopy and February yield to create yield maps and profit maps.
We also developed relationships between photographs of ground cover, laboratory measurements of fresh weight and leaf area and the final crop yield.
In reviewing the season’s worth of MicroFarm plot measurements and noticed there were areas where yield reached its potential, areas where yield was limited by population (establishment), some where yield was limited by canopy growth (development) and some by both population and development.
This observation helped us form a concept of Management Action Zones, based on population and canopy development assessments.
Our aims for Year 2 are on the website. We set out to confirm the relationships we found in Year 1.
This required developing population expectations and determining estimates of canopy development as the season progressed, against which field measurement could be compared.
We had to select our “zones” before the crop got established as we did a lot of base line testing of the soil. So our zones were chosen based on paddock history and a fair bit of guess work. Really, we need to be able to identify zones within an establishing or developing crop, then determine what is going on so we can try to fix it as quickly as possible.
In previous seasons we experimented with smartphone cameras and image processing to assess canopy size and relate that to final yields. We are very pleased that photographs of sampling plots processed using the “Canopeo” app compare very well with Leaf Area Index again this season.
Through the season we tracked crop development in the plots and using plant counts and canopy cover assessments to try and separate the effects of population (establishment) and soil or other management factors.
We built a web calculator to do the maths, aiming for a tool any grower or agronomist can use to aid decision making. The web calculator was used to test our theories about yield prediction and management zones.
ASL Software updated the “CoverMap” smartphone application and we obtained consistent results from it. The app calculates canopy ground cover and logs data against GPS position in real time. Because we have confidence that ground cover from image processing is closely related to Leaf Area Index we are working to turn our maps into predictions of final yields.
The current season’s MicroFarm crop is certainly variable. Some is deliberate: we sat the irrigator over some areas after planting to simulate heavy rain events, and we have a poorly irrigated strip. We know some relates to different soil and cover crop histories.
But some differences are unexpected and so far reasons unexplained.
Together with Plant and Food Research we have been taking additional soil samples to try and uncover the causes of patchiness.
We’ve determined one factor is our artificial rain storm, some crop loss is probably runoff from that and some is historic compaction. We’ve even identified where a shift in our GPS AB line has left 300mm strips of low production where plants are on last year’s wheel tracks!
But there is a long way to go before this tricky crop gives up its secrets.
A version of this article previously appeared in The Grower
Dan Bloomer has been travelling in Australia and Europe asking, “How ready are robots for farmers and how ready are farmers for robots?”
Notable areas of active research and development globally are scouting, weeding and fruit picking. Success requires machines that can determine and follow a route traversing whatever terrain it must, capture information, identify and selectively remove weeds, and identify, pick and transport fruit. They have to sense, analyse, plan and act.
Robotics is widespread in industries such as car manufacturing that have the exactly the same task being repeated over and over again. With possible exception of robotic milking, farm operations are not like that. Virtually every single case is unique with unique responses needed.
Many groups around the world are looking at robotic weeding . There are many items needing attention. How do we tell weeds from crop plants? Can we do that fast enough and reliably enough to make a robot commercially viable on-farm? Once identified, how do we optimise robotic arm movement to best attack a patch of weeds?
A key theme for Rob Fitch and colleagues is Active Perception: perception being what we can detect with what accuracy and confidence; active meaning in real time and including planning actions. They invest heavily in developing mathematics to get fast results. And they are succeeding.
Using Intel’s RealSense structured light camera it takes them less than half a second to identify and precisely locate groups of apples on a trellis. Within that time they also calculate exactly where to place the camera to get a second confirming view.
Cheryl McCarthy and colleagues at the National Centre for Engineering in Agriculture (NCEA) are conducting a range of research projects that integrate autonomous sensing and control with on-farm operations to robotically manage inputs within a crop. Major projects include automation for weed spot spraying, adaptive control for irrigation optimisation, and remote crop surveillance using cameras and remotely piloted aircraft.
Now Cheryl is using UAVs to capture photos of crops, stitching the pictures to get a whole paddock image, then splitting it up again to efficiently identify and locate individual plants and weeds. This is enabling her to create accurate maps some other weed destroying robot can use.
SwarmFarm founders, Andrew and Jocie Bate grow cereals and pulses near Emerald. Spray-fallow is used to conserve water in this dryland environment and WeedSeeker® and Weedit® technologies reduce chemical use to a very small percentage of traditional broadcast application.
With large areas, most growers move to bigger machinery to maximise labour efficiency. This has a number of adverse effects including significant soil damage and inability to work small areas or work efficiently around obstacles such as trees.
SwarmFarm chose robots as practical light weight equipment. They reason that several small machines working together reduce soil impact and have the same work rate as one big machine. Andrew estimates that adoption of 8 m booms versus 34 m booms could increase the effective croppable area in Queensland by 2%.
Are these robots ready for farmers? Are farmers ready for these robots?
Only SwarmFarm has multiple machines currently working on farm in Australia. They are finalising a user interface that will allow non-graduate engineers (smart farmers) to manage the machines.
The question that remains is, “Why would I buy a specialised machine when I can put a driver on a cheaper conventional tractor or higher work rate sprayer and achieve the same?”
Is it the same?
Travel to Australia was supported by a Trimble Foundation Study Grant
A desire to reduce soil compaction and avoid high and inefficient use of chemicals and energy inspired Steve Tanner and Aurelien Demaurex to found eco-Robotix in Switzerland.
Their solution is a light-weight fully solar-powered weeding robot, a 2 wheel drive machine with 2D camera vision and basic GPS. Two robotic arms position herbicide nozzles or a mechanical device for precision weed control.
The ecoRobotix design philosophy is simplicity and value: avoiding batteries cuts weight, technology requirements and slashes capital costs. It is a step towards their vision of cheap autonomous machines swarming around the farm.
Bought by small farms, Naio Technologies’ Oz440 is a small French robot designed to mechanically weed between rows. The robots are left weeding while the farmer spends time on other jobs or serving customers. Larger machines for vegetable cropping and viticulture are in development.
Naio co-founder Gaetan Severac notes Oz440 has no GPS, relying instead on cameras and LiDAR range finders to identify rows and navigate. These are small machines with a total price similar to a conventional agricultural RTK-GPS system, so alternatives are essential.
Tech companies have responded and several “RTK-GPS” systems are now available under $US1000. Their accuracy and reliability is not known!
Broccoli is one of the world’s largest vegetable crops and is almost entirely manually harvested, which is costly. Leader Tom Duckett says robotic equipment being developed at the University of Lincoln in England is as good as human pickers at detecting broccoli heads of the right size, especially if the robot can pick through the night. With identification in hand, development is now on mechanical cutting and collecting.
In 1996, Tillett and Hague Technologies demonstrated an autonomous roving machine selectively spraying individual cabbages. Having done that, they determined that tractors were effective and concentrated on automating implements. They are experts in vision systems and integration with row and plant identification and machinery actuation, technology embedded in Garford row crop equipment.
Parrish Farms has their own project adapting a Garford mechanical to strip spray between onion rows. Nick Parrish explained that Black Grass control was difficult, and as available graminicides strip wax off onions boom spraying prevents use of other products for up to two weeks.
Route planning to avoid hazards and known obstacles
Laser range finder to sense objects and define them as obstacles
Wide area safety curtain sensing ground objects at 2m
Dead man’s handle possibly via smartphone
Collapsible bumper as a physical soft barrier that activates Stop
Big Red Buttons anyone close can see and use to stop the machine
Machines that are small, slow and light minimise inertia
“Hands free hectare” is Harper Adams University’s attempt to grow a commercial crop using open source software and commercially available equipment in an area no-one enters.
Harper Adams research to develop a robotic strawberry harvester is notable for the integration of genetics for varieties with long stalks, a growing system that has plants off the ground, and the robotic technologies to identify, locate and assess the ripeness of individual berries and pick them touching only the peduncle (stalk).
So what have I learned about farm robotics?
People believe our food production systems have to change
Farm labour is in short supply throughout the western world
Machines can’t get bigger as the soil can’t support that
Robotics has huge potential but when
Safety is a key issue but manageable
There is huge investment in research at universities, but also in industry
It’s about rethinking the whole system not replacing the driver
There are many technologies available, but probably not the mix you want for your application.
After identifying areas within paddocks that had yields limited by different probably causes, we conceived the idea of Management Action Zones (MAZs).
Some areas showed that yield was limited by plant number: establishment was poor. Others had the expected population, but low biomass: the plants were small due to some other limiting factor.
If we can identify zones easily, and determine the causes, we should be able to target a management response accordingly. So for this season, we set out a revised research aim.
What we want to know:
Can we successfully determine a management action zone in a field?
Why do we need to know this?
Develop a tool to increase uniformity and yield outcomes
Develop a tool to evaluate management practices and crop productivity
If we want to successfully determine a management action zone in a field then there are two main steps to achieve in this year’s work:
Confirm the relationship between digital data and crop model parameters
Does the relationship stay constant over time and sites?
How early in growth can a difference be detected?
Can the relationship be used to show a growth map across a field?
Develop an approach to gather information and ways to input and display results, initially using a website approach.
Can we integrate a plant count and yield information to start developing a management action zone?
How should this be put together in a way growers can start to use to gather information about their crops?
At the MicroFarm, we established six research zones based on paddock history and excessive wetness at establishment.
We have three paddock histories: two years of onion production with autumn cover crops of Caliente mustard, two years of onion production with autumn cover crops of oats, and no previous onion crops planted after previous summer sweetcorn and autumn sown rye grass. In each of these areas, we deliberately created sub-zones by applying about 45mm of spray irrigation as a “large rain event”.
The impact of the artificial rainstorm is evident on images taken at the end of November.
The Precision Agriculture Association NZ is presenting workshops focused on technologies available to help reduce nitrogen leaching. There are two North Island workshops being offered at:
Massey Universityon Thursday 1st September 2016 [PDF here]
Ellwood Centre, Hastings on Friday 2nd September 2016 [PDF here]
The ‘Technology to Reduce N Leaching’ workshops are similar to the well received program conducted in Ashburton in March 2016 and will address where we are and what we can do about nitrate leaching limits in a North Island context utilising a range of technologies and farm systems options.
The particular areas for focus for the program are:
Variable rate technologies and systems
Precision spreading systems and services
Soil moisture monitoring, sensors, metering
Nutrient budgeting and environmental monitoring
A Q & A time slot is devoted in the afternoon session for attendees to interact with members and presenters on the day to share learnings and understandings about the issues. This will also be possible over the lunch break on both days with one and half hours devoted for this.
Offer to PAANZ Members
As part of the Hastingsprogram only on 2nd September, PAANZ members are offered the opportunity to participate as trade/sector participants for technologies and products as may be appropriate to support the program.
PAANZ is not able to offer trade/sector stand space at the Palmerston North venue due to space restrictions unfortunately so only the Hastings venue will be able to accommodate this option for members.
If you would like to participate please advise Jim Grennell, E-mail: firstname.lastname@example.org
Mobile:021 330 626, places are limited to ten organisations for the Hastings workshop to be involved as a trade/sector participant so it will be on a first come basis.
The cost of participation will be $100.00 plus GST per stand with attendance fee of $100.00 per person additional.
As these are indoors Workshops, with a technology focus and space at the Hastings venue is limited no large equipment or hardware can be accommodated.
Confirmation of members wishing to take up this opportunity is required by Monday 22nd August 2016 after which time the opportunity to participate will be made available to non-members.
Effective and reliable sensing for the performance of robotic tasks, such as manipulation in the outdoor environment remains a challenging problem.
While commercially available solutions such as ASA-LIFT are available for specific tasks and crops, and for operation in specific conditions, the systems are either not cost effective and or physically unsuitable for specific farming conditions and practices.
This research proposed to develop a mobile robot system with flexibility to adapt and with intelligence to cope with natural variability; through a two-fold aim utilising vision for navigation and manipulation. This talk discussed some of the recent developments on these aspects.
In particular, the talk focused on a novel approach that analyses point cloud information from a time-of-flight (ToF) camera to identify the location of foremost spring onions along the crop bed, for the intention of robotic manipulation. The process uses a combination of 2D image processing on the amplitude data, as well as 3D spatial analysis, extracted from the camera to locate the desired object.
Whilst the experimental results demonstrated the robustness of this approach, further testing was required to determine the ability of a system to cope with different scenarios that exist in the naturally varying environment.
For validation, the vision system was integrated with a robotic manipulation system and initial results of the investigation were presented.
The industrial revolution gave us machines and agri-inputs that enabled us to farm at scale and speed. The green revolution began to unlock the potential of plant genes to increase yield. Now the digital revolution provides us with an opportunity to harness the power of ‘big data’ and technological innovation to radically re-engineer our horticultural production methods and supply chains.
Digitally informed decisions during production, harvesting, sorting, packing, storage and transit could be the basis for a step change to high profitability, high resource efficiency and low footprint horticultural value chains.
Identifying the research priorities that we need to realise this opportunity in New Zealand is a challenge in itself, given the pace of developments in sensing technology, robotics and the internet of things globally. Accordingly, Plant & Food Research assembled an expert panel from across its science teams, augmented with other specialists from New Zealand and Australia, to develop a digital horticulture research strategy.
The panel has taken a value chain approach to identifying research priorities, particularly in relation to production, harvesting, sorting and packaging, storage and transit. Future science needs are structured around the concepts of ‘sense, think, act’ for each part of the value chain and are linked by an ‘artery’ of data to feed forwards and backwards along the value chain.
Plant & Food Research looks forward to working with a wide range of partners to deliver this digital horticulture strategy for the benefit of New Zealand’s producers and exporters.
The OnionsNZ/SFF Project “Benchmarking Variability in Onion Crops” is investigating technologies to map onion crop development. The purpose is to better understand variability and to gather information to inform tactical and strategic decision making.
An AgriOptics survey provided a Soil EM map of the MicroFarm which was used as a base data layer and helped select positions for Plant & Food’s research plots.
As the crop developed, repeated canopy surveys used a GreenSeeker NDVI sensor and CoverMap, a Smartphone application. Both were mounted side by side on a tractor fitted with sub-metre accuracy GPS. Altus UAS provided UAV survey data including MicaSense imagery with five colour bands captured. A mid-season 0.5 m pixel NDVI satellite image was captured.
Both ground based systems had difficulty recording very small plants. GreenSeeker data were dominated by soil effects until a significant canopy was present. Once plants could be seen in photographs, the CoverMap system was able to distinguish between plants and soil.
Direct photos of Plant & Food plots were processed to calculate apparent ground cover. A very strong relationship was found between these and actual plant measurements of fresh leaf weight and leaf area index – both strongly correlated to final crop size.
Attempts to directly correlate the map layers with Plant & Food field plot measurements were frustrated by inadequate or inaccurate image location. Onion crops have been found highly variable over small distances. The GreenSeeker only records a reading every four or five metres, and CoverMap about every 1.5 m. Compounded by errors of a metre or more, finding a measurement to match a 0.5 m bed plot was not possible. Similarly, the UAV and satellite images, while able to identify plots, did not initially show correlations.
Using ArcGIS, fishnets were constructed over the various canopy data layers and correlations between them found at 5 m and 10 m grids. The 10 m grid appears to collect enough data points even for the GreenSeeker to provide a reasonable if not strong correlation with other canopy layers. Similar processes are being used to compare soil and canopy data.
After one season of capture, there appears to be merit in using an optical canopy cover assessment as plants develop. Once full canopy is achieved, the NDVI or a similar index may be better. Colour image analysis will be tested as a method of recording crop top-down as a measure of maturity and storage potential.
We were not successful in mapping yield directly, but did identify a process for creating a yield map based on earlier crop canopy data.