Category Archives: Sensing

LandWISE 2017: Are we ready for automation?

In 2017 our 15th Annual Conference focuses on automated tools for data collection, decision making and doing actual tasks on the farm (and beyond).

  • What do you want?
  • What’s on offer?
  • How will farms and management have to change?

We have a comprehensive programme. We’ve gone a bit outside the box to bring a variety including from outside the horticultural and arable sectors. We find cross-pollination and hybrid vigour valuable!

So register, come along and listen to excellent presenters, discuss the ideas with colleagues and go away with new understanding and plans.

Thanks to Our Loyal Platinum Sponsors!
Many thanks to AGMARDT, sponsors of our international presenter, Thibault Delcroix, France

Hawke’s Bay Regional Council, John Deere and BASF Crop Protection are our Platinum Sponsors again in 2017. Many thanks to these loyal supporters who have backed the Conference for a number of years.

We also welcome our Gold Sponsors, meal sponsors and  trade displays new and old. These are the organisations that make conferences like this possible and affordable.

Join them and us at the Havelock North Function Centre on 24-25 May 2017 to mix with leading practitioners, farmers, growers, researchers, technology developers and providers.

Register now – click here!

 

MicroFarm pH Mapping

GrowMaps’ pH testing equipment at a Papakura trial site

GrowMaps this week completed the first comprehensive soil pH mapping at the MicroFarm. GrowMaps will have a trade display at the LandWISE 2017 Conference and will be taking part in the Horizons Regional Council field session at the Centre for Land and Water.

GrowMaps principal Luke Posthuma completed the survey, and says his observations as the survey progressed suggest there is a reasonable spread of pH across our relatively small area.

As well as Veris sampling, Luke took a number of soil samples for verification and calibration checks.

The Veris equipment also maps soil electrical conductivity (EC) down to 60cm. Soil EC is a measurement of how much electrical current soil can conduct. It is often an effective way to map soil texture because smaller soil particles such as clay conduct more current than larger silt and sand particles.

Part of the Veris pH mapping is post-survey processing to create the most reliable result. We await the processed maps with considerable interest.

We previously had a similar soil conductivity map provided by AgriOptics and it will be interesting to compare the results.

Benchmarking Onion Variability 2016-17

Now in year two of our OnionsNZ SFF project, we have trials at the MicroFarm and monitoring sites at three commercial farms in Hawke’s Bay and three more in Pukekohe.

2015-16

A summary of Year 1 is on our website. A key aspect was testing a range of sensors and camera systems for assessing crop size and variability. Because onions are like needles poking from the ground, all sensors struggled especially when plants were small. This is when we want to know about the developing crop, as it is the time we make decisions and apply management.

By November our sensing was more satisfactory. At this stage we captured satellite, UAV, smartphone and GreenSeeker data and created a series of maps. 

We used the satellite image  to create canopy maps and identify zones. We sampled within the zones at harvest, and used the raltioship between November canopy and February yield to create yield maps and profit maps.

Yield assessments show considerable variation, limits imposed by population, growth of individual plants, or both

We also developed relationships between photographs of ground cover, laboratory measurements of fresh weight and leaf area and the final crop yield.

In reviewing the season’s worth of MicroFarm plot measurements and noticed there were areas where yield reached its potential, areas where yield was limited by population (establishment), some where yield was limited by canopy growth (development) and some by both population and development.

This observation helped us form a concept of Management Action Zones, based on population and canopy development assessments.

Management Action Zones – If population is low work for better establishment next season. If plants are small see if there is something that can be done this season

2016-17

Our aims for Year 2 are on the website. We set out to confirm the relationships we found in Year 1.

This required developing population expectations and determining estimates of canopy development as the season progressed, against which field measurement could be compared.

We had to select our “zones” before the crop got established as we did a lot of base line testing of the soil. So our zones were chosen based on paddock history and a fair bit of guess work. Really, we need to be able to identify zones within an establishing or developing crop, then determine what is going on so we can try to fix it as quickly as possible.

In previous seasons we experimented with smartphone cameras and image processing to assess canopy size and relate that to final yields. We are very pleased that photographs of sampling plots processed using the “Canopeo” app compare very well with Leaf Area Index again this season.

Through the season we tracked crop development in the plots and using plant counts and canopy cover assessments to try and separate the effects of population (establishment) and soil or other management factors.

We  built a web calculator to do the maths, aiming for a tool any grower or agronomist can use to aid decision making. The web calculator was used to test our theories about yield prediction and management zones.

ASL Software updated the “CoverMap” smartphone application and we obtained consistent results from it. The app calculates canopy ground cover and logs data against GPS position in real time. Because we have confidence that ground cover from image processing is closely related to Leaf Area Index we are working to turn our maps into predictions of final yields.

Maps of canopy cover created from the CoverMap smartphone application show significant variability across the paddock. Canopy increase is seen over time in two maps created a week apart

The current season’s MicroFarm crop is certainly variable. Some is deliberate: we sat the irrigator over some areas after planting to simulate heavy rain events, and we have a poorly irrigated strip. We know some relates to different soil and cover crop histories.

But some differences are unexpected and so far reasons unexplained.

Wide variation within the area new to onions does not follow artificial rain or topographic drainage patterns. This photo is of the area shown far right in the cover maps above.

Together with Plant and Food Research we have been taking additional soil samples to try and uncover the causes of patchiness.

We’ve determined one factor is our artificial rain storm, some crop loss is probably runoff from that and some is historic compaction.  We’ve even identified where a shift in our GPS AB line has left 300mm strips of low production where plants are on last year’s wheel tracks!

But there is a long way to go before this tricky crop gives up its secrets.

This project is in collaboration with Plant and Food Research and is funded by OnionsNZ and the MPI Sustainable Farming Fund.

We also appreciate the support of growers, seed companies and our MicroFarm sponsors Ballance AgriNutrients, BASF Crop Protection and the Centre for Land and Water.


 

In Search of Farm Robots: Ch 1

A version of this article previously appeared in The Grower

Dan Bloomer has been travelling in Australia and Europe asking, “How ready are robots for farmers and how ready are farmers for robots?”

Notable areas of active research and development globally are scouting, weeding and fruit picking.  Success requires machines that can determine and follow a route traversing whatever terrain it must, capture information, identify and selectively remove weeds, and identify, pick and transport fruit.  They have to sense, analyse, plan and act.

Robotics is widespread in industries such as car manufacturing that have the exactly the same task being repeated over and over again. With possible exception of robotic milking, farm operations are not like that. Virtually every single case is unique with unique responses needed.

Many groups around the world are looking at robotic weeding . There are many items needing attention. How do we tell weeds from crop plants? Can we do that fast enough and reliably enough to make a robot commercially viable on-farm? Once identified, how do we optimise robotic arm movement to best attack a patch of weeds?

The Australian Centre for Field Robotics (ACFR) at the University of Sydney is well known for its field robots such as the solar powered Ladybird. The new generation Ladybird is known as Rippa, and is currently undergoing endurance testing. Look on YouTube for ACFR videos and you’ll even see SwagBot moving around rolling hill country.

A key theme for Rob Fitch and colleagues is Active Perception: perception being what we can detect with what accuracy and confidence; active meaning in real time and including planning actions. They invest heavily in developing mathematics to get fast results. And they are succeeding.

Using Intel’s RealSense structured light camera it takes them less than half a second to identify and precisely locate groups of apples on a trellis. Within that time they also calculate exactly where to place the camera to get a second confirming view.

Smart maths allow ACFR scientists to capture 3D images and identify and locate apples in less than half a second
Smart maths allow ACFR scientists to capture 3D images and identify and locate apples in less than half a second

Cheryl McCarthy and colleagues at the National Centre for Engineering in Agriculture (NCEA) are conducting a range of research projects that integrate autonomous sensing and control with on-farm operations to robotically manage inputs within a crop. Major projects include automation for weed spot spraying, adaptive control for irrigation optimisation, and remote crop surveillance using cameras and remotely piloted aircraft.

At LandWISE 2015, Cheryl reported on their machine vision and sensing system for weed detection systems that uses depth and colour segmentation and a new processing technique to operate at commercial ground speeds of 10-15 km/h.

Now Cheryl is using UAVs to capture photos of crops, stitching the pictures to get a whole paddock image, then splitting it up again to efficiently identify and locate individual plants and weeds. This is enabling her to create accurate maps some other weed destroying robot can use.

cherylmccarthy
Research at the University of Southern Queensland investigates UAVs to scout paddocks combined with image stitching and analysis for interpretation to create maps of weeds for later treatment

SwarmFarm founders, Andrew and Jocie Bate grow cereals and pulses near Emerald. Spray-fallow is used to conserve water in this dryland environment and WeedSeeker® and Weedit® technologies reduce chemical use to a very small percentage of traditional broadcast application.

4WD SwarmFarm robots carrying WeedSeeker technology cover the paddock spraying only living weeds
4WD SwarmFarm robots carrying WeedSeeker technology cover the paddock spraying only living weeds

With large areas, most growers move to bigger machinery to maximise labour efficiency. This has a number of adverse effects including significant soil damage and inability to work small areas or work efficiently around obstacles such as trees.

SwarmFarm chose robots as practical light weight equipment. They reason that several small machines working together reduce soil impact and have the same work rate as one big machine. Andrew estimates that adoption of 8 m booms versus 34 m booms could increase the effective croppable area in Queensland by 2%.

Are these robots ready for farmers? Are farmers ready for these robots?

Only SwarmFarm has multiple machines currently working on farm in Australia. They are finalising a user interface that will allow non-graduate engineers (smart farmers) to manage the machines.

The question that remains is, “Why would I buy a specialised machine when I can put a driver on a cheaper conventional tractor or higher work rate sprayer and achieve the same?”

Is it the same?

Travel to Australia was supported by a Trimble Foundation Study Grant

In Search of Farm Robots: Ch3 Switzerland, France and England

This article originally appeared in “The Grower”

A desire to reduce soil compaction and avoid high and inefficient use of chemicals and energy inspired Steve Tanner and Aurelien Demaurex to found eco-Robotix in Switzerland.

Their solution is a light-weight fully solar-powered weeding robot, a 2 wheel drive machine with 2D camera vision and basic GPS. Two robotic arms position herbicide nozzles or a mechanical device for precision weed control.

Steve Tanner lab testing the exoRobotix vision and robotic weed control system

The ecoRobotix design philosophy is simplicity and value: avoiding batteries cuts weight, technology requirements and slashes capital costs. It is a step towards their vision of cheap autonomous machines swarming around the farm.

 Bought by small farms, Naio Technologies’ Oz440 is a small French robot designed to mechanically weed between rows. The robots are left weeding while the farmer spends time on other jobs or serving customers. Larger machines for vegetable cropping and viticulture are in development.

Prototypes V1, V2 and V3; precursors to the Naio Oz440 show the steps in a robot’s development

Naio co-founder Gaetan Severac notes Oz440 has no GPS, relying instead on cameras and LiDAR range finders to identify rows and navigate. These are small machines with a total price similar to a conventional agricultural RTK-GPS system, so alternatives are essential. 

Tech companies have responded and several “RTK-GPS” systems are now available under $US1000. Their accuracy and reliability is not known!

Thorvald an example of research collaboration: Norwegian University robot being automated at University of Lincoln show the common design of four wheel steer and four wheel drive

Broccoli is one of the world’s largest vegetable crops and is almost entirely manually harvested, which is costly. Leader Tom Duckett says robotic equipment being developed at the University of Lincoln in England is as good as human pickers at detecting broccoli heads of the right size, especially if the robot can pick through the night.  With identification in hand, development is now on mechanical cutting and collecting.

In 1996, Tillett and Hague Technologies demonstrated an autonomous roving machine selectively spraying individual cabbages.  Having done that, they determined that tractors were effective and concentrated on automating implements. They are experts in vision systems and integration with row and plant identification and machinery actuation, technology embedded in Garford row crop equipment. 

Parrish Farms has their own project adapting a Garford mechanical to strip spray between onion rows. Nick Parrish explained that Black Grass control was difficult, and as available graminicides strip wax off onions boom spraying prevents use of other products for up to two weeks.

Simon Blackmore is a global leader in farm robotics thinking at Harper Adams University. His effort to address robotic safety issues includes a seven level system:

  1. Route planning to avoid hazards and known obstacles
  2. Laser range finder to sense objects and define them as obstacles
  3. Wide area safety curtain sensing ground objects at 2m
  4. Dead man’s handle possibly via smartphone
  5. Collapsible bumper as a physical soft barrier that activates Stop
  6. Big Red Buttons anyone close can see and use to stop the machine
  7. Machines that are small, slow and light minimise inertia

“Hands free hectare” is Harper Adams University’s attempt to grow a commercial crop using open source software and commercially available equipment in an area no-one enters.

Harper Adams research to develop a robotic strawberry harvester is notable for the integration of genetics for varieties with long stalks, a growing system that has plants off the ground, and the robotic technologies to identify, locate and assess the ripeness of individual berries and pick them touching only the peduncle (stalk).

So what have I learned about farm robotics?

  • People believe our food production systems have to change
  • Farm labour is in short supply throughout the western world
  • Machines can’t get bigger as the soil can’t support that
  • Robotics has huge potential but when
  • Safety is a key issue but manageable
  • There is huge investment in research at universities, but also in industry
  • It’s about rethinking the whole system not replacing the driver
  • There are many technologies available, but probably not the mix you want for your application.

As Simon Pearson at the National Centre for Food Manufacturing says, “It’s a Frankenstein thing, this agrobotics. There are all sorts of great bits available but you have to seek them out and stitch them together yourself to make the creature you want.”

Dan’s travel was supported by a Trimble Foundation Study Grant

Onion Crop Research Plan

After identifying areas within paddocks that had yields limited by different probably causes, we conceived the idea of Management Action Zones (MAZs).

Yield assessments show considerable variation, limits imposed by population, growth of individual plants, or both
Yield assessments show considerable variation, limits imposed by population, growth of individual plants, or both

Some areas showed that yield was limited by plant number: establishment was poor. Others had the expected population, but low biomass: the plants were small due to some other limiting factor.

If we can identify zones easily, and determine the causes, we should be able to target a management response accordingly. So for this season, we set out a revised research aim.

What we want to know:

  • Can we successfully determine a management action zone in a field?

Why do we need to know this?

  • Develop a tool to increase uniformity and yield outcomes
  • Develop a tool to evaluate management practices and crop productivity

If we want to successfully determine a management action zone in a field then there are two main steps to achieve in this year’s work:

  • Confirm the relationship between digital data and crop model parameters
    • Does the relationship stay constant over time and sites?
    • How early in growth can a difference be detected?
    • Can the relationship be used to show a growth map across a field?
  • Develop an approach to gather information and ways to input and display results, initially using a website approach.
    • Can we integrate a plant count and yield information to start developing a management action zone?
    • How should this be put together in a way growers can start to use to gather information about their crops?

At the MicroFarm, we established six research zones based on paddock history and excessive wetness at establishment.

We have three paddock histories: two years of onion production with autumn cover crops of Caliente mustard, two years of onion production with autumn cover crops of oats, and no previous onion crops planted after previous summer sweetcorn and autumn sown rye grass. In each of these areas, we deliberately created sub-zones  by applying about 45mm of spray irrigation as a “large rain event”.

Artificial heavy rain event applied after planting and before emergence
Artificial heavy rain event applied after planting and before emergence

The impact of the artificial rainstorm is evident on images taken at the end of November.

The lasting effect of a heavy (artificial) rain event pre-emergence (right panel) shows low population and poor growth compared to areas without heavy rain (left panel)
The lasting effect of a heavy (artificial) rain event pre-emergence (right panel) shows low population and poor growth compared to areas without heavy rain (left panel)

Technology to Reduce N Leaching

N-Leach_WorkshopThe Precision Agriculture Association NZ is presenting workshops focused on technologies available to help reduce nitrogen leaching. There are two North Island workshops being offered at:

Massey University on Thursday 1st September 2016 [PDF here]

and

Ellwood Centre, Hastings on Friday 2nd September 2016 [PDF here]

Programme

The ‘Technology to Reduce N Leaching’ workshops are similar to the well received program conducted in Ashburton in March 2016 and will address where we are and what we can do about nitrate leaching limits in a North Island context utilising a range of technologies and farm systems options.

The particular areas for focus for the program are:

  • Variable rate technologies and systems
  • Precision irrigation
  • Precision spreading systems and services
  • Soil mapping
  • Soil moisture monitoring, sensors, metering
  • Nutrient budgeting and environmental monitoring

A Q & A time slot is devoted in the afternoon session for attendees to interact with members and presenters on the day to share learnings and understandings about the issues. This will also be possible over the lunch break on both days with one and half hours devoted for this.

PAANZ2

Offer to PAANZ Members

As part of the Hastings program only on 2nd September, PAANZ members are offered the opportunity to participate as trade/sector participants for technologies and products as may be appropriate to support the program.

PAANZ is not able to offer trade/sector stand space at the Palmerston North venue due to space restrictions unfortunately so only the Hastings venue will be able to accommodate this option for members.

If you would like to participate please advise Jim Grennell, E-mail: jim@paanz.co.nz

Mobile: 021 330 626, places are limited to ten organisations for the Hastings workshop to be involved as a trade/sector participant so it will be on a first come basis.

The cost of participation will be $100.00 plus GST per stand with attendance fee of $100.00 per person additional.

As these are indoors Workshops, with a technology focus and space at the Hastings venue is limited no large equipment or hardware can be accommodated.

Confirmation of members wishing to take up this opportunity is required by Monday 22nd August 2016 after which time the opportunity to participate will be made available to non-members.

Vision System for Onion Crops

Effective Sensing for Robotic Tasks- Still a Challenge

Chee Kit Wong

Kit Wong
Callaghan Innovation

 

Effective and reliable sensing for the performance of robotic tasks, such as manipulation in the outdoor environment remains a challenging problem.

While commercially available solutions such as ASA-LIFT are available for specific tasks and crops, and for operation in specific conditions, the systems are either not cost effective and or physically unsuitable for specific farming conditions and practices.

This research proposed to develop a mobile robot system with flexibility to adapt and with intelligence to cope with natural variability; through a two-fold aim utilising vision for navigation and manipulation. This talk discussed some of the recent developments on these aspects.

In particular, the talk focused on a novel approach that analyses point cloud information from a time-of-flight (ToF) camera to identify the location of foremost spring onions along the crop bed, for the intention of robotic manipulation. The process uses a combination of 2D image processing on the amplitude data, as well as 3D spatial analysis, extracted from the camera to locate the desired object.

Whilst the experimental results demonstrated the robustness of this approach, further testing was required to determine the ability of a system to cope with different scenarios that exist in the naturally varying environment.

For validation, the vision system was integrated with a robotic manipulation system and initial results of the investigation were presented.

A Digital Horticulture Research Strategy

Value Chain Approach To Identifying Priorities

Roger Williams

Roger Williams
New Zealand Institute for Plant & Food Research Limited (PFR)

 

The industrial revolution gave us machines and agri-inputs that enabled us to farm at scale and speed. The green revolution began to unlock the potential of plant genes to increase yield. Now the digital revolution provides us with an opportunity to harness the power of ‘big data’ and technological innovation to radically re-engineer our horticultural production methods and supply chains.

Digitally informed decisions during production, harvesting, sorting, packing, storage and transit could be the basis for a step change to high profitability, high resource efficiency and low footprint horticultural value chains.

Identifying the research priorities that we need to realise this opportunity in New Zealand is a challenge in itself, given the pace of developments in sensing technology, robotics and the internet of things globally. Accordingly, Plant & Food Research assembled an expert panel from across its science teams, augmented with other specialists from New Zealand and Australia, to develop a digital horticulture research strategy.

The panel has taken a value chain approach to identifying research priorities, particularly in relation to production, harvesting, sorting and packaging, storage and transit.  Future science needs are structured around the concepts of ‘sense, think, act’ for each part of the value chain and are linked by an ‘artery’ of data to feed forwards and backwards along the value chain.

Plant & Food Research looks forward to working with a wide range of partners to deliver this digital horticulture strategy for the benefit of New Zealand’s producers and exporters.

Mapping Onion Canopies

Investigating Technologies to Map Onion Crop Development

DanBloomer200

 

Dan Bloomer and Justin PishiefCentre for Land and Water

 

The OnionsNZ/SFF Project “Benchmarking Variability in Onion Crops” is investigating technologies to map onion crop development. The purpose is to better understand variability and to gather information to inform tactical and strategic decision making.

An AgriOptics survey provided a Soil EM map of the MicroFarm which was used as a base data layer and helped select positions for Plant & Food’s research plots.

As the crop developed, repeated canopy surveys used a GreenSeeker NDVI sensor and CoverMap, a Smartphone application. Both were mounted side by side on a tractor fitted with sub-metre accuracy GPS.  Altus UAS provided UAV survey data including MicaSense imagery with five colour bands captured. A mid-season 0.5 m pixel NDVI satellite image was captured.

Both ground based systems had difficulty recording very small plants. GreenSeeker data were dominated by soil effects until a significant canopy was present. Once plants could be seen in photographs, the CoverMap system was able to distinguish between plants and soil.

Direct photos of Plant & Food plots were processed to calculate apparent ground cover. A very strong relationship was found between these and actual plant measurements of fresh leaf weight and leaf area index – both strongly correlated to final crop size.

Attempts to directly correlate the map layers with Plant & Food field plot measurements were frustrated by inadequate or inaccurate image location. Onion crops have been found highly variable over small distances. The GreenSeeker only records a reading every four or five metres, and CoverMap about every 1.5 m. Compounded by errors of a metre or more, finding a measurement to match a 0.5 m bed plot was not possible. Similarly, the UAV and satellite images, while able to identify plots, did not initially show correlations.

Using ArcGIS, fishnets were constructed over the various canopy data layers and correlations between them found at 5 m and 10 m grids. The 10 m grid appears to collect enough data points even for the GreenSeeker to provide a reasonable if not strong correlation with other canopy layers.  Similar processes are being used to compare soil and canopy data.

After one season of capture, there appears to be merit in using an optical canopy cover assessment as plants develop. Once full canopy is achieved, the NDVI or a similar index may be better. Colour image analysis will be tested as a method of recording crop top-down as a measure of maturity and storage potential.

We were not successful in mapping yield directly, but did identify a process for creating a yield map based on earlier crop canopy data.