Calibrate for Water Conservation

By Jeffrey Gilbert

Irrigation systems are probably one of the biggest investments made to help maintain landscape plants. They also are both customized for a particular site and probably the most heavily used ‘tools’, sometimes on a daily basis, especially in the desert Southwest. But unlike other tools, most of the irrigation system is buried and often operated at night and therefore out of sight.

Like any other equipment, the irrigation system requires regular maintenance. And, similar to spray and fertilizer equipment, calibration is essential to guarantee the desired application.

 Learning how to audit and calibrate your sprinkler system can save water and money.

Learning how to audit and calibrate your sprinkler system can save water and money.

While precise calibration is very important when applying pesticides and fertilizer so as not to waste product or do any harm, calibration of a sprinkler system does not seem to get the same regular attention. A fertilizer treatment may require calibration every time an application is made. The same attention if given to a sprinkler system would be impractical, since sometimes daily watering will be required.

But given the fact that water resources are increasingly scarce and expensive, more frequent calibration of the irrigation system is warranted. Testing and calibration of an irrigation system should be done on at least an annual basis and especially when any significant changes have been made to the system.

PRECIPITATION RATES

The precipitation rate is probably one of the most useful pieces of data collected during an audit. The precipitation rate when determined using an audit represents ‘real-world’ values that account for actual sprinkler spacing, arcs as-set, variable system pressure throughout a zone, wear on the individual components, misaligned sprinklers, blocked patterns and any “customization” that has been done since the system was originally installed.

Even with the greatest attention to detail, an original installation can vary from the design. So even a new system can differ significantly from the intended design enough to affect overall performance.

System pressure can decrease, especially if surrounding areas have been ‘built up’ with new residential or other commercial properties since the original installation. Normal wear can increase the orifice size of the nozzles and intern increase overall demand within a zone and lower system pressure.

Modifications that change the nozzle sizes and/or brands of sprinklers within a zone can increase or decrease operating pressure. The combination of wear and repairs can result in the undesirable outcome of different precipitation rates within a zone. The need to install one or more additional sprinklers within a zone to cover ‘dry spots’ given that in the past there was no problem may be a symptom of decreasing system pressure.

Precipitation rates provided in manufacturers catalogs and included in some ‘smart’ controllers represent ‘gross precipitation rates’. These values do not account for any losses due to drift, off-target application or evaporation as do the values collected from an actual audit. These gross precipitation rates assume precise spacing, unchanging pressure on all heads within a zone and exact arc settings of, for example, 180 or 360 degrees etc.

There is an assumption made that all of the water that passes through the nozzle will reach the turf. Combined with distance measurements for sprinkler spacing that are obtained in zero-wind, these ‘gross’ precipitation rates can misrepresent real-world values. Using these gross precipitation rates to calculate runtimes can result in deficit applications of desired water.

Increasing importance and desire to conserve water makes knowing the precipitation rate of an irrigation system even more essential. Reliance on ‘smart’ technology still depends on accurate determination of precipitation rates to ensure minimal waste and acceptable distribution uniformity or ‘DU’.

Determination of precipitation rates improves overall distribution uniformity by balancing the application rate across multiple zones. For example, in large sport fields and golf course fairways where more than one zone is used to irrigate, knowing the precipitation rate of each individual zone balances the overall applied water and improves DU.

Older controllers without all of the new technology can be just as effective in applying irrigation, given the proper calibration of the system and good management practices. Irrigation management relies on accurate assessments of the precipitation rates of the system. After all, what really matters is the precise application of a pre-determined amount of water at the appropriate interval and time-of-day.

CALIBRATION

Performing a catch cup test is the best way to determine precipitation rates. Using catch cups will provide the most realistic values and account for the actual conditions at the site. The more catch cups used, the more accurate the final results will be. And, using catch cups with a larger surface area, given the use of the same number of cups, will increase accuracy since a greater percentage of the lawn area can be represented.

The results from an actual catch can test represent the ‘net precipitation rate.’ These values can differ from the gross precipitation rates provided in a catalog by as much as 10-25 percent, sometimes more. This means that not all of the water that passes through a sprinkler nozzle actually reaches the desired turf area.

To confirm this fact, record the meter readings before and after conducting a catch cup test. The ratio of the average depth collected in the cups divided by the depth applied by the gallons of water across the given lawn area will be the application efficiency of the system.

In addition to the determination of the precipitation rate and the DU, graphing the depth in inches collected in each cup as a ‘surface response’ can be very useful. By graphing the catch can data using a spreadsheet program, a visual representation of the overall distribution of coverage can be displayed. This ‘picture’ of the various depths of water looks similar to a topographical map and can be a much better way to diagnose problem areas than reliance on a single ‘grade’ as provided by DU.

Short of performing a catch can test, taking meter readings, recording the runtime and determining the landscape area can provide another means of accessing the application rate of a sprinkler system or zone. Measuring the working pressure and spacing between each sprinkler can be used to adjust the precipitation rate as given in a catalog, remembering that not all of the water will reach the turf.

If the existing measurements are collected from the field for the spacing between sprinklers within a row and the distance between rows, a more accurate value can be obtained compared to the fixed spacing listed in a catalog. As discussed earlier, these are ‘gross’ precipitation rates and will vary from those determined via an actual catch cup test.

With more frequent testing and calibration, maximum performance of the irrigation system can be achieved, reducing water waste while maintaining optimum plant health.

Jeffrey Gilbert is a Senior Research Specialist at the School of Plant Sciences at the University of Arizona, Tucson.