Testing an NIR sample on farm

Farmers getting their forages analysed using NIRS (near infrared spectroscopy) could be getting a “poor deal” with investigations showing potential for major inaccuracies in some of the results being produced.

More worrying, this could have already led to some farmers over-feeding expensive protein sources this year as a result of labs underestimating true forage potential. 

Independent consultant Dr Dave Davies of Silage Solutions sampled 177 grass silages in January 2017 using wet chemistry and NIRS analysis as an add-on to an AHBD-funded study looking at silage losses on 20 English farms.

See also: Farmers face forage shortages and soaring feed costs

Speaking exclusively to Farmers Weekly, he said he found that while NIRS predicted the dry matter in line with wet chemistry analysis, it fell short when it came to sugar predictions.

In fact, NIRS samples under-predicted real sugar values by 7% units on average, compared to wet chemistry and the difference became greater the higher the silage sugar content.

Problems this causes on farm

In a farm situation, Dr Davies said this could equate to 60g more sugar being fed per day than the cow actually needs, which could lead to metabolic disease issues such as acidosis.

“There’s a lot of early-cut silage this year that’s not being rationed correctly,” he warned.

Dr Davies said the failing did not mean the NIRS technique was inferior, but was reflective of the databases labs used to predict the results.

These databases are based on historic chemical analysis and are then used to predict the quality of fresh silage samples submitted using near infrared spectroscopy techniques.

But, according to Dr Davies, unless these databases are kept up to date, it is difficult to predict silage quality accurately.

“Forages change year on year. This year, early and multi-cut grass silages have higher protein contents; the database should therefore change year on year too,” he added.

He warned:

  • He’d come across one forage analysis whereby pure wholecrop beans had been analysed as a wholecrop cereal silage, because there was no like-for-like data
  • Another grass silage analysis had an NDF of over 100%, which he said was “impossible” and must have been due to the silage falling out of database parameters.

Dr Davies heavily criticised some datasets used by some UK labs for not keeping place with the marketplace. He said silage quality on some farms had improved drastically in recent years but databases had not.

“It’s not NIRS that’s the problem. NIRS is idiot proof providing you get the calibration right.”

He urged farmers to put pressure on companies carrying out testing by asking them more probing questions (see “What to look out for to ensure you’re getting a reliable result” below).

He added: “The more industry pressure that’s put on them the more they will have to come up with the goods.

“Things have to change because this technique should be good and it’s messing up rations.”

What to look out for to ensure you’re getting a reliable result

  1. If you’ve had your silage analysed and it has a negative value then it’s wrong. While a zero is possible, negative values are impossible so discard the results.
  2. The percentage sum of NDF, ash, crude protein, oil, sugars/starch, lactic acid and volatile fatty acids should be between 85-95%.
  3. Ask if the lab has a specific calibration for the forage you want analysing if it’s something other than maize or grass silage such as beans or mixed legume wholecrop. If not, there’s no point in getting it analysed.
  4. Find out how many UK samples they have in their dataset from the last two years to check they are updating it regularly. Some databases are based on samples from other European countries which won’t be as accurate as forage growing conditions differ.
  5. While the average data is showing higher silage proteins this year, still ask for protein analysis to be done by wet chemistry because the actual NIRS-predicted proteins are analysing lower than the true protein value may be. If the analysis is very different from NIRS, ask for your money back.

Response 

The Forage Analysis Assurance Group (FAA), which represents the companies carrying out NIRS lab assessment, said it was “disappointed” by the trial, which it said cast doubt on the validity of NIRS in the UK and was being aired publicly, rather than in a scientific forum.

“We understand the data have neither been published nor submitted to critical peer review. Without access to the data and the statistical analysis we cannot offer an opinion on either the rigour of the method or the validity of the results and conclusions drawn.”

The spokesperson said the group was “disappointed” the trial, which appears to be casting doubt on the validity of NIRS in this country, is being aired publicly rather than in a balanced scientific forum.

“With conserved forage representing a major proportion of dry matter intakes, accuracy of analysis is paramount to allow cost-effective diets. As a group, we are committed to enabling all of our members to undertake the most accurate analysis of the sample presented.

“Nevertheless, the biggest single factor affecting the accuracy of diet formulation on farm is the quality of the sample actually provided for analysis and the frequency of sampling. 

“Unless the sample is a representative sample of the entire clamp face, it will never give a true reflection of the forage fed and as such accurate ration formulation will be difficult.

“Equally, the quality will vary throughout the clamp (different cuts, different fields, different swards, different weather) and so representative samples should be taken regularly and diets fine-tuned appropriately.”

The FAA said that numerous steps had been taken to ensure the validity of NIRS, including:

  • The establishment of a UK specific database that has been consistently refined to allow the performance of accurate NIRS testing of fresh preserved forages.
  • Adherence to the ring testing methodology, whereby common samples are analysed at participating laboratories, to help drive improved consistency and reduce variation between testing facilities
  • Many members also carry out their own additional QC procedures comparing ISO 17025 accredited wet chemistry to NIRS outputs to maximise the ongoing accuracy of the routine silage analyses. 
  • Where samples analysed by NIRS do not fit the normal analytical range for the forage type under test, samples can be checked according to procedures which typically include wet chemistry.

Wet chemistry and NIR explained

Wet chemistry relies on traditional laboratory-based approaches for each individual analyte.

To determine dry matter, a sample will be placed in an oven at approximately between 80C and 100C until it is dry.

It will be weighed before placing in the oven and again after drying and the %DM calculated.

For pH a known weight of fresh silage sample will be added to a specific volume of water and mixed for a set amount of time depending on the exact method and then the pH measured with a pH electrode. 

NIRS analysis involves shining NIR light at the sample. As a result the bonds between the molecules emit further NIR light, which is measured by the spectroscope.

At each wavelength an intensity is measured to give a profile. Then using chemometrics (mathematical models) a prediction of the analytes is made using samples that have previously been measured by chemistry and scanned by NIRS to build the relevant database. 

The more samples of a similar type in the database the more accurate the prediction.

There needs to be enough samples for each individual analyte to be predicted and they have to be representative of the samples assessed.

For example, if all the grass silages in the database range between 18-30%DM and a grass silage sample of 50%DM was assessed, then the prediction would be less accurate compared with a 30%DM sample that was analysed.