I need to measure a concentration that is above the upper limit specified for my test. What can I do?
Add deionized or distilled water to the sample so that it is diluted to a concentration that can be measured. Run the test using the diluted sample, and then multiply the result by the dilution factor to find the concentration in the original sample (before dilution). Make sure reagents are added after the sample has been diluted to ensure that enough reagent is available to react with the parameter being measured.
Some parameters such as ozone, sulfide, ferrous iron, and oxygen scavengers are not suitable for dilution because they will be partially consumed by reactions with oxygen or lost to the atmosphere. In these cases, it is best to use a test method with a higher range. If there are impurities in the dilution water, disinfectants such as chlorine will be consumed, and results will be lower than actual.
What does it mean when a test is "over range"?
When a test produces an "over-range" result, it means that the concentration is above the upper limit specified for the test. The color that develops after reagents are added is darker than the kit or instrument can read.
When a sample is above the specified range, dilute a fresh portion of sample using deionized or distilled water. Run the test using the diluted sample, and then multiply the result by the dilution factor.
How do I dilute my sample to bring it within the test range?
The simplest dilution is a one-to-one or 2-fold dilution, where one part sample is mixed with one part deionized or distilled water. This reduces the concentration of anything in the sample by half. Follow the usual test procedure using the diluted sample, and then multiply the result by 2 to find the concentration of the sample before it was diluted.
- Example: Your sample reads off-scale on your kit or instrument display. Mix 5 mL sample with 5 mL deionized water and repeat the test. The result using this diluted sample is 1.9 mg/L. The concentration of the sample before dilution is (1.9 x 2) = 3.8 mg/L.
If the sample is still over-range after dilution, a larger dilution will need to be made. For example, mix one part sample to two parts deionized water for a 3-fold dilution (multiply result by 3), one part sample to three parts deionized water for a 4-fold dilution (multiply result by 4), and so on. If your test uses the sample as a blank, use the diluted sample as the blank.
It is very important to measure the amount of sample and deionized water accurately when making dilutions. An error in these measurements will lead to an error in the result, which will then be multiplied by the dilution factor. At minimum, measure the sample and deionized water in a graduated cylinder. For best results use a TenSette pipet for measuring the sample, add it to a Class A volumetric flask, and fill to the mark with deionized water. Mix thoroughly before testing.
You can adjust the measured volumes to accommodate the pipets and flasks available to you. For instance, if you need to make a 20-fold dilution, you can measure 1 mL of sample and add 19 mL of dilution water. Or you can measure 5 mL of sample and add 95 mL of water (or dilute to the mark on a 100-mL volumetric flask) to get the same dilution. If only 10 mL is required for running the test, use just 10 mL of this mixture. Use the extra for the blank or duplicate measurements if desired.
I received a standard solution that needs to be diluted, how do I do this?
Some standard solutions are only available at concentrations well above the range of the test and require dilution before use. Instructions for making these dilutions can be found in the Accuracy Check section of most Hach procedures.
If no instructions are available, choose a concentration for the dilution that falls within the specified test range and decide how much standard to prepare. Then calculate how much of your standard to use from this formula:
(concentration needed (mg/L)) x (amount to prepare (mL)) = mL of Standard to use
(concentration of standard needing dilution (mg/L))
Example: You purchased a 25-mg/L standard for zinc, but you need a concentration of 2 mg/L. To make 100 mL of a 2 mg/L standard:
2 mg/L x 100 mL = 8 mL
Use a pipet to measure 8 mL of the 25 mg/L standard into a clean 100-mL volumetric flask (a graduated cylinder can be used but results will not be as accurate.) Fill the flask to the 100-mL mark with deionized water. Mix well. Once a standard solution is prepared, use it right away to ensure accurate results.
View Hach's Standard Solutions Guide
What is a standard solution?
A standard solution is a solution that contains a specified concentration of a parameter, such as chlorine or iron. When a standard solution is used in place of a sample and tested, the result should match the concentration of the standard. A matching result will give confidence that the test is working correctly. Any significant discrepancies (greater than 10%) indicate a problem that must be investigated. Hach carries many Standard Solutions at concentrations that fall within most test ranges.
Some standard solutions are so unstable that they are difficult to make and use and are therefore not available. Such standards include hydrogen sulfide, chlorine dioxide, dissolved oxygen, and ozone. Chlorine standards are available only at high concentrations, and must be diluted with high-quality dilution water using glassware that has no chlorine-demand.
Standard solutions can also be used to calibrate instruments such as colorimeters and spectrophotometers, as well as electrochemical meters such as pH and pH/ISE meters. Varying concentrations of a standard solution can be used to make a calibration curve that in turn can be used to find the concentration of samples.
How do I use a standard solution to check accuracy?
The Standard Solution method, or using a standard solution in place of the sample, is the simplest way to use a standard solution. Where a Hach test procedure says to measure a portion of sample into a sample cell, add the standard solution to the cell instead, and follow the rest of the steps in the procedure.
If the result is close to the specified standard concentration, you can be confident that your instrument and reagents are working correctly and that you are performing the test correctly. Specific instructions can be found in the Accuracy Check section of most Hach procedures.
What if I get no color, or the wrong color when testing a sample?
There are several possible reasons why you may not see any color develop in a colorimetric test:
- The procedure is not being followed correctly. Carefully review the step-by-step procedure for your test to be sure you are adding all reagents and following all instructions correctly.
- The pH of the sample may not have adjusted to the proper range after reagents were added. Check the Sampling and Storage section of your procedure to find the optimum pH range for the test. If you do not find this information, add the reagents to a sample of deionized water and measure the pH of this solution. If the pH of your sample with reagents is significantly different, adjust the pH of a fresh portion of sample to this pH range using an acid or base, and try the test again.
- The concentration of the parameter you are testing may be below the measurable limit for your test.
- The sample may need to be digested. Check the instructions for your test to see if digestion is mentioned.
- There may be an interference in your sample. Follow the standard additions instructions to find if your sample might have an interference.
If the color that develops in your sample is significantly different from what it should be:
- Check the pH of your sample as described above and adjust if necessary.
- If your procedure mentions sample pretreatment by digestion, make sure you are digesting the sample.
- Dilute the sample. This can often dilute an interfering component to the point at which it no longer interferes. If this does not help, you will need to use a different analysis method.
I have a standard solution to check accuracy with my colorimeter/spectrophotometer or test kit, and keep getting zero. What is wrong?
Be sure to add reagents to the standard, just as if it were a sample. Color needs to develop in the standard solution the same way that it develops in the sample.
Follow the step-by-step procedure but substitute the standard solution for the sample. For example, add the standard to the sample cell, add reagents, wait any timed steps, and then read the standard in the kit or instrument.
What is the difference between a reagent blank and a sample blank?
A reagent blank refers to a small positive error in test results that comes from the reagents themselves. Hach Company makes every effort to manufacture reagents that have negligible blank values, and typically the values are so small that they do not affect test accuracy.
The reagent blank value is most important to measure and subtract from test results when measuring low concentrations. For example, subtracting a reagent blank value of 0.02 mg/L from a test result of 0.06 mg/L changes the result by more than 30 percent. On the other hand, subtracting a reagent blank value of 0.02 mg/L from a result of 1.23 mg/L changes the results by less than 2 percent.
To measure the reagent blank, use good quality deionized water in place of your sample and run the test as usual, adding reagents and waiting any timed steps. Then subtract this value from your sample results. The reagent blank value can change from one reagent lot to the next. Therefore, measure the reagent blank each time you use a different lot of reagent.
A sample blank refers to using the sample for zeroing an instrument during a test procedure. A sample blank can correct for potential error from existing color or turbidity in the sample before reagents are added.
When zeroing the instrument on a sample blank, only the color that develops from reaction with the reagents is measured. Because background color and turbidity can vary from sample to sample, a sample blank is most commonly used to zero the instrument.
What are SpecCheck Standards and how are they used?
SpecCheck Standards are colored gel standards designed to check instrument response on Hach colorimeters and spectrophotometers. They provide a quick and simple way to verify that your instrument response has not changed.
SpecCheck Standards consist of one blank and 3 colored standards in round 10-mL sample cells, each set corresponding to a Hach test method such as the DPD chlorine test. Zero the instrument with the blank SpecCheck Cell, and record the concentration of each of the 3 colored standards. Then measure the concentrations periodically to check for any significant changes over time.
SpecCheck Standards are secondary standards and cannot be used for calibrating a spectrophotometer or colorimeter. They can be used only to check instrument performance; they cannot indicate whether reagents are working properly or whether the operator is performing the test correctly.
What are DR/Check Standards?
DR/Check Standards are secondary gel standards having varying shades of black. These standards are used to verify colorimeter or spectrophotometer response at any wavelength.
Use the absorbance mode on your instrument when measuring DR/Checks. Set the instrument to a particular wavelength, zero with the DR/Check blank, and read the absorbance of each of the 3 standards.
Measure the aborbance of these standards periodically to check for any significant changes in instrument response.
Can I use a colorimeter or spectrophotometer to measure turbidity?
Turbidity can also be approximated in an instrument such as a colorimeter or spectrophotometer by measuring the decrease in transmitted light due to turbidity in the sample. This type of measurement, however, is not considered valid by regulatory agencies and does not fit the definition of turbidity by the American Public Health Association.
Transmittance measurements are also susceptible to interferences such as light absorption from color or particle absorption. What´s more, there is no correlation that can be made between transmittance measurements (reported as FAU) versus measurements from a turbidimeter (reported as NTU). Nevertheless, colorimeters and spectrophotometers are sometimes used for determining large changes in the turbidity of a water system or for process control.
Any true turbidity measurement must be made in a turbidimeter. Hach Company has been designing and manufacturing turbidimeters for more than 50 years, and has a great deal of expertise in this area.
What does standard additions mean?
The standard additions method is a widely accepted technique for checking the accuracy of a test. This method is particularly useful for determining whether interferences are present in a sample.
A spike, or measured amount, of a standard solution is added to a sample, and the concentration is measured before and after the spike. The concentration after the spike should increase by a known amount. If the sample contains an interference, the increase in concentration may be less than or greater than the expected concentration.
You can find step-by-step instructions for running a standard additions test in the accuracy check section of most Hach procedures.
How can I perform standard additions?
Detailed instructions for performing standard additions can be found in the Accuracy Check section of most Hach procedures. You will find the concentration and exact volumes of standard solution to use. Many Hach instruments will record the concentrations and amount of standard used and calculate the difference between the expected and actual result.
If these instructions are not available, add standard solution to the sample three times in equal increments, for instance 0.1, 0.2, and 0.3 mL. For colorimetric tests, add the standard to fresh sample portions, before reagents are added. Run the test as usual and measure the concentration after each addition. Compare the actual results to the expected results to determine if there is a discrepancy.
To determine the expected concentration, multiply the volume of standard added by the standard concentration, and then divide by the volume of the sample (the volume of standard added should be included in the sample volume for best accuracy). For more information, see the Standard Additions Instruction Sheet in the Technical Reference Database section in the Download Documentation area.
What are interferences?
Interferences are substances in water that can interfere with the chemical reactions involved in a test.
Interferences can cause results that are higher or lower than the correct result.
How can I check for interferences in my sample?
The standard additions method is the best way to find whether your sample contains an interference. This method involves adding a spike, or known amount, of the parameter you are testing directly to the sample. If you measure the sample before and after the spike, you should see the result increase after the spike by an expected amount.
- Example: you test a sample for copper and get a result of 1.0 mg/L copper. Then you spike a fresh portion of sample with 0.5 mg/L copper. You expect to see a result of 1.5 mg/L after testing the spiked sample.
If the test on the spiked sample is significantly lower or higher than 1.5 mg/L, your sample may contain an interference with the copper test. Be sure that you are running the test correctly and can get a correct result using a standard solution.
Most tests require a specific pH range to work properly, and reagents include buffers to adjust the pH to that level. If your sample pH is extremely low or high, or has very high alkalinity, it is possible that the sample is not adjusted to the correct pH.
If your sample contains an interferent, the best thing to do is try diluting it to a level at which it does not interfere. If this does not help, you will need to use a different analysis method.
My regulator requires me to perform a calibration on my colorimeter or spectrophotometer. Is this a good idea?
First make sure that your regulator requires a new calibration and not just a calibration check for the test you run. A new calibration is not only time-consuming but has the potential to introduce error and can result in erroneous data. Hach instruments are calibrated by experienced chemists using multiple lots of reagent, standard, and multiple instruments and are independently verified to ensure accuracy.
It is safer to use a standard solution to verify the existing calibration curve in an instrument rather than enter a new calibration into the instrument. If the correct result is obtained using a standard solution, a new calibration is not necessary.
If the resulting concentration is significantly different from that of the standard (more than 10-20 percent), make sure the procedure is followed exactly, that the correct program number on the instrument is used, that the correct sample cells and specified reagents are used, and that glassware has been cleaned thoroughly.
Be sure also that the chemical form on the standard solution matches that of the result, for example if the label on a phosphorus standard solution reads as PO4, be sure the result from the instrument or kit is also displayed as PO4 and not as P.
If your regulator requires a new calibration, see the question on using standard solutions to prepare a calibration curve.
How do I prepare a calibration curve in my colorimeter or spectrophotometer?
Hach colorimeters and spectrophotometers are carefully calibrated during manufacture and do not require additional calibration. However, these instruments will allow calibrations for colorimetric tests if desired.
To calibrate an instrument for a particular test, put the instrument in absorbance mode, select the wavelength for the test, and prepare 3 to 10 different concentrations of standard solutions that span the full test range. Follow the Hach procedure (or your own procedure) to develop color in each standard, and the measure the absorbance of each standard in the instrument.
Most Hach instruments can record the absorbance and concentration of each standard, and then calculate a calibration curve. See your instrument manual for specific instructions on setting your instrument up to do this.
After your instrument is calibrated, it will calculate the concentration of unknown samples using your calibration curve.
More in-depth information on preparing calibration curves can be found in "An Introduction to Standards and Quality Control for the Laboratory" in the Hach Learning Library.
How can I convert percent (%) to mg/L and vise versa?
Multiply percent by 10,000 to get mg/L (1% = 10,000 mg/L).
Multiply mg/L by 0.0001 to get percent.
How can I convert mg/L to ppm?
1 mg/L = 1 ppm for dilute aqueous solutions. For example, a chlorine concentration of 1.8 mg/L chlorine is equivalent to 1.8 ppm chlorine.
How can I convert grains per gallon (gpg) to milligrams per liter (mg/L)?
Multiply gpg by 17.1 to get mg/L (1 gpg = 17.1 mg/L). For example, a hardness test result of 3 gpg as CaCO3 is equivalent to 51 mg/L as CaCO3.
How can I convert percent transmittance to absorbance, or absorbance to percent transmittance?
To convert a value from percent transmittance (%T) to absorbance, use the following equation:
- Absorbance = 2 - log(%T)
- Example: convert 56%T to absorbance:
- 2 - log(56) = 0.252 absorbance units
To convert a value from absorbance to percent transmittance, use the following equation:
- %T = antilog (2 - absorbance)
- Example: convert an absorbance of 0.505 to %T:
- antilog (2 - 0.505) = 31.3 %T
What is a grain per gallon (gpg)?
A grain is a historic unit of weight, originally defined in England as the weight of a barleycorn. One grain is equivalent to 64.799 milligrams. A grain per gallon is the weight of a substance in one gallon of water, similar to a milligram per liter.
Converting grains per gallon to milligrams per liter:
- 1 grain = 64.799 milligrams
- 1 gallon = 3.785 liters
- 64.799 divided by 3.785 liters = 17.1
- 1 gpg = 17.1 mg/L
What is the difference between accuracy and precision?
Accuracy refers to how close your result is to the true value. Precision refers to how well you can repeat your results using fresh portions of the same sample.
Although good precision suggests good accuracy, precise results can be inaccurate. For example, a result of 1.0 mg/L using a 1.0 mg/L standard solution is a very accurate result. Results of 1.73 mg/L, 1.74 mg/L, and 1.75 mg/L using a 1.0 mg/L standard solution shows good precision, but poor accuracy.
The best way to check accuracy and precision is to use a standard solution that has a known concentration. If your results using the standard are close to the standard concentration, your result is accurate. If you repeat the test several times with fresh portions of the standard and get similar results, your results are precise.
What is Standard Methods?
Standard Methods for the Examination of Water and Wastewater is a standard reference text used in water plants worldwide for testing all aspects of water quality. It contains hundreds of USEPA-approved laboratory procedures and represents the best current practice of water and wastewater analysis. It is published jointly by the American Public Health Association, Water Environment Federation, and American Water Works Association.
Does Hach have a test for potassium permanganate?
Hach does not have a direct method for potassium permanganate, however Hach method 8034, periodate oxidation for high range manganese can be adapted to measure permanganate.
To adapt this method for permanganate, do not add the sodium periodate reagent. Add the buffer reagent as usual. The reaction time is not necessary.
Multiply your manganese results by 2.16 to get results as permanganate (MnO4), or by 2.88 for results as potassium permanganate (KMnO4), or select the MnO4 or KMnO4 form display on your instrument. This is not a highly sensitive test and may not measure below 1.5 - 2 mg/L KMnO4 accurately.
Do you have tests especially for saltwater aquariums?
The Saltwater Master Test Kit, Catalog No. 2068600, measures ammonia, nitrate, nitrite, and pH in saltwater samples and is easy to use. Individual kits for these parameters are available from Aquarium Systems Inc., www.aquariumsystems.com.
Hach also has test kits for copper, alkalinity, phosphate, and hardness that can be used for testing saltwater.