History and Instrumentation of the Public Service Laboratory, Illinois State Water Survey

Public Service Laboratory


The Illinois State Water Survey (ISWS) was founded as a part of the University of Illinois Chemistry Department in 1895 to survey the waters of Illinois to trace the spread of water-borne diseases.  Typhoid epidemics had hit the nation in 1893, and, while it was difficult to conclusively identify the origin of these outbreaks, circumstantial evidence indicated that contaminated water supplies were the primary factor. 

Arthur Palmer
Arthur Palmer was the first Director of the Illinois State Water Survey, serving from 1895 to 1904.

Professor Stephen A. Forbes, a well-known and respected scientist and Director of the State Laboratory of Natural History, had a strong interest in aquatic biology, and, in 1894, opened a biological field station on the Illinois River at Havana.  Chemical analyses of water samples from the Havana station were performed at the University of Illinois by Professor Arthur W. Palmer, and through these studies, Professor Palmer began to collect a significant amount of data on surface water quality in Illinois. 

More comprehensive laboratory studies of the state's water supplies became possible after the Illinois legislature, in 1895, provided additional funds and improved laboratory facilities for the University and Professor Palmer.  Residents of Illinois could request chemical analysis of water samples, and these would be done free of charge.  Realizing the importance of clean drinking water and the relationship of water impurities to disease, Palmer set initial survey goals of determining sanitary conditions of the state's water supplies and setting local standards of purity. 

Noyes Lab
In 1902 the Water Survey moved to the new Noyes Laboratory on the campus of the University of Illinois in Urbana, Illinois.

During the first 15 months of the survey, 1787 samples were analyzed in Palmer's chemical laboratories.  A fire in 1896 destroyed the laboratory and most of the water survey records, but the study continued at the University.  Professor Palmer recommended that the scope of the work be expanded, and the Illinois General Assembly provided funding to establish the State Water Survey as an institution, operating within the University of Illinois.

Existing Water Survey water sampling records begin with sample number 1788, dating back to January 1897.

PSL logbook Cover of earliest known Public Service Laboratory sample logbook.
Earlier records were destroyed in the 1896 fire.
PSL logbook The first page of the logbook shows sample 1788.  This sample dates back to January 1897 and marks the start of surviving records.

Most samples from 1896 were taken from homes in areas where typhoid or diphtheria was present, and Palmer found excessive nitrates in many of these water samples.  Routine Water Survey sample analysis included testing for turbidity, sediment, color, residue on evaporation, and several chemical parameters.   Interest in the Water Survey's program spread, and by the end of 1899, 6,597 samples had been analyzed.  By 1910, this number had grown to 21,715.  The ISWS continues to provide water testing services to Illinois residents, with the total number of samples analyzed standing at over 230,000.

Water Survey lab
Photograph showing a typical Water Survey laboratory in the 1930s.
Water Survey photo Water Survey photo

Early Water Survey photographs, taken when the Water Survey was located in Noyes Laboratory.


Testing kit
Photograph showing a young man with a field testing kit in about 1917.
Water Survey lab
Photograph showing a typical Water Survey laboratory in the 1960s.
Evolution of Testing Methods


One of the early parameters tested was turbidity.  The normal procedure in 1912 used the turbidity standard adopted by the U. S. Geological Survey:  "a water which contains 100 parts of silica per million in such a state of fineness that a bright platinum wire one millimeter in diameter can just be seen when the center of the wire is 100 millimeters below the surface of the water and the eye of the observer is 1.2 meters above the wire."   A rod with a platinum wire on the end was calibrated by placing graduation marks on the rod, at various distances from the end, and this was lowered into the water as far as the wire could be seen.  The vanishing depth was compared to a table of known values to get the measured turbidity.

A suggested alternative to the platinum wire method was the use of a candle turbidimeter.  This consisted of a graduated glass tube with a flat polished bottom, enclosed in a metal case.  Observations were made looking vertically down through the tube at an image of an English standard candle.  Water samples were poured into the tube until the image of the candle disappeared, and this depth was compared to a table of known values to determine experimental values.

By 1933, the Jackson candle turbidimeter became the standard for making turbidity measurements.  This instrument could not be used for samples with turbidity less than 25, however, and in these cases, turbidity measurements were made by other means.  For samples with turbidities between 5 and 25,  determinations were made by comparing samples to standards in clear glass bottles with a capacity of 1 liter or greater.  For samples between 0 and 2, a similar procedure involving long glass tubes and a Baylis turbidimeter was used.  

The Jackson candle turbidimeter is still referenced in today's methods, but modern turbidimeters measure the amount of light scattered by a sample, and this is compared to results from known formazin suspensions.  These nephelometric methods offer better precision, better sensitivity, and an increased operating range over older visual methods, and subsequently the Jackson candle methods is no longer listed in Standard Methods.  Turbidity today is expressed in nephelometric turbidity units (NTU).

Today's turbidimeters are more precise, have better sensitivity, and operate over larger ranges than older visual methods.


Early (around 1912) nitrate procedures--primarily a phenolsulfonic acid reaction and a reduction-to-ammonia reaction--varied depending on chloride content, and both were subject to considerable error.  The phenolsulfonic acid method, recommended for chloride less than 30 parts per million (ppm), called for evaporation of about 20 milliliters sample, followed by reaction with phenolsulfonic acid (mixture of phenol and sulfuric acid), then addition of a base to make the solution alkaline.  The mixture was transferred to a 100-milliliter Nessler tube, and the color was examined vertically through the tube and compared to prepared standards.  Nitrates caused a yellow color to form, and if samples contained high natural background color, they had to be decolorized (with a separate reaction) before the phenolsulfonic acid procedure. 

Nessler tubes
Nessler tubes were commonly used in methods producing colored solutions.  The density of the color was viewed vertically through the tube, looking down onto a white surface.

Variations of the phenolsulfonic acid method continued to be used through the 1940's, with increased attention paid to eliminating potential interferences due to color, alkalinity, chloride, and nitrite.  

By 1975, there were several potential nitrate methods:  ultraviolet spectrophotometry, nitrate electrodes, cadmium reduction to nitrite followed by derivatization and colorimetric measurement, reaction with brucine and colorimetric measurement, reaction with chromotropic acid and colorimetric measurement, and reduction with Devarda's alloy followed by ammonia analysis.  Because of the difficulties of the complex procedures, limited concentration ranges, and potential interferences, screening techniques were described to help chemists choose the best method.

Modern nitrate methods include improved variations of previous methods, along with a separation technique known as ion chromatography.  Today's instruments can also be combined with autosamplers and computers to provide faster and better results.

Ion chromatograph
Analyses for nitrate and other anions can be performed using an ion chromatograph.   

This instrument eliminates many potential interferences, and several analytes can be determined in a single run.

pH, Alkalinity, and Acidity

Alkalinity and acidity in 1912 were determined using various types of titrations and endpoint indicators.  Lacmoid and erythrosine solutions were used for alkalinity endpoints in titrations with sulfuric acid.  Other indicators, including methyl orange, were also used by some analysts, but the use of methyl orange was not recommended due to various complications and interferences. Phenolphthalein could also be used as a follow-up to the lacmoid or erythrosine procedures to help determine the specific type of alkalinity, whether due to carbonates, bicarbonates, or hydrates (caustic).  Similar titrations were done to determine acidity (where applicable), using sodium carbonate as a titrant and phenolphthalein or methyl orange as endpoint indicators.

Indicators were used in early water analysis methods and are still being used today.


Methods in 1933 were very similar, using phenolphthalein and methyl orange indicators, but there was more discussion of actual hydrogen ion concentrations and specific values at certain endpoints.  The hydrogen ion concentration was listed in methods separate from alkalinity or acidity, and the distinction was made between the "intensity" property of a hydrogen ion measurement and the "quantity" property of an alkalinity or acidity determination.  The acid or alkaline intensities could be expressed in terms of the hydrogen ion concentration in moles per liter, or by the more convenient pH scale, based on the negative logarithm of the hydrogen ion concentration.  Although the use of a hydrogen electrode and potentiometer was listed as a possible method for determining pH,  colorimetric/indicator methods were described as being "satisfactory for most control and research problems in water work."

Burets are used to dispense accurate quantities of reagents in a titration.  
An indicator is used to determine the endpoint (color change).

By 1946, the hydrogen electrode, which makes use of a platinum electrode and hydrogen gas, was recognized as the standard to which other methods and electrodes must be referred, but practical difficulties made this method inappropriate for daily use.  The glass electrode had emerged and was described as the most accurate of the existing methods for determining pH in a wide variety of solutions.  The Beckman Model G pH meter, using a glass electrode, reference electrode, and a current amplification system, was  a major revolution in pH measurement and in general scientific instrumentation.  Invented in 1934 by Arnold O. Beckman and produced until about 1950, this pH meter was easy to use, and it packaged its extensive electronics and electrodes in a compact design.  

pH meter
The Beckman Model G pH meter was initially made to measure the pH of lemons.  Its popularity increased as chemists realized its potential. It was compact, self-contained, and could quickly and easily measure the pH of almost any solution.

The instrument shown here was purchased in 1956. 

Although instruments and electrodes have changed and improved somewhat over time, they are still similar to their predecessor, the Beckman Model G.  Glass combination electrodes have replaced separate glass and reference electrodes, and meters are now smaller and better due to improved electronics.  Functionality has increased with computer interfaces and the addition of automatic sampling devices.

Modern instruments
Modern instruments monitor pH values, control titrations, run automated sample changers, and transfer data to computers.

Hardness, Calcium, Magnesium (and other Metals)

Hardness in water has long been recognized as being caused by the presence of certain minerals, primarily salts of calcium and magnesium.  These minerals cause precipitation of soap solutions, and consequently hard water requires excessive amounts of soap before a lather can be formed.  These observations were used in early methods (1912 and earlier) to test for hardness, which was defined as the soap-destroying power of water.  Soap solution was added to a water sample, followed by vigorous shaking, until a continuous lather was formed over the entire surface of the sample.  The amount of soap required was compared to results in a table of known values to determine the total hardness, expressed in parts per million as calcium carbonate.  Hardness could also be expressed in the English degrees of hardness:  grains of calcium carbonate per imperial gallon.

Soap solutions
Soap solutions were used in early methods to determine the hardness of the water.
The harder the water, the more soap it takes to form a lather.

A more accurate method of total hardness determination was by computation from the results of separate calcium and magnesium analyses.  This procedure was listed in Standard Methods as a "mineral analysis" and also determined silica, iron, aluminum, and manganese.  The process involved a series of evaporations, treatments with chemicals (acids, bases and others), dissolutions, precipitations, filtrations, flame ignitions, and titrations, and it could last several hours.  This was not considered necessary for ordinary analytical work, but it was used as a control for the soap method.  

Use of the soap method continued through at least 1946, with the calculation (from separate calcium and magnesium determinations) method considered the more accurate procedure.  Additional procedures had also been developed, including the palmitate method (titration and precipitation with potassium palmitate) and the soda reagent method (precipitation with a mixture of sodium hydroxide and sodium carbonate, followed by titration). 

 By 1960, the soap procedure was no longer listed as a standard method.  The preferred, more accurate procedure was calculation, with an EDTA titration method available as an alternative for rapid analyses. Accurate calcium determinations could be done using either a gravimetric method (lengthy) or a permanganate titrimetric method (for larger numbers of samples).  EDTA titration was available as a simple alternative, but care had to be taken to avoid magnesium reactions.  Magnesium could be determined gravimetrically after removal of calcium salts (using filtrate from calcium methods), or photometrically using a brilliant yellow dye and a simple spectrophotometer.  

Major instrumental changes were in effect by 1975.  Atomic absorption spectrophotometry had been introduced and could be used for analysis of a variety of metals, including calcium and magnesium.  It was fast and required relatively little sample preparation.  

Atomic absorption spectrophotometers can be used to determine a variety of individual metals.

By 1986, Inductively Coupled Plasma instruments were in place.  These enabled the determination of multiple metals at the same time from a single solution.

ICP instruments
Inductively Coupled Plasma (ICP) instruments allow determinations of multiple metals simultaneously from a single sample solution.  These highly sensitive instruments can be configured with a variety of detectors, depending upon the desired application.  Classic, earlier models like this vacuum system were large and generally not very portable.

During the last 20 years, design improvements, particularly with detectors, have increased the analytical capabilities and decreased the size of ICP instruments.  Many of today's models collect data on small CCD or CID detectors, rather than an array of photomultiplier tubes.

ICP instruments
Advances in technology have reduced the size of ICP instruments.  Modern models can now fit on benchtops or carts.

Miscellaneous older laboratory equipment

Evelyn Galvanometer
Evelyn Galvanometer, purchased around 1949
Odo Detector
Odor Detector
Evelyn Photoelectric Colorimeter
Evelyn Photoelectric Colorimeter, purchased around 1949

American Public Health Association.  1912.  Standard Methods For the Examination of Water and Sewage.  2nd Edition.  APHA, New York, NY.

American Public Health Association.  1933.  Standard Methods For the Examination of Water and Sewage.  7th Edition.  APHA, New York, NY.

American Public Health Association.  1946.  Standard Methods For the Examination of Water and Sewage.  9th Edition.  APHA, New York, NY.

American Public Health Association.  1960.  Standard Methods For the Examination of Water and Wastewater Including Bottom Sediments and Sludges.  11th Edition.  APHA, New York, NY.

American Public Health Association.  1965.  Standard Methods For the Examination of Water and Wastewater Including Bottom Sediments and Sludges.  12th Edition.  APHA, New York, NY.

American Public Health Association.  1976.  Standard Methods For the Examination of Water and Wastewater.  14th Edition.  APHA, Washington, D. C.

American Public Health Association.  1992.  Standard Methods For the Examination of Water and Wastewater.  18th Edition.  APHA, Washington, D. C.

Hayes, R. G.  1980.  State Science in Illinois. The Scientific Surveys, 1850-1978.  Southern Illinois University Press, Carbondale and Edwardsville, IL.

Please send comments about this page to Dan Webb.

Illinois State Water Survey

2204 Griffith Dr
Champaign, IL 61820-7463

Terms of use. Email the Web Administrator with questions or comments.

© 2018 University of Illinois Board of Trustees. All rights reserved.
For permissions information, contact the Illinois State Water Survey.
Site Map