Tải bản đầy đủ - 0trang
II. Fate of Wastewater Constituents in Soil
HERMAN BOUWER AND R. L. CHANEY
required to avoid a build-up of solids and, hence, a general decline in the
infiltration rates in the basins.
Biological clogging of the surface soil may also be caused by bacterial
action, including production of polysaccharides and other organic compounds, if the wastewater contains a high dissolved organic matter content.
Human urine, for example, with a chemical oxygen demand of 4000
mg/liter caused such clogging in fine-sand filters that the resulting infiltration rates were too low for practical application, and coarser sand had
to be used (California Institute of Technology, 1969).
Thomas et al. (1966) observed accelerated clogging of soil columns
flooded with septic-tank effluent when the soil became anaerobic. Clogging
was concentrated in the top centimeter. While sulfide accumulation could
be used as an indicator of anaerobiosis, it was not a direct cause of clogging. Drying the soil caused infiltration recovery equivalent to the decrease
in infiltration during anaerobic conditions. Since organic matter was the only
material that declined during drying, clogging was attributed to the accumulation of polysaccharides, polyuronides, and other organic compounds
Nevo and Mitchell (1967) found that low redox potentials inhibited
degradation of polysaccharides in laboratory experiments, but had little
effect on the production of polysaccharides, indicating the need for regular
drying or resting periods of treatment fields to avoid declines in infiltration
rates. These workers also found that at temperatures below 20°C decomposition of polysaccharides was inhibited but synthesis slowly continued. Between 20 and 30"C, production and degradation rates of polysaccharides
were approximately equal, and both rates increased with temperature. At
37"C, little polysaccharide was produced, but the decomposition rate continued to increase. This indicates that soil clogging caused by formation
of polysaccharides may be of greater concern in cool climates than in warm
Regardless of climate, the optimum schedule of wastewater application
and drying or resting of the soil must be evaluated by local experimentation. For the Flushing Meadows Project (Bouwer, 1973a; Bouwer et al.,
1974a), maximum long-term infiltration rates were obtained with flooding
periods of about 18 days, rotated with drying periods of about 10 days
in the summer and 20 days in the winter. At the Whittier Narrows spreading
grounds (McMichael and McKee, 1965), basins are flooded for about 9
hours and then dried for about 15 hours. With this schedule, about 2 feet
per day infiltrated into the soil. Wastes containing very high solids contents
may be applied only a few hours each week to allow drying and decomposition of the solids layer. Bendixen el al. (1968) reported satisfactory
performance of a ridge-and-furrow system in northern latitudes where sec-
LAND TREATMENT OF WASTEWATER
ondary sewage effluent was applied on a 2 weeks on and 2 weeks off
Clogging from excessive accumulation of suspended solids on the surface
of the soil can be a problem in disposal fields with poor surface drainage.
Because of reduced infiltration rates, surface runoff will develop and water
will collect in the low places of the field, causing anaerobic conditions in
the solids layer and the underlying soil. This will reduce the rate of decomposition of the solids, and odor and insect problems may develop. It is
generally desirable to remove as much suspended material from the wastewater as possible before the water is applied to land.
Overland-flow systems can effectively remove suspended solids of wastewater. Thomas (1973b) reported a suspended solids reduction from an
average of 160 mg/liter (range 52-420) to 6-12 mg/liter for comminuted
raw sewage applied to vegetated plots that were 36 m long and had a slope
of 2-4%. The loading rates were from 7.4 to 9.8 cm/week, applied daily
(except Sundays) in 8-9 hours. Law et al. (1970) found that the suspended solids content of screened cannery waste was reduced from 245 to
16 mg/liter by vegetation filtration over a distance of 45-100 m at loading
rates of 0.9 cm/day applied in 6-8 hours. Other solids removal percentages are 95% for primary sewage effluent after 365 m of overland flow at
the Melbourne system (Kirby, 1971), a reduction of 56.4 to 15.0 mg/liter
for humus tank effluent at the high loading rate of 85 cm/day in an English study (Truesdale et al., 1964), and from 5215 to 63 mg/liter for sugar
beet waste in a Nebraska study (Porges and Hopkins, 1955; Hopkins
et al., 1956).
Wastewater contains a variety of natural and synthetic organic compounds, usually not individually identified, but collectively expressed in
terms of the biochemical oxygen demand (BOD, determined normally after
5 days incubation), the chemical oxygen demand (COD, usually determined
with the dichromate technique), or the total organic carbon content (TOC,
determined as the difference between total and inorganic carbon). The
BOD and COD tests were developed primarily for oxygen regimes in
aquatic environments. For land treatment, however, TOC content may
be the most appropriate parameter. In addition to the carbonaceous oxygen
demand, wastewater contains a nitrogenous oxygen demand for oxidation of
organic or ammonia nitrogen to nitrate. The oxygen demands for other
constituents are negligible, except perhaps for certain special wastes containing large amounts of sulfide, reduced iron, or other reduced
HERMAN BOUWER AND R. L. CHANEY
Good-quality secondary sewage effluent may have a BOD of around
10-20 mg/liter, a COD of 30-60 mg/liter, and a TOC content of 10-30
mg/liter. The relation between TOC and COD was evaluated as
TOC = 0.25 COD
for secondary effluent (domestic and light industry) from the Phoenix
area (Bouwer et al., 1974b). This relationship also includes measurements
on renovated sewage water obtained by high-rate land treatment. The nitrogenous oxygen demand of secondary sewage effluent where most of the nitrogen is in the ammonium form, may be in the range of 100-200 mg/liter.
Wastes from vegetable or fruit processing plants may have a BOD of
several hundred to several tens of thousands of milligrams per liter
(W. G. Knibbe, personal communication, 1973; California State Water
Resources Control Board, 1968; Splittstoesser and Downing, 1969; Rose
et al., 1971; Colston and Smallwood, 1973). Splittstoesser and Downing ( 1969) reported a COD/BOD ratio of 1.4-2 for vegetable processing
effluents. Incompletely digested sewage sludge and liquid animal wastes
have BOD’S of several hundred to several tens of thousands of milligrams
per liter, depending on the density of the slurry or effluent (Loehr, 1968;
Erickson et al., 1972). The COD of animal wastes may be 2 to 3 times
as high as the BOD (Erickson et al., 1972).
The soil with its biomass is extremely versatile and effective in decomposing natural and synthetic organic compounds. The processes can be divided
into aerobic metabolisms where CO,, H,O, microbial cells, and NO,- and
SO,*- are the main end products, and anaerobic metabolisms. The latter
occur at a slower rate and are less complete, organic intermediates being
formed. These include acids, alcohols, amines, and mercaptans. The end
products of anaerobic decomposition consist of CH4,H,,NH,+, and H,S in
addition to CO, and H,O (Miller, 1973). Organic carbon, whether supplied
to the soil by the wastewater or produced in the soil by autotrophic bacteria, is a main factor in denitrification, since it supplies the energy for
the denitrifying bacteria.
Theoretically, aerobic conditions in the soil should prevail so that the
total oxygen demand (sum of carbonaceous, nitrogenous, and other oxygen
demands) of the waste load is balanced against the amount of oxygen
entering the soil. Oxygen enters the soil (1 ) as dissolved oxygen in the
wastewater applied (usually negligible), (2) as mass flow after the start
of a drying or resting period, when the soil drains and air replaces the
draining water in the soil, and (3) by diffusion from the atmosphere after
the soil has drained. The deeper the water table and the higher the drainable pore space fraction of the soil, the more oxygen enters the soil as
LAND TREATMENT OF WASTEWATER
mass flow after infiltration stops. The longer the drying period, the more
oxygen will enter by diffusion in relation to that which has entered the soil
by mass flow.
The depth to which oxygen can penetrate the soil by diffusion is limited
and does not exceed a distance of about 1 meter in all but the most porous
soils (Pincince and McKee, 1968; Lance et al., 1973). Lance et al. (1973)
also found that the amount of oxygen entering by diffusion was 1.5 times
greater than the amount entering by mass flow when laboratory soil
columns were flooded with secondary sewage effluent on a 2-day wet, 5-day
dry cycle, but twice that amount with a 9-day wet, 5-day dry cycle. Most
of the oxygen was used to convert ammonium to nitrate and only a relatively small fraction was used to reduce COD. If wastewater is applied
with sprinklers, considerable amounts of oxygen may enter the soil during
the short periods between sprinkler revolutions, particularly on fast-draining soils.
Some organic compounds are easier to degrade and exert a higher initial
oxygen demand on the soil than others. The oxygen demand of secondary
sewage effluent is sufficiently small and mostly due to readily degradable
material. Thus, BOD is essentially completely removed as the effluent
moves through the soil, even for high rate systems. In laboratory and field
studies, prolonged flooding and obvious depletion of oxygen did not seem
to affect the removal of BOD or COD (Bouwer et al., 1974b; Lance et
al., 1973). Thus, anaerobic processes were also effective for BOD removal.
This agrees with studies by Thomas and Bendixen (1969), who detected
little or no effect of loading rate, duration of dosing, and temperature, on
the organic carbon removal from septic-tank effluent passing through soil
columns. Small, frequent applications, such as the 3 to 6 times per day
rate recommended by Robeck et al. (1964) for best removal of COD,
may be necessary if the wastewater contains high concentrations of organic
compounds. Such schedules may increase the rate of biodegradation of
these compounds in the soil, as was demonstrated by HaIIam and Bartholomew (1953) for plant residue.
The BOD loading and removal at the Flushing Meadows Project was
100 kg/ha per day during flooding (Bouwer ef al., 1974b). At the Whittier
Narrows Project, complete BOD removal was obtained from secondary
sewage effluent at infiltration rates of about 0.6 m/day, or a BOD load
also of about 100 kg/ha per day. In this rapid-infiltration system, 9-hour
flooding periods were rotated with 15-hour drying periods. The sum of
the carbonaceous and nitrogenous oxygen demands was about 750-1 000
kg/ha per day. Of this, about four-fifths was for nitrification of ammonium
(McMichael and McKee, 1965). About three-fourths of the carbonaceous
oxygen demand was removed in about 1.2 m of percolation of the effluent
HERMAN BOUWER AND R. L. CHANEY
through the soil. Erickson et al. (1972) reported BOD reductions from
about 1200 mg/liter to 5 mg/liter when dairy waste was applied to the
Barriered Landscape Wastewater Renovation System (BLWRS) at rates
of about 2 cm/day, or a BOD load of about 240 kg/ha per day.
Higher oxygen demands on the soil system and less complete removal
of BOD are possible with effluents from vegetable or fruit processing
plants, concentrated animal-waste slurries, or incompletely digested sewage
sludges, where the BOD levels may be in the tens of thousands of milligrams per liter and the organic compounds readily biodegradable. D. M.
Parmelee (personal communication, 1973) recommended that BOD loading rates not exceed 450 kg/ha per day for food processing plants. At
these rates, W. G. Knibbe (personal communication, 1973) found that the
COD of vegetable processing plant effluent was reduced from a range of
about 500 to 2000 mg/liter to about 25 mg/liter in the first 50 cm of
movement through soil (the COD of these effluents was about 1.7 times
as high as the BOD). Higher loadings produced higher COD levels in the
Where soils are heavily overloaded with organic compounds in liquid
wastes, solids in the wastewater and solids formed by bacterial activity
in the soil may build up under the anaerobic conditions caused by the
high oxygen demand. This will in turn cause a decrease in the infiltration
rate, and hence in the oxygen demand exerted on the soil. Thus, soil may
have some form of “self-defense” against excessive loadings of oxygen
Overland flow systems can also be effective in removing oxygen demand.
provided the loading rate is sufficiently small and land has been sufficiently
prepared to avoid channeling or short-circuiting. Thomas ( 1973b) reports
a BOD reduction from an average of 150 mg/liter to a range of 8 to 12
mg/liter by flowing comminuted raw sewage over vegetated soil. Truesdale
el al. (1964), using a much higher loading rate, found that BOD of humus
tank effluent was reduced from a 16 to 24 mg/liter range to a 7 to 10
mg/liter range by overland flow in grassed plots. Wilson and Lehman
(1967) obtained a reduction of only about 20% in the COD of primary
effluent by flowing it through bermudagrass irrigation borders. For cannery
wastes, the BOD was reduced from 580 mg/liter to 9 mg/liter in a Texas
project (Law et al., 1970). A BOD reduction from 483 to 158 mg/liter
was obtained for sugarbeet wastes in a field not very well graded and showing considerable channeling (Porges and Hopkins, 1955).
Vela and Eubanks (1973) demonstrated that for land treatment of cannery wastes, soil bacteria, rather than enzymes or bacteria already present
in the plant effluent, were responsible for the decomposition of organic
matter. Thus, soil treatment can be expected to be more effective in reduc-
LAND TREATMENT OF WASTEWATER
ing the BOD of wastewater than, for example, lagooning or other treatment
where the plant effluent will not be in contact with the soil. Only a small
fraction of the bacteria population in the soil (16 out of 100 species) contributed directly to the decomposition of organic matter, which consisted
of hydrolysis of the polymers followed by oxidation of the monomers. The
other bacterial species probably contributed indirectly to the mineralization
process. Because of this, bacteria in the soil did not correlate with the
oxidative capacity of the soil.
Shuval and Gruener’s (1973) statement that “. . . advanced wastewater
renovation technology still cannot reduce COD or TOC to an absolute
zero concentration . . .”, also applies to land treatment of wastewater.
For example, while BOD was completely removed and COD reduced to
the same level as that of the native groundwater at the Flushing Meadows
Project, TOC values of the renovated water averaged 5 mg/liter after 9
m soil precolation (Bouwer et al., 1974b). The identity of this organic
carbon is not very well known. Thus, it is subject to speculation regarding
toxicants, teratogens, mutagens, and carcinogens. Perhaps this TOC can be
reduced by treatment with a strong oxidant, such as ozone.
Wastewaters, and particularly sewage effluent from industrialized communities, may contain hydrocarbons, detergents, pesticides, phenolic
compounds, and other undesirable constituents. Usually, however, their
concentrations are so low that with adsorption and gradual biodegradation
generally occurring in the soil, few or no adverse effects are expected
(Miller, 1973). Special precautions need to be taken, however, with land
treatment of wastewaters containing unusually large concentrations of these
compounds, or where porous soils, fissured rock, or cavernous limestones
in the treatment fields offer little opportunity for appreciable renovation of
Until further research has demonstrated that the refractory organics and
other substances in renovated wastewater are harmless, direct use of such
water (particularly sewage water) for domestic purposes is not recommended as a general practice (Long and Bell, 1972; American Water
Works Association, Board of Directors, 1973; Ongerth et al., 1973).
Of the numerous microorganisms possibly present in the wastewater,
particularly in sewage effluents and sludges, the fate of pathogenic bacteria
and viruses when the water moves through the soil is of utmost concern.
The fecal coliform test is useful for indicating fecal pollution and, hence,
possible presence of pathogens in surface water. For land treatment systems, low fecal coliform densities in the percolate or renovated water
HERMAN BOUWER AND R. L. CHANEY
probably mean absence or low levels of pathogenic bacteria or viruses.
However, the absence of such organisms can be determined only by testing for specific microbial pathogens.
The pathogenic bacteria commonly found in sewage effluent include Salmonella, Shigella, Mycobacterium, and Vibrio comma (Foster and Engelbrecht, 1973). Viruses include the enteroviruses and adenoviruses. The
hepatitis virus is of great concern, but tests to detect its presence have
not yet been developed. Other pathogens include the protozoa, such as
Endamoeba histolytica, and helminth parasites, for example, ascaris and
Fortunately, the soil is an effective filter and many reports indicate
absence or very low levels of fecal coliforms or other organisms after water
has moved one to several meters through soil (Stone and Garber, 1952;
California State Water Pollution Control Board, 1953; Baars, 1964; McMichael and McKee, 1965; Drewry and Eliassen, 1968; Merrel and Ward,
1968; Romero, 1970; Young and Burbank, 1973; Bouwer et al., 1974b).
On the other hand, situations have also been reported where appreciable
numbers of microorganisms were detected in the renovated water after considerable distance of underground movement (Romero, 1970; Randall,
1970; Allen and Morrison, 1973). Such long underground travel distances
of microorganisms are usually associated with macropores, as may be
found in gravels, coarse-textured soils, structured clay soils, fractured rock,
cavernous limestones, etc.
The retention of microorganisms in the soil is largely due to physical
entrapment for the larger organisms and to adsorption to clay and organic
matter for viruses and other amphoteric organisms (McGauhey and Krone,
1967; Krone, 1968). Drewry and Eliassen (1968) found that virus adsorption was more rapid when the pH was below 7-7.5 than when the pH
was higher. An increase in the cation concentration of the liquid phase
in the soil also increased the adsorption of viruses. Young and Burbank
(1973) reported virus removal in soil as a pH-dependent adsorption process. Cookson (1967) found that the adsorption of viruses by activated
carbon could be described by a diffusion equation with a Langmuir adsorption boundary condition. Virus removal due to adsorption during phosphate precipitation was described by a pH-dependent Freundlich isotherm
by Brunner and Sproul ( 1970).
Microorganisms retained in the soil are subject to normal die-off, which
usually takes several weeks to several months (Van Donsel et al., 1967).
This is about the same as the die-off times in surface waters (Andre et al.,
1967). Much longer survival times in soil have also been reported, however, such as 6 months to l year for salmonella (Rudolfs et al., 1950)
and up to 4 years for Escherichia coli (Mallman and Mack, 1961). Miller
LAND TREATMENT OF WASTEWATER
(1973) found that fecal streptococci from sewage sludge survived up to
6 months in a clay soil, but not as long in coarser soils. The die-off of
pathogens and other foreign microorganisms brought into the soil with the
wastewater is due to the “homeostatic” reaction of the existing microbiological community in the soil (Alexander, 1971). This rejection of foreign
organisms may result from production of toxins, lysis by enzymes, consumption by predatory protozoa, parasitic organisms, competition, and the
general hostility of the soil environment to pathogenic organisms that are
more at home in men and other warm-blooded creatures.
Normally, fecal coliform bacteria are essentially completely removed
after the water has traveled 1 m or at most 2 or 3 m through the soil.
However, Bouwer et al. (1974b) found much deeper penetration of fecal
coliforms below rapid-infiltration sewage basins after the basins were
flooded following an extended drying or resting period. This was probably
due to reduced entrapment of E. coli on the surface of the soil. The clogging layer of organic fines that had accumulated on the soil during flooding,
forming an effective filter, was dry and partially decomposed after drying,
thus yielding a more open surface of the soil and a less effective filter when
flooding was resumed. Also, the bacteria population in the soil undoubtedly
declined during drying because the nutrient supply was discontinued. Consequently, there was less competition from the native soil bacteria, and
hence greater survival of the fecal coliforms when flooding was resumed.
As flooding continued, however, fine suspended solids accumulated again
on the surface of the soil and the bacteria population also increased, both
resulting in increased retention of E. coli and return of the fecal coliform
levels to essentially zero in renovated water sampled from a depth of 9
m. Almost all the removal of the fecal coliforms took place in the first
1 m of soil.
Pathogenic and other foreign microorganisms may survive for some time
in the soil, but they do not multiply (Benarde, 1973). The same has been
observed for surface water (Deaner and Kerri, 1969). McMichael and
McKee (1965) observed increased coliform counts in the soil with depth
below spreading basins. They attributed this to a growth in Aerobacter
aerogenes, which is a common soil bacterium of the coliform group, rather
than to E. coEi. However, Masinova and Cledova (1957) reported that
E. coli can sufficiently change in soil or water to give the biochemical tests
more typical of the intermediate coliform types, including A . aerogenes.
Cohen and Shuval (1973) studied the survival of coliforms, fecal coliforms,
and fecal streptococci in surface water and sewage treatment plants. Fecal
streptococci were generally more resistant than the other indicator organisms. In two systems, the survival of fecal streptococci paralleled the survival of enteric viruses better than the survival of coliforms.
HERMAN BOUWER AND R. L. CHANEY
The best insurance against contamination of groundwater by pathogenic
microorganisms due to land treatment of wastewater is to allow sufficient
distance between the land treatment facility and the point where groundwater leaves the aquifer for human consumption. Recommendations for
this distance vary from about 10 m to 100 m (Romero, 1970; Drewry
and Eliassen, 1968), depending on the soil type. Very coarse soils, wellstructured soils, and fractured or cavernous rocks cannot be expected to
effectively retain microorganisms, and they should be avoided.
In addition to moving underground, pathogenic organisms can spread
from a land treatment site through the air, particularly if the wastewater
is applied by sprinklers. Adams and Spendlove (1970) found that trickling
filters of sewage plants emitted coliform bacteria into the air, and that E.
coli could be sampled from the air as far as 1.2 km downwind.
No matter what precautions are taken and how failsafe a land treatment
system may be, some contamination and some survival of microorganisms
may still take place. The simplest precaution against the possibility of infectious disease may be to chlorinate or otherwise disinfect all water for
human consumption that is pumped from wells within underground traveling distance from land treatment sites or other possible sources of groundwater contamination. Most waterborne disease outbreaks are due to consumption of undisinfected groundwater (Craun and McCabe, 1973). These
authors also recommend disinfection of groundwater as an easy and simple
means to reduce the incidence of water-borne disease.
Chlorination for virus control in wastewater is not effective if the water
has a high suspended solids content. Thus, virus survival in chlorinated
secondary sewage effluent is often observed (Mack, 1973). Culp et al.
(1973) reported that disinfection for virus removal is most effective in
water having a turbidity below 1 JTU (Jackson Turbidity Units) and as
near as 0.1 JTU as possible. Chlorination to a free residual of 1 mg/liter
with a contact time of 30 minutes is normally adequate to completely remove or inactivate all viruses. Since soil filtration of wastewater removes
essentially all suspended solids, chlorination of the percolate or renovated
water for virus and bacteria removal should be much more effective than
chlorination of the wastewater prior to land treatment.
In overland flow systems bacteria and viruses are removed primarily
by settling and entrapment of suspended solids harboring the microorganisms. Detention times in overland flow systems normally are too short
to reduce bacteria and viruses substantially by normal die-back, as usually
happens in ponds or streams (Andre et al., 1967; Cohen and Shuval,
1973). Seidel (1966) reports much faster die-back of fecal coliforms in
shallow impoundments where rushes (Scirpus lucustris and Spartina Townsendii) were growing than in impoundments without such vegetation.
LAND TREATMENT OF WASTEWATER
The removal of microorganisms in overland flow systems may possibly
be improved if a flocculant such as alum or lime is added to the wastewater
prior to land application. Viruses and other microorganisms may then
become attached to the flocs and be detained on the treatment field. Excellent virus reductions, for example, have been obtained by flocculation and
sand filtration of secondary sewage effluent (Berg et al., 1968). The addition of flocculants also helps to precipitate phosphates (Brunner and
Sproul, 1970), and hence, may increase the phosphate removal in overland
Bacteria and viruses in the wastewater restrict the type of crop that
can be grown on the land treatment fields. While entry of certain viruses
into the plant through the root system has been observed (Murphy et al.,
1958; Murphy and Syverton, 1958), normally the main concern is with
pathogenic organisms that could collect on the surfaces of fruits and vegetables consumed raw (National Technical Advisory Committee, 1968).
This committee suggests an interim guideline of not more than 5000 total
coliform bacteria per 100 ml and not more than 1000 fecal coliforms per
100 ml, for irrigation water of crops where tops or roots are directly consumed by man or livestock. More conservative health guidelines were
presented by Krishnaswami ( 1971 ) .
A number of states have adopted quality criteria for irrigation with sewage effluent, sometimes based on what is theoretically desirable and practically achievable while avoiding criteria that are so stringent that they could
not be met by normal irrigation water. As an example, the Arizona State
Health Department ( 1972) requires secondary treatment, or its equivalent,
if the sewage is used for irrigation of fibrous or forage crops not intended
for human consumption, or orchard crops where the water does not come
in contact with fruit or foliage. Secondary treatment and disinfection or
equivalent treatment to reduce the total coliform density to 5000 per 100
m1 and the fecal coliform density to 1000 per 100 ml are required for
higation of food crops that are sufficiently processed to destroy pathogens,
or for orchard crops where the irrigation water does come in contact with
fruit and foliage, or golf courses, cemetaries, etc. Tertiary treatment to
produce a BOD and suspended solids content both of less than 10 mg/liter
and disinfection or equivalent treatment to reduce the fecal coliform count
to less than 200 per 100 ml are required if the effluent is to be used for
irrigation of food crops that are consumed raw by man, or of play grounds,
lawns, parks, etc., where children can be expected to play.
One of the biggest questions with respect to the health hazards of land
treatment of sewage effluent and other wastewaters is: What are acceptable
levels of microorganisms, and particularly pathogens, in the renovated
water or crops produced by such systems? Some persons may advocate
HERMAN BOUWER AND R. L. CHANEY
complete sterility, but this may not be necessary even if it were achievable.
The environment as a whole is not sterile. Bacterial pathogens have been
recovered from pristine mountain streams (Fair and Morrison, 1967).
While some people may be alarmed to hear that fecal coliforms and, hence,
possibly pathogenic bacteria, can travel through the air for long distances
around sewage treatment plants (Adams and Spendlove, 1970), sewage
treatment plant workers apparently do not have poorer health than people
in other occupation groups. As a matter of fact, sewage plant workers were
found to have the lowest absenteeism rate among a group of occupations
studied, and this was attributed to the fact that “sewage workers were regularly immunized by their exposure to small amounts of infected material”
(J. L. Melnick, as quoted by Benarde, 1973).
Benarde (1973) also states that “one must be chary of the type of
microbiological thinking that equates the presence of microbes with the
potential for illness. The fact is that illness is an unusually complex phenomenon that does not have a 1 to 1 relationship to microbes.” Little is
known about minimum infecting doses of pathogenic organisms and the
combination of factors necessary to produce illness (Dunlop, 1968;
Benarde, 1973). From a communicable disease standpoint, however, land
treatment is far less hazardous than disposal of sewage effluent and other
liquid wastes into rivers and streams (Benarde, 1973 ) .
The nitrogen content of liquid waste may be as low as essentially zero
for some cannery wastes and as high as 700 mg/liter for slurries of fresh
swine waste (Erickson et al., 1972). Secondary sewage effluent generally
contains 20-40 mg of nitrogen per liter (California Department of Water
Resources, 1961) and sewage sludge 3-5% nitrogen (on a dry weight
basis). Winery wastewaters may have 4-10 times as much nitrogen as domestic sewage (Schmidt, 1972). Wet sewage sludge (95% water) generally contains 1500-2500 mg of nitrogen per liter (Hinesly, 1973; Peterson
et al., 1973).
For cannery wastes, where the organic material consists essentially of
cellulose and other carbonaceous materials, nonleguminous crops may
actually become nitrogen deficient at high waste loadings in the same
way that nitrogen deficiency may occur after application of crop residue
containing less than about 1.3% nitrogen. The C/N ratio of these materials
is usually about 35. For such wastes, release of significant amounts of nitrogen cannot, be expected unless the nitrogen content exceeds about 1.8 %
on a dry weight basis (Wets, 1973).
For secondary sewage effluent and similar liquid wastes, a significant