Tải bản đầy đủ - 0trang
III. Basic Factors Affecting Field Soil Loss
DWIGHT D. SMITH Ah7) WALTER H. WISCHMEIER
minute intensities was also generally poor. Good correlation with maximum 30-minute intensity was found only on steep slopes or sandy loam.
At every location for which fallow-plot data were available, both runoff
and soil loss were more highly correlated with rainfall energy than with
rain amount or any short-period maximum intensity. Momentum rated
second, but was well below energy as a predictor of soil loss from fallow.
In further regression analyses of the data, Wischmeier (1959) found
that the rainstorm parameter most highly correlated with soil loss from
fallow was a product term, kinetic energy of the storm times maximum
30-minute intensity. He called this product the “rainfall-erosion index.”
Maximum 30-minute intensity was defined as twice the greatest amount
of rain falling in any 30-minute period. A break between storms was defined as a period of 6 consecutive hours with less than 0.05 inches of rainfall. This index was selected as the most appropriate rainfall parameter
for use in the soil loss prediction equation.
The rainfall-erosion index thus defined explained from 72 to 97 per
cent of the variation in individual-storm erosion from tilled continuous
fallow on each of six widely scattered soils. The percentage of the soilloss variance explained by the index was greater than that explained by
any other of 42 factors investigated and greater than that explained by
rain amount and maximum 5, 15, and 30-minute intensities, all combined in a multiple regression equation.
The erosion index evaluates the interacting effect of total storm
energy and maximum sustained intensity. Thus it is an approximation of
the combined effects of impact energy and rate and turbulence of runoff.
Rainfall energy is a function of the specific combination of drop velocities
and rain amount. The maximum 30-minute intensity is an indication of
the excessive rainfall available for runoff.
The product terms-rain amount times 30-minute intensity, and momentum times 30-minute intensity-were also more precise estimators of
soil loss than was energy alone, although less accurate than the energyintensity product. This supports the conclusion that the erosive potential
of a rainstorm is primarily a function of the interacting effects of drop
velocity, rain amount and maximum sustained intensity. In assembled
plot data, maximum 30-minute intensity was more effective than maximum 15- or 60-minute intensity as the second element of the interaction
The relationship of soil loss to the storm energy-intensity products is
linear. Therefore, the location erosion-index value for a year can be
computed by summing the storm energy-intensity products. The assembled
plot data showed that when all factors other than rainfall were constant,
specific-year soil losses from cultivated areas were directly proportional
to the yearly values of the index. Yearly or monthly values of the erosion
index can be computed on a locality basis from recording rain gauge
records. Return-period values can also be computed.
The rainfall-erosion index provides valid estimates of the effects of
rainfall patterns over long time periods. For prediction of losses from
specific storms, precision was improved by combining with the erosion
index, the parameters: rainfall energy, an antecedent moisture index,
and antecedent energy since cultivation ( Wischmeier and Smith, 1958).
Some soils erode more readily than others. Soil properties that influence soil erodibility by water may be grouped into two types: (1)
those properties that effect the infiltration rate and permeability; and
( 2 ) those properties that resist the dispersion, splashing, abrasion, and
transporting forces of the rainfall and runoff.
Middleton (1930) was one of the first to try to obtain an index of soil
erodibility based on the physical properties of the soil. He considered
the dispersion ratio and what he called the erosion ratio to be the most
significant soil characteristics influencing soil erodibility. His erosion
ratio was the dispersion ratio divided by the ratio of colloid to moisture
equivalent. The dispersion ratio, expressed as a percentage, was the ratio
of the apparent total weight of silt and clay in the nondispersed sample
to the total silt and clay in the dispersed sample. He also suggested that
organic matter, the silica: sesquioxide ratio, and the total exchangeable
bases influenced the erosional behavior of soils. Middleton et al. (1932,
1934) grouped soils of the original ten erosion stations according to the
Peele et al. (1945) modzed Middleton’s criteria in an analysis of
four major soils of South Carolina. Voznesensky and Artsruui (1940) developed a formula for an index of erodibility based on dispersion, waterretaining capacity, and aggregation.
O’Neal (1952) attempted to develop a key for evaluating soil permeability on the basis of certain field conditions. The first step in this procedure was to determine the type of structure. Then the class of permeability was estimated from four principal factors and one or more of
eight secondary factors. Principal factors used were: relative dimensions
(horizontally and vertically) of structural aggregates, amount and direction of overlap aggregates, number of visible pores, and texture. Important secondary factors were: compaction, direction of natural breakage, silt content, cementation, type of clay minerals, character of coatings
in aggregates, degree of mottling, and certain features of climate.
DWIGHT D. SMITH AND WALTER H. U'ISCHMEIER
Parr and Bertrand (1960) presented a comprehensive review of past
studies of water infiltration into soils. Instrumentation, procedures, and
results were briefly summarized.
TOdescribe what they considered to be the most important soil properties affecting erodibility, Adams et al. (1958b) made the following field
and laboratory measurement on Iowa soils: runoff, infiltration, wash
erosion, splash erosion, water-stable aggregates < 0.10 mm., dispersion
ratio, per cent silt and clay, bulk density, pores drained by 60-cm.
water tension, and air permeability at field capacity. For soils containing
relatively large amounts of swelling type clay that crack upon drying, the
erodibility has been considered to be significantly influenced by moisture
content (Smith et al., 1953; Adams et al., 1958a).
Browning et al. (1947) developed a conservation guide for soils
mapped in Iowa, to be used in computing field soil loss. The soils were
divided into seven groups on the basis of what was considered to be their
relative erodibility. This procedure was extended to other soils of the
North Central and Northeastern States (U. S. Soil Conservation Service,
1956; Lloyd and Eley, 1952).
Many soil properties appear to influence erodibility. Their effects
may be interrelated. Some of them are influenced by cropping history,
past erosion and management practices. Subjective classifications of soil
into erodibility classes by relating observed erosion to soil survey data
have often been biased by confounding soil effects with those of rainfall
and management. The automatic confounding of rainfall and soil effects
has also complicated efforts to evaluate soil erodibility empirically from
field plots under natural rainfall. As a means of isolating soil effects
If'ischmeier et al. (1958) proposed use of the rainfall-erosion index to
adjust measured soil losses for differences in rainstorm characteristics
and use of slope factors to adjust for differences to topography.
For soil-loss prediction purposes, Wischmeier and Smith (1961) defined soil erodibility as soil loss in tons per acre per unit of rainfallerosion index, measured from tilled continuous fallow with length and
per cent of slope at specified values. The soil erodibility factor thus becomes a quantitative factor. Empirical measurements for its evaluation
include combinations of all primary and interacting factor effects.
Field soil loss is affected by the degree of slope, the length of slope,
and the curvature of the slope. Studies of the first two effects have been
conducted under both natural and simulated rainfall, but little has been
done to evaluate the third.
1. Per Cent Slope
The effect of per cent slope was studied on small plots under
sprinklers by Duley and Hays ( 1932), Neal (1938), Borst and Woodburn
( 1940), and Zingg (1940). Water applied by sprinklers in these studies
did not simulate natural rainfall in drop size distribution, drop velocity,
or energy. Duley and Hays found that the increase in soil loss with each
unit increase in per cent slope became greater as the slope became
steeper. In a comparison of a silty clay loam with a sandy soil, the former
gave greater erosion loss on the flatter slopes and the latter on the
steeper slopes. Borst and Woodburn, using artificial rainfall at Zanesville, Ohio, found soil loss proportional to P 3 0 , where S is per cent of
slope. Neal, working with Putnam soil, found soil loss proportional to
So.7Z1.2,where Z is intensity in inches per hour.
The first comprehensive study of the effect of slope on soil loss was
published by Zingg (1940). He concluded that soil loss varies as the 1.4
power of the per cent slope. He used data by Duley and Hays (1932),
Diseker and Yoder (1936) and from a series of studies he conducted.
For better description of the relationship on the flatter slopes of Midwest claypan soils, and using data of Neal (1938), Smith and Whitt
(1947) proposed the equation:
R = 0.10
+ 0.21 s
where R is relative soil loss in relation to unity loss from a 3 per cent
slope and S is per cent of slope.
The authors (Smith and Wischmeier, 1957) evaluated the per cent
slope-soil loss relationship on the basis of plot data under natural rainfall secured by several investigators. 0. E. Hays, at the Upper Mississippi
Valley Conservation Experiment Station, Lacrosse, Wisconsin, obtained
data that covered 17 years of soil loss measurements from slopes of 3,
8, 13, and 18 per cent on Fayette soil. The plots were cropped to continuous barley for five years, followed by twelve years of corn-oatsmeadow rotation with across-slope tillage. A second-degree polynomial
gave a better least-squares fit to these data than did the logarithmic
relationship suggested by the earlier investigators. The constants of
parabolic equations derived from Hays’ data and from Zingg’s rainfall
simulator data were nearly identical when the latter were adjusted for
cropping effect. Data of Van Doren and Gard (1950) and Borst et al.
(1945), each comparing two slopes, if adjusted to conform with conditions at Lacrosse, also fit the equation derived from Hays’ data. The
combined data for the four studies gave a very good least-squares fit
to the equation:
DWIGHT D. SMITH AND WALTER H. WISCHMELER
A = 0.43
+ 0.30s + 0.043S2
in which A is soil loss and S is per cent slope.
In both the Hays and Zingg studies, runoff increased significantly
with increase in per cent slope although the two soils were quite different, one being a deep loess that sealed under raindrop impact and
the other a loam over a clay subsoil.
2. Length of Slope
Zingg (1940) concluded that total soil loss varied as the 1.6 power
of slope length; and the loss per unit area, as the 0.6 power. His conclusion was based on data from Bethany, Missouri; Guthrie, Oklahoma;
Clarinda, Iowa; Lacrosse, Wisconsin; and, Tyler, Texas. A group study
in 1946 under Musgrave (1947) proposed 0.35 as the average value for
the slope length exponent for soil loss per unit area.
In 1956, the results of statistical analysis of data for 532 plot years,
involving simultaneous measurements on two or more lengths of slope
under natural rainfall from 15 plot-study locations in 12 States, were
published by Wischmeier ef nl. (1958). The analysis showed that the
relationship of soil loss to slope length often varied more from year to
year on the same plot than it varied among locations. The magnitude
of the slope-length exponent appeared to be influenced by soil characteristics, rainfall pattern, steepness of slope, cover, and residue management. However, the data were not adequate to provide quantitative
evaluations of the factor-interaction efTects.
Average values of the slope length exponent for the different locations
varied from 0 to 0.9. Magnitude of the exponent appeared definitely to
be related to the effect of slope length on runoff. At Hays, Kansas, and
Temple, Texas, runoff decreased significantly with slope length. From
these data the over-all average value of the exponent did not differ
significantly from zero. However, in the final 7-year period of the 15year study at Temple, soil loss was proportional to LO3. At Guthrie,
Oklahoma, and for corn following bluegrass sod at Bethany, Missouri,
where runoff showed a significant increase with increased slope length,
soil loss varied as Lo.’ and Lo.9, respectively. At the other 11 locations
studied, slope length had no significant effect on runoff. In these studies,
the magnitude of the slope length exponent ranged from 0.27 to 0.60.
In seven of the studies cropping was continuous corn or cotton. In the
other four it was rotational, including row crops.
Average values of the length exponent computed for ten locations in
the Corn Belt and the Northeastern States did not differ significantly at
the 10 per cent level. The mean of the ten exponents was 0.46. A group
study at Purdue University in 1956, which included the authors (Smith
and Wischmeier, 1957), concluded that for field use the value of the
length exponent should be 0.5 2 0.1.
D. COVERAND MANAGEMENT
The greatest deterrent to soil erosion is cover. Cover and management
influence both the infiltration rate and the susceptibility of the soil to
erosion. The most effective vegetative cover is a well-managed, dense sod.
Fields easily eroded are usually those in poorly managed, cultivated
crops. The severest erosion occurs when erosive rainstorms coincide with
periods in the rotation when the soil surface is essentially bare. Sodbased rotations have played a predominant role in runoff and erosion
control. Also, many new tillage practices have been very effective.
Baver (1956) classified the major effects of vegetation on runoff and
erosion into five distinct categories: ( 1 ) interception of rainfall by the
vegetative cover; ( 2 ) decrease in the velocity of runoff and the cutting
action of the water; ( 3 ) root effects in increasing granulation and
porosity; ( 4) biological activities associated with vegetative growth and
their influence on soil porosity; and, ( 5 ) the transpiration of water
leading to subsequent drying out of the soil.
Bertoni et al. (1958) observed that final infiltration rates of a soil
varied with season of the year. They suggested that the higher infiltration rates during July were due to increased vegetal cover which protected the soil surface against sealing, to lowered surface moisture, and
to high soil and water temperatures. Higher infiltration rates during the
summer months than during other seasons also were observed by Beutner
et al. ( 1940), Horner and Lloyd ( 1940), and Borst et al. ( 1945). Woodward (1943) also found a linear relation between vegetal cover and
In a study of the relation of plant cover to infiltration and erosion in
Ponderosa Pine forests of Colorado, Dortignac and Love (1960) concluded that large pore space of the upper 2 inches of soil and the
quantity of dead organic materials were the two properties that accounted
for most of the variation in infiltration rates among cover types. The case
of soil dislodgement by rainfall impact varied with types of soil cover,
but soil origin and the amount of exposed bare soil were the main factors
In 15 years of soil-loss measurements, Horner (1960) found that the
kind and amount of cover provided during the winter season was the
dominant factor affecting runoff and erosion on Palouse silt loam at
Pullman, Washington, where large summer storms with high intensities
DWIGHT D. SMITH AXD WALTER H. WISCHMEIER
were few. Land seeded to winter wheat was more vulnerable to erosion
than any other winter cover condition common to the area. Sod-based
rotations provided more effective erosion control and soil organic matter
maintenance than did cropping systems without the meadow. Summer
fallowing caused the largest erosion losses and the most rapid depletion
of organic matter. Melting snow on frozen soil contributed significantly
to erosion hazards in this area. Exceedingly high soil losses from a gentle
rain falling on soil thawed to a depth of 4 inches were also reported by
Bay et nl. (1952) in \%‘isconsin.
Taylor and Hays (1960) found that a heavy mulch of chopped
cornstalks and manure provided excellent erosion control on corn following corn on Fayette silt loam of 16 per cent land slope. Whitaker
et nl. (1961) found that fertilization adequate to produce high crop
yields and large quantities of plant residues greatly reduced the formerly
serious soil and water losses from sloping claypan soils. Seedbed
preparation by subtillage, which left shredded cornstalks on or near the
surface, significantly reduced erosion losses even from very high intensity
Shredding the cornstalks increases their wintertime erosion control
value. In studies on Warsaw loam and Russell silt loam with about 4
per cent slope, hleyer and Mannering (1961b) found that soil loss
associated with shredded cornstalks was slightly less than half that from
cornstalks as left by mechanical pickers when rainfall was applied
artificially at 2.4 inches per hour for 60 minutes. However, one trip over
the shredded stalks with a disk significantly increased the soil content
of the runoff.
The importance of crop residues, cover crops, and sod-based rotations
in control of runoff and erosion in the Southern Piedmont soil area has
been shown by studies at Watkinsville, Georgia (Carreker and Barnett,
1949; Barnett, 1959). Analyzing data from the blackland prairie of
central Texas, Adams et al. (1958a) found that both runoff and soil
loss from corn managed by subsurface tillage methods were significantly less when the corn followed fescue than when it followed another
year of corn.
Krall et al. (1958) found that stubble mulch fallow provided better
erosion control than did other methods of summer fallowing in the
semiarid areas. In Wyoming, Barnes and Bohmont (1955) found that
land in grass as commonly left after haying operations absorbed water
at a rate 25 per cent lower than did land with “trashy” fallow. Both
conditions absorbed from 30 to 75 per cent more water in an hour than
did bare fallow land. Raking and baling loose straw from a stubble field
reduced water intake rate by more than 30 per cent. Burning the
residue reduced water absorption by nearly 50 per cent. Duley (1960)
reported that in a three-year grain rotation, plowed land lost 2.6 times
as much water and 4.8times as much soil in runoff as did stubble mulched
land over a 20-year period. The amount of water stored in the soil
during summer fallow depended on the amount of residue present on
In a study by Mannering and Meyer ( 1961), 6% inches of simulated
rain were applied at 2% inches per hour on various quantities of straw
mulch spread over freshly plowed and disked wheat stubble on Wea
silt loam with 5 per cent slope. With no mulch, soil loss was 12 tons
per acre, but with 2 tons mulch per acre no runoff and erosion occurred.
With 1 ton of mulch per acre, soil loss was reduced to % ton per acre;
ton of mulch, to about 1ton per acre.
Minimum tillage practices in which the corn-planting operation
coincides with or immediately follows plowing with moldboard plows,
and fitting operations with disk and harrow are omitted, have gained in
popularity in recent years. Minimum tillage provides erosion and runoff
control because of larger aggregate or clod size and decreased compaction
(Free, 1960a). Idltration is increased and erosion is reduced during
the highly vulnerable seedbed and crop establishment periods. Quantitative data on the magnitude of erosion-control benefits from minimumtillage practices are too limited to permit evaluating the interaction
effects of soils, slope, cover, and row direction.
Meyer and Mannering (1961a) compared plow-plant as a single
operation with planting on a seedbed fitted by two diskings with a
trailing harrow on Russell silt loam of 5 per cent slope that previously
had been in meadow. Runoff and soil losses were measured from three
5.2-inch applications of simulated rain at 2.6 inches per hour. In a test
about 2 weeks after seeding, losses from the minimum-tillage plots
averaged 63 per cent of the runoff and 52 per cent of the soil loss from
the fitted seedbeds. When the corn in both treatments was cultivated
to prevent surface crusting, reductions in both runoff and soil loss by
the minimum-tillage practice were still apparent even after corn harvest,
but the magnitude of the benefits decreased significantly with successive
cultivations and increased vegetative growth. When corn cultivations
were omitted on the minimum-tillage areas, both runoff and erosion
were greatly increased as a result of surface crusting. Soil loss from this
treatment was greater than that from the corn planted on fitted seedbed
and cultivated after emergence.
Swamy Rao et al. (1960) found that minimum tillage for corn
resulted in a higher rate of infiltration, less soil resistance to penetration,
lower bulk density, and less soil compaction due to tractor and implement
DWIGHT D. SMITH AND WALTER H. WISMMEIER
traffic. The data for fallow periods in rotations measured in other studies
(Wischmeier, 1960) show that the magnitude of the benefits attainable
with minimum-tillage practices may be expected to depend upon crop
sequence, quality of meadows in the rotation, and quantity of residues
Hays (1961) reduced total soil and water losses from rotations by
spacing corn rows 60 inches apart and interseeding legumes in the
corn after the second cultivation. Rains soon after the interseeding caused
increased soil and water losses due to the smoothing and packing of the
soil by the seeder, but after the seeding became established losses
were significantly reduced. The quality of meadows established by this
procedure was good.
Effects of specific cover, sod crop sequences, tillage practices, and
residue managements on field soil loss have been investigated in
cooperative USDA and State Agricultural Experiment Station plot studies
under natural rainfall at more than 45 locations in 23 States and Puerto
Rico. The data have generally been analyzed and reported by the study
Cover and management data have usually been reported either
on a crop-year basis or as rotation averages. Crop-year losses reflect the
effect of different crops or crop sequences, but do not show the cause of
favorable or unfavorable results. These can be more readily discerned
if the individual-storm data are analyzed on the basis of different stages
of crop growth.
To study the relations of cover, crop sequence, productivity level,
and residue management to soil loss, Wischmeier divided each crop row
into five crop stages, defined for relative uniformity of cover and residue
effects as follows: (1) rough falIow-turnplowing to seedbed preparation; ( 2 ) seedbed--first month after crop seeding; ( 3 ) establishment
-second month after crop seeding; (4)growing cover-from 2 months
after seeding until harvest; ( 5 ) stubble or residue-harvest to plowing
or new seedbed.
On these bases, soil losses from the cropped plots were compared with
losses from tilled continuous fallow under identical rainfall, soil, and
topographic conditions. Ratios of these losses, expressed as percentages,
were published in the form of a ready-reference table (Wischmeier,
19600).About a quarter million individual-storm soil loss measurements
were available for this study.
Highly sigdicant inverse correlations between crop yields and
erosion losses were apparent in the data. In gcncral, crop yields appeared
to provide a fair indication of the combined ef€ects of such variables as
density of canopy, rate of water use by the growing crop, and quantity
of crop residues.
Specific-year erosion losses from corn after meadow ranged from
14 to 68 per cent of corresponding losses from adjacent continuous corn.
The effectiveness of grass and legume meadow sod plowed under
before corn in reducing soil loss from the corn was, in general, directly
proportional to meadow yields. Its erosion-control effectiveness was
greatest during the fallow and corn-seedbed periods. The residual effect
of grass and legume mixtures was greater than that of legumes alone.
Direct comparisons of corn foIIowing first, second, and third years of
meadow, though limited, indicated that second-year meadow, when
allowed to deteriorate, was less effective than one year of meadow.
When succeeding meadows were more productive than first-year, they
were usually more effective in reducing erosion from corn in the following
When the corn residues were removed at harvest time, soil losses
from corn after corn were high and yields were ususally low. Soil
losses from the growing corn under these conditions were from 35 to
50 per cent of those from adjacent continuous fallow. Leaving the
cornstalks and plowing them under in spring significantly decreased
erosion during the following corn year as well as during the winter
period. Effectiveness of the corn residues turned under was directly
related to the quantity of residues available and was greatest during
the fallow and seedbed periods.
Erosion from clean-tilled cotton during the growing period appeared
to be about 50 per cent more than from corn under similar management,
soil, and rainfall. Soybean data were too limited to reveal a significant
difference in average annual erosion from beans in 42-inch rows as
compared with corn in comparable sequence. Erodibility of fallow soil
occurring for brief periods in crop rotations was influenced more by crop
sequence and the nature and quantity of residues turned under than
by the inherent characteristics of the soil itself.
The erosion control effectiveness of winter cover seedings depended
upon time and method of seeding, time of plowing, rainfall distribution,
type of cover seeded, and density of cover produced. Covers such as
vetch and ryegrass seeded early enough to attain good fall growth
and turned in April were effective in reducing erosion not only in the
winter months, but also in the following crop year (Uhland, 1958).
Small grain alone seeded in corn or cotton residues and plowed under
in the spring showed no residual erosion-reducing effect after the next
year’s corn or cotton planting.
DWIGHT D. SMITH AXD WALTER H. WISCKMEIER
Contour tillage and planting, strip cropping, terracing, waterways,
and gully control structures are generally included under erosion control
practices. Sometimes they are referred to as supporting practices.
Tillage practices, sod-based rotations, fertility treatments, and other
cropping-management practices discussed in the preceding section are
not included in this group, although it is recognized that they contribute
materially to erosion control and frequently provide the major control
in the farmer’s field. This discussion will be confined to the first three
of the listed practices.
Contour planting of crops has been in general an effective practice.
It functions, however, only to control runoff or scour erosion and then
only for those storms that are low to moderate in extent or until the
capacity of the rows to hold or conduct runoff is exceeded. In field
practice, key rows are either level or have a grade toward a waterway.
Because of land slope irregularities, row breakage is frequent with the
larger runoff storms. When this occurs losses may equal or exceed those
from up- and downslope planting (Smith et al., 1945; Moldenhauer and
The effectiveness of contour planting and tillage in erosion control has
varied with slope, crop, and soil (Smith and Whitt, 1947; Van Doren
et al., 1950; Van Doren and Bartelli, 1956; Tower and Gardner, 1953).
Its maximum effectiveness in relation to up- and downhill rows is on
medium slopes and on deep, permeable soils that are protected from
sealing. The relative effectiveness decreases as the land slopes become
either very flat or very steep. Row shapes as secured with listing increase
the channel capacity and , therefore, increase the average annual
effectiveness of farming on the contour. However, when row breakage
occurs, the results are disastrous because of the sudden release of large
quantities of impounded water. The ratio of soil loss with contouring
to that from up- and downhill rows is generally considered to be 0.5 for
slopes of from 2 to 7 per cent, 0.6 for slopes down to 1 per cent and up
to 12 per cent, 0.8 for 12 to 18 per cent, and 0.9 for 18 to 24 per cent.
With the development of land-forming machinery and techniques,
controlled row grades and shapes, designed to handle those storms
causing the bulk of runoff and erosion, became possible for those soils
amendable by management practices applied after reshaping.
Strip cropping-a practice in which contour strips of sod alternate
with strips of row crops-has proved to be a more effective practice
than contouring alone. The sod acts as filter strips when row breakage