|Return Tillage Handbook|
Pacific Northwest Conservation
Tillage Handbook Series No. 19
Occurrence of the Rhizoctonia "Bare Patch" Disease in Diverse Direct-Seed Spring Cropping Systems in a Low-Precipitation Zone
Cook, WSU Endowed Chair in Wheat Research, Pullman
The percentage area of patches of wheat plants stunted by Rhizoctonia solani AG8 in years 3 and 4 of a direct seed (no-till) cropping systems study conducted on the Ron Jirava farm 5 miles west of Ritzville, Washington was the same for continuous spring wheat (no crop rotation), spring wheat after spring barley, or 1st or 2nd year spring wheat after consecutive crops of safflower and yellow mustard. Similar percentage area of patches occurred in plots sown to spring barley after spring wheat and in the safflower and mustard. Greenhouse studies confirmed that safflower and mustard as well as several other broadleaf crops are susceptible to the root rot caused by this same pathogen. Between years 3 and 4, some patches increased in size, some new patches formed, and a few patches present in year 3 were not present in year 4. A one-time application of zinc at 1.0 lb/ac at planting provided no visual response of the stunted plants where the application passed through one side of a patch. The effect of crop rotation on grain yield of spring wheat related to water supply, with lower yield after the broadleaf crops because they extract more water (leaving less water for the next crop) than either wheat or barley. Considering that these broadleaf crops provide no apparent benefit for Rhizoctonia root rot control, and also that they leave less soil water available for the ensuing one or two cereal crops, growers in low-precipitation areas on the inland PNW are probably better off to plant continuous cereals.
cropping in the low-precipitation (less than 12 inch annual) region of
the U.S. inland Pacific Northwest (PNW) is mostly a tillage-based winter
wheat-summer fallow system where only one crop (winter wheat) is produced
every 2 years. This dry region covers 3.5 million cropland acres in eastern
Washington and adjacent north-central Oregon. The main purpose of the
13-month fallow is to store in the soil a portion of the over-winter precipitation
to allow successful establishment of winter wheat, help ensure economic
grain yields and reduce the risk of crop failure from drought. However,
summer fallow in this region is only about 30% efficient (Leggett et al.
1974), i.e., about 70% of annual average precipitation received during
the fallow year is lost because of runoff and evaporation before the winter
wheat is planted. Moreover, intensive tillage during the fallow cycle
frequently buries surface residue, pulverizes soil clods, and reduced
surface roughness. Blowing dust from excessively tilled fallow fields
leads to major soil losses and causes concerns for air quality.
in the region has historically been considered too low for annual cropping,
a 3-year winter wheat-spring cereal-fallow rotation is practiced on about
10% of the cropland. Schillinger et al. (1999) reported that both direct-seeded
(no-till) and conventionally-seeded spring cereals can be successfully
produced in this low-precipitation area when over-winter water recharge
occurs to a soil depth of 3 feet or more and 5 inches of plant-available
water are stored in the soil. Although continuous annual cropping is practiced
on less than 1% of the land, there is interest by both growers and researchers
in long-term annual cropping using direct seeding.
Replacement of the
winter wheat-fallow system with continuous spring cropping eliminates
several important diseases of wheat in the PNW, namely the snow molds,
Cephalosporium stripe, and Pseudocercosporella foot rot, all of which
occur on winter wheat but not spring cereals . In contrast, Rhizoctonia
root rot caused by Rhizoctonia solani AG8 is rare or nonexistent on wheat
following a 1-year break to fallow but occurs with continuous cereals,
especially if volunteer cereals and grass weeds are allowed to grow within
the stubble between crops (Smiley et al. 1992), and is exacerbated by
direct seeding (Pumphrey et al. 1987; Weller et al. 1986). When severe,
the disease occurs as patches of stunted plants and is referred to in
early literature of Australia as "bare patch." Because of its
wide host range, rotation of cereals with a broadleaf crop has not controlled
this disease. Thus, Rovira and Venn (1985) reported that lupines used
in a 2-year rotation with wheat and direct seeding in South Australia
controlled take-all caused by Gaeumannomyces graminis var. tritici but
not Rhizoctonia root rot caused by R. solani AG8.
As part of a larger long-term study of the potential for continuous direct-seed spring cropping in the low-precipitation area of eastern Washington, we compared the incidence and severity of Rhizoctonia root rot in response to 4 years of (i) continuous wheat, (ii) a 2-year wheat-barley rotation, and (iii) a 4-year safflower-yellow mustard-wheat-wheat rotation. While primarily intended to evaluate the agronomic and economic potential of these three spring cropping systems, the 4-year rotation was designed specifically to test the effects of back-to-back broadleaf crops on the epidemiology of Rhizoctonia root rot. Thongbai et al. (1993a,b) reported that zinc applications to South Australian soils deficient in zinc provided some control of Rhizoctonia root rot. We, therefore, also tested the effects of zinc application in a year when patches of stunted plants caused by this disease were particularly severe.
MATERIALS AND METHODS
design was a randomized complete block with four replications. Each crop
in each rotation occurred each year in 60 X 500 ft plots, making a total
of 28 plots. During the first 3 years (1997, 1998, and 1999), all plots
were planted and fertilized in one-pass directly into the undisturbed
soil and residue left by the previous crop using Jirava's Flexi-Coil 6000
(Flexi-Coil Ltm., Saskatoon, Sask.) airseeder equipped with Barton II
dual-disk openers on 7.5-inch spacing for simultaneous and precision placement
of seed and fertilizer in the same row. In 2000, all plots were planted
and fertilized in one-pass using a custom-built drill equipped with Cross-slot
(Baker Mfg., Christ Church, NZ) notched-coulter openers on 8-inch spacing
for simultaneous and precision placement of seed and fertilizer in the
same row. Both openers are low-disturbance and place fertilizer beneath
and slightly to one side of the seed. Glyphosate (Round-up) was applied
2-4 weeks before planting in the spring at a rate of 16 oz./acre to control
weeds and limit the build-up of inoculum of R. solani AG8 on living hosts
between harvest and planting (Roget et al. 1987; Smiley et al. 1992).
Seeding rate averaged across years was 70, 70, 20, and 8 lb/ac for wheat,
barley, safflower, and mustard, respectively. Solution 32 provided the
base for liquid fertilizer to supply an average of 36 lb N, 10 lb P, and
15 lb S per acre. The quantity of available soil moisture and residual
N, P, and S was measured in all rotations each spring to determine fertilizer
needs based on a yield goal. Between the tillering and stem elongation
phase of growth of wheat and barley, in-crop broadleaf weeds were controlled
with 2,4-D LV-6 at 12 oz./ac plus 0.3 oz./ac Harmony Extra . In-crop herbicides
were not used in safflower or mustard plots as no legally-labeled broadleaf
weed herbicides were available for these crops in Washington.
All plots were harvested with the grower's combine, and grain yield was determined on site by auguring grain into a truck mounted on weigh pads. When post-harvest broadleaf weeds were present in cereals (1999 only) and broadleaf crops (all years), Surefire herbicide (paraquat + diuron) was applied @ 24 oz./ac 7-10 days after harvest to prevent seed production and halt soil water use by these weeds.
of Rhizoctonia root rot and area as patches
Patches appeared in 1999 and 2000. A global positioning system (GPS) equipped with mapping software was used to determine the location, size, and area of these patches. These measurements were obtained in all wheat and barley plots in 1999 and 2000 and in safflower and mustard plots in 2000 by circling each clearly visible Rhizoctonia patch with the backpack-mounted GPS mapping unit in early June. The GPS unit had a degree of accuracy of less than 3 feet.
Tests for susceptibility
of plant species to Rhizoctonia solani AG8.
Greenhouse studies were conducted to determine the relative susceptibility of several broadleaf plant species to R. solani AG8. The plant species were canola, mustard, safflower, pea, lentil and chickpea, all potential rotation crops with wheat in the PNW. A virulent isolate [C-1; (Weller et al. 1986)] of the pathogen obtained originally from barley and grown on sterilized oat grains was used as the source of inoculum. Plastic containers were filled two-thirds full with a local silt loam soil that had been pasteurized with moist heat at 140 0 F for 30 minutes and then air-dried. Two Rhizoctonia-infested oat grains were placed on the soil surface and covered with a 1-inch-thick layer of the same pasteurized soil. The soil was then moistened with water applied from the top and incubated for 1 week; this allowed time for the pathogen to colonize the pasteurized soil from the oat-grain food base. Seeds of the different crop plant species were then planted one seed per container for large-seeded plants and two seeds per container for small-seeded crops, and incubated on a greenhouse bench at 60 0 F with natural light supplemented with overhead mercury-halide lights. Pasteurized soil of the same quantity and incubated in the same fashion but without oat-grain inoculum of the pathogen served as checks. A total of seven containers with and seven without Rhizoctonia inoculum were used per plant species. The plants were watered as necessary and allowed to grow for 3-4 weeks at which time the amount of disease was assessed based on appearance of the tops (plant stunting) and the roots (girdled or severed by Rhizoctonia lesions). The tests were done three times.
the effects of zinc application on the development of Rhizoctonia patches
At planting, one pass was made the full length of each 500-ft-long plot with the 8-ft-wide Cross-slot drill equipped to apply zinc at 1.0 lb/ac as zinc-chelate mixed with the liquid fertilizer. All other drill passes were made using the same drill and same fertilizer rate but without the zinc. Plant samples were taken when wheat was in the stem elongation phase (Feekes 7) from areas within each of the four replicate plots from (i) healthy plants where no zinc was applied, (ii) healthy plants where zinc was applied, and (iii) within patches (stunted plants) where zinc was applied. These samples were oven-dried prior to analysis for N, P, K, S, Ca, Mg, Mn, Cu, B, and Zn.
Analysis of data
severity of Rhizoctonia root rot and patches
Patches of plants
stunted because of Rhizoctonia root rot were apparent in all wheat and
barley plots starting in year 3 (1999), and the area in patches was even
greater in year 4 (2000) of the study (Table 3; Figures 1 and 2). Poor
stands in both the mustard and safflower in 1999 made it unreliable to
quantify the patches for those crops that year. When all data on percentage
area of patches for a given year were analyzed together, there was no
significant difference in area of patches among any of the rotations,
including in 2000 that included 1st year wheat after two consecutive broadleaf
crops (Table 2). On the other hand, when the area of patches was analyzed
for wheat plots only, the 4.5% area as patches in 1st year wheat after
the two consecutive broadleaf crops was significantly less (at P< 0.1)
than the 11.9% area as patches in the continuous spring wheat plots (Table
An overlay of patched areas in 2000 with that in 1999 showed that some new patches had appeared and others had disappeared in 2000 compared with 1999 (Figures 1 and 2). Averaged across treatments, 73% of the land area with patches in 2000 were in the same areas that were in patches in 1999; the remaining 27% of 2000 patches were in areas not stunted in 1999 (data not shown). Similarly, 25% of the total land area with patches in 1999 showed no symptoms of Rhizoctonia root rot in 2000. An example of these patterns is shown in Figure 2.
Table 1. Incidence
of rhizoctonia root rot caused by Rhizoctonia solani AG8) on spring wheat
and spring barley in the second year (1998) of different planned crop
Table 2. Percentage
plot area with patches of stunted plants caused by Rhizoctonia solani
AG8 in the different cropping systems and years.
Fig. 1. Distribution of Rhizoctonia-incited patches in 1999 (A) and 2000 (B) as affected by cropping system and based on documentation by global positioning system (GPS). Yellow = patches in 1999; Blue = patches common in both 1999 and 2000; Red = new patches in 2000. Note: Rhizoctonia patches were not measured in treatments 1 and 2 in 1999.
Fig. 2. Magnified view of a portion of the experiment area showing Rhizoctonia-incited patches in 1999 only (yellow), patches common in both 1999 and 2000 (blue), and new patches in 2000 (red).
of plant species to R. solani AG8 in the greenhouse
Influence of zinc
applications on Rhizoctonia root rot
Laboratory analyses revealed only minor chemical differences between soil samples collected within and outside of Rhizoctonia-incited patches (Table 3). Soil pH and extractable potasium both were lower and manganese was higher in areas where patches were observed in 1999, but despite statistical significance, these differences were not agronomically significant. Nutrient concentration within the plant tissues all were within the sufficiency range reported for small grains (Westfall et al. 1990). Tissue concentrations of zinc in response to zinc applications were not higher in healthy plants but, were higher along with manganese and boron in diseased plants (Table 3).
Table 3. Soil pH
and plant nutrient concentrations in soil (A) before application of zinc
and in plants (B) after application of zinc with the nitrogen fertilizer
solution across areas with healthy plants and plants in patches caused
by Rhizoctonia solani AG8.
during two years with severe Rhizoctonia root rot
Table 4. Spring crop
yields in three rotations in Adams county, Washington: a 4-year safflower-
mustard- wheat-wheat rotation; a 2-year wheat-barley rotation and; continuous
wheat. All crops were planted with a Flexi-Coil 6000 drill in 1997, 1998,
and 1999, and a Cross-slot drill in 2000.
rot occurred at a low but uniform level on both wheat and barley in the
2nd year of the cropping sequence, occurred as patches starting in the
3rd year, and produced the largest area of patches in the 4th year in
all three cropping systems. The larger percentage area of patches in year
4 than in year 3 could have resulted from any one or a combination of
at least three factors: (i) the higher precipitation in 2000 compared
with 1999 could have provided more favorable conditions for the disease;
(ii) the use of a lower-disturbance drill in year 4 compared to years
1-3 could have favored the disease (MacNish, 1985); and/or (iii) the pattern
reflects a natural and progressive increase in disease with years of annual
Roget (1995) working
in South Australia similarly reported for direct-seed continuous wheat
and wheat alternated in 2-year rotations with volunteer pasture, peas,
or medic, all direct seeded, that the severity of Rhizoctonia root rot
based on a root rating increased during years 1 to 3 of the study and
the percentage area of patches caused by this disease increased during
years 1 to 4 of the study regardless of the crop rotation. Interestingly,
the severity rating and percentage area of patches declined progressively
in all four cropping systems tested by Roget (1995) in South Australia
after years 3 and 4, respectively, confirming the report of Lucas et al.
(1993) from Oregon of a decline in Rhizoctonia root rot with continuous
cropping of wheat. It is still too early in the cropping sequence to expect
Rhizoctonia root rot decline in the Jirava experiment.
The size, shape and
distribution of patches mapped by GPS in our study are remarkably similar
to the size, shape, and distribution of patches caused by Rhizoctonia
root rot in cereals and mapped by MacNish (1985) in Western Australia.
MacNish (1985) also observed that some years favored more or larger patches
than other years, and that with years, some patches expanded, some remained
the same in size and shape for at least two seasons, some disappeared
completely and new patches appeared. He proposed that a changing balance
of conducive and suppressive factors over space and time, interacting
with nonrandom distribution of primary inoculum, can explain the patterns
observed where these patches are mapped in the same field year after year.
The wide host range
of R. solani AG8 has been well documented (Rovira, 1996; Rovira and Venn,
1985). Nevertheless, different kinds of crops have diverse effects on
the soil environment, they have tap versus fibrous root systems, and they
produce various amounts of crop residue or the residue decomposes at different
rates when left on the soil surface in no-till systems. Depending on the
extent of these differences, the amount of disease in this low-precipitation
area could also differ, at least between cropping systems as dissimilar
as our 4-year rotation and continuous-wheat system. The 4-year rotation
in the Jirava study was designed to augment any benefit of broadleaf crops
for control of root disease by including two broadleaf crops back-to-back
before returning to wheat. Previous studies on rotational effects of broadleaf
crops have been limited to a single broadleaf crop as a break crop before
wheat (Roget, 1995; Rovira and Venn, 1985). In spite of the differences
in crops and rotations, the incidence and severity of Rhizoctonia root
rot was similar if not the same on wheat whether the cropping system was
continuous wheat, a 2-year barley-wheat rotation, or a 4-year safflower-mustard-wheat-wheat
Smiley et al (1996)
concluded that differences in severity of Rhizoctonia root rot of winter
wheat in 2-year, wheat-fallow and wheat-pea rotations at Pendleton, OR
(16 inch annual precipitation) could be explained by an inverse correlation
between microbial biomass and severity of Rhizoctonia root rot, with higher
microbial biomass and less Rhizoctonia root rot (presumable due to greater
disease suppression) in the wheat-pea than in the wheat-fallow rotation.
Microbial biomass at the site where our work was done, although not measured,
would be low in all treatments, in spite of being cropped every year,
because of low amounts of crop biomass produced under the dry conditions
of this site. This might account for the highly conducive nature of the
site to this disease.
It is highly unlikely
that the severe patches caused by R. solani AG8 in the first wheat crop
following two consecutive broadleaf crops were due to survival of the
pathogen in old wheat residue over the 2 years since wheat was last grown
in these plots. Such survival is possible with the pathogens responsible
for take-all, Cephalosporium stripe, and Psuedocercosporella foot rot,
all of which establish in wheat straw or stem bases-parts of the wheat
plant that are slow to decompose, especially under very dry conditions.
Rhizoctonia solani AG8, on the other hand, is limited almost exclusively
to root tissues, which are relatively quick to decompose once the plant
is dead. Even a short plant-free period, such waiting an extra week between
spraying and seeding, can greatly reduce the severity of this Rhizoctonia
root rot (Roget et al. 1987; Smiley et al. 1992). It seems more likely
that, since all crops in these systems are hosts and since the period
of time from planting (mid March to early April) to harvesting (August
for mustard, barley, and wheat and September for safflower) was approximately
the same, the amount of primary inoculum for production of Rhizoctonia
root rot in the next crop was also then approximately the same.
There was no visual
plant response to zinc applied at 1.0 lb/ac below the seed at planting,
including where the application passed through just one side of the patches.
We expected no plant growth response in the apparently healthy areas,
since the zinc concentration in the soil was within the range considered
adequate for wheat growth, but did expect a response to zinc within the
patches if this nutrient has any potential for an ameliorating effect
on this disease. Since this study was done only once, on a site with no
apparent soil deficiency in zinc, it is possible that the same test on
a zinc-deficient site would show evidence of mitigation of Rhizoctonia
root rot. Conceivably some as yet unidentified factor but not zinc deficiency
is predisposing crops to severe Rhizoctonia root rot at this site.
since the amount of Rhizoctonia root rot on wheat and barley was generally
the same in all cropping systems, the yields of these crops also were
generally the same. There were differences in soil water content, especially
in year 4 where plots planted to safflower and mustard the previous 2
years had about 1.5 inches less water available for wheat compared to
continuous cereals, and this was reflected in an average of 10% lower
yields of wheat in the 4-year rotation than in either the 2-year barley-wheat
rotation or continuous wheat (Table 4). Safflower and mustard are among
the few broadleaf crops available for use in crop rotations in this low-precipitation
The production of
spring wheat and spring barley grain yields in the range of 40-45 bu/A
averaged over 4 years of continuous cereals is about 65% of grain yield
achieved by farmers in neighboring fields during the study period when
growing one crop every other year in the 2-year winter wheat-fallow rotation.
Since the yields of spring cereals are produced every year, total production
over each 2-year period would be 130% of the yield of winter wheat alternated
with fallow. Grain yield in years 3 and 4, when Rhizoctonia root rot was
most severe, could have been 5-10% higher without the Rhizoctonia-incited
patches. These yields were produced using just four or five field operations
per year: A pre-plant application of glyphosate herbicide, one pass to
plant and fertilize, an in-crop broadleaf herbicide application, harvest
with a combine, and a post-harvest herbicide application (1999 only for
cereals). The winter wheat-fallow system, on the other hand, typically
uses eight or more tillage operations to prepare a seedbed, control weeds,
fertilize and plant in addition to in-crop herbicide application and harvest.
Excessively-tilled fallow is a major source of dust in the region and
also leaves the land vulnerable to water erosion when soils are frozen
in the winter months. In contrast, direct-seeded cropland is protected
from erosion by the standing stubble and other undisturbed residue of
the previous cereal crop.
There are reports that other deep-rooted broadleaf crops reduce disease pressure and enhance grain yield of the subsequent wheat crop (Angus et al. 1991). However, considering that these broadleaf crops provide no apparent benefit for Rhizoctonia root disease control and leave less soil water available for the ensuing one or two cereal crops, growers in low-precipitation areas on the inland PNW are probably better off to plant continuous cereals.
We thank Steve Schofstoll, WSU Technical Assistant, and Ron Sloot, WSU Agricultural Research Technician, for their excellent support with these experiments.
Conservation Tillage Handbook Series publications are jointly produced
by University of Idaho Cooperative Extension System, Oregon State University
Extension Service and Washington State University Cooperative Extension.
Similar crops, climate, and topography create a natural geographic unit
that crosses state lines in this region. Joint writing, editing, and production
prevent duplication of effort, broaden the availability of faculty, and
substantially reduce costs for the participating states.
The Pacific Northwest
Conservation Tillage Handbook is a large, three-ring binder handbook that
is updated with new and revised Handbook Series publications. It was initiated
in 1989 as a PNW Extension publication in Idaho, Oregon and Washington.
Updates to the Handbook are provided when the updating card is returned.
By the end of 2001, 54 new PNW Conservation Tillage Handbook Series have
been added to the original 98. Copies of the complete Handbook are available
for $20 through county extension offices in the Northwest or ordered directly
by calling state extension publication offices: Idaho -- (208) 885-7982;
Oregon -- (541)-737-2513; Washington -- (509) 335-2999 (some shipping
and handling charges and sales tax may apply).
It's now accessible on the Internet! All of the PNW Conservation Tillage Handbook and Handbook Series are on the Internet home page (http://pnwsteep.wsu.edu) Pacific Northwest STEEP III Conservation Tillage Systems Information Source. The home page also contains recent issues of the PNW STEEP III Extension Conservation Tillage Update, listings of other conservation tillage information resources, coming events and much more. For more information on the Handbook or other Web site information, contact Roger Veseth, WSU/UI Conservation Tillage Specialist, Plant Soil and Entomological Sciences Department, University of Idaho, Moscow, ID 83844-2339, phone 208-885-6386, FAX 208-885-7760, e-mail (email@example.com).
Cooperative Extension programs and policies comply with federal and state laws and regulations on nondiscrimination regarding race, color, gender, national origin, religion, age, disability, and sexual orientation. The University of Idaho Cooperative Extension System, Oregon State University Extension Service and Washington State University Cooperative Extension are Equal Opportunity Employers.
us: Hans Kok, (208)885-5971
Accessibility | Copyright
| Policies | WebStats | STEEP Acknowledgement