Occurrence of the Rhizoctonia “Bare Patch” Disease in Diverse Direct-Seed Spring Cropping Systems in a Low-Precipitation Zone

Pacific Northwest Conservation Tillage Handbook Series No. 19
Chapter 4 – Plant Disease Management Strategies, May 2002

Authors

R. James Cook, WSU Endowed Chair in Wheat Research, Pullman
William Schillinger, WSU Dryland Research Agronomist, Lind
Neil Christensen, OSU Extension Soil Scientist, Corvallis
Ron Jirava, Grower, Ritzville
Harry Schafer, WSU Research Technician, Lind

Abstract

The percentage area of patches of wheat plants stunted by Rhizoctonia solani AG8 in years 3 and 4 of a direct seed (no-till) cropping systems study conducted on the Ron Jirava farm 5 miles west of Ritzville, Washington was the same for continuous spring wheat (no crop rotation), spring wheat after spring barley, or 1st or 2nd year spring wheat after consecutive crops of safflower and yellow mustard. Similar percentage area of patches occurred in plots sown to spring barley after spring wheat and in the safflower and mustard. Greenhouse studies confirmed that safflower and mustard as well as several other broadleaf crops are susceptible to the root rot caused by this same pathogen. Between years 3 and 4, some patches increased in size, some new patches formed, and a few patches present in year 3 were not present in year 4. A one-time application of zinc at 1.0 lb/ac at planting provided no visual response of the stunted plants where the application passed through one side of a patch. The effect of crop rotation on grain yield of spring wheat related to water supply, with lower yield after the broadleaf crops because they extract more water (leaving less water for the next crop) than either wheat or barley. Considering that these broadleaf crops provide no apparent benefit for Rhizoctonia root rot control, and also that they leave less soil water available for the ensuing one or two cereal crops, growers in low-precipitation areas on the inland PNW are probably better off to plant continuous cereals.

Introduction

Dryland (rainfed) cropping in the low-precipitation (less than 12 inch annual) region of the U.S. inland Pacific Northwest (PNW) is mostly a tillage-based winter wheat-summer fallow system where only one crop (winter wheat) is produced every 2 years. This dry region covers 3.5 million cropland acres in eastern Washington and adjacent north-central Oregon. The main purpose of the 13-month fallow is to store in the soil a portion of the over-winter precipitation to allow successful establishment of winter wheat, help ensure economic grain yields and reduce the risk of crop failure from drought. However, summer fallow in this region is only about 30% efficient (Leggett et al. 1974), i.e., about 70% of annual average precipitation received during the fallow year is lost because of runoff and evaporation before the winter wheat is planted. Moreover, intensive tillage during the fallow cycle frequently buries surface residue, pulverizes soil clods, and reduced surface roughness. Blowing dust from excessively tilled fallow fields leads to major soil losses and causes concerns for air quality.

Although precipitation in the region has historically been considered too low for annual cropping, a 3-year winter wheat-spring cereal-fallow rotation is practiced on about 10% of the cropland. Schillinger et al. (1999) reported that both direct-seeded (no-till) and conventionally-seeded spring cereals can be successfully produced in this low-precipitation area when over-winter water recharge occurs to a soil depth of 3 feet or more and 5 inches of plant-available water are stored in the soil. Although continuous annual cropping is practiced on less than 1% of the land, there is interest by both growers and researchers in long-term annual cropping using direct seeding.

Replacement of the winter wheat-fallow system with continuous spring cropping eliminates several important diseases of wheat in the PNW, namely the snow molds, Cephalosporium stripe, and Pseudocercosporella foot rot, all of which occur on winter wheat but not spring cereals . In contrast, Rhizoctonia root rot caused by Rhizoctonia solani AG8 is rare or nonexistent on wheat following a 1-year break to fallow but occurs with continuous cereals, especially if volunteer cereals and grass weeds are allowed to grow within the stubble between crops (Smiley et al. 1992), and is exacerbated by direct seeding (Pumphrey et al. 1987; Weller et al. 1986). When severe, the disease occurs as patches of stunted plants and is referred to in early literature of Australia as “bare patch.” Because of its wide host range, rotation of cereals with a broadleaf crop has not controlled this disease. Thus, Rovira and Venn (1985) reported that lupines used in a 2-year rotation with wheat and direct seeding in South Australia controlled take-all caused by Gaeumannomyces graminis var. tritici but not Rhizoctonia root rot caused by R. solani AG8.

As part of a larger long-term study of the potential for continuous direct-seed spring cropping in the low-precipitation area of eastern Washington, we compared the incidence and severity of Rhizoctonia root rot in response to 4 years of (i) continuous wheat, (ii) a 2-year wheat-barley rotation, and (iii) a 4-year safflower-yellow mustard-wheat-wheat rotation. While primarily intended to evaluate the agronomic and economic potential of these three spring cropping systems, the 4-year rotation was designed specifically to test the effects of back-to-back broadleaf crops on the epidemiology of Rhizoctonia root rot. Thongbai et al. (1993a,b) reported that zinc applications to South Australian soils deficient in zinc provided some control of Rhizoctonia root rot. We, therefore, also tested the effects of zinc application in a year when patches of stunted plants caused by this disease were particularly severe.

Materials and Methods

Treatments and field layout

A study of direct-seed annual spring-cropping systems was established starting in 1997 on the Ron Jirava farm near Ritzville, Adams County, WA. The spring cropping systems were: (i) a 4-year safflower-yellow mustard-wheat-wheat rotation; (ii) a 2-year wheat-barley rotation and; (iii) continuous wheat. The spring wheat was soft white ‘Alpowa’ and the barley ‘Baronesse’ all four years. The experiment covered 20 acres and the soil is a Ritzville silt loam. The soil is more than 6 feet deep and the slope is less than 1%. The field where the experiment was established had been direct seeded to spring wheat in 1996 following decades of the traditional winter wheat-fallow system. Average annual precipitation at the site is 11.5 inches.

The experimental design was a randomized complete block with four replications. Each crop in each rotation occurred each year in 60 X 500 ft plots, making a total of 28 plots. During the first 3 years (1997, 1998, and 1999), all plots were planted and fertilized in one-pass directly into the undisturbed soil and residue left by the previous crop using Jirava’s Flexi-Coil 6000 (Flexi-Coil Ltm., Saskatoon, Sask.) airseeder equipped with Barton II™ dual-disk openers on 7.5-inch spacing for simultaneous and precision placement of seed and fertilizer in the same row. In 2000, all plots were planted and fertilized in one-pass using a custom-built drill equipped with Cross-slot™ (Baker Mfg., Christ Church, NZ) notched-coulter openers on 8-inch spacing for simultaneous and precision placement of seed and fertilizer in the same row. Both openers are low-disturbance and place fertilizer beneath and slightly to one side of the seed. Glyphosate (Round-up) was applied 2-4 weeks before planting in the spring at a rate of 16 oz./acre to control weeds and limit the build-up of inoculum of R. solani AG8 on living hosts between harvest and planting (Roget et al. 1987; Smiley et al. 1992). Seeding rate averaged across years was 70, 70, 20, and 8 lb/ac for wheat, barley, safflower, and mustard, respectively. Solution 32 provided the base for liquid fertilizer to supply an average of 36 lb N, 10 lb P, and 15 lb S per acre. The quantity of available soil moisture and residual N, P, and S was measured in all rotations each spring to determine fertilizer needs based on a yield goal. Between the tillering and stem elongation phase of growth of wheat and barley, in-crop broadleaf weeds were controlled with 2,4-D LV-6 at 12 oz./ac plus 0.3 oz./ac Harmony Extra . In-crop herbicides were not used in safflower or mustard plots as no legally-labeled broadleaf weed herbicides were available for these crops in Washington.

All plots were harvested with the grower’s combine, and grain yield was determined on site by auguring grain into a truck mounted on weigh pads. When post-harvest broadleaf weeds were present in cereals (1999 only) and broadleaf crops (all years), Surefire herbicide (paraquat + diuron) was applied @ 24 oz./ac 7-10 days after harvest to prevent seed production and halt soil water use by these weeds.

Quantification of Rhizoctonia root rot and area as patches

In 1998, before the appearance of patches of stunted plants, the number and percentage of seminal and crown roots with these diseases was determined for 100 plants in the stem elongation stage of growth dug on 5 June (Feekes 5) from each plot and transported under refrigeration to the laboratory for processing. The plants from each plot represented a larger bulked sample in excess of 100 plants selected from intermittent locations across the entire length and breadth of each plot. The roots and adhering soil were dug at least 15 cm deep as clusters of four to five plants per sample. The plants when freshly dug were shaken gently to dislodge some of the soil and then placed in a large new plastic bag. In the laboratory, the roots were washed free of soil and the total number of seminal and crown roots and the numbers with Rhizoctonia root rot were counted while floating the roots in a thin layer of water against a white background. The 100 plants selected from these assessments were selected randomly from the bulked samples.

Patches appeared in 1999 and 2000. A global positioning system (GPS) equipped with mapping software was used to determine the location, size, and area of these patches. These measurements were obtained in all wheat and barley plots in 1999 and 2000 and in safflower and mustard plots in 2000 by circling each clearly visible Rhizoctonia patch with the backpack-mounted GPS mapping unit in early June. The GPS unit had a degree of accuracy of less than 3 feet.

Tests for susceptibility of plant species to Rhizoctonia solani AG8.

Greenhouse studies were conducted to determine the relative susceptibility of several broadleaf plant species to R. solani AG8. The plant species were canola, mustard, safflower, pea, lentil and chickpea, all potential rotation crops with wheat in the PNW. A virulent isolate [C-1; (Weller et al. 1986)] of the pathogen obtained originally from barley and grown on sterilized oat grains was used as the source of inoculum. Plastic containers were filled two-thirds full with a local silt loam soil that had been pasteurized with moist heat at 140 0 F for 30 minutes and then air-dried. Two Rhizoctonia-infested oat grains were placed on the soil surface and covered with a 1-inch-thick layer of the same pasteurized soil. The soil was then moistened with water applied from the top and incubated for 1 week; this allowed time for the pathogen to colonize the pasteurized soil from the oat-grain food base. Seeds of the different crop plant species were then planted one seed per container for large-seeded plants and two seeds per container for small-seeded crops, and incubated on a greenhouse bench at 60 0 F with natural light supplemented with overhead mercury-halide lights. Pasteurized soil of the same quantity and incubated in the same fashion but without oat-grain inoculum of the pathogen served as checks. A total of seven containers with and seven without Rhizoctonia inoculum were used per plant species. The plants were watered as necessary and allowed to grow for 3-4 weeks at which time the amount of disease was assessed based on appearance of the tops (plant stunting) and the roots (girdled or severed by Rhizoctonia lesions). The tests were done three times.

Evaluation of the effects of zinc application on the development of Rhizoctonia patches

An application of zinc was superimposed on the continuous spring wheat treatment in 2000 (year 4) to determine whether the patches caused by Rhizoctonia root rot could be abrogated by this plant nutrient. Two composite soil samples were collected to a depth of 6 inches from within each of the four replications of the continuous-wheat treatment in the spring just prior to planting. One sample was taken from within a 1999 Rhizoctonia-incited patch and one was taken from a nearby 1999 apparently healthy area. The soil samples were air-dried for 10 days and then submitted to the Central Analytical Laboratory at Oregon State University. The samples were analyzed for pH, phosphorus, potassium, calcium, and magnesium, copper, manganese, zinc, sulfate-sulfur, and total nitrogen.

At planting, one pass was made the full length of each 500-ft-long plot with the 8-ft-wide Cross-slot drill equipped to apply zinc at 1.0 lb/ac as zinc-chelate mixed with the liquid fertilizer. All other drill passes were made using the same drill and same fertilizer rate but without the zinc. Plant samples were taken when wheat was in the stem elongation phase (Feekes 7) from areas within each of the four replicate plots from (i) healthy plants where no zinc was applied, (ii) healthy plants where zinc was applied, and (iii) within patches (stunted plants) where zinc was applied. These samples were oven-dried prior to analysis for N, P, K, S, Ca, Mg, Mn, Cu, B, and Zn.

Analysis of data

Grain yield, percentage area as patches of plants (stunted by Rhizoctonia root rot), and nutrient concentration data were subjected to analysis of variance and means were compared using least significant differences (LSD) when the F test indicated significance at P<0.05. Root infection data were compared using standard deviations from means for percentage infections.

Results

Incidence and severity of Rhizoctonia root rot and patches

A low amount of Rhizoctonia root rot occurred on both seminal and crown roots of both the wheat and barley in year 2 of the study (Table 1). The percentage of roots with lesions typical of Rhizoctonia root rot was three to four times higher for seminal roots than crown roots. Both the lowest and the highest percentage of seminal and crown roots with Rhizoctonia root rot in 1998, significant at P < 0.05, occurred in the treatments where wheat had been grown back-to-back in those first 2 years of the study, one dedicated to continuous wheat and the other prior to planting broadleaf crops in years 3 and 4 of the study.

Patches of plants stunted because of Rhizoctonia root rot were apparent in all wheat and barley plots starting in year 3 (1999), and the area in patches was even greater in year 4 (2000) of the study (Table 3; Figures 1 and 2). Poor stands in both the mustard and safflower in 1999 made it unreliable to quantify the patches for those crops that year. When all data on percentage area of patches for a given year were analyzed together, there was no significant difference in area of patches among any of the rotations, including in 2000 that included 1st year wheat after two consecutive broadleaf crops (Table 2). On the other hand, when the area of patches was analyzed for wheat plots only, the 4.5% area as patches in 1st year wheat after the two consecutive broadleaf crops was significantly less (at P< 0.1) than the 11.9% area as patches in the continuous spring wheat plots (Table 2).

An overlay of patched areas in 2000 with that in 1999 showed that some new patches had appeared and others had disappeared in 2000 compared with 1999 (Figures 1 and 2). Averaged across treatments, 73% of the land area with patches in 2000 were in the same areas that were in patches in 1999; the remaining 27% of 2000 patches were in areas not stunted in 1999 (data not shown). Similarly, 25% of the total land area with patches in 1999 showed no symptoms of Rhizoctonia root rot in 2000. An example of these patterns is shown in Figure 2.

Table 1. Incidence of rhizoctonia root rot caused by Rhizoctonia solani AG8) on spring wheat and spring barley in the second year (1998) of different planned crop sequences.

Table 1.

Table 2. Percentage plot area with patches of stunted plants caused by Rhizoctonia solani AG8 in the different cropping systems and years.

Table 2.
Fig. 1. Distribution of Rhizoctonia-incited patches in 1999 (A) and 2000 (B) as affected by cropping system and based on documentation by global positioning system (GPS). Yellow = patches in 1999; Blue = patches common in both 1999 and 2000; Red = new patches in 2000. Note: Rhizoctonia patches were not measured in treatments 1 and 2 in 1999.
Fig. 2. Magnified view of a portion of the experiment area showing Rhizoctonia-incited patches in 1999 only (yellow), patches common in both 1999 and 2000 (blue), and new patches in 2000 (red).

Susceptibility of plant species to R. solani AG8 in the greenhouse
All broadleaf plant species exposed to inoculum of R. solani AG8 were stunted relative to the check plants grown in pasteurized soil without this pathogen. Inspection of the roots revealed the roots girdled and severed by the lesions diagnostic of this disease on wheat and barley.

Influence of zinc applications on Rhizoctonia root rot
Zinc applications had no visible effect on Rhizoctonia root rot as indicated by no response in any of the areas where the application passed through one side of a patch. Further, the soil test for zinc indicated that wheat was unlikely to respond to a zinc application, and an adequate supply of zinc for wheat was confirmed based on the concentrations in plant tissue (Table 3) and the absence of any visual response of the plants to the zinc application outside the patches.

Laboratory analyses revealed only minor chemical differences between soil samples collected within and outside of Rhizoctonia-incited patches (Table 3). Soil pH and extractable potasium both were lower and manganese was higher in areas where patches were observed in 1999, but despite statistical significance, these differences were not agronomically significant. Nutrient concentration within the plant tissues all were within the sufficiency range reported for small grains (Westfall et al. 1990). Tissue concentrations of zinc in response to zinc applications were not higher in healthy plants but, were higher along with manganese and boron in diseased plants (Table 3).

Table 3. Soil pH and plant nutrient concentrations in soil (A) before application of zinc and in plants (B) after application of zinc with the nitrogen fertilizer solution across areas with healthy plants and plants in patches caused by Rhizoctonia solani AG8.

Table 3.

Grain yields during two years with severe Rhizoctonia root rot
Yields of wheat were higher by 49-58% and barley by 71% in year 4 (2000) than in year 3 (1999) of the 4-year study (Table 4). There was about 1.5 inches less water available in the 6-ft soil profile at the time of planting wheat where broadleaf crops had been grown in years 1 and 2 of the study (data not shown).

Table 4. Spring crop yields in three rotations in Adams county, Washington: a 4-year safflower- mustard- wheat-wheat rotation; a 2-year wheat-barley rotation and; continuous wheat. All crops were planted with a Flexi-Coil 6000 drill in 1997, 1998, and 1999, and a Cross-slot drill in 2000.

Table 4.

Discussion

Rhizoctonia root rot occurred at a low but uniform level on both wheat and barley in the 2nd year of the cropping sequence, occurred as patches starting in the 3rd year, and produced the largest area of patches in the 4th year in all three cropping systems. The larger percentage area of patches in year 4 than in year 3 could have resulted from any one or a combination of at least three factors: (i) the higher precipitation in 2000 compared with 1999 could have provided more favorable conditions for the disease; (ii) the use of a lower-disturbance drill in year 4 compared to years 1-3 could have favored the disease (MacNish, 1985); and/or (iii) the pattern reflects a natural and progressive increase in disease with years of annual direct-seed cropping.

Roget (1995) working in South Australia similarly reported for direct-seed continuous wheat and wheat alternated in 2-year rotations with volunteer pasture, peas, or medic, all direct seeded, that the severity of Rhizoctonia root rot based on a root rating increased during years 1 to 3 of the study and the percentage area of patches caused by this disease increased during years 1 to 4 of the study regardless of the crop rotation. Interestingly, the severity rating and percentage area of patches declined progressively in all four cropping systems tested by Roget (1995) in South Australia after years 3 and 4, respectively, confirming the report of Lucas et al. (1993) from Oregon of a decline in Rhizoctonia root rot with continuous cropping of wheat. It is still too early in the cropping sequence to expect Rhizoctonia root rot decline in the Jirava experiment.

The size, shape and distribution of patches mapped by GPS in our study are remarkably similar to the size, shape, and distribution of patches caused by Rhizoctonia root rot in cereals and mapped by MacNish (1985) in Western Australia. MacNish (1985) also observed that some years favored more or larger patches than other years, and that with years, some patches expanded, some remained the same in size and shape for at least two seasons, some disappeared completely and new patches appeared. He proposed that a changing balance of conducive and suppressive factors over space and time, interacting with nonrandom distribution of primary inoculum, can explain the patterns observed where these patches are mapped in the same field year after year.

The wide host range of R. solani AG8 has been well documented (Rovira, 1996; Rovira and Venn, 1985). Nevertheless, different kinds of crops have diverse effects on the soil environment, they have tap versus fibrous root systems, and they produce various amounts of crop residue or the residue decomposes at different rates when left on the soil surface in no-till systems. Depending on the extent of these differences, the amount of disease in this low-precipitation area could also differ, at least between cropping systems as dissimilar as our 4-year rotation and continuous-wheat system. The 4-year rotation in the Jirava study was designed to augment any benefit of broadleaf crops for control of root disease by including two broadleaf crops back-to-back before returning to wheat. Previous studies on rotational effects of broadleaf crops have been limited to a single broadleaf crop as a break crop before wheat (Roget, 1995; Rovira and Venn, 1985). In spite of the differences in crops and rotations, the incidence and severity of Rhizoctonia root rot was similar if not the same on wheat whether the cropping system was continuous wheat, a 2-year barley-wheat rotation, or a 4-year safflower-mustard-wheat-wheat rotation.

Smiley et al (1996) concluded that differences in severity of Rhizoctonia root rot of winter wheat in 2-year, wheat-fallow and wheat-pea rotations at Pendleton, OR (16 inch annual precipitation) could be explained by an inverse correlation between microbial biomass and severity of Rhizoctonia root rot, with higher microbial biomass and less Rhizoctonia root rot (presumable due to greater disease suppression) in the wheat-pea than in the wheat-fallow rotation. Microbial biomass at the site where our work was done, although not measured, would be low in all treatments, in spite of being cropped every year, because of low amounts of crop biomass produced under the dry conditions of this site. This might account for the highly conducive nature of the site to this disease.

It is highly unlikely that the severe patches caused by R. solani AG8 in the first wheat crop following two consecutive broadleaf crops were due to survival of the pathogen in old wheat residue over the 2 years since wheat was last grown in these plots. Such survival is possible with the pathogens responsible for take-all, Cephalosporium stripe, and Psuedocercosporella foot rot, all of which establish in wheat straw or stem bases-parts of the wheat plant that are slow to decompose, especially under very dry conditions. Rhizoctonia solani AG8, on the other hand, is limited almost exclusively to root tissues, which are relatively quick to decompose once the plant is dead. Even a short plant-free period, such waiting an extra week between spraying and seeding, can greatly reduce the severity of this Rhizoctonia root rot (Roget et al. 1987; Smiley et al. 1992). It seems more likely that, since all crops in these systems are hosts and since the period of time from planting (mid March to early April) to harvesting (August for mustard, barley, and wheat and September for safflower) was approximately the same, the amount of primary inoculum for production of Rhizoctonia root rot in the next crop was also then approximately the same.

There was no visual plant response to zinc applied at 1.0 lb/ac below the seed at planting, including where the application passed through just one side of the patches. We expected no plant growth response in the apparently healthy areas, since the zinc concentration in the soil was within the range considered adequate for wheat growth, but did expect a response to zinc within the patches if this nutrient has any potential for an ameliorating effect on this disease. Since this study was done only once, on a site with no apparent soil deficiency in zinc, it is possible that the same test on a zinc-deficient site would show evidence of mitigation of Rhizoctonia root rot. Conceivably some as yet unidentified factor but not zinc deficiency is predisposing crops to severe Rhizoctonia root rot at this site.

Not surprisingly, since the amount of Rhizoctonia root rot on wheat and barley was generally the same in all cropping systems, the yields of these crops also were generally the same. There were differences in soil water content, especially in year 4 where plots planted to safflower and mustard the previous 2 years had about 1.5 inches less water available for wheat compared to continuous cereals, and this was reflected in an average of 10% lower yields of wheat in the 4-year rotation than in either the 2-year barley-wheat rotation or continuous wheat (Table 4). Safflower and mustard are among the few broadleaf crops available for use in crop rotations in this low-precipitation area.

The production of spring wheat and spring barley grain yields in the range of 40-45 bu/A averaged over 4 years of continuous cereals is about 65% of grain yield achieved by farmers in neighboring fields during the study period when growing one crop every other year in the 2-year winter wheat-fallow rotation. Since the yields of spring cereals are produced every year, total production over each 2-year period would be 130% of the yield of winter wheat alternated with fallow. Grain yield in years 3 and 4, when Rhizoctonia root rot was most severe, could have been 5-10% higher without the Rhizoctonia-incited patches. These yields were produced using just four or five field operations per year: A pre-plant application of glyphosate herbicide, one pass to plant and fertilize, an in-crop broadleaf herbicide application, harvest with a combine, and a post-harvest herbicide application (1999 only for cereals). The winter wheat-fallow system, on the other hand, typically uses eight or more tillage operations to prepare a seedbed, control weeds, fertilize and plant in addition to in-crop herbicide application and harvest. Excessively-tilled fallow is a major source of dust in the region and also leaves the land vulnerable to water erosion when soils are frozen in the winter months. In contrast, direct-seeded cropland is protected from erosion by the standing stubble and other undisturbed residue of the previous cereal crop.

There are reports that other deep-rooted broadleaf crops reduce disease pressure and enhance grain yield of the subsequent wheat crop (Angus et al. 1991). However, considering that these broadleaf crops provide no apparent benefit for Rhizoctonia root disease control and leave less soil water available for the ensuing one or two cereal crops, growers in low-precipitation areas on the inland PNW are probably better off to plant continuous cereals.

Acknowledgements

We thank Steve Schofstoll, WSU Technical Assistant, and Ron Sloot, WSU Agricultural Research Technician, for their excellent support with these experiments.

Literature Cited

  1. Angus, J.F., van Herwaarden A.F., and Howe, G.N. 1991. Productivity and break crop effects of winter-grown oilseeds. Australian J. Exp. Agric. 31:669-677.
  2. Beal, R.E., Phillion, D.P., Headrick, J.M., O’Reilly, P., and Cox, J. 1998. MON65500: A unique fungicide for control of take-all in sheat. Pages 343-350 in Proceedings: The 1998 Brighton Conference-Pests and Diseases.
  3. Cook, R. J. and Veseth, R.J. 1991. Wheat Health Management. APS Press, St. Paul, MN. 151 pp.
  4. Large, E.C. 1954. Growth stages in cereals. Plant Path. 3:128-129.
  5. Leggett, G.E., Ramig, R.E., Johnson, L.C., and Massee, T.W. 1974. Summer fallow in the Northwest. p. 110-135. In Summer fallow in the western United States. USDA-ARS Conserv. Res. Rep. no. 17. U.S. Gov. Print. Office, Washington, DC.
  6. Lucas, P., Smiley, W. and Collins, H.P. 1993. Decline of Rhizoctonia root rot on wheat in soils infested with Rhizoctonia solani AG-8. Phytopathology 83:260-263.
  7. MacNish, G.C. 1985. Mapping rhizoctonia patch in consecutive cereal crops in Western Australia. Plant Pathology 34:165-174.
  8. MacNish, G.C. 1985. Methods of reducing rhizoctonia patch of cereals in Western Australia. 34:175-181.
  9. Pumphrey, F.V., Wilkins, D.E., Hane, D.C., and Smiley, R.W. 1987. Influence of tillage and nitrogen fertilizer on Rhizoctonia root rot of direct drilled wheat by short-term chemical fallow. Plant Dis. 71:125-127.
  10. Ramsey, N.E., Cook, R.J., and Halsey, M.E. 2000. Prevalence of wheat take-all in the Pacific Northwest. Phytopathology 90:S63 (abstract).
  11. Roget, D.K. 1995. Decline in root rot (Rhizoctonia solani AG-8) in wheat in a tillage and rotation experiment at Avon, South Australia. Aust. J. Exp. Agric. 35:1009-1013.
  12. Roget, D.K., Venn, N.R. and Rovira, A.D. 1987. Reduction in rhizoctonia root rot of direct drilled wheat by short-term chemical fallow. Aust. J. Exp. Agric. 27:425-430.
  13. Rovira, A.D. 1986. Influence of crop rotation and tillage on Rhizoctonia bare patch of wheat. Phytopathology 76:669-673.
  14. Rovira, A.D. and Venn, N.R. 1985. Effect of rotation and tillage on take-all and Rhizoctonia root rot of wheat. In Ecology and Management of Soilborne Plant Diseases. Eds C.A. Parker, A.D. Rovira, K.J. Moore, P.T. Wong and J.F. Kollmorgen. pp. 255-258. American Phytopathological Society: St. Paul, MN, USA.
  15. Schillinger, W.F., Cook, R.J., and Papendick, R.I. 1999. Increased dryland cropping intensity with no-till barley. Agron. J. 91:744-752.
  16. Smiley, R.W., Ogg, Jr., A.G. and Cook, R.J. 1992. Influence of glyphosate on severity of Rhizoctonia root rot and growth and yield of barley. Plant Dis. 76: 937-942.
  17. Smiley, R.W., Collins, H.P., and Rasmussen, P.E. 1996. Diseases of wheat in long-term agronomic experiments at Pendleton, OR. Plant Dis. 80:813-820.
  18. Thongbai, P., Hannam, R.J., Graham, R.D. and Webb, M.J. 1993a. Interaction between zinc nutritional status of cereals and Rhizoctonia root rot severity. I. Field observations. Plant and Soil 153:207-214
  19. Thongbai, P., Graham, R.D., Neate, S.M. and Webb, M.J. 1993b. Interaction between zinc nutritional status of cereals and Rhizoctonia root rot severity. II. Effect of Zn on disease severity of wheat under controlled conditions. Plant and Soil 153:215-222.
  20. Weller, D.M., R.J. Cook, G. MacNish, E.N. Bassett, R.L. Powelson, R.R. Petersen. 1986. Rhizoctonia bare patch of small grains favored by reduced tillage in the Pacific Northwest. Plant Dis. 70:70-73.
  21. Westfall, D.G., Whitney, D.A., and Brandon, D.M. 1990. Plant analysis as and aid in fertilizing small grains. P. 495-515. In R.L. Westerman (ed.) Soil Testing and Plant Analysis, Third Edition. Soil Science Society of America, Madison, WI.

Pacific Northwest Conservation Tillage Handbook Series publications are jointly produced by University of Idaho Cooperative Extension System, Oregon State University Extension Service and Washington State University Cooperative Extension. Similar crops, climate, and topography create a natural geographic unit that crosses state lines in this region. Joint writing, editing, and production prevent duplication of effort, broaden the availability of faculty, and substantially reduce costs for the participating states.

Cooperative Extension programs and policies comply with federal and state laws and regulations on nondiscrimination regarding race, color, gender, national origin, religion, age, disability, and sexual orientation. The University of Idaho Cooperative Extension System, Oregon State University Extension Service and Washington State University Cooperative Extension are Equal Opportunity Employers.