Thursday, 13 July 2006
80-3

Rhizoctonia Control through Management of Disease Suppressive Activity in Soils.

David K. Roget, CSIRO Sustainable Ecosystems, Glen Osmond SA 5064, Australia and Gupta V.S.R. Vadakattu, CSIRO Entomology, PMB NO 2, Glen Osmond SA 5064, Australia.

Rhizoctonia solani Kuhn Ag 8 causes the bare patch disease in a number of agricultural crops including wheat in southern Australia. The phenomenon of disease suppressiveness to soil borne plant pathogens has been reported to occur worldwide. The level of disease suppressive activity in soils against fungal diseases such as rhizoctonia bare patch is a function of the population, activity and composition of the microbiota community. All soils have an inherent level of suppressive activity, but this level can be significantly modified by management practices used within a farming system. In a long-term farming system trial at Avon in South Australia, disease suppression increased from a low to high level over a period of 5-10 years following a change in management practices to intensive cropping, full stubble retention, limited grazing and higher nutrient inputs to meet crop demand (Roget 1995). The increase in suppression provided complete control of the soil borne diseases rhizoctonia bare patch and take-all. Soils with high levels of disease suppression have also been identified in commercial farms across SA and Victoria. The management factors consistently related to soils with improved disease suppression included intensive cropping, stubble retention, limited grazing, limited or no cultivation and above average yields (high water use efficiency). These management practices increase biologically available carbon inputs and result in changes to the composition and activity of the soil microbial community over time (Gupta and Neate, 1999). These changes result in greater competition for soil resources that, along with predation and inhibition of pathogens, lead to increased disease suppression. The aim of this paper is to report on (i) the dynamics of rhizoctonia disease incidence in the long-term farming system experimental site and (ii) the levels of disease suppressive activity as influenced by the management practices that include herbicide sprays, crop removal and increased inorganic N levels in the surface soil. The long-term farming system trial started 1978 and is located at Avon, South Australia on a calcic Xerosol (pH 8.4) and receives an annual rainfall of 330mm. In the late 1990's herbicide resistant grass weeds had become such an issue that the plots were sprayed out at anthesis, during 2000 to 2002 crop seasons, to limit seed set. Crops were harvested as normal during 2003 and 2004 seasons. Each year soils samples were collected during the off-season and analysed for disease suppressive activity using a lab-based bioassay (Roget et al. 1999) and for mineral N levels in the soil profile prior to sowing next crop. Disease incidence levels in the cereal crop following anthesis spraying were monitored. Pot based bioassay using surface (0-10cm) soil from the field experiment were also conducted to determine the effect of mineral N levels on disease suppressive activity. Following the first year of anthesis spraying a very low level of rhizoctonia (<1 disease rating on a 1 to 5 scale) was observed on the crop roots and was the first measurable level of disease in the trial over 10 years. Following the second year of crop spraying, small patches of rhizoctonia damage was observed. After the third year of spraying, the following crop (2003) was severely affected by both rhizoctonia (disease rating 3) and take-all. Results from pot bioassays using samples collected pre-sowing indicated the loss of suppressive activity. In 2003 the level of mineral N in the topsoil at sowing had reached 70 kg/ha but declined to a normal levels (10-15 kg / ha) after crops were harvested in 2003 and 2004. A significant reduction in disease incidence was observed in the 2004 and 2005 crops following the resumption of N export through the crop harvest. Through out this period the DNA level of R. solani (AG 8) in the soil remained ~60 pg/g. These observations are supported by the results from pot bioassays where addition of inorganic N along with simple carbon source (sucrose) reduced the effectiveness of disease reduction when compared to C substrate alone. These results suggest that full suppressive activity was restored following re-establishment of C and N turnover processes that will prevent the accumulation of high levels of mineral N during the non crop periods (i.e. summer and early autumn periods) through crop harvesting, export of N and retention of cereal crop residues. It appears that the underlying suppressive activity is not lost, but is not expressed effectively in the presence of high mineral N levels. References: i. Gupta, V.V.S.R. and Neate, S.M. (1999) Root disease incidence-A simple phenomenon or a product of diverse microbial/biological interactions. In: Proc. of the 1st Australasian SoilBorne Disease symposium,R.C. Magarey (Ed.), pp. 3-4, BSES, Brisbane, Australia. ii. Roget DK, Coppi JA, Herdina, Gupta VVSR (1999) Assessment of suppression to Rhizoctonia solani in a range of soils across SE Australia. In ‘Proc. 1st Australasian Soilborne Disease Symposium'. Gold Coast, Qld. iii. Roget DK (1995) Decline of root rot (Rhizoctonia solani AG-8) in wheat in a tillage and rotation experiment at Avon, South Australia. Aust. J. of Expt. Agric. 35, 1009-13

Back to 2.3P New Strategies for Management of Plant Pathogenic Soil Microorganisms - Natural Soil Suppression or Genetically Modified Plants - Theater
Back to WCSS

Back to The 18th World Congress of Soil Science (July 9-15, 2006)