Managers reduce piñon (Pinus spp.) and juniper (Juniperus spp.) trees that are encroaching on sagebrush (Artemisia spp.) communities to lower fuel loads and increase cover of desirable understory species. All plant species in these communities depend on soil water held at > −1.5 MPa matric potential in the upper 0.3 m of soil for nutrient diffusion to roots and major growth in spring (resource growth pool). We measured soil water matric potentials and temperatures using gypsum blocks and thermocouples buried at 0.01–0.3 m on tree, shrub, and interspace microsites to characterize the seasonal soil climate of 13 tree-encroached sites across the Great Basin. We also tested the effects of initial tree infilling phase and tree control treatments of prescribed fire, tree cutting, and tree shredding on time of available water and soil temperature of the resource growth pool on nine sites. Both prescribed fire and mechanical tree reduction similarly increased the time that soil water was available (matric potential > −1.5 MPa) in spring, but this increase was greatest (up to 26 d) when treatments were applied at high tree dominance. As plant cover increased with time since treatment, the additional time of available water decreased. However, even in the fourth year after treatment, available water was 8.6 d and 18 d longer on treatments applied at mid and high tree dominance compared to untreated plots, indicating ongoing water availability to support continued increases in residual plants or annual invaders in the future. To increase resistance to invasive annual grasses managers should either treat at lower or mid tree dominance when there is still high cover of desirable residual vegetation or seed desirable species to use increased resources from tree reduction. This strategy is especially critical on warmer sites, which have high climate suitability to invasive species such as cheatgrass (Bromus tectorum L.)
Piñon–juniper (Pinus spp.–Juniperus spp.) tree encroachment and subsequent infilling in former sagebrush (Artemisia spp.) communities results in loss of understory cover, increase in woody fuel loads, and greater risk for high-severity, large-scale wildfire (Miller and Tausch 2001). Increased runoff and erosion associated with bare and water-repellent soils (Pierson et al. 2010; Urgeghe et al. 2010; Madsen et al. 2011) and dominance by annual weeds such as cheatgrass (Bromus tectorum L.; Brooks et al. 2004) may follow. Land managers reduce live tree dominance by prescribed fire and various mechanical means such as manual or hydraulic cut-and-drop or by shredding standing trees. To restore or maintain resilient ecosystems, managers should treat infilling areas well in advance of a suspected ecological threshold of tree cover (Bates et al. 2013; Roundy et al. 2014). This threshold tree cover has conceptually been considered to be an upper ratio of tree to total perennial cover beyond which fuel loads are high and understory residual plants (e.g., shrubs and perennial herbaceous plants) and seed banks are so limited that invasive annuals are much more likely than desirable perennials to dominate after fire or fuel-control disturbances (Miller et al. 2005). Infilling phases based on cover of trees relative to cover of shrubs and herbs (Miller et al. 2005) are relevant ecologically because they represent relative competitive demand for soil water and nutrients.
The annual climatic pattern in the Great Basin consists of soil water recharge in fall, winter, and spring, and short spring periods when warm soil temperatures and water availability coincide to support rapid growth (Caldwell 1985; Smith and Nowak 1990; Leffler and Ryel 2012). Growth is dependent on soil water availability at relatively shallow soil depths (< 0.3–0.5 m) in what Ryel et al. (2008) and Leffler and Ryel (2012) have identified as the resource growth pool. The resource growth pool is defined by high enough soil water matric potentials (> −1.5 MPa) to support nutrient mass flow and diffusion to roots, and root uptake of nutrients in solution. Invasive annuals such as cheatgrass are highly dependent on the shallow resource growth pool for growth and seed production (Ryel et al. 2010). Residual perennials, especially perennial grasses with root systems that deplete the soil water resource growth pool, are important for resisting dominance by annual grasses such as cheatgrass and medusahead (Taeniatherum caput-medusae [L.] Nevski) and are a major determinant of community resilience after disturbance (Booth et al. 2003; Chambers et al. 2007; Davies 2008; Chambers et al. 2013, 2014).
Selecting the type and timing of treatments that enhance the response of desirable residual species that deplete soil water growth pools may be a key consideration to reducing fuels while increasing resistance to invasive weeds (Leffler and Ryel 2012). From a resource availability standpoint, controlling trees at an advanced phase of infilling (Phase III; Miller et al. 2005) could be considered most risky for invasive weed dominance (Bates et al. 2013). At this phase there are fewer desirable residual perennial shrubs, grasses, and forbs to deplete the resource growth pool and most likely a longer period of increased resource availability following disturbance. In turn, this increases the likelihood that annual weeds on site or from nearby seed sources could dominate (Davis et al. 2000).
Resource availability after fuel control treatments should mainly be a function of decreased resource use by the life forms that are most dominant before treatment and that are most reduced after treatment. Although broadcast prescribed fire generally reduces both tree and shrub canopies, cutting and shredding methods are generally implemented in the Great Basin to maintain shrubs as an important component of wildlife habitat. Tree control methods such as cutting and shredding have been shown to affect seasonal soil temperatures, increase time of soil water availability (Bates et al. 2000; Young et al. 2013b), and maintain higher soil water contents (Bates et al. 2002) than untreated areas. In addition to decreasing resource use by trees, fuel treatment residues could affect soil fertility, temperature, and both infiltration and availability of soil water (Breshears et al. 1998; Madsen et al. 2008; Davies et al. 2009; Cline et al. 2010; Archer et al. 2011; Leffler and Ryel 2012; Young 2012; Pierson et al. 2013; Young et al. 2013b).
We measured soil water matric potentials and temperatures on untreated, prescribed fire, and mechanically treated plots at three phases of infilling across the Great Basin. Our study is unique in its regional scope, focus on phase of infilling at time of treatment, and intensity of measurement of the resource growth pool. As part of this regional study, we used seasonal soil water availability and temperature data to ordinate the sites and characterize similarities and differences. We hypothesized that, even with site differences, treating at higher phases of tree infilling would result in the greatest increase in soil water availability. We also hypothesized that prescribed fire would result in a longer period of soil water availability than mechanical methods because it reduces shrubs, as well as trees. To determine canopy and residue-related influences on soil water and temperature in relation to treatments, we evaluated these variables on tree, shrub, and interspace microsites.
Study sites included four different cover types: five western juniper (Juniperus occidentalis Hook.) sites in California and Oregon, four single leaf piñon (Pinus monophylla Torr.& Frém.)–Utah juniper (Juniperus osteosperma [Torr.] Little) sites in central Nevada (piñon–juniper), and two Utah juniper and two Utah juniper–Colorado piñon (Pinus edulis Engelm.) sites (juniper–piñon) in Utah (McIver et al. 2010; Miller et al. 2014). Sites were selected as wooded shrublands (Romme et al. 2009) or expansion woodlands (Miller et al. 2008; McIver et al. 2010) where trees have invaded sagebrush (Artemisia spp.) communities on loam soils with native species still present in the understory across a range of tree cover (Roundy et al. 2014). Sites represent a wide range in elevation, soil, and climatic conditions, but some regional characteristics are evident. Across the Great Basin from west to east, western juniper sites represent the lowest elevation, piñon–juniper sites in central Nevada have the highest elevation, and Utah juniper sites in Utah are intermediate (Table 1). On the northwestern Great Basin sites, soils are derived from basalt lava flows and the climate is Pacific maritime, with most precipitation falling between November and June (McIver et al. 2010; Rau et al. 2011; Miller et al. 2014). The central and eastern sites include igneous-, metamorphic-, and sedimentary-based soils, which are carbonatic. The climate is more continental, with lower precipitation between November and June, and highly variable summer precipitation mainly in July and August (McIver et al. 2010; Rau et al. 2011; Miller et al. 2014).
List of piñon–juniper sites and infilling phase measured for soil water and temperature for tree treatment comparisons and for ordination of untreated plots.
Treatments were applied across the network as a randomized complete block, with each site considered a block. We placed treatment plots at each site on the same ecological site (Miller et al. 2014). Plots were fenced where necessary to exclude cattle grazing. Throughout the network at each site, three 8–20-ha treatment plots were left as an untreated control plot or received either a broadcast burn or a cut-and-drop treatment. As an additional treatment on the four Utah sites, standing trees were masticated or shredded with a rotating toothed drum mounted on a rubber-tired or track vehicle. This procedure leaves tree residues ranging in size from small chips and shreds to twigs and branches (Cline et al. 2010). Prescribed fire ignition was by aerial ignition and drip torch, with the latter used on all sites to ensure that all tree and shrub canopies were fully ignited. Because plots could not all be burned in the same year (Miller et al. 2014), treatments were applied in 2006, 2007, and 2009 in a stagger-start design (Loughlin 2006; Table 1). This design avoids the potential restricted inferences associated with implementing all treatments under the same set of climatic conditions. Prescribed fire plots were burned between August and October and trees on mechanically treated plots were cut or shredded from September through November. All trees > 0.5 m tall were cut or shredded and debris was left in place on the ground. Tree canopies were reduced to < 5% in the burn plots and < 1% in the mechanically treated plots. Prescribed burning reduced shrub cover to < 5% whereas mechanical treatments had no effect on shrub cover (Roundy et al. 2014).
Soil water and temperature measurement stations were located at three phases of tree dominance by observing relative tree, shrub, and herb cover to determine dominance of life forms. Dominance of shrubs and herbs with trees present constituted Phase I, codominance of trees with shrubs and herbs constituted Phase II, and dominance of trees with limited but present shrubs and herbs constituted Phase III (Miller et al. 2005). Lack of equipment to fully instrument all sites, different years of treatment implementation, wildfire after treatment on one site, and some micrologger operating system problems the first few years after installation all resulted in less than complete data for all treatments at all sites for 3–4 yr after treatment (Table 1). Only one station was installed at the Walker Butte, Five Creeks, and Seven Mile sites, and these were installed on untreated Phase III plots. Three stations were installed at Spruce Mountain on the untreated Phase I, II, and III plots. At the remaining sites, stations were installed on untreated, burn, cut, and shred plots at infilling Phases I, II, and III. Each of these sites had 9 stations (three phases by three treatments: untreated, burned, cut) or 12 stations (Utah sites only: three phases by four treatments: untreated, burned, cut, shred). We collected soil water matric potential and temperature data for characterizing 13 sites and for comparing treatments on 9 sites for a total of 19 site-yr, or 26 spring site-yr (Table 1).
Each of the 99 soil water and temperature stations installed across the 13 study sites was equipped with a Campbell Scientific, Inc. (Logan, UT) CR10X or CR1000 micrologger and multiplexer that measured 16 soil temperature and soil water matric potential sensors. At each station, thermocouples to measure temperature and gypsum blocks (Delmhorst, Inc., Towaco, NJ) to measure soil water matric potential were buried at 1–3 cm, 13–15 cm, 18–20 cm, and 28–30 cm deep at the east-side dripline of tree and shrub canopies and associated litter mounds and on two interspaces between shrubs or trees (4 depths by 4 microsites = 16 thermocouples or gypsum blocks at each station). Microloggers were programmed to read sensors every 60 s and to store hourly averages. We converted gypsum block resistance data to water potential using standard calibration curves (Campbell Scientific, Inc. 1983). Although some error may be introduced by not individually calibrating each gypsum block, blocks calibrated with standard equations have been shown to be relatively consistent and sensitive to soil drying in a growth chamber study (Taylor et al. 2007). We also measured air temperature and precipitation (1–1.5-m height) on one station at each site (untreated Phase III). Precipitation was measured with an electronic tipping bucket rain gage (Texas Electronics, Inc., Dallas, TX) and removable precipitation adapter for snowfall (Campbell Scientific, Inc.; Fig. 1). Air temperature was measured in a gill shield using a Campbell Scientific, Inc., model 107 temperature probe (Fig. 2).
Derived variables were calculated for four seasons and at each soil depth: spring (March–June), summer (July–August), fall (September–November), and winter (December–February). Derived variables included total number of wet days, used to indicate time of available soil water (total hours · 24 hr−1 when hourly average soil water matric potential was > −1.5 MPa), degree days (summation of hourly average soil temperatures for each hour that average soil temperature was > 0°C · 24 hr−1), wet degree days (degree days when soil water matric potential > −1.5 MPa), and hourly average soil temperatures (Rawlins et al. 2012).
To characterize the seasonal soil water and temperature of all sites we conducted a number of iterations of principle components analysis (PCA) using PC-ORD version 6 (MJM Software Design, Gleneden Beach, OR; Peck 2010) using data from untreated plots for all 13 sites listed in Table 1. We used data from Phase I, II, and III stations where available. For each season, we calculated a number of derived variables, including mean wet days, degree days, wet degree days, and soil temperature means, maxima, and minima across the number of years that we had data for each site. Eleven sites had 5–6 yr of data (Table 1). We conducted a final iteration of PCA using one soil temperature and one available water-related variable for each season. We selected variables that were normally distributed as indicated by low skewness and kurtosis values and that had the highest correlation with PCA response axes in previous iterations. We then plotted station locations (Phases I–III) for each site and site centroids in relation to the first two PCA axes (Fig. 3).
Mixed model analysis (Proc Glimmix, SAS v9.3, SAS Institute, Inc., Cary, NC) was used to test fixed effects of year since treatment (crossed factor), season (crossed factor), treatment (main plot), phase (subplot within treatment), microsite (sub-subplot within phase and treatment), and depth (sub-sub-subplot within phase, treatment, and microsite). Data were not transformed because examination of residual plots indicated that assumptions for analysis of variance were generally met. Site was considered random in these analyses. After initial testing of all effects, analyses were conducted on each year since treatment and season separately because different sites were represented in different years since treatment (Table 1), seasons had different number of total days, and interactions among year since treatment and season were usually significant (P < 0.05). After conducting analyses on the four measurement depths separately, we subsequently omitted the top depth (1–3 cm) and analyzed across the lower three depths for all responses. These depths (13–15 cm, 18–20 cm, 28–30 cm) are most indicative of plant water use rather than evaporative water loss, as at the 1–3-cm depth. Finally, we conducted analyses on the difference between untreated and treated responses for each infilling phase at each site to best adjust for differences in annual weather among sites. This also allowed us to determine additional wet days, degree days, and wet degree days associated with tree reduction. Tukey tests were used to determine significant differences among treatments or phases when significant.
Precipitation during the study period varied from lows in 2007–2008 to exceptional highs in 2010–2011 (Fig. 1). Five Creeks had the highest measured precipitation (641 mm October 2010 through June 2011) and highest average precipitation (602.4 mm annual, 542 mm October through June; PRISM Climate Group 2012). However, measured precipitation data were only available in 2010–2011 for this site. Excluding Five Creeks, study sites with highest precipitation for the western juniper, piñon–juniper, and juniper–piñon cover types were Blue Mountain, South Ruby, and Stansbury. Drier sites for those cover types were Bridge Creek, Marking Corral, and Onaqui, although values were generally similar for the drier sites within a cover type. Excluding Five Creeks, precipitation averages for the study period were 45 mm less annually and 27 mm less from October through June than long-term averages (PRISM Climate Group 2012). Across cover types, piñon–juniper sites had cooler air temperatures, while juniper–piñon sites were warmest (Fig. 2).
The first two axes of the PCA seasonal ordination for untreated plots represented soil temperature and available soil water gradients (Fig. 3) and accounted for 84.7% of variation among sites. Western juniper and the Stansbury juniper–piñon study sites were generally wettest whereas most piñon–juniper and juniper–piñon sites were drier. Although western juniper sites were generally cooler than the other sites, Bridge Creek, a western juniper site, was exceptionally warm. Piñon–juniper sites in central Nevada were slightly drier and also cooler than juniper–piñon sites in Utah.
Year, Season, Treatment, and Soil Depth
Total number of wet days averaged across all treatments and 1–30-cm soil depths for the first, second, and third years since treatment were 155.5 d, 180.1 d, and 180.9 d, respectively. Data for the fourth year since treatment were available for spring only. The number of wet days in spring for 1–30 cm was 80.3 d, 86.2 d, 97.6 d, and 98.4 d for the first, second, third, and fourth years since treatment, respectively. Percentage of wet days of total days within a season were (mean ± SE) 74 ± 4.3%, 5.3 ± 4.3%, 24.2 ± 4.3%, and 64.9 ± 4.3% for spring, summer, fall, and winter across all other factors. Because season categorizations had different numbers of days, and because different study sites were represented in different years since treatment, additional analysis was conducted separately for each season and year since treatment.
The treatment by depth interaction was significant (P < 0.05) for half of the 13 year-since-season by treatment cases for wet days and degree days. In general, wet days decreased with soil depth in fall (year 3, F = 140.93,647, P < 0.0001) and summer (year 3, F = 8.33,586, P < 0.0001), and increased with depth in spring (year 3, F = 62.93,586, P < 0.0001; Fig. 4). Cut plots generally had significantly more wet days than untreated plots in fall (year 3, F = 7.22,12, P < 0.0089) and spring (year 3, F = 5.42,12, P < 0.0212) whereas burn plots had more wet days than untreated plots in winter (year 3, F = 5.052,12, P < 0.0256; Fig. 4). Cutting trees conserved more water at deeper than shallower depths in spring and summer. Degree days increased with soil depth in fall (year 3, F = 753.43,647, P < 0.0001) and winter (year 3, F = 429.03,629, P < 0.0001) and decreased with depth in spring (year 3, F = 781.53,586, P < 0.0001) and summer (year 3, F = 1128.43,586, P < 0.0001; Fig. 5). Burn plots generally had significantly more degree days than untreated and cut plots in fall (year 3, F = 6.52,12, P < 0.0122) and spring (year 3, F = 5.32,586, P < 0.0224; Fig. 5). Because the lower three depths were generally more similar in response than the surface depth of 1–3 cm (Figs. 4 and 5) and were also more indicative of plant water use than atmospheric water loss, responses of the lower three depths were averaged to represent the resource growth pool in subsequent analysis. Also, because soil water and temperature varied among untreated plots at different study sites for each year and season (Fig. 3), standardized effects of treatment were analyzed as the difference between treated and untreated plots for each study site.
Differences Between Treated and Untreated Plots: Treatment, Phase, and Microsite
Significant main effects included tree infilling phase for wet days (year 3, F = 4.12,18, P < 0.0265) and wet degree days (year 3, F = 6.42,18, P < 0.0079) in spring and microsite for most soil climate responses (P < 0.05; wet days, degree days, and wet degree days) for most seasons and years since treatment. Burning or cutting trees increased wet days and wet degree days at the 13–30-cm soil depth most at infilling Phase III sites in spring and this effect continued even 4 yr after treatment (Fig. 6). Additional wet days in spring after tree reduction varied somewhat in magnitude and pattern among sites or years after treatment (Fig. 7). Treatment differences between the burn and cut were significant only for degree days and average soil temperature in fall (F = 10.11,6, P < 0.019; F = 10.71,6, P < 0.017), spring (F = 8.31,5, P < 0.0346; F = 8.91,5, P < 0.0326), and summer (F = 9.11,6, P < 0.0234; F = 9.11,6, P < 0.0234) for 3 yr after treatment. Three years after treatment, burning increased degree days compared to untreated plots by 108.5, 106.9, and 98.4 in fall, spring, and summer, whereas the difference in degree days between cut and untreated plots was not significantly different (P > 0.05) from zero. Shredding was compared with cutting and burning for Utah study sites only and, except for degree days in the summer of year 3, produced statistically similar (P > 0.05) soil climate responses as cutting and burning.
Burning or cutting generally increased soil wet days, wet degree days, and degree days most on tree microsites compared to shrub and interspace microsites (Table 2). Burning or cutting trees increased degree days on tree microsites compared to shrub and interspace microsites in spring and summer, but decreased degree days on tree microsites relative to shrub and interspace microsites in fall and winter. Soil temperatures were highly correlated with degree days and showed a similar response. The interaction of phase and microsite was significant (P < 0.05) for soil climate variables associated with soil temperature for most seasons and years (Fig. 8). Although cutting or burning at Phase III increased degree days in spring and summer for all microsites, they increased degree days most for tree, then shrub, then interspace microsites at Phases I and II (Fig. 8). Treating at Phase III especially increased degree days for shrub and interspace microsites in fall, whereas treating at all phases increased degree days similarly for all microsites in winter. The interaction of treatment, phase, and microsite was significant (P < 0.05) for temperature-related variables for some years and seasons. For example in fall of year 3, wet degree days were increased by both treatments, but cutting increased wet degree days most on tree microsites at Phase II and interspace microsites at Phase III. Burning increased wet degree days most on tree microsites at Phase III.
Tree reduction increased time of available water during spring in the resource growth pool, and treating at a higher phase of tree infilling resulted in greater time of available water, as we had hypothesized (Fig. 6). Bates et al. (2000) found that cutting western juniper with an average canopy cover of 23% increased the period of active growth for understory plants up to 6 wk because of greater soil water availability. We found that burning or cutting increased the time of available water in the resource growth pool in spring by a maximum of 26 d, 20 d, 15 d, and 19 d the first, second, third, and fourth years after treatment on Phase III woodlands (Fig. 6). This extra time of available water in spring is a significant increase. In contrast, controlling trees at Phase I increased the time of available water the first 2 yr after treatment, and only by 6.7 d and 3 d (Fig. 6). Lateral roots of western juniper trees are much more limited in the first 10 yr of growth than in later years (Krämer et al. 1996). Mechanically killing smaller and fewer trees associated with Phase I infilling would be expected to have much less effect on time of available water, because dominant shrubs and herbaceous plants are still the major water users.
Patterns of soil water availability generally followed treatment-induced changes in understory vegetation. When the community was treated at a higher phase of infilling, there was less residual understory cover (Roundy et al. 2014), presumably less plant water use, and therefore a longer time of available water after treatment. As plant cover increased with time since treatment (Miller et al. 2014), the additional time that water was available decreased. However, even in the fourth year after treatment, water was available 8.6 d and 18 d longer on treated than untreated plots at original infilling Phases II and III (Fig. 6), indicating ongoing water availability to support continued increases in residual plants or annual invaders in the future. Perennial herbaceous cover increased most after cutting at an initial infilling of mid–Phase I or higher (tree dominance index ≥ 0.2, Roundy et al. 2014) and was associated with longer times of available water when treatments were implemented at Phases II or III (Fig. 6).
We hypothesized that burning would result in greater time of available water than mechanical tree control because it would reduce both shrub and tree canopies and water use. However, there was no difference in time of available water between fire and mechanical treatments. Perennial herbaceous vegetation was initially decreased by burning, but recovered to pretreatment cover by the second year and exceeded levels in the control by year 3 (Miller et al. 2014). Burning decreased shrub cover at all but the highest phase of infilling where shrub cover was low initially (tree dominance index < 0.8, Roundy et al. 2014). Mechanical treatments of cutting and shredding were not targeted to reduce shrubs and did not decrease shrub cover (Roundy et al. 2014). Burned plots did have higher degree days most seasons than cut plots 3 yr after treatment (Fig. 5). Warmer soils on the burned area may have contributed to greater soil water evaporation and a lack of difference in time of available water on burned and mechanically treated plots. Wind may have moved winter snow off burned areas. Warmer soils in the burn may have also contributed to the greater increase in exotic species compared to the control and the mechanical treatment (Miller et al. 2014).
Additional days of soil water availability and warmer soil temperatures after fire and mechanical treatments could lead to cheatgrass dominance through increased germination, growth, and seed production (Chambers et al. 2007; Roundy et al. 2007; Rawlins et al. 2012). Prescribed fire increased cheatgrass cover across all ranges of tree dominance on these sites, whereas cutting and shredding increased cheatgrass cover when implemented at Phase II or higher (Roundy et al. 2014). However, cheatgrass cover was < 9% and patchy on all but two sites, which had > 25% cheatgrass cover (Roundy et al. 2014). Soil water availability was increased both on sites that became dominated or did not become dominated by cheatgrass (Fig. 7). This suggests that other site environmental conditions and propagule pressure (Colautti et al. 2006) were more important determinants of cheatgrass dominance on this network of sites than increased time of available soil water. It is typical on many sites for cheatgrass to increase the first few years after tree control, and then decrease later as perennial grasses increase (Bates et al. 2005; Miller et al. 2014). Determination of environmental conditions that lead to weed rather than perennial grass dominance is critical to management of tree-encroached sites. If managers knew which sites were most prone to cheatgrass dominance, they could plan to seed these sites in conjunction with tree control.
Cheatgrass was most dominant on the Stansbury and Scipio sites in our study. These sites grouped together as warmer than most other sites, and Stansbury was wetter than Scipio (Fig. 3). Although temperature and water requirements for germination of cheatgrass are generally satisfied across a wide range of soil temperatures in sagebrush ecosystems (Roundy et al. 2007), growth and reproduction are limited at higher elevations by cold soil temperatures (Chambers et al. 2007). Lower-elevation sites associated with greater degree days are less resistant to cheatgrass when perennial grass competitors are removed (Chambers et al. 2007; 2014). Risk of cheatgrass dominance is generally considered higher on drier and warmer Wyoming big sagebrush (Artemisia tridentata Nutt. subsp. wyomingensis Beetle & Young) ecological sites than on cooler and wetter mountain big sagebrush (A. tridentata Nutt. subsp. vaseyana [Rydb.] Beetle) ecological sites (Bates et al. 2013; Chambers et al. 2014). Nevertheless, Bates et al. (2013) found that cheatgrass dominated burned Phase III western juniper–mountain big sagebrush ecological sites with a frigid soil temperature regime.
Treatments, aspect, and microsite conditions that increase soil temperatures when soil water is available are associated with increased cheatgrass abundance. (Link et al. 1994; Tausch et al. 1995; Condon et al. 2011). In our network study, burned plots had greater time of available water in winter than did untreated plots and greater degree days than untreated or cut plots 3 yr after treatment (Figs. 4 and 5). Also, burning or cutting increased time of available water and wet degree days more on tree microsites compared to shrub and interspace microsites for most seasons. Burned litter and woody residues from cutting or shredding may increase seasonal temperatures and time of available water and thereby support cheatgrass abundance, especially on cooler or drier sites. Tree shredding and the associated residues increased soil temperatures, time of water availability, and soil N supply rate on Phase III sites studied by Young (2012) and Young et al. (2013b), which supported both annual cheatgrass and perennial bluebunch wheatgrass (Pseudoroegneria spicata (Pursh) A. Love) seedling growth (Young et al. 2013a). Our regional study shows that treatments to reduce trees may modify environmental conditions to favor cheatgrass, but cheatgrass response may still not be highly predictable for some sites as noted by Bagchi et al. (2013).
Managers remove trees to reduce fuel loads and to increase cover and density of desirable understory species. Tree removal by fire or mechanical means similarly increased the time of available water, which is associated with a longer period of nutrient diffusion to roots and growth of whichever residual growth forms are present (Leffler and Ryel 2012). Although perennial herbaceous cover is increased to varying degrees by tree removal at high phases of infilling or tree cover (Roundy et al. 2014), that residual cover is not sufficient to use all the soil water and limit growth potential of undesirable weeds. Removing trees at Phases I and II when enough desirable residuals remain to use the expanded resource growth pool, rather than at Phase III, should reduce risk of weed dominance, especially on sites with warmer temperatures and high climate suitability for cheatgrass. To reduce the risk of weed dominance, Phase III areas could be seeded in conjunction with tree control. Because lack of precipitation is the major limitation to seeding success (Hardegree et al. 2011), the increase in available soil water associated with tree control should support increased seedling establishment.
Authors thank Bureau of Land Management and US Forest Service personnel for implementing treatments. Authors thank Brad Jessop, Jaime Ratchford, and Darrell Roundy for assistance installing measurement stations. We especially acknowledge Shad Roundy for assistance with data analysis and Virginia Roundy for assistance in maintaining stations. This is Contribution Number 86 of the Sagebrush Steppe Treatment Evaluation Project (SageSTEP), funded by the US Joint Fire Science Program, the Bureau of Land Management, the National Interagency Fire Center, and the Great Northern Landscape Conservation Cooperative.