Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Wildfire is increasing in frequency and size in the western United States with climate change and invasive species such as cheatgrass. This increase is also causing an increase in the need for restoration techniques, especially in low-elevation, arid shrublands. Sagebrush shrublands are home to the threatened Gunnison sage-grouse and can take decades, if not longer, to recover after fire. We investigated management-friendly restoration techniques aimed at increasing sagebrush cover in a sagebrush system important to Gunnison sage-grouse and impacted by fire in western Colorado. We tested several restoration techniques that could be replicated in management actions to mitigate stressors on sagebrush recruitment, specifically herbivory by large ungulates, water limitation, and competition with other plants. We found that sagebrush grew and survived better when planted as transplanted seedlings versus seeds, when planted in areas where herbicide had been applied versus when vegetation was removed by hand tools, and when caged to prevent herbivory than when not caged. Surprisingly, providing supplementary water did not improve sagebrush transplant growth or survival over use of a microsite (small structure made of wood collected from the burn scar). Constructed microsites were meant to provide protection from wind, retain moisture, and provide shade. Overall, our results indicate that if sagebrush seedlings are provided shelter and structure, then survival can approach natural (not planted) rates and sagebrush can be successfully established in low-elevation sites.
The objective of this paper is to identify prospects for stakeholder cooperation for effective implementation of enhanced rangeland restoration techniques under different land tenure status in Tataouine Governorate of southern Tunisia, through the rest technique locally called “Gdel.” This technique consists of leaving a given rangeland at rest to reconstitute the plant cover. A stakeholder analysis was conducted using the MACTOR methodological framework to analyze stakeholders' strategies and their balance of power in terms of rangeland management decisions, specifically regarding the implementation of resting, which involves a high level of collective action. Data collection was based on two focus group discussions with the nine main stakeholders involved directly and indirectly in Tunisian rangeland management. Stakeholders' perceptions about resting are compared across private and collective land tenure systems. Findings show a wide diversity in stakeholder relationships, in terms of influences, dependencies, and balance of power, with differences between collective and private tenure systems. In private rangelands, equal levels of stakeholder influence and power lead to a much more stable and flexible rangeland restoration process, with more alliances and consensual objectives among almost all stakeholders. The situation in collective rangelands is very different because the majority of stakeholders have a weak influence in terms of management decisions, with fewer alliances and more conflictual objectives among them. Pathways for stakeholder cooperation and long-term empowerment are suggested for effective implementation of rangeland restoration techniques involving collective action.
Grazing by livestock, particularly cattle (Bos spp.), is the dominant land use across North American rangelands and often co-occurs in habitats used by wildlife. Deer (Odocoileus spp.) are an ecologically and economically important native wildlife species in North America. Sustainable management and profitable economic returns require an understanding of the factors driving cattle-deer compatibility. Cattle are compatible with deer when cattle grazing does not negatively impact deer or their habitat requirements (food, cover, and space). We reviewed 2,685 publications on cattle-deer interactions across North American ecosystems to assess the compatibility of these two important genera. We extracted data from 85 of the publications, years ranged from 1930–2015, that met criteria for quantifying cattle-deer diet overlap, and cattle effects on deer food, cover, and space. We determined that cattle-deer compatibility across North American ecosystems is dictated: mostly by geographic region; followed by cattle stocking rate and season; and marginally by soil texture. Cattle and deer were compatible across North American ecosystems when cattle stocking rate was less than 0.12–0.17 AUY ha–1. Cattle-deer diet overlap was lowest during summer and autumn. Although, cattle had the greatest potential to decrease forbs in the northeastern forested ecoregion on clay soils during autumn. Cattle had little measurable effect on habitat variables important to deer in open North American ecoregions dominated by herbaceous vegetation. In contrast to rangelands, cattle had the greatest potential to adversely impact deer food, cover, and use of space in forest-dominated ecoregions in North America. However, observations in eastern forested ecoregions only represented 6–16% our data sets. Our review reveals a range of conservative cattle stocking rates (0.12–0.17 AUY ha–1) that will have minimal impact on deer using rangelands, and that stocking rates in forested ecoregions may need to be reduced further to minimize impacts to deer and their habitat requirements.
Improper management of cattle near streams can negatively affect the processes that support stream and riparian ecosystems. To judge the success of riparian management strategies, public land management agencies often evaluate two metrics of livestock disturbance, stubble height and streambank alteration. There are concerns associated with how well these disturbance metrics reflect livestock use and incorporate additional effects due to wild ungulates. We sought to address these questions in 39 riparian areas by using time-lapse cameras to estimate ungulate use and then measuring near stream ungulate disturbance in these same reaches. We found daytime measures of livestock use were related to stubble height and streambank alteration. Explaining the variation in the relationship between stubble height and livestock use was improved by incorporating covariates, whereas covariates did little to improve our understanding of streambank alteration. This suggests greater flexibility in how different stream reaches are managed when stubble height is the guideline. As streambank alteration was explained solely by ungulate use, the simplest way to reduce disturbance was to reduce use. In most stream reaches, the additive presence of wild ungulates was minimal but sufficient to be included in the best models describing the effects of ungulate disturbance. On most days, no cattle were seen in the evaluated riparian reaches. Although large groups of cattle (> 20 individuals) were occasionally observed within riparian areas, they generally occurred in small groups of one to four individuals. Across the riparian reaches we evaluated, cattle presence and density were generally low enough that metrics of livestock disturbance suggested little risk to stream conditions important to aquatic biota. However, there were some riparian areas where cattle stayed too long or occurred in large enough numbers that their use along the streambanks could negatively affect stream habitat conditions.
Shelemia Nyamuryekung'e, Andres F. Cibils, Richard E. Estell, Matthew McIntosh, Dawn VanLeeuwen, Caitriana Steele, Alfredo L. González, Sheri Spiegal, F. Guadalupe Continanza
We compared cow-calf contacts, as well as movement, activity, and pasture use patterns of heritage Raramuri Criollo (RC) and desert-adapted commercial Angus Hereford crossbred (AH) beef cattle grazing Chihuahuan Desert pastures during 4 wk in the summers of 2016 and 2017. Within each herd of 11 cow-calf pairs, a group of 7–9 randomly selected cows were fitted with Global Positioning System collars that recorded animal position at 10-min intervals. Proximity loggers configured to record contact events (< 1-m radius) were fitted on a subset of five cow-calf pairs of each breed. The effect of breed on cow-calf contacts, as well as the dams' movement, activity, and pasture use patterns were analyzed via mixed analysis of variance models. A higher number of RC cow-calf contacts occurred while the dam was grazing and traveling compared with AH counterparts (P ≤ 0.05). No breed-related differences were observed in the overall number and duration of cow-calf contact events. Compared with AH dams, RC cows traveled farther each day (RC: 7.51 vs. AH: 4.85 km, P < 0.01) at higher movement velocities (5.46 vs. 3.53 m. min–1, P < 0.01) and spent more time traveling (1.05 vs. 0.48 h, P < 0.01), more time grazing (9.37 vs. 7.45 h, P < 0.01), and less time resting (13.07 vs. 15.68 h, P < 0.01). RC cows explored almost three times more daily area than AH (152.30 vs. 57.69 ha, P = 0.01) but spent similar amounts of time within 200 m and 100 m of a drinker. RC calves explored larger daily areas than their AH counterparts (83.0 vs. 20.8 ha, P = 0.05), but no breed differences were detected in the number of contact events near drinkers. RC calves possibly impose fewer constraints on their dams' movement and activity patterns compared with commonly used British crossbreds when grazing the Chihuahuan Desert during summer.
Drought contingency planning is an increasingly common tool in the ranchers' climate adaptation toolboxes, but its effect on drought response has not yet been evaluated. We use cognitive models of protective action decision making and planning to explore the effects of having a drought plan on the use of drought early warning information and drought response (and timing). Results of a cross-sectional, probability-based survey of livestock producers affected by a 2016 flash-drought are used to describe the characteristics of operations with drought plans and provide evidence of whether having a plan predicts drought information use and response. While larger operations are more likely than others to have plans for drought, having a drought plan appears to play a unique role in ranchers' use of information and decision making regardless of operation size. Findings suggest that encouraging the use of drought contingency planning may improve ranchers' adaptive capacity. Increased use of planning may also increase the effectiveness of communicating risk and early warning information, by making such information more actionable by decision makers.
We compared animal and vegetation responses of a 13 600-ha area under holistic grazing management (HGM) with a similar area under continuous grazing (CGM) in a Patagonian station. Limitations were a dry 2012–2016 experimental period, poorer soils, and grazing of native guanacos (Lama guanicoe) in the HGM area. Forage standing crop in this area before the experience was lower and remained so during the study: (194 ± 31 HGM vs. 244±33 kg dry matter.ha–1 CGM). Six monitoring sites showed similar and remarkable (though mostly nonsignificant) vegetation improvements in total cover (10.6% HGM vs. 10.9% CGM) and cover of short palatable grasses (21.4% vs. 23.9%, respectively). Species richness showed small changes (–1 vs. –6%), bare soil interpatches decreased (–11.9 vs. –5.4%), and land function indicators of Stability (5.4% vs. 9.9%), Infiltration (12.4% vs. 12.0%), and Nutrient recycling (4.2% vs. 20.6%) increased. Tussock cover changed significantly with grazing management, as it decreased –6% (ns) in HGM and grew 42% (P = 0.03) under CGM, probably due to coarse tussock forage consumption in HGM. Sheep under HGM were 15% lighter (43.9 ± 0.5 HGM vs. 51.7 ± 0.5 kg.ewe–1 CGM P < 0.001), ewes scored 28% lower body condition (1.60 vs. 2.25, P < 0.001), and lambing rates were 36% lower (48.3 ± 2.1% vs. 74.2 ± 1.9%). Rotation ended in 2015 as a consequence of low lambing rates, and sheep body condition and reproductive rates recovered to similar values in both areas. Positive vegetation changes in both areas may be driven by residual effects of destocking 3 decades ago and show that improvement is possible using moderate stocking rates. Although it could be argued that rest periods of HGM may be positive in the long term, its negative effects on animal production should be addressed, and fast regeneration using intense management in these severely restricted habitats should not be expected. Slow, persistent progress under careful management seems achievable under both grazing systems.
Bement (1969) developed a stocking rate (SR) guide for yearling cattle grazing shortgrass steppe based on relationships among average daily weight gain (ADG, kg · d–1), beef production per hectare (BP, kg · ha–1), and stocking rate (animal unit days, AUD · ha–1) measured in long-term grazing experiments conducted from 1940 to 1963. These analyses identified an optimal biophysical SR of 13.5 AUD · ha–1. Here, we 1) examine modern era (2000–2018) SR results from these same long-term grazing experiments to determine if there has been a shift in the optimal biophysical SR and 2) assess the influence of drought (< 75% of normal precipitation) on the optimal biophysical SR. For all years in the modern era, the optimal SR occurred at 23.2 AUD · ha–1, 72% higher than the value reported by Bement (1969). For the 3 drought yr, the optimum SR was 14.2 AUD · ha–1, which still exceeded the optimal SR by Bement (1969). Our results show the capacity of this shortgrass steppe rangeland to produce livestock weight gains has increased substantially between the Bement and modern eras. This multidecadal directional shift to a higher optimum biophysical SR is likely driven by two nonmutually exclusive factors. First, the plant community changed from dominance by a C4 shortgrass (Bouteloua gracilis) in the Bement era to codominance with a more productive C3 midgrass (Pascopyrum smithii) in the modern era. This change has resulted in pasture-level forage production increasing notably between the two eras. Second, the entry weights and genetic growth potential of yearling steers increased over the 8 decades and may have influenced the efficiency of weight gain for a given amount of forage consumed. Our findings provide guidance for incorporating flexible optimum SR in nondrought and drought years for adaptive grazing management strategies.
Coal surface mining in the northern Great Plains of the United States often produces mining spoils with physical and chemical barriers to successful revegetation, and this has resulted in experiments on reclamation with salvaged soil materials. There is a need to determine changes in soil properties and plant community and productivity decades after reclamation. Experiments were initiated in the mid-1970s by placing wedge-shaped masses of Haplustoll soils over leveled, saline-sodic mine spoils forming 2–5% hillslopes approximately 50 m long at two sites near Zap and Stanton, North Dakota. Seeding treatments at Zap included alfalfa (Medicago sativa L.), crested wheatgrass (CWG, Agropyron cristatum [L.] Gaertn.), smooth bromegrass (SBG, Bromus inermis Leyss.), and Russian wildrye (Psathyrostachy juncea [Fisch.] Nevski); at Stanton, native grasses (two Bouteloua spp.) replaced P. juncea. We examined soil, plant community, and productivity changes 3 decades after the start of the experiments. Leaching of soluble salts in subsoils and mine spoils improved soil quality (SQ). SQ improved more in lower slope areas because more rootzone was occupied by low SQ mine spoil. Initial forage yield patterns showed dependency on hillslope position, reflecting both soil depth effects in lower slope parts of hillslopes and apparent water redistribution effects in middle and upper slope areas. Evidence of SQ improvement over time was inferred by substantial decreases in yield dependency on hillslope position at both sites. The Zap site was grazed more heavily than Stanton, and species composition at Zap was 46% cool-season grasses (CWG greatest) and 40% broadleaves. Stanton had 87% cool-season grasses (SBG greatest) with 3% broadleaves. Leaching of salts was threefold greater at Zap and may have resulted from an abundance of deep-rooted broadleaves. Reclaiming mined land with salvaged soil and revegetation can improve SQ of mine spoil, which our results infer was driven by root growth and establishment of macroporosity.
Both exposure at weaning and supplementation can increase intake of redberry juniper (Juniperus pinchotii Sudw.). When recently weaned sheep or goats are individually penned and fed redberry juniper for 14 d, intake of the plant increases. Unfortunately, this approach is labor intensive and impractical for most livestock producers. The objectives of this study were to determine if goats would accept juniper when conditioned in groups and if supplementation would further increase intake. Recently weaned Boer-Spanish cross goats (n = 40) were placed into two treatments of 20 animals each and fed juniper for 14 d. All goats were unfamiliar with juniper before initiation of the study. Twenty goats in one treatment were placed in individual pens, while the other 20 were separated into four groups (n = 5). In addition, 10 individuals from each treatment were fed additional protein (37% crude protein). Juniper was fed for 30 min daily for 14 d, along with alfalfa pellets (2.5% of body weight) to meet maintenance requirements. After feeding juniper for 14 d in individual pens or groups of five, all goats were placed in individual pens for 7 d to measure intake on an individual animal basis following preconditioning procedures. All goats increased intake (P < 0.05) daily. Goats fed juniper in groups increased intake at a faster rate until d 11. Thereafter, intake declined apparently because of overingestion and aversive postingestive feedback. In addition, supplementation did not affect juniper intake in this study. For conditioning an acceptance of redberry juniper as a dietary item, feeding goats in groups appears as effective as feeding in individual pens.
Both preconditioning and genetic selection can improve intake of redberry juniper (Juniperus pinchotii Sudw.). Preconditioning an acceptance of juniper has relied on feeding juniper to sheep or goats for 14 d at weaning. Successful genetic selection for juniper consumption by goats has relied on selection of both sires and dams for juniper intake. This study compared redberry juniper intake by kid goats from sires selectively bred for juniper consumption and sires chosen for production characteristics. We used five billies chosen for high juniper consumption and five billies chosen for other production characteristics and bred each to seven doe (n = 70). Doe had not been selected for juniper consumption. Kids were weaned at 90 d of age and placed in individual pens for testing. All kids were fed redberry juniper for 30 min each morning for 14 d. After any juniper refusals were collected, alfalfa pellets (2.5% body weight) were fed for the remainder of the day to meet maintenance requirements. Consumption of juniper was measured and compared between treatments and among sire groups. Once the 14-d juniper feeding trial was completed, all goats were placed on the same ad libitum ration for 30 d with body condition scores and weights taken and compared among treatments and sire groups after 30 d of feeding the complete ration. There were no differences in juniper consumption, body condition scores, and weights among treatments. Goats increased juniper consumption daily in individual pens. Sires' willingness to consume juniper did not appear to impact intake of the plant by their offspring in this study.
Free-roaming horses (Equus ferus caballus) occur throughout arid and semiarid regions of the western United States, where they can decrease plant biomass and diversity, impair water quality, and reduce forage available to native wildlife and domestic livestock. Management of free-roaming horses on Bureau of Land Management (BLM) and US Forest Service lands is determined by protections and population targets established by law, but these do not apply to other federal or Tribal Lands, where relatively little is known about the abundance and distribution of free-roaming horses. To address this information gap, we conducted the first comprehensive survey of free-roaming horses within the Navajo Nation, which is the largest Tribal Land holding in the contiguous United States and covers portions of the states of Arizona, New Mexico, and Utah. We used stratified random sampling and double-observer distance methods to produce estimates of horse abundance corrected for detection bias. During the summer of 2016, we used fixed-wing aircraft to survey 4 975 km of transects across our 67 089-km2 study area. We observed 4 290 horses distributed among 527 groups and estimated 38 223 horses lived within the study area during the survey period (standard of error [SE]: 6 052, 90% confidence interval: 29 365–47 080), with 29 394 horses in open areas (SE: 5 511, 90% confidence interval [CI]: 21 328–37 460) and 8 829 horses in forested areas (SE: 2 331, 90% CI: 5 417–12 240). Overall density of 0.570 horses/km2 (SE: 0.090, 90% CI: 0.438–0.702) was 23% higher than density of horses and burros (Equus asinus) in all BLM herd management areas (HMAs) in 2016 and exceeded by 17% the density in Nevada, the only state with an HMA of comparable size to the Navajo Nation. Our results will inform management of a free-roaming horse population that this study has revealed to be the among the largest in the United States.
A common argument for western juniper (Juniperus occidentalis) control in Oregon is the amount of water saved due to reduction in the number of trees. Yet a good understanding of water use by mature trees and sapling regrowth following mature juniper removal has not been documented. Such information is important to better assess how much water can be saved by juniper control. We used sap flow sensors to monitor water use by mature juniper in a control area and saplings in an area where juniper control occurred in 2005. Sap flow data collected between May 2017 and Sep 2019 showed the period of highest water use was in the summer, although this varied each year. In July 2017 (wettest yr of the study), mature trees used 144 L/d, approximately twice as much as in the average of July 2018 and 2019 (dry yr). During the period of maximum water uptake, mature trees used between 45 and 69 times more water than saplings depending on precipitation and, consequently, soil water availability. In summary, 1) juniper water use varies greatly with precipitation and 2) because of the large difference between mature and sapling trees, juniper control results in considerable water savings, even after a 14-yr period of juniper regrowth.
In rangelands, monitoring spatial regime boundaries (i.e., boundaries between ecological states) could provide early warnings of state transitions, elucidate the spatial nature of state transitions, and quantify management outcomes. Here, we test the ability of established regime shift detection methods and traditional, local-scale rangeland monitoring data to identify spatial regime boundaries in a complex rangeland system. We collected plant community composition data via point-intercept sampling at 400 evenly-spaced locations along a 4000m transect. We then applied three statistical regime shift detection methods to identify spatial regimes and compared outcomes of each statistical method. Statistical detection of spatial regimes held up to traditional field monitoring practices. Spatial regime locations matched historic plant communities in the study site going back 130 years, but we also detected a localized wildfire-driven state transition: a relict ponderosa pine (Pinus ponderosa) spatial regime transitioned to a bur oak (Quercus macrocarpa) – annual grass regime. The spatial regimes monitoring approach capitalizes on the existence of spatial boundaries between states to track ecological states as they move, expand, contract, or disappear. This is an advancement over traditional time series approaches to regime shift/state transition detection which only detect state transitions if enough sites transition. Existing local-scale rangeland monitoring, used strategically, can complement current coarse, broad-scale applications of spatial regimes monitoring by detecting subtle, fine-scale boundaries that broad-scale monitoring cannot capture.
Woodland managers in northern Chile require efficient means of estimating fruit production to improve their ability to sustainably manage mesquite tree ecosystem. We assessed the accuracy of the indirect visual count (VC) method for quantifying fruit production of two endangered mesquite trees, tamarugo (Prosopis tamarugo) and algarrobo (Prosopis alba), by comparing it with the fruit trap (FT) method, during three fruit dissemination periods. We studied two natural populations of trees of P. tamarugo and P. alba in the Pampa del Tamarugal ecosystem, Atacama Desert, northern Chile. We placed FTs beneath the crowns of the trees to measure the fruit crop during the mast dissemination period. The VC method was performed at the same trees used for FT. We used linear regressions and coefficient of variation to assess the relationship between the two methods. There was a positive correlation between both methodologies in all three periods for P. alba, but the relationship for P. tamarugo was not statistically significant. Analysis of the variability in fruit production estimates using the FT method showed significant interannual differences for P. tamarugo and significant differences among individuals for both species. The VC method had better results for fruit production in P. alba, whereas the use of the FT method is more suitable for P. tamarugo.
Multipurpose trees and shrubs are vital components of arid and semiarid ecosystems. They offer both regulatory and production services, yet there is inadequate information on their actual use and factors affecting utilization of these plant resources. A study was carried out in three agroecological zones (counties) of southeastern Kenya: Machakos (subhumid zone), Kajiado (semiarid zone), and Kitui (arid zone). The research objective was to assess the local uses of multipurpose trees and shrubs and determine how socioeconomic factors (gender, type of occupation, and education level) influenced their use. A field survey was conducted along a 324-km transect across the three agroecological zones. Data were collected from 196 respondents using focus group discussions, key informants, and individual household questionnaires. A total of 86 plant species belonging to 47 plant families were recorded. The Mann-Whitney U-test revealed that women significantly used a higher number of different plant species than men (P < 0.05). The level of education and type of occupation also significantly affected plant use (P < 0.05), with subsistence farmers having primary or no formal education, reporting the highest number of species. Fifteen plant-use categories comprising production and service provision were identified. Plant species use-value index (UVI) revealed 10 indigenous and wild woody species of high value in the study area. Acacia tortilis (Forssk.) Hayne had the highest UVI of 0.33, followed by Commiphora africana (A. Rich.) Engl. (0.17), Terminalia brownii Fres. (0.13), and Zanthoxylum chalybeum Engl. (0.12). There was a significant positive correlation between the reporting frequency for woody species and the overall UVI (P < 0.05). Economic development, climate change, and land use changes jeopardize distribution, utilization, and knowledge preservation of multipurpose woody species. There is a need for monitoring and adoption of gender-sensitive strategies for their sustainable utilization in order to safeguard these unique plant resources from degradation and overexploitation.
Contemporary methods of rangeland health (RH) assessment evaluate indicators designed to assess land use impacts on ecosystem function. These methods have not been tested relative to variation in specific grazing practices, including grazing period length and stocking rates during the growing season. We report on RH outcomes for three habitat types (native grassland, tame pasture, and forested pasture) across 97 pastures on 28 beef cattle ranches in Alberta, Canada. Pastures were distributed along a climatic gradient encompassing the grassland, parkland/foothill, and boreal regions. Surveys of ranchers were used to quantify typical grazing period length (1 May–31 October) and, if applicable, rotation length, along with corresponding stocking rates for each pasture over the previous 5 yr. Pastures were assessed for RH using indicators of vegetation composition and structure, litter abundance, soil stability, weed presence, and within tame pastures, woody plant encroachment. An Akaike Information Criterion analysis compared the influence of aridity, grazing period length, and stocking rate on total range health scores (RHS) and ordination used to identify associations between indicator scores and grazing metrics. Total RHS varied among habitat types, being greater in forests than native and tame grasslands (P < 0.05), and declined with increasing forage utilization, particularly in forests. Within tame pastures, total RHS varied primarily in response to regional climate, with RHS decreasing as moisture deficits increased and declining with longer grazing periods during summer. Native grasslands also decreased in RHS in response to longer grazing periods, with stocking rates having little impact on RHS. Select RH indicators were associated with improved health in native grasslands grazed for shorter periods including low weed abundance and greater litter. Further studies are recommended to understand how, over and above climatic influences, variation in grazing practices alter the health of northern temperate grasslands.
Invasive plants may alter arbuscular mycorrhizal fungi (AMF) communities on which resident plants depend. To determine if the invader crested wheatgrass (Agropyron cristatum [L.] Gaertn.) associates with different AMF than resident plants, we compared AMF communities of six A. cristatum–dominated sites to five sagebrush steppe and three mixed-grass prairie sites in the Northern Great Plains. Consistent with findings on some other invaders, roots of A. cristatum were without AMF and supported fewer AMF operational taxonomic units than seven native species and two nonnative species. Restoring natives to A. cristatum sites is notoriously difficult. Our findings suggest a lack of soil mutualists may contribute to the difficulty.
Invasive plants can have significant negative effects on human and ecological communities, including reduced productivity and biodiversity and increased fire risk. Effective mitigation of invasive species likely requires action by heterogeneous actors who span jurisdictions, sectors, and levels of governance. While there has been significant research to develop targeted mitigation techniques that slow or halt the spread of specific invasive plants, there has been relatively little complementary work to develop knowledge about the implementation of these management techniques through effective governance systems. To address this gap, we interviewed and conducted archival research on land managers involved in the mitigation of buffelgrass (Pennisetum ciliare, syn: Cenchrus ciliarus) invasion in southern Arizona to investigate how existing and emerging governance arrangements encourage or undermine individual and collective action to manage invasive plants. Our results show that a key challenge of managing invasive species is identifying the mechanisms that will allow heterogeneous actors to overcome internal barriers to coordination with others and enable collective action. These internal barriers are multifaceted, involving laws and policies, cultural traditions and mandates, the availability of monetary and human resources, and information on causes and consequences of species invasion and effective approaches to mitigation. Approaches to solving these problems must include improved knowledge of internal institutional structures and the opportunities and barriers they present to collective action, the preferences of heterogeneous actors when presented with information about future ecosystem conditions absent coordination, the factors that prevent individuals within different organizations from following through on commitments to participate in collective action institutions, and how each of these conditions affects the availability and persistence of resources for mitigation. Together, improved knowledge of the relationships between these factors may provide new approaches to proactive management of emerging resource management challenges, from invasive species to emerging diseases.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere