Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
40 result(s) for "Luttbeg, Barney"
Sort by:
Re-examining the Causes and Meaning of the Risk Allocation Hypothesis
The risk allocation hypothesis has inspired numerous studies seeking to understand how temporal variation in predation risk affects prey foraging behavior, but there has been debate about its generality and causes. I examined how imperfect information affects its predictions and sought to clarify the causes of the predicted patterns. I first confirmed that my modeling approach—given a threshold or linear fitness function—produced the risk allocation prediction that prey increase their foraging efforts during low and high risk as the proportion of high-risk periods increases. However, the causes of this result and its robustness differed for the two fitness functions. When prey that had evolved to use perfect information received imperfect information, risk allocation was reduced. However, prey that evolved to use imperfect information in some cases reversed the risk allocation prediction. The model also showed that risk allocation occurs even when prey have no knowledge that the proportions of low- and high-risk periods have changed. I conclude that risk allocation is largely not driven by prey expectations about future states of the environment but rather by the prey’s current energetic state and time remaining. I discuss the consequences for experimental design and explanations for empirical results.
Stress hormone-mediated antipredator morphology improves escape performance in amphibian tadpoles
Complete functional descriptions of the induction sequences of phenotypically plastic traits (perception to physiological regulation to response to outcome) should help us to clarify how plastic responses develop and operate. Ranid tadpoles express several plastic antipredator traits mediated by the stress hormone corticosterone, but how they influence outcomes remains uncertain. We investigated how predator-induced changes in the tail morphology of wood frog ( Rana sylvatica ) tadpoles influenced their escape performance over a sequence of time points when attacked by larval dragonflies ( Anax junius ). Tadpoles were raised with no predator exposure, chemical cues of dragonflies added once per day, or constant exposure to caged dragonflies crossed with no exogenous hormone added (vehicle control only), exogenous corticosterone, or metyrapone (a corticosteroid synthesis inhibitor). During predation trials, we detected no differences after four days, but after eight days, tadpoles exposed to larval dragonflies and exogenous corticosterone had developed deeper tail muscles and exhibited improved escape performance compared to controls. Treatment with metyrapone blocked the development of a deeper tail muscle and resulted in no difference in escape success. Our findings further link the predator-induced physiological stress response of ranid tadpoles to the development of an antipredator tail morphology that confers performance benefits.
Revisiting the classics: considering nonconsumptive effects in textbook examples of predator-prey interactions
Predator effects on prey dynamics are conventionally studied by measuring changes in prey abundance attributed to consumption by predators. We revisit four classic examples of predator—prey systems often cited in textbooks and incorporate subsequent studies of nonconsumptive effects of predators (NCE), defined as changes in prey traits (e.g., behavior, growth, development) measured on an ecological time scale. Our review revealed that NCE were integral to explaining lynx—hare population dynamics in boreal forests, cascading effects of top predators in Wisconsin lakes, and cascading effects of killer whales and sea otters on kelp forests in nearshore marine habitats. The relatives roles of consumption and NCE of wolves on moose and consequent indirect effects on plant communities of Isle Royale depended on climate oscillations. Nonconsumptive effects have not been explicitly tested to explain the link between planktonic alewives and the size structure of the zooplankton, nor have they been invoked to attribute keystone predator status in intertidal communities or elsewhere. We argue that both consumption and intimidation contribute to the total effects of keystone predators, and that characteristics of keystone consumers may differ from those of predators having predominantly NCE. Nonconsumptive effects are often considered as an afterthought to explain observations inconsistent with consumption-based theory. Consequently, NCE with the same sign as consumptive effects may be overlooked, even though they can affect the magnitude, rate, or scale of a prey response to predation and can have important management or conservation implications. Nonconsumptive effects may underlie other classic paradigms in ecology, such as delayed density dependence and predator-mediated prey coexistence. Revisiting classic studies enriches our understanding of predator—prey dynamics and provides compelling rationale for ramping up efforts to consider how NCE affect traditional predator—prey models based on consumption, and to compare the relative magnitude of consumptive and NCE of predators.
Resource levels and prey state influence antipredator behavior and the strength of nonconsumptive predator effects
The risk of predation can drive trophic cascades by causing prey to engage in antipredator behavior (e.g. reduced feeding), but these behaviors can be energetically costly for prey. The effects of predation risk on prey (nonconsumptive effects, NCEs) and emergent indirect effects on basal resources should therefore depend on the ecological context (e.g. resource abundance, prey state) in which prey manage growth/predation risk tradeoffs. Despite an abundance of behavioral research and theory examining state-dependent responses to risk, there is a lack of empirical data on state-dependent NCEs and their impact on community-level processes. We used a rocky intertidal food chain to test model predictions for how resources levels and prey state (age/size) shape the magnitude of NCEs. Risk cues from predatory crabs Carcinus maenas caused juvenile and sub-adult snails Nucella lapillus to increase their use of refuge habitats and decrease their growth and per capita foraging rates on barnacles Semibalanus balanoides. Increasing resource levels (high barnacle density) and prey state (sub-adults) enhanced the strength of NCEs. Our results support predictions that NCEs will be stronger in resource-rich systems that enhance prey state and suggest that the demographic composition of prey populations will influence the role of NCEs in trophic cascades. Contrary to theory, however, we found that resources and prey state had little to no effect on snails in the presence of predation risk. Rather, increases in NCE strength arose because of the strong positive effects of resources and prey state on prey foraging rates in the absence of risk. Hence, a common approach to estimating NCE strength – integrating measurements of prey traits with and without predation risk into a single metric – may mask the underlying mechanisms driving variation in the strength and relative importance of NCEs in ecological communities.
Risk, resources and state-dependent adaptive behavioural syndromes
Many animals exhibit behavioural syndromes—consistent individual differences in behaviour across two or more contexts or situations. Here, we present adaptive, state-dependent mathematical models for analysing issues about behavioural syndromes. We find that asset protection (where individuals with more ‘assets’ tend be more cautious) and starvation avoidance, two state-dependent mechanisms, can explain short-term behavioural consistency, but not long-term stable behavioural types (BTs). These negative-feedback mechanisms tend to produce convergence in state and behaviour over time. In contrast, a positive-feedback mechanism, state-dependent safety (where individuals with higher energy reserves, size, condition or vigour are better at coping with predators), can explain stable differences in personality over the long term. The relative importance of negative- and positive-feedback mechanisms in governing behavioural consistency depends on environmental conditions (predation risk and resource availability). Behavioural syndromes emerge more readily in conditions of intermediate ecological favourability (e.g. medium risk and medium resources, or high risk and resources, or low risk and resources). Under these conditions, individuals with higher initial state maintain a tendency to be bolder than individuals that start with low initial state; i.e. later BT is determined by state during an early ‘developmental window’. In contrast, when conditions are highly favourable (low risk, high resources) or highly unfavourable (high risk, low resources), individuals converge to be all relatively bold or all relatively cautious, respectively. In those circumstances, initial differences in BT are not maintained over the long term, and there is no early developmental window where initial state governs later BT. The exact range of ecological conditions favouring behavioural syndromes depends also on the strength of state-dependent safety.
Freshwater snail responses to fish predation integrate phenotypic plasticity and local adaptation
Predators shape the phenotype of prey by means of local adaptation, within-generation plasticity, and transgenerational plasticity. Theory predicts that the influence of these three mechanisms on phenotype is determined by the number of generations exposed to the predator and the autocorrelation between parent and offspring predator environments. To test theoretical predictions about the relative effects of local adaptation, within-generation plasticity, and transgenerational plasticity on prey phenotype, we exposed two generations of freshwater snails (Physa acuta) from two populations—one from a large pond containing predatory fish and one from a set of small ponds that lacked predatory fish—to the cues from a predaceous bluegill sunfish (Lepomis macrochirus). Physa acuta exhibit both local adaptation and phenotypic plasticity in response to predation risk; thus, we predicted that snails from the population with fish and snails experimentally exposed to fish cues would produce more globose (or spherical), crush-resistant shells. In addition, given the prolonged exposure to fish predation, we predicted that snails from the population with fish would exhibit reduced phenotypic plasticity. We found evidence that snails from the population with fish were generally more globose and more crush resistant than snails from the fish-less population and exposure to fish cues led snails from both populations to develop more globose shells with increased crush resistance. However, contrary to our predictions, the offspring of snails from both populations exposed to fish cues had elongated shells and reduced crush resistance. This may reflect the negative effects of parental stress or variation in resource investment.
Effects of pyric herbivory on prairie-chicken (Tympanuchus spp) habitat
The reduction and simplification of grasslands has led to the decline of numerous species of grassland fauna, particularly grassland-obligate birds. Prairie-chickens (Tympanuchus spp.) are an example of obligate grassland birds that have declined throughout most of their distribution and are species of conservation concern. Pyric herbivory has been suggested as a land management strategy for enhancing prairie-chicken habitat and stabilizing declining population trends. We assessed differences in vegetation structure created by pyric herbivory compared to fire-only treatments to determine whether pyric herbivory increased habitat heterogeneity for prairie-chickens, spatially or temporally. Our study was performed at four sites in the southern Great Plains, all within the current or historic distribution of either lesser (T. pallidicinctus), greater (T. cupido), or Attwater's (T. cupido attwateri) prairie-chickens. Key vegetation characteristics of grass cover and vegetation height in pyric herbivory and fire-only treatments were within the recommended range of values for prairie-chickens during their distinct life history stages. However, patches managed via pyric herbivory provided approximately 5% more forb cover than fire-only treatments for almost 30 months post-fire. Additionally, pyric herbivory extended the length of time bare ground was present after fires. Pyric herbivory also reduced vegetation height and biomass, with mean vegetation height in pyric herbivory treatments lagging behind fire-only treatments by approximately 15 months. Canopy cover in fire-only treatments exceeded levels recommended for prairie-chicken young within 12 months post-fire. However, canopy cover in pyric herbivory treatments never exceeded the maximum recommended levels. Overall, it appears that pyric herbivory improves vegetation characteristics reported as critical to prairie-chicken reproduction. Based on our results, we suggest pyric herbivory as a viable management technique to promote prairie-chicken habitat in the southern Great Plains, while still accommodating livestock production.
The effects of variable predation risk on foraging and growth: Less risk is not necessarily better
There is strong evidence that the way prey respond to predation risk can be fundamentally important to the structuring and functioning of natural ecosystems. The majority of work on such nonconsumptive predator effects (NCEs) has examined prey responses under constant risk or constant safety. Hence, the importance of temporal variation in predation risk, which is ubiquitous in natural systems, has received limited empirical attention. In addition, tests of theory (e.g., the risk allocation hypothesis) on how prey allocate risk have relied almost exclusively on the behavioral responses of prey to variation in risk. In this study, we examined how temporal variation in predation risk affected NCEs on prey foraging and growth. We found that high risk, when predictable, was just as energetically favorable to prey as safe environments that are occasionally pulsed by risk. This pattern emerged because even episodic pulses of risk in otherwise safe environments led to strong NCEs on both foraging and growth. However, NCEs more strongly affected growth than foraging, and we suggest that such effects on growth are most important to how prey ultimately allocate risk. Hence, exclusive focus on behavioral responses to risk will likely provide an incomplete understanding of how NCEs shape individual fitness and the dynamics of ecological communities.
Shaped by the past, acting in the present: transgenerational plasticity of anti-predatory traits
Phenotypic expression can be altered by direct perception of environmental cues (within-generation phenotypic plasticity) and by the environmental cues experienced by previous generations (transgenerational plasticity). Few studies, however, have investigated how the characteristics of phenotypic traits affect their propensity to exhibit plasticity within and across generations. We tested whether plasticity differed within and across generations between morphological and behavioral anti-predator traits of Physa acuta, a freshwater snail. We reared 18 maternal lineages of P. acuta snails over two generations using a full factorial design of exposure to predator or control cues and quantified adult F2 shell size, shape, crush resistance, and anti-predator behavior – all traits which potentially affect their ability to avoid or survive predation attempts. We found that most morphological traits exhibited transgenerational plasticity, with parental exposure to predator cues resulting in larger and more crush-resistant offspring, but shell shape demonstrated within-generation plasticity. In contrast, we found that anti-predator behavior expressed only within-generation plasticity such that offspring reared in predator cues responded less to the threat of predation than control offspring. We discuss the consequences of this variation in plasticity for trait evolution and ecological dynamics. Overall, our study suggests that further empirical and theoretical investigation is needed in what types of traits are more likely to be affected by within-generational and transgenerational plasticity.
Recoupling fire and grazing reduces wildland fuel loads on rangelands
Fire suppression and exclusion, the historically dominant paradigm of fire management, has resulted in major modifications of fire‐dependent ecosystems worldwide. These changes are partially credited with a recent increase in wildfire number and extent, as well as more extreme fire behavior. Fire and herbivory historically interacted, and research has shown that the interaction creates a unique mosaic of vegetation heterogeneity that each disturbance alone does not create. Because fire and grazing have largely been decoupled in modern times, the degree to which the interaction affects fuels and fire regimes has not yet been quantified. We evaluated effects of fire‐only and pyric herbivory on rangeland fuels and fire behavior simulated using BehavePlus at four sites across the southern Great Plains. We predicted patches managed via pyric herbivory would maintain lower fuel loads, and less intense simulated fire behavior than fire alone. We found that time since fire was a significant predictor of fuel loads and simulated fire behavior characteristics at all sites. Fuel loads and simulated fire behavior characteristics (flame length and rate of spread) increased with increasing time since fire in all simulated weather scenarios. Pyric herbivory mediated fuel accumulation at all sites. Mean fuel loads in fire‐only treatments exceeded 5000 kg/ha within 24 months, but pyric herbivory treatments remained below 5000 kg/ha for approximately 36 months. Simulated flame lengths in fire‐only treatments were consistently higher (up to 3 × ) than in pyric herbivory treatments. Similarly, fire spread rates were higher in fire‐only than in pyric herbivory treatments in all simulated weather conditions. Although all sites had potential to burn in the most extreme weather conditions, pyric herbivory reduced fuel accumulations, flame lengths, and rates of spread across all weather patterns simulated. These reductions extended the amount of time standard wildland firefighting techniques remain effective. Therefore, incorporating pyric herbivory into fuel management practices, in areas of high herbaceous productivity, increases the effectiveness of fuel treatments.