Parasitic lineages have acquired suites of new traits compared to their nearest free-living relatives. When and why did these traits arise? We can envision lineages evolving through multiple stable intermediate steps such as a series of increasingly exploitative species interactions. This view allows us to use non-parasitic species that approximate those intermediate steps to uncover the timing and original function of parasitic traits, knowledge critical to understanding the evolution of parasitism. The dauer hypothesis proposes that free-living nematode lineages evolved into parasites through two intermediate steps, phoresy and necromeny. Here we delve into the proposed steps of the dauer hypothesis by collecting and organizing data from genetic, behavioral, and ecological studies in a range of nematode species. We argue that hypotheses on the evolution of parasites will be strengthened by complementing comparative genomic studies with ecological studies on non-parasites that approximate intermediate steps.
Parasitism has evolved hundreds of times (Geary and Thompson, 2001; Johnson et al., 2004; Weinstein and Kuris, 2016). The most intimate types of parasitism in which the parasite lives on or in the host are mediated by gene family expansions (Tsai et al., 2013), horizontal gene transfers (Davis et al., 2000), new behaviors (Poinar, 1983; Chaisson and Hallem, 2012), and novel molecular defenses (Warburton and Zelmer, 2010; Haegeman et al., 2011). Within the large number of changes in highly derived parasite lineages is a subset of core traits that mediate the fundamental components of the parasitic lifestyle. Because these components are numerous (e.g., host finding, host attachment, immune system evasion), the parasitic lifestyle seems prohibitively irreducible (Dieterich and Sommer, 2009). If an entire suite of traits is required for successful parasitism, how does any individual trait increase in frequency in a free-living population?
One hypothesis is that many “parasitism” traits arose well before the transition to parasitism and were co-opted from their ancestral functions. Specifically, lineages acquired parasitic traits by establishing and shifting between other types of species interactions like mutualisms or commensalisms (Poinar, 1983; West et al., 1996; Johnson et al., 1997; Weinstein and Kuris, 2016). Phylogenies support this hypothesis: parasitic lineages’ closest non-parasitic relatives often associate with similar host species in non-parasitic ways (Proctor and Owens, 2000; Kikuchi et al., 2011; Weinstein and Kuris, 2016).
This hypothesis suggests that studying parasitic lineages alone could overlook important aspects of their evolutionary history. When inferring when a given trait arose, we can be led astray by assuming its ancestral function was the same as its current one. For instance, plant-parasitic nematodes have acquired cellulase enzymes horizontally (Haegeman et al., 2011). These enzymes enable nematodes to burrow through host plant tissue, suggesting that their acquisition occurred concurrently with the transition to parasitism (Jones, 1981). Alternatively, the enzymes could have been acquired much earlier for a different purpose before being co-opted for parasitism. To investigate this possibility, we can see if any non-parasitic species have cellulase genes that serve non-parasitic functions. The free-living nematode Pristionchus pacificus has 7 cellulase genes (Mayer et al., 2011), challenging the assumption that these are exclusive to plant parasites and suggesting that cellulases may have been acquired well before the transition to parasitism—possibly to break down bacterial biofilms (Romeo, 2008), as in P. pacificus—and later co-opted for plant tissue invasion. In short, non-parasites demonstrate alternative uses for so-called “parasitism traits,” which helps us investigate their origins.
An ideal system for studying the evolution of parasitism should therefore be rich in interaction types with readily available ecological, behavioral, genetic, and developmental data. Nematodes are popular for these reasons, but genetic and developmental data in this group (and many other parasitism study systems) have outpaced basic understanding of behavior and ecology (West et al., 1996; Whitfield, 1998; Proctor and Owens, 2000; Félix and Braendle, 2010; Adams et al., 2020). Comparative genomic studies have attempted to unlock the evolution of parasitism—for example, by studying convergence in molecular pathways (Crook, 2014)—but without an ecological understanding of species interactions, these models of parasitism evolution are limited. Here we argue that studying parasitism requires collecting seemingly disparate ecological data and reexamining them through the lens of parasitism evolution. We focus on 1 model, the so-called dauer hypothesis in nematodes, to update our knowledge of the hypothesis and illustrate how an ecological approach can benefit any parasitic study system.
THE DAUER HYPOTHESIS
The dauer hypothesis developed from the observation that free-living and parasitic nematodes have remarkably conserved life cycles (Fig. 1). All nematodes proceed through 4 larval stages separated by molts. Some species have an extra component to the life cycle; instead of passing through one of the normal larval stages, they may enter an alternative life stage with unique properties and functions (Rogers and Sommerville, 1963). In some species, the alternative stage (called the “dauer”) facilitates dormancy, a halting of development until conditions are favorable (Poinar, 1983). In others, the alternative stage (also called the “dauer”) is used for dispersing between habitat patches via larger carrier animals (Kiontke and Sudhaus, 2006; Félix and Braendle, 2010). In parasitic species, the alternative stage (called the “infective juvenile”) facilitates transmission between hosts (Anderson, 1984). Though their functions differ, these alternative stages survive harsh external stressors (Ellenby, 1968; Klass and Hirsh, 1976; Stasiuk et al., 2012), tie development to external cue detection (Rogers and Sommerville, 1963; Cassada and Russell, 1975; Stasiuk et al., 2012), have similar morphological structure and neuronal regulation, and almost always split from the main life cycle at the L2–L3 molt (Crook, 2014). The dauer hypothesis suggests that these 3 alternative stages are homologous (Rogers and Sommerville, 1963) and that the infective juveniles of parasites are derived from the more ancestral dauer larva (Hotez et al., 1993; Crook, 2014).
This broad version of the dauer hypothesis applies to any of the 15 or more origins of parasitism in Nematoda, including transitions to both plant and animal parasitism (Blaxter and Koutsovoulos, 2015). Over the years, researchers have detailed more specific versions for the evolution of different types of parasites. The best-known version lays out a 4-step evolutionary pathway for transitions to animal parasitism. The proposed evolutionary sequence is 1) free-living ancestors that do not associate with any larger species, 2) phoretic relationships in which nematodes superficially attach to a larger animal for dispersal, 3) necromeny, in which nematodes may feed on their dead hosts without directly contributing to the death themselves, and 4) parasitism (Crook, 2014). Many extant nematode species approximate these 4 steps (Poinar, 1983), so studying them can reveal the origins of parasitism traits, their ancestral benefits, and when they were co-opted for new purposes.
STEP ONE: NON-ASSOCIATING
The dauer hypothesis proposed that the road to parasitism begins early, and the ecological conditions at the non-associating step profoundly impact whether and how lineages evolve towards parasitism.
Free-living nematodes live in a variety of habitats, such as marine sediments or terrestrial soil, and eat different diets, like bacteria, fungus, yeast, or micro-animals (Poinar, 1983). This variation imposes unique stressors and selects different traits that can influence later developments. For instance, desiccation is a major selective force in terrestrial but not marine environments (Rebecchi et al., 2020). The dauer larva is thought to combat this stressor because it retains water well (Rogers and Sommerville, 1963; Ellenby, 1968; McSorley, 2003), remains dormant for a long time (Klass and Hirsh, 1976), and exists only in terrestrial lineages (Anderson, 1988, 1996). If the dauer precedes parasitism as suggested by the dauer hypothesis, we might expect that parasitism has only evolved in terrestrial lineages. Consistent with this prediction, phylogenetic analysis shows that parasitic lineages overwhelmingly derive from terrestrial ancestors; even nematodes that parasitize marine hosts like fish are more closely related to terrestrial nematode lineages than to marine ones (Anderson, 1984, 1996; Poulin, 2007), despite marine lineages being far older (Anderson, 1996; Blaxter and Koutsovoulos, 2015). In short, lineages have unequal probabilities of evolving parasitism depending on their ecological context.
Traits in the non-associating step may also predict what kind of parasitism evolves. Animal parasites are most closely related to bacteria-eating nematodes with narrow pharynxes (Poinar, 1983; Campbell and Lewis, 2002), whereas plant parasites derive from fungivorous nematodes with long hollow stylets (Poinar, 1983; Luc et al., 1987) (but see Holterman et al., 2009). The traits that facilitate each diet are more easily co-opted to parasitize one type of host over the other. Again, even the most basic early traits like diet can have far-reaching impacts on future developments.
STEP TWO: PHORESY
Under the dauer hypothesis, the next step in parasitism evolution is phoresy, a transport relationship between a small carried species, the phoront, and a more mobile species, the vector (Crook, 2014; White et al., 2017). Phoresy is advantageous for exploiting patchy resources as opposed to more continuous resources. For example, many lineages proliferate on dense blooms of microbes found on substrates like rotting fruit or carrion (Félix and Braendle, 2010). Such resources are far apart and short-lived, which selects for fast life cycles and phoresy (Young, 1961; Southwood, 1977; Bartlow and Agosta, 2021).
Phoresy as a step toward parasitism is not limited to the dauer hypothesis; it shows up in taxa such as mites (Houck and OConnor, 1991; Athias-Binche and Morand, 1993). Two aspects of phoresy contribute to its potential significance in the evolution of parasitism. Unlike species interactions that involve passing contact between the 2 species, phoresy necessitates intimate and long-lasting physical contact between the phoront and its vector, which may lend itself to future exploitation. Phoresy also constrains the sizes of the participants; it only works if the phoront is much smaller than the vector. This same size difference characterizes host-parasite interactions (Lafferty and Kuris, 2002), so phoresy may be more likely to foster parasitism than interactions that do not constrain participant size.
We might predict that lineages of adept dispersers (or those with no need to disperse) would have no need for phoresy and therefore not acquire the vector-finding and vector-attachment traits required by parasitism. We find evidence supporting this prediction in the difference between marine vs. terrestrial lineages. Marine nematodes are poor directional swimmers (Sherman et al., 1983) but can easily ride currents to new locations (Heip et al., 1985) or attach to floating driftwood and algae (Gerlach, 1977; Highsmith, 1985). When marine sediments are disturbed in experimental plots, nematodes swiftly recolonize them via these modes of dispersal (Sherman and Coull, 1980; Chandler and Fleeger, 1983; Sherman et al., 1983). Marine nematodes also have less need to disperse since their habitats are continually replenished with organic material (Heip et al., 1985; Alldredge and Silver, 1988). Therefore, marine nematodes would probably receive little benefit from phoretic relationships, which explains their rarity (Poulin, 2007). Even outside of Nematoda, marine phoronts are rarer than terrestrial phoronts, though under-sampling cannot be ruled out (Bartlow and Agosta, 2021). In terrestrial lineages, ephemeral habitats necessitate frequent dispersal via a vector (OConnor, 1982; Félix and Braendle, 2010); artificial habitats are rarely colonized by nematodes when vectors are excluded (M. Rockman, pers. comm.). Therefore, the spatial structure of terrestrial habitats can select for traits that increase dispersal ability through phoresy (Houck and OConnor, 1991).
Phoresy adaptations
Phoresy selects for traits that promote dispersal by increasing the frequency and success of transport; these same traits may subsequently promote host finding in parasitism. Phoronts can increase dispersal success in 4 ways: preparing for dispersal in the starting habitat, embarking on the vector, surviving during travel, and disembarking into the new habitat (Fig. 2).
How does a phoretic organism know when to stay in the starting habitat and when to disperse? Individuals that disperse too early miss out on resources and reproductive opportunities still available in the starting habitat. Individuals that fail to notice a swiftly degrading habitat may miss their chance to leave (Viney and Harvey, 2017). The most obvious solution is to tie dispersal to environmental cues. The dauer stage of non-associating nematodes is induced by poor environmental conditions (Cassada and Russell, 1975), so lineages that used the dauer stage for dispersal could immediately co-opt that sensory machinery to temporally limit dispersal attempts. Subsequently, this machinery could be modified in parasites to find their hosts.
Several traits make embarkment on the vector more predictable. Phoronts that sense and move toward vectors increase their chance of being picked up relative to phoronts that encounter vectors randomly (Haas, 2003). Potential chemotaxis cues include vector pheromones, metabolites, or temperature (Rogers and Sommerville, 1963; Granzer and Haas, 1991). These cues may be general—produced by a wide range of vector species—or specific to particular taxa. For instance, many phoretic and parasitic nematodes are attracted to carbon dioxide, a general cue given off by exhaling animals (Sciacca et al., 2002; Haas, 2003; Harbison et al., 2009). In some cases, vector seeking is temporally limited because only the dauer stage is attracted to carbon dioxide (Hallem et al., 2011). Some phoretic species even sense specific cues like pheromones or excreted compounds in vector waste (Haas, 2003; Hong and Sommer, 2006). More chemosensation and chemotaxis studies should be performed to understand the relationship between chemosensation and the range of vectors a given nematode uses in the wild.
Phoronts also use structures or behaviors to maximize the chance of vector attachment (White et al., 2017). In parasitic species, this translates to an increasing chance of infection after exposure to the host. Heritable intraspecific variation in such traits could allow selection to increase the likelihood of attachment and solidify the relationship between the phoront and vector (Durkin and Luong, 2018; Durkin et al., 2020). Unlike some taxa that require complex structures like hooks or claws (Bartlow and Agosta, 2021), most nematodes only need a film of water to stick to passing organisms and inanimate objects (Stiernagle, 2006). To ensure contact, dauers climb fungal stalks or other surface irregularities, hang on by their tails, and wave (Reed and Wallace, 1965; Croll and Matthews, 1977); this behavior (“nictation”) is hypothesized to reduce the dauer's contact with its substrate and better attach to its vectors (Campbell and Gaugler, 1993). Lab studies show that in both non-parasitic and parasitic species, individuals that nictate attach to vectors at significantly higher rates than non-nictating species or nictation-defective mutants of nictating species (Granzer and Haas, 1991; Campbell and Gaugler, 1993; Brown et al., 2011; Lee et al., 2012). Some parasitic species nictate faster when detecting carbon dioxide, presumably to increase the chance of embarkment when a host is nearby (Rogers and Sommerville, 1963; Granzer and Haas, 1991; Gang and Hallem, 2016). The nictation behavior itself may have been co-opted from the non-associating step where a stand-and-wave technique would be useful in traversing wide pores in the soil matrix (Reed and Wallace, 1965; Griffin, 2012). Such behaviors are physically demanding. Normally, dauers that sit and wait for better environments retain their old cuticle to provide a double layer of desiccation resistance (Ellenby, 1968; Poinar, 1983). Phoretic species cast off their old cuticle immediately after the molt (Cassada and Russell, 1975; Kiontke and Sudhaus, 2006), possibly to facilitate the acrobatics of host seeking, climbing, and nictation.
Once on the vector, the phoront must endure several environmental stressors, notably desiccation, starvation, and time. These same stressors affect parasitic species when they are searching for new hosts. The dauer stage lives off stored nutrients for months (Klass and Hirsh, 1976; Wadsworth and Riddle, 1989) so they are not as food limited as other life stages would be. To limit desiccation, dauers seek out humid regions near the vector's core or around openings: on beetles, they line up under the elytra (Blaxter, 2003; Moser et al., 2005), and on isopods, in gaps between segments (Kiontke and Fitch, 2013). Though the association is physically intimate, phoronts are not thought to interfere with their vector's mobility.
When the vector reaches a new habitat, the phoront must disembark to resume development, feed, and reproduce. Disembarkment is less relevant to parasitism from a trait perspective and more relevant as an evolutionary constraint. Because phoronts resume feeding and reproduction only in new food patches, mutations that cause phoronts to avoid disembarking are quickly weeded out (Farish and Axtell, 1971; Houck and OConnor, 1991). The fitness of the phoront is aligned with the fitness of the vector during transit, stabilizing the species interaction and blocking exploitation as long as the phoront still needs to arrive at a new habitat for food and reproduction. If, however, the phoront finds a way to break the coupling between phoront and vector fitness, exploitation can occur.
STEP THREE: NECROMENY
Necromenic nematodes use a vector to disperse, as in phoresy, but have an added feeding component: if the vector dies in transit, they may feed and develop on the microbes that proliferate on the cadaver (Crook, 2014). Necromeny thus builds on phoresy by selecting for traits that reinterpret the vector as a potential habitat by detecting vector health and tying development to these signals (Wirth, 2009). Parasitism can further evolve from this host detection and development. Additionally, necromeny breaks the coupling between an organism's fitness and its vector's fitness since the organism can survive to reproduce whether the vector lives or dies.
The transition from phoresy to necromeny can be facilitated or constrained by vector traits as well as phoront traits. While necromeny provides nematodes with an alternative food source and reproductive ground, it does not eliminate the need for further dispersal via another vector. In some vector species, death pheromones encourage conspecifics to aggregate around the cadaver or physically manipulate it, which can transfer necromenic organisms to a live vector, continuing the cycle (Wilson et al., 1958; Wirth, 2009; Yao et al., 2009). In contrast, some vector species are strongly repelled by conspecific death pheromones, which could bar the development of necromeny since the vector habitat is effectively a dead end (Yao et al., 2009). Alternatively, avoidance of dead conspecifics could select for more generalist nematodes that rely on pick up by other vector species.
Necromeny adaptations
Though necromeny has been noted for decades (Schulte, 1989), we know surprisingly little about it. Identifying necromenic species is challenging—many species appear necromenic in the lab environment but not in the wild (Blaxter, 2003; Félix et al., 2018). The context-specific nature of necromeny implies that the relevant changes are likely in signaling and communication rather than the capability of resource use. By this argument, necromeny selects for traits that reinterpret vector environments as sites of food and reproduction (Wirth, 2009). We focus on 4 major areas that could facilitate this shift (Fig. 3): detecting vector cues, detecting the vector microbiome, intraspecific communication between nematodes, and sensing the abiotic context.
Nematode-vector communication could contribute to necromenic behavior and prime the nematode for parasitism. In particular, nematodes may develop the ability to monitor vector health by detecting cues produced by sick or dead vectors. For instance, Caenorhabditis elegans (a suspected necromenic species) emerges from the dauer stage faster when detecting oleic acid (Kaul et al., 2014), a compound released by many arthropods after death (Wilson et al., 1958; Yao et al., 2009). Some cues are produced constitutively by healthy vectors, and their absence signals sickness or death. Pristionchus spp. are kept in the dauer stage by a constitutively produced beetle pheromone and resume development only once their vector dies and ceases pheromone production (Cinkornpumin et al., 2014). In theory, necromeny links nematode development to its vector. The linkage can strengthen over time, as demonstrated by the many parasitic species that cannot develop properly until sensing one or more host cues (Rogers and Sommerville, 1963).
Vectors are also defined by a surface microbiome during life and the necrobiome that proliferates after death (Schulte, 1989). The vector microbiome likely differs from the microbiome on the phoront's patchy habitat. Nematodes distinguish between bacteria with impressive resolution (Grewal and Wright, 1992), and dauers (at least in C. elegans) use this information when deciding whether to resume development (Bubrig et al., 2020). Non-necromenic species might confine development to environments with their preferred bacterial community and might not recognize vector bacteria as a legitimate food source. Shifting food preferences could occur with changes in odorant receptors, which are highly mutable (Quignon et al., 2005), or in their expression.
Nematodes regulate and coordinate development via the exchange of pheromones among conspecifics (Golden and Riddle, 1984; Ludewig et al., 2019). A nematode's dauer state is maintained by nearby dauers. Nematodes that develop to adulthood signal nearby dauers to resume development (Ludewig et al., 2019). Intraspecific communication can therefore reinforce habitat use by coordinating development as a group. The habitat shift seen in necromeny could result from adjusting the production of pheromone components, which are known to be highly genotype and environment specific in both free-living and parasitic species (Kaplan et al., 2011; Mayer and Sommer, 2011; Stasiuk et al., 2012; Diaz et al., 2014). Similar group coordination may increase infection success in parasitic lineages (Campbell et al., 1999).
Chemosensation in nematodes depends on the environmental context (Bargmann, 2006), so interpreting vector bacteria as “food” may require integration with cues from the preferred substrate. For instance, nematodes living on rotting fruit might scan for plant oils, sugars, or high moisture in addition to bacterial food signals. Integrating vector-specific environmental cues or lessening the importance of environmental cues could play a role in necromeny.
STEP FOUR: PARASITISM
Parasites differ from phoretic or necromenic stages in that parasites feed and develop on living hosts. Necromeny may facilitate this exploitation by uncoupling vector and nematode fitness; the nematodes do not depend solely on their ancestral habitat. In some cases, parasites have free-living stages that still feed in the environment either facultatively like Pristionchus lheritieri (Geraert et al., 1989) or as part of an alternating life cycle like Strongyloides stercoralis (Yamada et al., 1991). In other cases, the facultative strategy may be used as a stepping stone to obligate parasitism and the abandonment of the ancestral habitat (Durkin and Luong, 2018, 2020; Luong and Mathot, 2019).
Parasitic nematodes use 2 main methods to feed on their host (direct feeding and indirect feeding), which suggests multiple ways a lineage could evolve host feeding. In indirect feeding, nematodes do not feed on host tissue. Instead, they weaponize their ancestral food source to kill the host. Heterorhabditis spp. harbor symbiotic bacteria that are harmless to them but highly pathogenic to their hosts. The bacteria kill the host and proliferate on the cadaver, after which the nematodes feed on the bacteria to grow and develop (Poinar, 1990). Traits of non-parasitic nematodes could facilitate the evolution of this strategy. Larvae developing into dauers plug both ends of their gut, which can trap bacteria inside (Riddle et al., 1981; Rae et al., 2008). If the bacterial species is asymptomatic to the nematode but pathogenic to the carrier species, dauers may inadvertently weaponize them when they expel the bacteria (Chantanao and Jensen, 1969). The relationship could be strengthened by selecting for bacteria that are less pathogenic to the nematode and more pathogenic to the carrier species, more likely to survive the nematode gut, and more likely to reassociate with the nematode again, such as if the bacterium regularly out-competes other bacterial species (Ffrench-Constant et al., 2000).
In direct feeding, nematodes switch from their ancestral food source (e.g., bacteria) to the host's tissue, likely via digestive enzymes (Geary and Thompson, 2001; Poulin and Randhawa, 2015). Animal parasitic nematodes co-opt endogenous digestive enzymes and secrete them into the environment rather than retain them internally (Blaxter, 2003). Plant-parasitic nematodes require additional enzymes like cellulases that are not endogenous (Danchin et al., 2010).
Regardless of the evolutionary pathway, some limited set of changes must facilitate basic host exploitation (Geary and Thompson, 2001). Once that threshold is crossed, newly evolved parasites encounter a new set of challenges like the host immune system, and many aspects of the ancestral lifestyle may no longer apply (Blaxter, 2003; Poulin and Randhawa, 2015). Over time, parasites might even target new tissues (Sukhdeo et al., 1997), jump to new hosts, or integrate new hosts into a complex life cycle (Sudhaus, 2018). The majority of parasitism genes we find in comparative genomic studies have been acquired in the millions of years after the initial transition to parasitism, potentially disguising the core set of genes that enabled initial exploitation (Blaxter, 2003; Opperman et al., 2008; Kikuchi et al., 2011). The original genes may not even be present anymore, their functions transferred and diluted to dozens of other genes (Viney, 2017). Therefore, studying the transition to parasitism becomes harder the farther back exploitation began. To solve this, we can shift our attention from prototypical parasites and study extant species that are facultatively parasitic (Luong and Mathot, 2019) or in transitional phases (Stevens et al., 2020). Such cases help us distinguish between critical changes for parasitism and subsequent adaptation.
CONCLUSION
The problem with models of parasitism evolution is that they cannot be tested directly short of experimentally evolving free-living species into parasites (Viney, 2017). We can gain insight into the transition to parasitism by aggregating and interpreting data from related non-parasitic species. The process gives clues about when key parasitism traits likely arose and what their ancestral function may have been. While this article focused on 1 nematode model, the approach can benefit any models—such as alternative dauer-based hypotheses in Nematoda (Sudhaus, 2018) or those in other taxa like mites (Houck and OConnor, 1991)—provided the model's steps are well defined and we can find non-parasitic species that approximate them.
Our interpretations are only as good as the ecological data for non-parasitic species. In model nematodes, ecology and natural history have been vastly outpaced by genetics, genomics, and developmental biology (Kammenga et al., 2008; Félix and Braendle, 2010). This pattern is hardly a quirk of Nematoda (West et al., 1996; Whitfield, 1998; Proctor and Owens, 2000) and reflects a broader skew toward applied parasitology and away from evolutionary and ecological studies (Keymer and Read, 1990; Vickerman, 2009; Jackson, 2015). We can understand the evolution of parasitism only by tuning more into the basic ecology and natural history of parasitic species and their non-parasitic relatives.
ACKNOWLEDGMENTS
The authors would like to thank A. Gibson for helpful comments and guidance as well as M. Rockman for sharing unpublished results. LTB was supported by the Graduate Council Fellowship through the University of Alabama Graduate School. JLF was supported by the US National Science Foundation awards EF 1921585 and DEB 1941854.