Introduction

Many species are defended with toxins, unpleasant tastes, spines and/or other traits that make them unprofitable to consume by would-be predators (Ruxton et al. 2018). These secondary (post-contact) defences serve to protect the carrier, but they will not always be evident to the predator before capture. Consequently, many of these species have evolved conspicuous colour patterns which may serve as a salient signal to warn would-be predators to avoid them (Poulton 1890; Wallace 1889; White and Umbers 2021). Intriguingly, EB Poulton, who proposed the term “aposematism” to describe the combination of harmful defence and conspicuous colour patterns (Poulton 1890) also noted another important association that has so far received considerably less attention: “Great tenacity of life is usually possessed by animals with Warning Colours” (1908, pg. 316). In his seminal work on animal coloration, Cott (1940) similarly noted “Another characteristic shared by [aposematic] animals…is their savage disposition when attacked, their toughness, and their remarkable tenacity of life when injured.” (pg. 245). Likewise, in his influential review of animal defences Edmunds (1974) simply noted that “aposematic animals are also often tough” (pg. 63).

The relative difficulty of killing aposematic species, and in particular their physical robustness, has long been familiar to entomologists who regularly capture and pin specimens. The association is perhaps best known in Lepidoptera, which have given us many examples of warning signals and mimicry (Quicke 2017; Ruxton et al. 2018). A common technique for immobilising butterflies without damaging their wings is known as “pinching”, in which the insect’s thorax is held between the thumb and forefinger and squeezed until a soft “pop” is felt (e.g., Steppan 1996; Swynnerton 1926). Even before the recognition of aposematism as a defensive strategy, Trimen (1868) took notice when pinching the thorax of several species of African Danais and Acraea butterflies failed to have any obvious effect on the specimens, until a very high amount of force was used. Even after the butterflies’ apparent death, Trimen described, with surprise, how several of the seemingly dead specimens flew off “with perfect ease and apparent nonchalance” (Trimen 1868, pg. 499). Trimen credited the “tenacity for life” of these butterflies to their “remarkable elasticity”, and quickly made the connection to the putative unpalatability of these genera, proposing that naïve birds and other insectivores may sample the butterflies only to release them upon discovering their noxious taste, leaving the butterflies largely unharmed. Over the years many similar observations have been made by entomologists, all supporting the view that aposematic species are in general “tougher” than palatable, cryptic species (see Table S1, Supplementary Information). For example, Pinheiro and Campos (2013) noted “In contrast to palatable species (including Batesian mimics), which usually exhibit soft wings, aposematic butterflies have tough wings” (pg. 366).

Of course, the evaluation of pinching forces is clearly subjective and many of the above observations were simply made in passing. Some assertions are also not entirely independent, with authors simply echoing the beliefs of others. In this review, we have attempted to critically evaluate the available evidence for an association between the ability of species to survive being handled and whether they exhibit aposematism or crypsis. Note that warning signals can potentially evolve to indicate pre-capture defences such as high evasiveness, rather than harmful post-capture defences, such as spines or toxins (Ruxton et al. 2018) but it is the association of capture tolerance with post-capture (“secondary”) defences that we focus on here. If aposematic species are indeed able to tolerate capture better than alternative forms, then it raises two important questions. First, if warning signals function by preventing predation before physical contact, why have aposematic species evolved an ability to tolerate capture? Second, if aposematic species are so good at tolerating capture, why have warning signals evolved at all? We return to these questions at the end.

Successful predation involves the predator breaching of not one but several stages of a prey defences in an attack sequence which may include encounter, detection, identification, approach, subjugation and consumption (Endler 1991). Increasing attention is therefore being paid to understanding defences as inter-related suites of deterrents, with specific back-up defences being selected for when earlier defences fail (Britton et al. 2007; Kikuchi et al. 2023). Here we use this review to summarize the strength of evidence that warning signals, chemical defences and capture tolerance represent one such suite of defences (a syndrome), acting quasi-independently yet combining synergistically to enhance the survivorship of the prey (Winters et al. 2021). In so doing, we follow the lead of early researchers such as Carpenter (1929) who also viewed the three-way association between secondary-defences, warning signals and capture tolerance as an integrated whole: “This resistance to injury is part and parcel of the process whereby an aposematic insect teaches an enemy that it is harmful or unpalatable. It almost invites attack, and if it is seized and handled, suffers little injury”.

We begin our review by justifying our use of the term “capture tolerance” to represent the ability of species to survive being seized and subjugated by a would-be predator. We then briefly describe the physical forces that prey may be subjected to when seized by their would-be predators, before describing the properties that would allow them to survive this treatment. Next, we describe the available evidence that aposematic species exhibit a greater degree of capture tolerance than cryptic palatable species, both from a morphological perspective and from an experimental perspective with real predators. We then consider why this association might arise before discussing its implications and highlighting areas for future work. To improve the focus of our review we have limited our examples to insects, although we note at the outset that a three-way association may arise in other taxonomic groups for similar reasons. For example, Cott (1940) highlighted the toughness of aposematic mammals, suggesting that it would be a selective advantage to a skunk, zorrila and porcupine to be constructed in such a way that it could survive any “ill-considered” blows before its defences are revealed.

Terminology

Here we use the term “capture tolerance”, to reflect a species’ ability to survive being seized by a predator with little to no permanent damage, allowing it to go on to reproduce. Capture tolerance can, of course, be achieved in several ways, depending on the nature of the stresses imposed by the would-be predator. These stresses are generally some combination of crushing (compression), stretching (tensile) and tearing (shear) forces (Miller et al. 2009). While some species appear to have evolved weakened structures that break off under initial seizure, these traits appear to have evolved to avoid successful capture, rather than to tolerate capture per se, so we only consider them briefly here.

When discussing capture tolerance, it is helpful to have a specific predator in mind. Birds, spiders, other insects, fish, frogs, lizards, and mammals (amongst others) are all known to attack insects and use different strategies to do so (Sugiura 2020). Naturally, the body part that is first seized by a would-be predator will vary with the species of prey and predator. Butterflies for example are often seized by the wings (Kassarov 1999; Wourms and Wasserman 1985) while flies (order Diptera) have small wings and are typically seized by the abdomen or thorax. Some inter-related measurable traits of relevance to capture tolerance in insects therefore include stiffness (resistance to non-permanent deformation), hardness (resistance to permanent deformation), elasticity (ability to return to its original shape after the removal of the deforming force), resilience (ability to store strain energy via elastic deformation without breaking), strength (force per unit area required to initiate a crack) and toughness (energy per unit area required to propagate a crack), although the precise meaning of these terms varies from field to field.

A standard measure of the stress applied on a prey item by a would-be predator is its bite force, typically measured in Newtons (N). Arguably of more relevance is the force the predator applies per unit area over which that force is distributed i.e. its pressure (typically measured in Pascals i.e. N/m2). Using pressure may be particularly informative when comparing the destructive capability of predators with different jaw sizes and shapes (being stepped on by a stiletto can do more damage than a flat-soled shoe). However, due to limitations in the instruments used to measure forces (e.g. transducer plates), the difficulty of quantifying bite area, and the desire to have a standardized measure, it is generally the raw forces that are reported, rather than pressures (see Lederer 1975 for a notable exception).

Physical tests of the capture tolerance of aposematic and cryptic insects

Aposematic velvet ants (Hymenoptera: Mutillidae) are the models of several well-known mimicry rings. In their exploration of the defences of these species, Schmidt and Blum (1977) experimentally evaluated the hardness of several insect species by crushing the thoraxes of dry specimens using a Hanson Cook-O-Meter® weighing scale and a bar. The velvet ant, Dasymutilla occidentalis, which is protected by a powerful sting with algogenic venom, was able to withstand the highest absolute crushing force (27.8 N) before “skeletal collapse”, and more force relative to its mass (247 N/g) than any other species assayed. For context, Corbin et al. (2015) measured compressive (“bite”) forces of North American insectivorous birds and found them to range from 0.68 N (approximately 1.28 N/cm2 given bill dimensions) in the yellow-rumped warbler (Dendroica coronata) to 6.31 N (approximately 2.14 N/cm2 given bill dimensions) in blue jays (Cyanocitta cristata). Likewise, Aguirre et al. (2003) determined the bite forces of bats, which range from 1.27 N in the black myotis (Myotis nigricans) to 68 N in the greater spear-nosed bat (Phyllostomus hastatus), which feeds preferentially on beetles.

In perhaps one of the most comprehensive datasets assembled to date, Wang et al. (2018a, b) used the dissected jaws of a lizard predator (Japalura swinhonis) embedded in resin as the crushing tool to quantify the “hardness” of a number of different insect species, measured as the maximum force at exoskeleton failure. As with Schmidt and Blum (1977), their work focused on evaluating the hardness of a conspicuously coloured (and potentially aposematic) species, the weevil Pachyrhynchus sarcitis, but in so doing they compared the physical properties of this weevil species with a diverse selection of sympatric insects, including other Coleoptera and insects in the orders Odonata, Orthoptera, Hemiptera, Blattodea, Neuroptera, and Hymenoptera. Although aposematic and non-aposematic taxa were not explicitly compared, it was noted that the mature aposematic form of P. sarcitis took more force to crush on average (20.1–40.9 N) than any of the other sympatric weevils (Curculionidae) in the same habitat, the majority of which appear to have been cryptic.

Another set of studies (DeVries 2002, 2003) was specifically interested in evaluating the tensile wing “toughness” of butterflies that varied in palatability to predators. In his 2002 paper, DeVries tested five species of sympatric nymphalid butterflies: two species considered palatable (Junonia terea and Bicyclus safitza) and three unpalatable (Amauris niavius, Acraea insignis and Acraea jobnstoni), measuring the “toughness” of their wings by attaching a small metal electrical clip to the hindwing margin and adding weight until the clip ripped free. He found that the wing “toughness” varied significantly among species, with the wings of the unpalatable species taking more weight to rip compared to those of palatable species. Even the unpalatable species varied in their wing toughness, leading DeVries to suggest that this variation may be due in part to inter-specific variation in their unpalatability, with more unpalatable species having “tougher” wings. To further test this idea, DeVries (2003) evaluated the rip-resistance of the wings of three different nymphalid species in the same manner. The butterfly Amauris albimaculata (Danainae) was considered an aposematic (unpalatable and conspicuous) model for the putatively palatable Batesian mimic Pseudacraea lucretia (Nymphalinae), while the butterfly Cymothoe herminia (Nymphalinae) was chosen as a closely-related palatable non-mimetic species. Intriguingly, DeVries (2003) again found that the aposematic species had the highest tear weight, the cryptic species the lowest, and the “putative” Batesian mimic a tear weight which was intermediate between the two.

It is worth noting that the species tested in the above experiments varied significantly in wing length, with the aposematic species being the largest and the cryptic species being the smallest, making size a potentially confounding variable, especially given the fixed size of clip (Kassarov 2004). Moreover, while DeVries (2002, 2003) broke new ground in attempting to objectively compare the physical attributes of butterflies with different defences, the relationship between the weight required to rip the wing and the ability of an insect to survive capture is not straightforward. In particular, one might argue that if a wing more easily tears in the beak of a bird, then that butterfly is more liable to be able to escape. So, the differences uncovered by DeVries (2002, 2003) may have been driven more by selection on palatable species to avoid capture, rather than by selection on unpalatable species wings to resist tearing following capture (Kassarov 2004).

To help understand why the wings of different butterfly species vary in their ease of tearing, Hill and Vaca (2004) compared the critical hindwing tear weight of three Pierela species of butterfly (Satyrinae) all of which were considered palatable to birds but only one of which exhibits “deflective” patches. As predicted, the species with a conspicuous white hindwing patch (P. astyoche) had significantly lower tear weights than the two species lacking this patch (P. lamia and P. lena). In this way, the weakness of a butterfly wing (in terms of its ease of rip**) may be a strength. The counter-intuitive nature of inferences drawn from beak marks is reminiscent of the classic work by Abraham Wald on the “survival bias” in WWII bombers (Mangel and Samaniego 1984; Wald 1943)—which concluded that reinforcing armour should go where bullet holes are not observed, rather than where they are.

Surviving sampling by predators

The properties of insects that confer resistance to physical forces are of interest here primarily because they suggest an ability to survive capture. However, some of the studies described above have been complemented by trials with real predators. As noted earlier, many aposematic velvet ants (Hymenoptera: Mutillidae) are extremely hard to crush (Schmidt and Blum 1977). Schmidt and Blum (1977) offered the species Dasymutilla occidentalis to a variety of predators. Out of 122 trials, only two specimens were killed, one by a gerbil and one by a tarantula. Gall et al. (2018) subsequently staged interactions between seven species of velvet ant and ten different species of potential predators with similar results. Likewise, Wang et al. (2018a) also reported that, in behavioural trials, all 40 of the live, hard (mature) specimens of the potentially aposematic weevil Pachyrhynchus sarcitis were immediately spat out after being bitten by tree lizards Japalura swinhonis (Agamidae), in each case surviving the attacks.

Many other studies have observed that chemically defended insect prey frequently survive being captured by predators. In particular, Järvi et al. (1981) presented wild-caught great tits, Parus major, sequential choices of a piece of mealworm and an unpalatable swallowtail larva, Papilio machaon. The birds generally attacked the swallowtail larva at the first opportunity but soon learned to avoid attacking them after experiencing their distastefulness. Importantly, all the swallowtail larvae survived these initial attacks, whereas most of the mealworms were consumed. In an even more ambitious experiment, Wiklund and Järvi (1982) evaluated the response of hand-reared Japanese quail (Cotumix coturnix), starlings (Sturnus vulgaris), great tits (Parus major) and blue tits (Parus caeruleus) to five species of aposematic insect (swallowtail butterfly larvae Papilio machaon, large white butterfly larvae Pieris brassicae, burnet moths Zygaena filipendula, ladybirds Coccinella septempunctata and firebugs Pyrrhocoris apterus) and found that all of the predators frequently released these distasteful prey unharmed after capture. Ohara et al. (1993) presented naïve domestic chicks (Gallus gallus) with green larvae of the butterfly Pieris rapae and black larvae of the sawfly Athalia rosae (conspicuous against green). They found that the number of P. rapae attacked by the chicks increased per trial, with 60/80 being eaten in the third trial. By contrast, although A. rosae larvae were pecked, they were seldom killed or injured, with only 1/80 being consumed in the third trial. They put the high survival of A. rosae down to the fact that their bodies were elastic, and the fact their integument has such a low mechanical resistance that even slight peck causes the release of repellent haemolymph (“easy bleeding”, Boevé and Schaffner 2003).

While the above studies were laboratory based, field experiments have reported similar results. In a comprehensive series of studies in the neotropics, Chai (1996) found that naïve young jacamars with little experience would readily attack butterflies of any colour pattern. However, certain butterfly species were released after capture, and most survived this sampling. Chai (1996) put this high survivorship down to the fact that many of the unacceptable butterflies had a “tough” and flexible thorax. Thereafter, these unacceptable species tended to be sight rejected. By contrast, other species –notably erratically flying cryptic species—were frequently consumed when caught, creating a bimodal acceptability curve.

Rather than observe the reactions of hand-reared predators, Pinheiro and Campos (2019) observed the responses of wild adult jacamars to butterflies in the field. While the majority of aposematic butterflies (e.g. species of Heliconius and Parides) were sight rejected, all of the aposematic butterflies that were caught by the jacamars were released alive and without apparent injury. By contrast, cryptic butterflies (Hamadryas and Satyrini species) did not elicit sight rejections, and large numbers were successfully attacked and consumed.

Arthropod predators have also been known to release chemically defended butterfly species unharmed after capture. For example, Vasconcellos-Neto and Lewinsohn (1984) released live specimens of 27 butterfly species into the web of the neotropical forest spider Nephira clavipes. Almost all prey items were initially touched by the spiders with pedipalps and foreleg tarsi, allowing the spiders to feel and gently taste them. The vast majority of chemically defended Ithomiine butterflies remained motionless in the web and were subsequently disentangled and released by the spiders. Curiously, Heliconius butterflies tended to struggle in the web and most specimens thrown in the webs were eaten. Despite the fact that heliconids are unpalatable to birds (Chouteau et al. 2019) one reason for this difference may be that the cyanogenic glycosides found in heliconids are not as repellent to other arthropods as the pyrrolizidine alkaloids found in ithomines.

If the association between aposematism and capture tolerance exists, why might it arise?

So far, we have highlighted the widespread impression among entomologists that aposematic insect species tend to have relatively high capture tolerance compared to cryptic insect species. While a variety of observations support this view, much of the evidence remains anecdotal in that the studies were not specifically designed to compare the capture tolerance of aposematic and non-aposematic species. Clearly, much more remains to be done, including evaluating the capture tolerance of a broader range of insect species under standard conditions (see section “Future directions”). However, given the long-held belief that there is an association between aposematism and capture tolerance, it is natural to wonder why it might have evolved.

The evolution of a harmful secondary defence such as a toxin and associated warning signals (aposematism) has itself long been considered a puzzle (Caro 2023; Mappes et al. 2005; Ruxton et al. 2018). This is because the first conspicuous mutant of a chemically defended but cryptic species would face a “double whammy” of standing out, and yet not being recognized as defended. Faced with this problem, Fisher (1930) argued that aposematism might more readily evolve in species that are spatially or temporally aggregated, so that the traits may spread if it protects other individuals that have also inherited the chemical defence and associated signal. However, an ability to survive capture suggests that this form of selection (later known as “kin selection”, Maynard Smith 1964) is not the only solution, or even the most likely. Fisher (1930) was also aware of this alternate possibility: “Professor Poulton has informed me that distasteful and warningly coloured insects, even butterflies, have such tough and flexible bodies that they can survive experimental tasting without serious injury. Without the weight of his authority I should not have dared to suppose that distasteful flavours in the body fluids could not have been evolved by the differential survival of individuals in such an ordeal”.

Unfortunately, Poulton’s observations referred to by Fisher still leave us with a conundrum, because a complete explanation of aposematism evolving by individual selection should also explain the high capture tolerance of aposematic insects. Indeed, as Endler (1991) noted, observing aposematic species that can survive attack “begs the question of how toughness evolves”. Here we consider various possibilities, with capture tolerance either at the start or end of a sequence of selection, or the product of independent selection (see Fig. 1a–d).

Fig. 1
figure 1

A range of plausible evolutionary pathways in which an association between secondary defences, warning signals and capture tolerance might evolve. Naturally, outside mimicry, warning signals will tend to evolve only after the possession of a suitable secondary defence. While it seems reasonable to assume that the possession of a secondary defence will tend to select for a body that is sufficiently robust to survive for long enough to reveal the defence, there are several routes under which the association might arise

Capture tolerance generates selection for aposematism

One solution as to how the association arises is that is that high capture tolerance is an initially incidental trait that simply allows aposematism to evolve (Fig. 1a). As we have seen, many insect species are able to tolerate capture, and if this resilience arises for other reasons independent of anti-predator defence, then it may nevertheless set the stage for the evolution of defences and associated warning signals. The abiotic environment of an insect species inevitably imposes important selection pressures on the species’ physical properties, and hence (incidentally) its ability to tolerate capture by would-be predators. For example, large beetles living in arid environments have been found to be generally more resistant to crushing than those living in temperate areas, which may be a consequence, at least in part, of selection to avoid desiccation (Fisher and Dickman 1993a, b). Likewise, insects that frequently come into contact with abrasive substrates, such as those which burrow underground, must be able to resist excessive friction damage (Sun et al. 2008). The biotic environment, ranging from selection imposed by potential ectoparasites to intra-specific conflict over resources, may also select for body forms that happen to allow the species to survive capture by predators (Gullan and Cranston 2014). Once capture tolerance has arisen then, as Evans (1987) argued, it may allow “tough, harmless cryptics” to “evolve into tough, nasty aposematics” through individual selection, since it facilitates the evolution of secondary defences and associated warning signals.

Of course, capture tolerance may not be incidental at all, but represent an integral part of the insect’s secondary defence, making the prey unprofitable to consume from an energetic perspective (Cyriac and Kodandaramaiah 2019). For example, while it is tempting to consider conspicuous bumblebees as signalling their propensity to sting a would-be predator, the tough chitin and hairy bodies of these insects may be an even more important factor rendering them unprofitable (Gilbert 2005). In an ambitious series of aviary experiments, Mostler (1935) observed that bumblebees stung their would-be predators relatively infrequently and instead proposed that birds learned to avoid them due to their difficulty of handling. As Mostler (1935) reports, one bird (a whitethroat, Sylvia communis) took 18 min to beat, dismember and consume a bumblebee, after which it was exhausted. Similarly, Swynnerton (1926) detailed numerous accounts of hungry birds repeatedly failing to capture or consume live Charaxes butterflies and attributed this to the butterflies’ “hard” bodies and the violence with which they struggled upon being seized. Charaxes are generally considered palatable (Batesian) mimics of chemically defended models. However, if their “tough” integument and violent escape behaviours make them sufficiently unprofitable, then they could potentially be considered Müllerian mimics (Sherratt 2008), having evolved the same warning signal as other unprofitable species.

In short, there are multiple ways in which warning signals could evolve after capture tolerance, including the trivial solution that capture tolerance itself is the signalled defence. In this instance, rather than toughness being a “third component” of aposematism, it could just be one of several secondary defences contributing to unprofitability.

A secondary defence selects for capture tolerance and warning signals

If a species has evolved a form of post-capture defence (such as unpleasant chemicals derived from its host plant, or a painful sting) then it is likely that there will be strong selection on its capture tolerance to allow the prey to survive long enough for their defence to be revealed (Fig. 1b). Many defended species secrete chemical compounds and odours on handling (Whitman et al. 1985); ladybirds, for example, release pyrazine odours when disturbed (Marples et al. 1994). Armed with a secondary defence, any preview of what is in store if subjugation continues, should therefore elicit a quick release. Indeed, it appears that the distribution of harmful deterrents within the body of aposematic prey are strategically allocated in a manner that rapidly teaches predators to avoid them and/or enhances prey survivorship (Brower and Glazier 1975). Evans et al. (1986) presented individuals of an aposematic (red and black) bug Caenocoris nerii (Hemiptera: Lygaeidae) that had been fed on oleander seeds (which contain cardiac glycosides) or sunflower seeds (cardenolide-free) to naïve common quail (Coturnix coturnix). Like many aposematic bugs, C. nerii has a rigid exoskeleton (Evans et al. 1986) and about 85% of all adults consequently survived the quails' attacks overall. However, the oleander seed-fed bugs were even less likely to be killed on attack, suggesting that the defence plays some role in facilitating survivorship.

Aposematism generates selection for capture tolerance

Aposematic species are not only highly conspicuous, but also tend to exhibit slow and predictable movement, rendering them relatively easy to catch (Dowdy and Conner 2016; Hatle and Faragher 1998; Marden and Chai 1991; Sherratt et al. 2004; Srygley 1994). Since many such species are initially attacked by naïve predators before they are consistently sight-rejected (Boyden 1976; Chai 1986) then it stands to reason that the higher vulnerability of aposematic prey (at least initially) may further select for improved post-capture tolerance (Fig. 1c).

One might wonder why palatable species lacking a significant secondary defence have not likewise co-evolved a form of capture tolerance when their primary defence, notably crypsis or high evasiveness, fails. One reason may be that such species are so rarely caught (either because they are not detected, or because they are hard to catch) that there is weaker selection for a post-capture defence. However, another complementary reason is that palatable prey lacking a secondary defence have no means to repel the predator following capture, so may be less able to prevent their consumption.

To formalize the above set of arguments, we have modified the well-known model of Engen et al. (1986)—see Supplementary Information. Engen et al. (1986) assumed for convenience that the probability of a prey being eaten when seized (pe) was a constant, reflecting the distastefulness of the prey (a small value of pe corresponding to a high distastefulness). Here we instead assume that prey can invest r in their ability to survive capture, which can be measured in terms of its reduction in fecundity at the end of the season. This investment is non-linear with a maximum possible (high investment) survival probability after being seized of mU and mP for unprofitable and profitable (i.e. with and without a secondary defence) prey respectively (\({m}_{P}\le {m}_{U})\).

Figures 2a, b show the fitness of the prey with and without a secondary defence for different mean number of encounters (\(\lambda\)) over its lifetime (reflective of its degree of crypsis) and different levels of investment in ability to survive capture (r). The optimal investment in robustness that maximizes prey types’ fitness under these conditions are highlighted in red. The key insights of this model are: (1) when prey are rarely encountered (\(\lambda\) low) then the optimal level of investment in robustness is 0 for both prey types, since the prey will not be selected to pay a cost for a trait that is rarely needed, (2) when unprofitable prey can honestly indicate their defences (maximal survivorship following capture, \({m}_{U}>{m}_{P}\)) then unprofitable prey get more “bang for their buck” and consequently stronger selection to become robust and (3) even profitable prey will ultimately face selection for a degree of robustness, but when \({m}_{P}\) is low this would only be worthwhile at high predator encounter rates.

Fig. 2
figure 2

a, b How conspicuousness (measured by the mean number of encounters of a prey type with a predator before reproduction \(\lambda\)) and the investment in robustness after capture (r) combine to influence the fitness of a profitable (without secondary defence) and b unprofitable (with secondary defence) prey type. Fitness is measured by the number of surviving offspring at the end of the season. The optimal investment in robustness that maximizes the prey type’s fitness is shown in red. The model generating these figures closely follows Engen et al. (1986) but treats robustness as a trait that can be invested in with parameters k = 0.1, b = 10, mP = 0.2, mU = 0.8 (see Supplementary Information for more details). When prey are rarely encountered (\(\lambda\) low) then the optimal level of investment in robustness is negligible, because it is rarely needed. As conspicuousness of the prey type rises, unprofitable prey face selection to become more robust, especially if they are able to gain release by displaying their secondary defence (mU > mP). Even profitable prey will ultimately face selection for a degree of robustness, but only when they are highly conspicuous

One intriguing possibility not covered in the above model is that the warning signals forewarn a would-be predator that the prey item is defended, so that it motivates the predator to handle it more carefully. In this way, a prey-type’s capture tolerance is not entirely independent of its signal. For example, Sillén-Tullberg (1985) found that great tits tended to taste-reject both red and grey forms of an unpalatable bug (Lygaeus equestris) after attacking them, but that the red form was more likely to survive this sampling, possibly as a consequence of more cautious sampling. Likewise, Yamazaki et al. (2020) recently deployed artificial prey in the field with two levels of crypsis (cryptic/conspicuous) and two levels of defence (palatable/unpalatable). They found that while the aposematic prey and cryptic unpalatable prey were attacked at similar rates, the aposematic prey exhibited a higher level of relative partial predation, leading them to suggest that predators attack aposematic prey with less aggression than other prey types (see also Carroll and Sherratt 2013).

The above studies support the contention that instead of instructing would-be predators to avoid them entirely, warning signals might serve as “go-slow” signals (Guilford 1994), causing would-be predator to be more attentive to the possibility that a prey item is harmful to attack. If warning signals cause predators to “handle with care,” then it seems reasonable to suggest that this forewarning reduces not just the handling cost for the predator should the prey turn out to be unprofitable (so that the predator is not stung, or smeared with a noxious compound, for example), but also the survivorship of the signalling prey when it is handled, allowing further selection to act on capture tolerance.

Co-evolutionary pathways

Of course, we do not necessarily need a linear sequence of events for the three-way association between secondary defence, warning signals and capture tolerance to arise. In particular, selection for both secondary defences and capture tolerance may be mediated by another aspect of the species’ life-history (Fig. 1d). For example, chemically defended insects tend to be relatively large (Pasteels et al. 1983), possibly because it is harder for these species to evolve effective crypsis (Prudic et al. 2007; Remmel and Tammaru 2009). However, larger insects also tend to be more resistant to compressive forces than smaller insects (Fisher and Dickman 1993a; Freeman and Lemen 2007; Herrel et al. 2001) at least in part due to the structural needs of a larger body (Williams et al. 2012). So, at least some associations between capture tolerance and aposematism may be simply driven by a “confounder”.

Summary

There are many plausible reasons why aposematic species might tend to survive capture more often than non-aposematic species. For example, physical “toughness” could facilitate aposematism. Alternatively, aposematism may facilitate selection for toughness. However, as we have stressed, physical toughness is not equivalent to capture tolerance. Indeed, the simplest explanation for the association is that predators simply let go of aposematic prey once they discover their secondary defences. Alternatively, or in addition, a warning signal could motivate predators to capture and handle aposematic insects more cautiously, which again does not require them to be “tougher”.

Implications of the association

Being able to survive for long enough to indicate the possession of secondary defences during capture has important implications for the evolution of anti-predator traits. Most notably, if conspicuous mutants with secondary defences can survive capture, then aposematism can readily evolve via individual selection (Fisher 1930; Wiklund and Järvi 1982). However, there are also other implications: following physical evaluation, if those variants displaying a harmful defence are more likely to be subsequently released unharmed, then automimics (variants that do not carry the defence) and Batesian (interspecific) mimics will tend to be selected against, or at least not gain quite as much. The whole process, analogous to checking for forgeries, would not be possible if all prey were simply killed on capture as is widely assumed in models of mimicry evolution (e.g. Aubier and Sherratt 2015; Speed 1993).

Naturally, if unpalatable or otherwise protected species tend to have high capture tolerance because of their mechanical properties (coupled with a preview of the defences themselves) then it is possible that a predator might use tactile stimuli to recognize these prey and release them. Birds are known to have various types of sensorimotor neurons in their beaks, jaws, and oral cavities which allow them to feel and manipulate the objects and food items they grasp (Kuenzel 1989; Schneider et al. 2014; Soliman and Madkour 2017). If prey with harmful secondary defences are sometimes rejected by predators simply on the basis of their “feel” then these tactile cues may be open to exploitation or reinforcement by other species (Batesian or Müllerian mimicry respectively). Tactile cues are perhaps the least-explored sensory component of prey evaluation and are likely to be of particular importance for nocturnal predators and those living in visually restricted environments. While it has long been recognised that predators can make exploratory contact with a prey item without consuming it, this has been largely evaluated in the context of gustatory taste-rejection and is rarely considered to be part of tactile exploration per se, although this possibility has been raised in several papers (DeVries 2003; Hogan-Warburg and Hogan 1981; Kassarov 1999; Swynnerton 1926).

In a comprehensive review, Gilbert (2005) drew attention to the wide variation in the extent to which palatable hoverfly species (Diptera: Syrphidae) resemble their stinging hymenopteran models. Some hoverfly species (the “high fidelity” mimics) appear to match their hymenopteran models very closely, not just in appearance but also in behaviour, whereas other mimetic species only bear a crude resemblance. Gilbert (2005) proposed that the high-fidelity mimics were on average “harder” and “more durable” than non-mimetic dipterans and low-fidelity hoverfly mimics, citing rounded, emarginate abdomens, a punctate cuticle, and strong joints between overlap** tergites as structural traits that are especially common in high fidelity mimicry. Naturally, being harder and more durable would not only allow the flies withstand the compressive and shear forces of their would-be predators, but also come closer to resembling Hymenoptera in texture (Matthews and Matthews 2010). Gilbert’s (2005) conjecture appears entirely plausible given that some of the higher fidelity hoverfly mimics also engage in mock-stinging (Penney et al. 2014), a behaviour that is only likely to be exhibited post capture. Unfortunately, however, the above proposals remain largely untested.

In his review of mimicry in insects, Rettenmeyer (1970) included mimics in his list of species that have resilient bodies “…aposematic insects, models, and mimics often have harder, more durable bodies than other insects”. Rothschild (1971) argued the opposite however: “…unlike the model, with its tough cuticle and other disagreeable qualities, the mimic cannot afford to be examined at close quarters”, and we have already quoted Pinheiro and Campos (2013) who suggested that palatable mimetic butterflies tend to have soft wings. Indeed, the robustness of the insect has occasionally been used by entomologists to distinguish models from their mimics, suggesting that, if mimics do ever evolve to resemble their models in structural texture, then it is not always close. In an 1883 letter, referenced by both Trimen and Bowker (1887), Haase (1896), Bowker commented on two species of South African butterflies which he could only distinguish once he pinched their thorax. Reportedly, the Batesian mimics of the genus Pseudacraea were “brittle” and would die at once, while the aposematic Acraea were “leathery” and could be squeezed “as long as hard as you like without effect”. Thus, while there are some suggestions in the literature that certain Batesian mimics may have higher capture tolerance than non-mimetic species, and plausible explanations as to why it might arise, the best that can be said is that it remains uncertain whether predators use tactile cues to discriminate prey and hence whether there is selection on mimics to evolve tactile properties that are similar to their models.

Future directions

Blest (1963) noted the “familiar fact” that aposematic insects are generally tougher and more heavily sclerotized than their non-aposematic forms. He also referred to the widespread belief that this would allow them to survive sampling by inexperienced predators. However, he also cautioned that “These effects do not seem to have been searched for critically, and their confirmation would be of some interest”. It is fair to say that little has changed in the time since Blest’s cautionary note, in that there remains a dearth of systematic experiments to quantitatively compare the capture tolerance of insects.

As noted in section “If the association between aposematism and capture tolerance exists, why might it arise?”, the majority of studies to date have simply highlighted the remarkably high tolerance of certain aposematic species, but the flimsiness of species that lack post-contact defences may have been considered insufficiently newsworthy to merit attention. One might wonder whether non-aposematic prey, when subjected to the same duration of handling as a similar aposematic prey, would have a lower survival rate. Clearly the first avenue of future research should be aimed at broadening our pool of data, building on comparative studies such as Wang et al. (2018b), Herrel et al. (2001) in which the capture tolerance of a wide variety of species are quantified. As we have seen, objectively quantifying an insect species capture tolerance—whether by physical assay or with real predators—is challenging, because it can be achieved through a variety of physical traits including hardness and elasticity and because different predators may exert different compressive, tensile and shear forces. Standardizing of the test procedures would be desirable, but the most important criterion for these tests is that they should be replicable and conducted in a manner that is relevant to the organism’s natural predators (Kassarov 2004). This is not a small undertaking; there are hundreds of examples of aposematic species in the literature, of which only a scant few have been examined in the context of their capture tolerance.

When comparing capture tolerance among insects, phylogenetic relationships are important to consider, as certain groups of organisms, such as beetles, tend to be generally more resistant to deformation relative to others, such as grasshoppers, with members of each group having shared physical properties in part because of their phylogenetic relatedness. As more comparative data are assembled, they can be used to address a number of questions, which have received some provisional early answers (e.g. DeVries 2002, 2003; Gilbert 2005) but clearly require more work. For example, controlling for phylogeny and important confounders such as body size, do species with significant secondary defences tend to have higher capture tolerance than closely related species which lack these defences, as has been so widely conjectured? Likewise, are aposematic species that signal their evasiveness (for example, butterflies of the genus Morpho, Young 1971) less able to survive capture than related aposematic species that signal a secondary defence such as unpalatability? Are Batesian mimics generally more resistant to handling than comparable non-mimics? A phylogenetic analysis of the potential evolutionary transitions between species with different combinations of capture tolerance, secondary defences and warning signals may help reveal the order in which the traits have evolved (e.g. Loeffler-Henry et al. 2023), but until we have the baseline data, the temporal sequence will remain conjecture.

We began our review by asking if warning signals work so well, why have aposematic insects evolved an ability to tolerate capture? To answer this question, we need a change in mind set: when researchers investigate animal defences, it is natural to highlight conditions under which they work, rather than fail. However, we have argued that when warning signals fail to prevent pursuit, the possession of the signalled defence will facilitate selection on the prey to survive for long enough for its defence to be shown. We also asked if aposematic species are so good at tolerating capture, why have warning signals evolved at all? Once again, it is all too easy to fall into the trap of believing an anti-predator strategy works perfectly: despite a degree of capture tolerance, if some individuals die or get injured during the handling process, then there will be continued selection on signalling to reduce the rate at which they are pursued following detection. In this way, a secondary defence may facilitate the evolution of both a warning signal (“plan A”) to deter pursuit, and an ability to survive capture (plan B) when plan A fails.

Is capture tolerance a neglected third component of aposematism? We hope we have convinced readers that capture tolerance has largely been neglected in the field of anti-predator defence. However, despite numerous anecdotal remarks, papers highlighting the exceptional robustness of certain aposematic species, and many plausible reasons why capture tolerance should be associated with aposematism, we cannot yet be confident that it is a third component of aposematism. Of course, a high degree of capture tolerance in aposematic species will come “for free” if the signalled defence is one of toughness, or if the warning signals motivate the predator to attack cautiously. More generally, however, we need more experimental work evaluating the robustness of non-aposematic species to fully characterize the nature of the association, while being careful to distinguish physical traits that enhance capture tolerance from those that simply prevent capture.