Analysis of the survival time of the SIRS process via expansion

We study the SIRS process, a continuous-time Markov chain modeling the spread of infections on graphs. In this model, vertices are either susceptible, infected, or recovered. Each infected vertex becomes recovered at rate 1 and infects each of its susceptible neighbors independently at rate $\lambda$, and each recovered vertex becomes susceptible at a rate $\varrho$, which we assume to be independent of the graph size. A central quantity of the SIRS process is the time until no vertex is infected, known as the survival time. Surprisingly though, rigorous theoretical results exist only for the related SIS model so far. We address this imbalance by conducting theoretical analyses of the SIRS process via their expansion properties. We prove that the expected survival time of the SIRS process on stars is at most polynomial in the graph size for any value of $\lambda$. This behavior is fundamentally different from the SIS process, where the expected survival time is exponential already for small infection rates. Our main result is an exponential lower bound of the expected survival time of the SIRS process on expander graphs. Specifically, we show that on expander graphs $G$ with $n$ vertices, degree close to $d$, and sufficiently small spectral expansion, the SIRS process has expected survival time at least exponential in $n$ when $\lambda \geq c/d$ for a constant $c>1$. Previous results on the SIS process show that this bound is almost tight. Additionally, our result holds even if $G$ is a subgraph. Notably, our result implies an almost-tight threshold for Erdos-R\'enyi graphs and a regime of exponential survival time for hyperbolic random graphs. The proof of our main result draws inspiration from Lyapunov functions used in mean-field theory to devise a two-dimensional potential function and applying a negative-drift theorem to show that the expected survival time is exponential.


Introduction
In the domain of modeling infectious diseases, a vast body of literature studying various stochastic processes on graphs exists (see, for example, the extensive survey by Pastor-Satorras, Castellano, Mieghem, and Vespignani [PCM+15]).In this article, we focus on the SIRS process, a continuoustime Markov chain where each vertex is either susceptible, infected, or recovered.Each infected vertex becomes recovered at rate 1 and infects each of its susceptible neighbors independently at an infection rate , while each recovered vertex becomes susceptible at a deimmunization rate .
A question central to understanding the SIRS process is how long it takes until no vertex in the graph is infected, known as the survival time1 of the process.Due to relevance of the SIRS process, its survival time has been studied extensively.This includes empirical results [FSP16; KA01; WCA+17], mean-field approaches [BP10], and results that consider deterministic variants of the process [Sai19].However, surprisingly, to the best of our knowledge, no rigorous, theoretical results exist for the SIRS process in the literature.
This lack of theoretical results for the SIRS process stands in stark contrast to the plethora of theoretical results for a similar but slightly simpler process, known as the SIS process.In the SIS process, each vertex is either susceptible or infected.Each infected vertex becomes susceptible at rate 1 and infects each of its neighbors independently at an infection rate .Thus, with a grain of salt, the SIS process can be viewed as a special case of the SIRS process in which recovered vertices turn immediately susceptible (that is, the deimmunization rate is infinite).The survival time of the SIS process is well understood on a variety of graphs.Early results on the SIS process consider its survival time on Z [Har74] and on infinite -regular trees [Lig96;Pem92;Sta96], while recent breakthroughs characterize the survival time on Galton-Watson trees [BNN+21; HD20; NNS22].On finite structures, the results of Nam, Nguyen, and Sly [NNS22] consider Erdős-Rényi graphs, while the SIS process has also been studied on scale-free graphs2 [BBC+05; BCG+10].These results rely on the survival time on simple subgraphs, such as stars.Further, Ganesh, Massoulié, and Towsley [GMT05] connect the survival time to the spectral radius and the isoperimetric constant of the host graph, which immediately translates to a variety of simple graphs.
We note that, for the same graph, the survival time-which is a random variable-of a SIS process is an upper bound for the survival time of a SIRS process when starting with identical configurations, as the two processes can be coupled such that an infected vertex in the latter is also always infected in the former.This allows to carry over some results from the SIS to the SIRS process.However, our knowledge about the SIRS process remains in a very unsatisfactory state for multiple reasons.First, we only have upper bounds on the survival time for the SIRS process, which begs the question for how tight they are.And second, far more importantly, the survival time in the SIS process for a graph is a lower bound for any graph containing as a subgraph, as adding more vertices does not reduce the number of infected vertices at any point in time.In contrast, it is not known whether the SIRS process also has this property.Adding more vertices to a graph in the SIRS process can lead to some vertices being earlier infected and thus potentially earlier recovered, which in turn can block an infection that would have occurred otherwise.Thus, it is not straightforward to generalize results for the SIRS process to supergraphs.
Our contribution.We conduct the first rigorous, theoretical study of the expected survival time of the SIRS process on a large variety of graph classes, most prominently expanders.In all of our results, we assume that the deimmunization rate is independent of the graph size and that the process starts with at least one infected vertex and no recovered vertices.Our results showcase the similarities and the differences between the SIS and the SIRS process, highlighting the impact of the state recovered.Furthermore, for our lower bounds, we prove that our results carry over to supergraphs of the graphs we analyze.This makes our results applicable to a great number of different graph classes.
More specifically, in Section 3, we show that the expected survival time of the SIRS process on stars is polynomial,3 regardless of the infection rate (Theorem 3.4).This strongly contrasts the SIS process, where the survival time is superpolynomial for already very small infection rates.This shows that recovered vertices can have a huge impact on the survival time.The reason for this drastic difference in the expected survival time between both processes is that the star is only connected through a single, central vertex.Thus, if the center is recovered, the infection only survives if not all leaves become recovered during this time interval.The latter event does not have sufficiently high probability of occurring for the infection to survive superpolynomially long.
In Section 4, we complement these findings by proving that the expected survival time of the SIRS process on expanders is at least exponential if the infection rate is greater than the inverse of the expander's average degree (Theorem 4.11).This result is very similar for the SIS process [GMT05].In contrast to stars, expanders have many edges between arbitrary subsets of vertices.Thus, if the number of infected vertices is sufficiently high, there exist enough edges between all susceptible and all infected vertices, regardless of the number of (remaining) recovered vertices.These edges give the process a high probability to not decrease the number of infected vertices, which leads to the overall long expected survival time.
Since we prove our result for expanders to carry over to supergraphs, this result implies respective expected survival times for other well known graph classes, such as Erdős-Rényi graphs (Corollary 5.2) and hyperbolic random graphs (Corollary 5.6), which we discuss in Section 5. Combined, our results emphasize that while the SIRS and SIS process behave very differently on some of their subgraphs (namely stars), they have similar behavior if the graph is sufficiently connected.In the following, we discuss our results in more detail.

Expected survival time on stars
For stars, we prove the following upper bound on the expected survival time of the SIRS process.◮ Theorem 3.4.Let be a star with ∈ N >0 leaves, and let be a SIRS process on with infection rate and with deimmunization rate .Let be the survival time of .Then for sufficiently large , it holds that E[ ] ≤ ln( ) + 2 (4 + 1) ∈ O( ln( )).◭ Note that this bound is independent of and that it results in a polynomial expected survival time as long as is at most constant with respect to .Although we only prove an upper bound, our bound matches, up to a logarithmic factor, empirical investigations of the star [FSP16], suggesting that our bound is almost tight.Note that these experimental results consider the infection rate to be constant in terms of , while our results apply for any .Our results also show a behavior similar to the deterministic variant of the process considered by Saif [Sai19].
The analysis mainly relies on the method of investigating independent phases in which the center is not infected, bounding the probability of the infection process dying out during that time, as is common [BBC+05;BCG+10].A phase lasts at most until all leaves triggered their recovery at least once, which occurs in expectation after a time of about ln( ).Thus, if the center just recovered, it needs to become susceptible more quickly than that bound, as otherwise all leaves are recovered.Since deimmunization triggers at rate , the probability that the center does not become susceptible in this time interval is about e − ln , resulting in a probability of about − that the infection dies out.Since these phases are independent, the infection process survives, in expectation, about of these trials, each lasting about ln( ) time in expectation.
Note that the deimmunization rate and the state recovered are important for this argument to hold.Without this additional state, that is, in the SIS process, it is quite likely that the center becomes quickly reinfected before all leaves are not infected, which leads to an exponential expected survival time once ≥ −1/2+ in this setting [GMT05], for all positive constants .

Expected survival time on expanders
Before we state our main result, we formally introduce the notion of expansion we use for our results.To this end, let = ( , ) be a graph with vertices { } =1 , and let be its normalized Laplacian, which is defined as if there is an edge between and , 0 otherwise.
Let have eigenvalues 1 ≤ ... ≤ .The spectral expansion of is defined as We call an ( , (1 ± ) , )-expander if and only if it has vertices, a spectral expansion of and only vertices with degree between (1 − ) and (1 + ) .As noted above, in contrast to stars, expanders feature many edges between arbitrary subsets of vertices.The key property we require for our results from ( , (1 ± ) , )-expanders is that the number of edges between any two sets and of vertices is close to | || |.
Our results hold for any expander ′ that is subgraph of a graph on which we analyze the SIRS process .More formally, we define the projection ′ of onto ′ to be the process on ′ such that, at each point in time, each vertex of ′ in ′ is in the same state as it is in .The survival time of a projected process is the first point in time that the projected process has no infected vertices.Given these definitions, our main result follows.
◮ Theorem 4.11.Let be a graph, and let ′ be a subgraph of that is an ( , (1 ± ) , )expander.Let → ∞ and , → 0 as → ∞.Let be the SIRS process on with infection rate and with constant deimmunization rate .Further, let start with at least one infected vertex in ′ and no recovered vertices in ′ .Last, let ′ be the projection of onto ′ , and let be the survival time of ′ .If ≥ for a constant ∈ R >1 , then for sufficiently large , it holds that E[ ] = 2 Ω ( ) .◭ We note that Theorem 4.11 is almost tight with respect to the range of .Ganesh, Massoulié, and Towsley [GMT05, Theorem 3.1] show that the survival time of the SIS process is at most logarithmic in when the spectral radius of a graph is less than 1/ .Note that the spectral radius of a graph is upper bounded by the maximum degree of the graph.This results in a logarithmic expected survival time of the process on ( , (1 ± ) , )-expanders when ≤ 1− , for some constant .Recall our discussion earlier in the introduction that the expected survival time of the SIS process is an upper bound to the expected survival time of the SIRS process.Hence, the expected survival time of the SIRS process for ≤ 1− is at most logarithmic in on ( , (1± ) , )expanders.
The proof of Theorem 4.11 consists of two main parts.First, we prove that a linear number of vertices in ′ becomes infected.Then, we show that the number of infected vertices stays linear for an expected exponential amount of time.For both parts, we make use of potential functions, which map the configuration of the process to a single real number that allows us to quantify how likely the process is to die out.In order to get the result on the projection of the process, we use that the influence of \ ′ only increases the rate at which vertices in ′ get infected.In the considered configurations, this rate increase only helps the process to get into the desired region of the potential.
In more detail, the first part shows that the process reaches a configuration with at least infected vertices with probability at least 1 +2 (Lemma 4.1).Note that if this event does not occur, then the process might die out fast.For bounding the probability of this event, we use a fairly simple potential expressing the difference in the number of infected vertices minus times the recovered vertices.We show that is a submartigale.Applying the optional-stopping theorem to concludes this first part of the proof.In the second part, we define a more advanced potential function (Definition 4.3), which gets large when the number of infected vertices gets small.We show that there is a region of the potential in which the process is a strict supermartingale with a constant negative drift (Lemma 4.6).We show that in this region, higher infection rates decrease the drift (Lemma 4.5).We then use the expansion properties of the base graph that guarantee that the infected vertices always have enough susceptible neighbors such that new vertices get infected and the potential decreases in expectation.This allows us to apply a concentration bound by Oliveto and Witt [OW11] (Theorem 2.2) for strict supermartingales, known as negative-drift theorem, based on an intricate theorem by Hajek [Haj82].The negative-drift theorem results in the lower exponential bound of the expected survival time.
Our definition of is based on a Lyapunov function used by Korobeinikov and Wake [KW02], which they utilize in order to derive results on the global stability of the SIRS process via meanfield theory.The mean-field theory assumes a fully mixed graph, which roughly corresponds to a clique for our process.In order to show global stability, the authors show a negative drift towards an equilibrium point.However, this drift is 0 for some configurations in our setting, which is not small enough to apply the drift theorem.We adjust their function appropriately to create a region in the potential that has a sufficiently large negative drift.We also alter the analysis of the function to work in the stochastic process and on expander graphs.

Expected survival time on special graph classes
The generality of Theorem 4.11 makes it applicable to various other interesting graph classes.The only requirement is that they contain an expander as a subgraph.We illustrate this generality for two important random-graph models, namely, Erdős-Rényi and hyperbolic random graphs.

Erdős-Rényi graphs
The first random-graph model we are interested in is , -the classical random-graph model of Erdős and Rényi [ER59].The expansion properties of this model have been previously studied in literature.As Coja-Oghlan [Coj07, Theorem 1.2] shows, Erdős-Rényi graphs have a very small spectral expansion.Furthermore, due to Chernoff bounds, the vertex degrees in Erdős-Rényi graphs are tightly distributed around their average degree .Therefore, Erdős-Rényi graphs fulfill, with high probability, our definition of an ( , (1 ± ) , )-expander.This leads to the following corollary of Theorem 4.11.◮ Corollary 5.2.Let ∼ , be an Erdős-Rényi graph with ( − 1) ∈ ω(ln ).Consider the SIRS process on with constant deimmunization rate , and let be the survival time of when the process starts with at least one infected vertex.If ≥ for a constant ∈ R >1 , then E[ ] = 2 Ω ( ) asymptotically almost surely with respect to .If ≤ for a constant ∈ (0, 1), then E[ ] ∈ O(log ) asymptotically almost surely with respect to .◭ Comparing Corollary 5.2 with the respective result for the SIS process (cf.[GMT05, Theorem 5.5]) shows that the two processes, SIS and SIRS, behave similarly on Erdős-Rényi graphs.

Hyperbolic random graphs
Many properties of complex real-world networks, such as the internet and social networks, are captured by hyperbolic random graphs [BPK10;VS16].For this reason, since their introduction [KPK+10], hyperbolic random graphs are a very popular model in network theory that has been extensively studied (e.g.[BFM15; GPP12; MS19]).Therefore, hyperbolic random graphs provide a highly relevant structure for studying the survival time of the SIRS process.The exact definition of the model is not required to understand our results, hence we refer the reader to the work by Krioukov, Papadopoulos, Kitsak, Vahdat, and Boguñá [KPK+10] for a formal definition.
The key parameter of a hyperbolic random graph controls the power-law exponent that the degree distribution follows.The interesting parameter range is ∈ (2, 3).Beyond this range, the graphs generated from this model lose key properties present in real-world networks.As the model commonly generates some very small disconnected components of a few vertices, the usual approach in literature is to focus on the giant component of the graph.Two of these properties are key for our results: the existence of a polynomial-sized clique as a subgraph and a polylogarithmic bound on the diameter of the giant component.Using these two properties, we identify the following parameter regime for the exponential expected survival time of the SIRS process on hyperbolic random graphs.
◮ Corollary 5.6.Let be a hyperbolic random graph with vertices that follows a power-law degree distribution with exponent ∈ (2, 3), and let be the SIRS process on with infection rate and with constant deimmunization rate .Further, let start with at least one infected vertex in the giant component and no recovered vertices, and let be the survival time of .Then there exists a constant ∈ R >0 such that if

Outlook
Although our results cover already a great range of interesting graph classes, this article is just the first step to understanding the SIRS process more thoroughly.Our analyses pose exciting new challenges for different scenarios, which we briefly delineate in the following.
Our upper bound of the expected survival time on stars (Theorem 3.4) is off from empirical results [FSP16] by a logarithmic factor.This shows that there is potential for improvement in the analysis.Ideally, proving a matching lower bound would answer the question for the exact expected survival time.
Combined, our results for stars (Theorem 3.4) and expanders (Theorem 4.11) show that adding edges to a graph leads, eventually, from a polynomial expected survival time to an exponential.However, it is not clear so far when this transition happens.An interesting next step is to look into connected stars instead of single stars.Connected stars appear as subgraphs in important real-world network models, such as the Chung-Lu [CL03] or the preferential-attachment model [BA99], motivating this research question.
With respect to expanders with vertex degrees concentrated around , our result (Theorem 4.11) implies that 1/ is the threshold for the infection rate at which the expected survival time transitions from logarithmic to exponential.However, our bounds require to be bounded away from 1/ by a constant.It is not clear, given a value ∈ o(1), what happens if = 1± .A more detailed analysis could provide insights into how rapidly the transition at the threshold occurs.
A different extension of our results is to consider deimmunization rates that are dependent on the graph size.Comparing the behavior of the SIS and the SIRS process on stars suggests that an increased deimmunization rate leads to far longer expected survival times.Thus, an interesting question is whether the survival time exhibits a threshold behavior with respect to the deimmunization rate.
Multi-dimensional potentials, as the one we use for the SIRS process on expanders, are rare in the analysis of stopping times of stochastic processes.Our approach draws inspiration from Lyapunov stability to devise a potential function for the stochastic process under study and then applies drift theory to convert this into a rigorous proof.Lyapunov functions are used in mean-field theory to show stable points of dynamical systems [Lya92], and epidemic processes constitute only a glimpse of their applicability.We believe that our approach might inspire further rigorous results of determining stopping times of other stochastic processes, not limited to epidemic models.

Preliminaries
We study the SIRS process, which is a continuous-time Markov chain on graphs in which the vertices change between different states, following events triggered by Poisson processes.We analyze how this process behaves asymptotically in the number of vertices of the graph.Especially, when we use big-O notation or refer to variables as constants, this is with respect to .When we use big-O notation inside of a term in a relation, this means that there exists a function from the big-O expression such that the relation holds, for example, the equation = 2 Ω ( ) means that there exists a function ∈ Ω( ) such that = 2 ( ) holds.If not stated otherwise, all variables we consider may depend on .Whenever we talk about Poisson processes, we refer to one-dimensional Poisson point processes that output a random subset of the non-negative real numbers.
We first define our infection models and some related terms that we use throughout the paper.We then state the probabilistic tools we use in the proofs.

Let
= ( , ) be a graph with vertex set and edge set .Further, let , ∈ R >0 .In the SIRS process, for each edge ∈ , we define a Poisson process with parameter , and for each vertex ∈ , we define the two Poisson processes with parameter 1 and with parameter .We refer to these processes as clocks, and when an event occurs in one of them, we say that the relevant clock triggers.We use to denote the set of all of these clocks, that is, = ( ∈ { }) ∪ ( ∈ { , }).Let denote the stochastic process in which all of the clocks in evolve simultaneously and independently, starting at time 0. Note that almost surely there is no time point at which two clocks trigger at once.There are almost surely a countably infinite number of trigger times in , which we index by the increasing sequence { } ∈N ≥0 , where 0 = 0.A SIRS process = ( ) ∈R ≥0 has an underlying graph = ( , ), an infection rate , a deimmunization rate , and an initial partition of into susceptible, infected, and recovered vertices with the respective sets ′ 0 , ′ 0 , and ′ 0 .At every time ∈ R ≥0 , the configuration is a partition of into ′ , ′ , and ′ .The configuration only changes at times in .Let ∈ N >0 .We consider the following configuration transitions in : We say that gets infected at time point by .• If for some ∈ we have We say that recovers at time point .• If for some ∈ we have ∈ and We say that gets susceptible at time point .If none of the above three cases occurs, the configuration of at is the same as the configuration of at −1 .Note that at all times between −1 and , retains the same configuration as in −1 .
In our proofs, we only consider the time points in at which the configuration changes.To this end, let We index the times in ′ by the increasing sequence { } ∈N .For all ∈ N, we call the -th step of the process.
If at any point in time no vertex is infected, then from that point onward, no vertex is infected.We say that the infection dies out or goes extinct at the first (random) time with ′ = ∅.We call the survival time of the SIRS process.
We only keep track of the number of vertices in each of the sets.To this end, we define for all ∈ R ≥0 the random variables = | ′ |, = | ′ |, and = | ′ |.These random variables change depending on the clocks in .We say that an event happens at a rate of ∈ R >0 if and only if the set of clocks that cause this event when they trigger has a sum of rates equal to .
We define the projection ′ of onto ′ as the process on ′ such that, at each point in time, each vertex of ′ in ′ is in the same state as it is in .When considering such a projection, we use , , and to only count the vertices of ′ in the corresponding state.Also only contains times at which the state of a vertex in ′ changes.The survival time of a projected process is the first point in time that the projected process has no infected vertices.Note that the survival time ′ of ′ is a lower bound for the survival time of , as all infected vertices of ′ are also infected in .
We use stochastic domination to transfer results from one random variable to another.We say that a random variable ( ) ∈R dominates another random variable ( ) ∈R if and only if there exists a coupling ( ′ , ′ ) ∈R in a way such that for all ∈ R ≥0 we have ′ ≥ ′ .

Probabilistic Tools
We use general concepts from probability theory (see for example [Fel68;MU17]).In addition, we use the following theorems.
We use the optional-stopping theorem for submartingales to bound the probability of reaching a specific configuration.For an event , the symbol 1 denotes the indicator random variable that is 1 if is true and 0 otherwise.◮ Theorem 2.1 (Optional stopping [MU17, Theorem 13.2]).Let ( ) ∈N be a submartingale and a stopping time, both with respect to a filtration (F ) ∈N .Assume that the following two conditions hold: We use the following theorem in order to show an exponential expected survival time for the SIRS process.We state it in a fashion that better suits our purposes.
).Let ( ) ∈N be a random process over R, adapted to a filtration (F ) ∈N .Let there be an interval [ , ] ⊆ R, two constants , ∈ R >0 and, possibly depending on ≔ − , a function ( ) Suppose that for all ∈ N the following two conditions hold: 1 The following theorem bounds the expected value of the maximum of exponentially distributed random variables.◮ Theorem 2.3 ([MU17, Lemma 2.10]).Let ∈ N >0 , and let { } ∈ [ ] be independent random variables that are each exponentially distributed with parameter ∈ R >0 .Let = max ∈ [ ] , and let be the -th harmonic number.Then We use the following version of Wald's equation, which does not require the addends to be independent.
◮ Theorem 2.4 (Generalized Wald's equation [DK22, Theorem 5]).Let , ′ ∈ R, and let ( ) ∈N be a random process over R ≥ such that ∈ [ ] has a finite expectation.Furthermore, let (F ) ∈N be a filtration, and let be a stopping time with respect to

Expander graphs
There are many notions of how to define expander graphs.We use algebraic expanders in which all but one of the eigenvalues of the normalized Laplacian of the graph are very close to 1.These graphs have some nice properties that let us bound the number of edges between infected and susceptible vertices.Formally, let = ( , ) be a graph with vertices { } =1 , and let be its normalized Laplacian, which is defined as if there is an edge between and , 0 otherwise.
We call an ( , (1 ± ) , )-expander if and only if it has vertices, a spectral expansion of and only vertices with degree between (1 − ) and (1 + ) .For two vertex sets , ⊆ , let ( , ) denote the number of edges between and .Using this notation, we have the following theorem ◮ Theorem 2.5 ([Chu97, Theorem 5.2]).Let = ( , ) be a graph and let , ⊆ .Then Applying Theorem 2.5 to expanders, we get the following two corollaries.. ◮ Corollary 2.6.Let = ( , ) be a ( , (1 ± ) , )-expander, and let ⊆ .Then Because the vertex degrees of all vertices in are bounded, we know that for each ⊆ holds Plugging that into the result of Theorem 2.5 gives us .We solve them for | ( , )| and bound them separately using that ≤ 1/5.
Subtracting | | • | | from both inequalities and combining them proves the corollary.

SIRS on Stars
We show that the expected survival time of the SIRS process on stars is bounded from above by a polynomial in the number of vertices that is independent of the infection rate (Theorem 3.4).To this end, we bound the number of times that the center gets infected and the time between two infections of the center.We use that while the center is not infected, no leaf gets infected.Hence, if all of the leaves recover before the center gets susceptible after it recovered, the infection dies out.We first bound the expected time that it takes for all of the leaves to recover.We refer to each clock at a vertex whose rate is the recovery (of 1) rate as recovery clock.
◮ Lemma 3.1.Let be a star with ∈ N >0 leaves, and let be a SIRS process on with infection rate and with deimmunization rate .Let be the time that it takes for all recovery clocks of the leaves to trigger at least once.Then E[ ] ≤ ln( ) + 1. ◭ Proof.The star has leaves, which all have a clock that recovers them at a rate of 1.For each of the clocks, the time until the first trigger happens is exponentially distributed with parameter 1.Hence, is calculated as the maximum of the exponential distributions of the independent clocks.By Theorem 2.3, E[ ] ≤ ln( ) + 1.
We now use Lemma 3.1 to bound the time it takes from one infection of the center until it gets infected again or until the infection dies out.◮ Lemma 3.2.Let be a star with ∈ N >0 leaves, and let be a SIRS process on with infection rate and with deimmunization rate .Let 0 ∈ R ≥0 be a time at which the infection has not died out yet.Further, let ∈ R ≥0 be the first time after 0 at which either the center gets infected after being susceptible or the infection dies out.Then E[ − 0 ] ≤ ln( ) + 2. ◭ Proof.If the center starts infected, in order for either the center to get infected again after being susceptible or the infection to die out, the center has to recover first.Let ′ ∈ R be the first time after 0 at which the center recovers.As all vertices recover at a rate of 1, the random variable ′ − 0 is exponentially distributed with a parameter of 1.
Between ′ and , no leaf gets infected, as the center is not infected and all edges are incident to the center.Hence, when all recovery clocks of the leaves trigger in this time interval at least once, the infection dies out.Therefore, the last point in time after ′ at which any of these recovery clocks trigger is an upper bound for .By Lemma 3.1, the expected time for this last trigger to happen is at most ln( ) + 1.That gives us Next, we bound the probability from below that when starting with an infected center, the infection dies out before the center gets infected again.We use this later to get an upper bound on the number of times that the center gets infected in total.◮ Lemma 3.3.Let be a star with ∈ N >0 leaves, and let be a SIRS process on with infection rate and with deimmunization rate .Let 0 ∈ R ≥0 be a time at which the center is infected.Further, let 0 be the event that the infection dies out after 0 before the center gets infected again (after being recovered in between).Then for sufficiently large , it holds that Pr[ 0 ] ≥ 1 4 − .◭ Proof.In order for either the center to get infected again after being susceptible or for the infection to die out, the center has to recover first.Let 1 ∈ R be the first time after 0 at which the center recovers.As long as the center is in the recovered state, no vertex gets infected, as all edges of the graph are incident to the center.If all leaves recover before the center gets susceptible, the infection dies out.In order to bound the probability of this event, we consider the first time ∈ R after 1 at which the center gets susceptible, and we also consider the first time ′ ∈ R after 1 at which all of the recovery clocks of the leaves trigger at least once in the interval ( 1 , ′ ].In particular, we use that all leaves recover before the center gets susceptible if ′ − 1 < ln( ) and − 1 ≥ ln( ).Each vertex recovers after a time that is exponentially distributed with parameter 1.As ′ is the first time after 1 at which all of the recovery clocks of the leaves trigger at least once in the interval ( 1 , ′ ], it is the maximum of exponentially distributed random variables.In order for ′ − 1 < ln( ), all of those random variables have to be smaller than ln( ).As all of them are independent, we get that, for sufficiently large , All vertices lose their immunity at a rate of .Hence, − 1 is exponentially distributed with parameter .Using the exponential probability distribution, we get Pr[ − 1 ≥ ln( )] = e − ln( ) = − .Now using the fact that the infection dies out when all leaves recover before the center gets susceptible and that − 1 and ′ − 1 are independent, we get Using the previous bounds, we now derive an upper bound on the expected survival time of a SIRS process on a star.◮ Theorem 3.4.Let be a star with ∈ N >0 leaves, and let be a SIRS process on with infection rate and with deimmunization rate .Let be the survival time of .Then for sufficiently large , it holds that E[ ] ≤ ln( ) + 2 (4 + 1) ∈ O( ln( )).◭ Proof.Let be the random variable that counts the number of times that the center gets infected before the infection dies out.For all ∈ N ≤ +1 , let be the -th time at which either the center gets infected or the infection dies out (we define 0 = 0).It then holds that = +1 = =0 +1 − .We aim to bound the expectation of this value using the generalized Wald's equation (Theorem 2.4).
Let (F ) ∈R ≥0 be the natural filtration of .By Lemma 3.2, it holds for all ∈ N ≤ that 0 ≤ E +1 − F ≤ ln( ) + 2. Hence, the expectations of all of the summed random variables are bounded.By Lemma 3.3, for all ∈ N ≥1 , the -th infection of the center has a probability of at least 1 4 − to be the last one if there is an -th infection of the center.Therefore, is dominated by a geometrically distributed random variable ∼ Geom( 1 4 − ).Hence, =0 +1 − is integrable.By Theorem 2.4, we get

SIRS on Expanders
We consider the SIRS process on graphs that have expanders as subgraphs.In particular, we show an exponential expected survival time for the projection of the SIRS process onto the expander when the deimmunization rate is constant and the infection rate is sufficiently high (Theorem 4.11).Note that the exponential expected survival time and the required infection rate depend only on the size and vertex degrees of the expander.In Section 4.1, we begin by analyzing basic properties of the process, such as the transition rates between all of the states.In Section 4.2, we show that the expected survival time of the considered SIRS processes is exponential if ≥ for a constant ∈ R >1 .We first prove that the process reaches a configuration with at least infected vertices with sufficiently high probability.We then provide a lower bound for the expected survival time starting at such a configuration.To this end, we define a potential over the configuration space that has in a specific region a constant negative drift away from the configuration with no infected vertices.We then translate this region into bounds for the potential, allowing us to apply the negative-drift theorem (Theorem 2.2) to get an exponential expected survival time.

The SIRS Process
Let = ( , ) be a graph and let ′ = ( ′ , ′ ) be a subgraph of that is an ( , (1 ± ) , )expander.Let be a SIRS process with infection rate ≥ for a constant ∈ R >1 and deimmunization rate on .Consider the projection ′ of onto ′ .We define for all ∈ N the random This value is the number of infected vertices in an equilibrium configuration of a SIRS process on a clique with vertices and an infection rate of .We use this value as a clique and the expanders we consider behave very similarly, thus, * is a good estimate for the number of infected vertices that tends to have on ′ .

Exponential survival time
We now show that the infection becomes epidemic if ≥ for a constant ∈ R >1 .We start by proving that, when starting with one infected vertex inside of the expander, the infection reaches a configuration with at least infected vertices with sufficiently large probability.
◮ Lemma 4.1.Let be a graph, and let ′ be a subgraph of that is a ( , (1 ± ) , )-expander.Let → ∞ and , → 0 as → ∞.Let be a SIRS process on with infection rate and with constant deimmunization rate .Also let start with at least one infected vertex in ′ and no recovered vertices in ′ .Consider the projection ′ of onto ′ .If ≥ for a constant ∈ R >1 , then there exists an ∈ R >0 such that for sufficiently large , the probability that there exists a ∈ N with ≥ is at least 1 +2 .◭ Proof.Let ′ = − 1.Note that ′ is positive because > 1.Let , ∈ R >0 be a constants that we specify later.We define for all ∈ N the potential = ( , ) = − .Additionally, we define the stopping time = inf { ∈ N | ≤ 0 ∨ < (1 − ) } and the natural filtration (F ) ∈R ≥0 of .We aim to show that ( ) ∈N is a sub-martingale until .This allows us to apply the optional-stopping theorem (Theorem 2.1) to bound E[ ] from below.The law of total expectation then yields a lower bound of 1 +2 for Pr[ > 0].We conclude the proof by showing that if > 0, then ≥ .We first bound , using Corollary 2.6 for all times < .We get We now bound for all ∈ N the drift E ( +1 − ) • 1 < F .To improve readability, we omit the multiplicative 1 < in all of the terms.
The last inequality holds by first choosing < ′ and then choosing small enough.Then for sufficiently small and , both of the summands are positive.
Note that E[ ] < ∞ because in each step ∈ N < , there is a non-zero probability (independent of ) to recover a vertex, hence there is always a non-zero probability to recover all vertices within the next steps, which stops the process.Therefore, by applying the optional-stopping theorem (Theorem 2.1), we get By the law of total expectation, we get that Because of the definition of and the fact that changes by at most 1 + ≤ 2 in one step, we get that ≥ −2.We also know that ≤ as ≤ .By definition of , it holds that 0 ≥ 1.
Now assume > 0. By the definition of , it then holds that < (1 − ) .Therefore, With > 0, we then get > , which implies Choosing accordingly concludes the proof.
To show that the infection survives long from that point onward, we define a potential function that assigns a real number to each configuration of the process, and we analyze its drift.The potential function is an adjusted version of the Lyapunov function of Korobeinikov and Wake [KW02].We first define a helper function .
◮ Definition 4.2.Let : (R >0 ) 2 → R be such that, for all , * ∈ R >0 , we have Hence, for a given * ∈ R >0 , the value = * is the only local optimum of ( * , ), and it is a global minimum.We now define the potential function that we use in the following lemmas.
◮ Definition 4.3.Let be a graph and let ′ be a subgraph of that is an ( , (1 ± ) , )expander.Let be a SIRS process on with infection rate ≥ for a constant ∈ R >1 and with deimmunization rate .Consider the projection ′ of onto ′ .Let ′ = 1 + .For all ∈ N, we define as Further, let (F ) ∈R ≥0 be the natural filtration of .We define for all ∈ N the drift as The potential becomes very large when the infection is close to dying out.We aim to show that the process tends to drift away from that high-potential region when we ignore the impact of the vertices outside of the considered subgraph and that there is a region in which the extra vertices only enlarge that drift.To calculate the differences of the values in the drift, we first have a look at . and Proof.We use that for all ∈ R >1 , it holds that Together with the definition of , we have For the second part, we get ( −1) concludes the proof.To bound the drift, we first show that there is an ∈ R >0 such that if there are less than infected vertices, the drift is maximized when , is 0. ◮ Lemma 4.5.Let be a graph, and let ′ be a subgraph of that is an ( , (1± ) , )-expander.Let be a SIRS process on with infection rate and with constant deimmunization rate .
Consider the projection ′ of onto ′ .Let ( , ) be the amount of edges between the infected and the susceptible vertices at time , and let ′ = ( , ) + , + , .If ≥ for a constant ∈ R >1 , then there exists a constant ∈ R >0 such that, for all ∈ N and sufficiently large , if 2 ≤ ≤ , then For easier notation, we first define We know that , = ( , ) = ( , ) + , for some , ∈ R ≥0 .By the definition of and the fact that = ′ + , + , , we get that As , + , is non-negative, to prove the lemma it is sufficient to show that By Lemma 4.4, we know that for all * ∈ R >0 and ∈ R >2 holds Using these bounds, we get that Note that ′ is in Θ( ) and is bounded from below by , therefore ′ −1 is bounded from above by a constant .Assume that + 1 ≤ .Let = (1− ) (1+ ) .Note that > 0 is constant and * = .Using the bounds from above we get We know that ( , + , ) is non-negative, therefore we can choose small enough such that the right-hand side of the previous equation is always at most 0, which concludes the proof.
We now show that the drift is bounded from above by a negative constant for configurations in which the number of infected vertices is very small but still linear in .
◮ Lemma 4.6.Let be a graph, and let ′ be a subgraph of that is an ( , (1± ) , )-expander.Let → ∞ and , → 0 as → ∞.Let be a SIRS process on with infection rate and with constant deimmunization rate .Consider the projection ′ of onto ′ .Let ∈ N and 0 , ∈ (0, 1) be sufficiently small constants.Assume that 0 ≥ ≥ .If ≥ for a constant ∈ R >1 , then there exists a constant ∈ R >0 such that ≤ − for sufficiently large .◭ Proof.Let ( , ) be the amount of edges between the infected and the susceptible vertices at time , and let ′ = ( , ) + , + , .For this proof, we first use the law of total expectation and Lemma 4.4 to get a large formula as an upper bound for ′ .We split this bound into multiple parts and bound each part separately.We show that, with the given conditions, one of the parts is bounded from above by − for a constant ∈ R >0 , and the other part is in o( ), so it is asymptotically much smaller in absolute values than the other part.We conclude the proof by bounding ′ and dividing the obtained bound for ′ by it.Using Lemma 4.5 and Lemma 4.4, we get ( + 1) .
Note that with the given conditions, all values of , , ′ , and * are in Θ( ).All of ( , ), , , and , are in O( ).Therefore, the terms in the second row of the last sum are in O(1), thus we only need to bound the first part.
We know the exact values of , and , .However, the value of ( , ) depends on which vertices are infected.We use the expander properties of and Corollary 2.7 to bound this number.Note that both 1 − * and 1 − ′ are negative and lower bounded by some constant.We get for sufficiently large that Note that * + ′ is in Θ(1), hence the last part of the last sum is in O(( + ) ).As + goes towards 0, this is asymptotically smaller then the rest of the drift, which we show now.
We aim to bound this term from above.To this end, we bound − ( + ) This term has exactly one maximum for positive which is at = ′ 1+ + .We also bound ≤ 0 + ′ from above.We get The expression in the brackets is a constant, and we aim to show that it is negative for sufficiently small 0 .We achieve this by showing that the part without the 0 is negative and then choosing 0 small enough.As both and − 1 are positive, we get The last line holds because > 1. Taking everything together, we get that ′ • is bounded from above by the sum of a constant, by a term that is in Θ(( + ) ), by and − ′ , where is a positive constant for sufficiently small 0 .
We aim to apply the negative-drift theorem (Theorem 2.2) to bound the expected survival time of the infection.In Lemma 4.6, we showed a constant negative drift of the potential in a region of the configuration space.To apply the drift theorem, we first transform the configuration space restrictions into restrictions on the value of the potential.The first lemma shows that if there is at least a constant amount of infected vertices, the potential does not get too large.
◮ Lemma 4.7.Let be a graph, and let ′ be a subgraph of that is an ( , (1± ) , )-expander.Let → ∞ and , → 0 as → ∞.Let be a SIRS process on with infection rate and with constant deimmunization rate .Consider the projection ′ of onto ′ .Let ∈ N and ∈ (0, 1) be constants such that ≥ .If ≥ for a constant ∈ R >1 , then ∈ O( ).◭ Proof.We aim to bound from above by writing it as a sum and bounding the individual summands.To this end, we first bound the terms that appear in the summands.By the definition of our random variables and the fact that there are only vertices, we get max( , , * ) ≤ ′ , min( , ) ≥ min( , / ) .
Applying these bounds to results in / ) .
The next lemma shows that when the number of vertices becomes small, the potential gets rather large.Together with the previous lemma, this shows that having few infected vertices and having a high drift is more or less the same.◮ Lemma 4.8.Let be a graph, and let ′ be a subgraph of that is an ( , (1± ) , )-expander.Let → ∞ and , → 0 as → ∞.Let be a SIRS process on with infection rate and with constant deimmunization rate .Consider the projection ′ of onto ′ .Let ∈ N and ∈ (0, * / ) be constants such that 1 ≤ ≤ .If ≥ for a constant ∈ R >1 , then We aim to bound from below by bounding the values in its definition.Recall that for a given * ∈ R >0 , the function ( * , ) is minimized for = * , which is the only local extreme value for ∈ R >0 .Therefore, we get for all , * ∈ R >0 ( * , ) ≥ ( * , * ) = 0.
Using 1 ≤ ≤ and that for all * ∈ R >0 , the function ( * , ) decreases monotonically in while < * , we conclude The next lemma shows that while the process has at least a constant fraction of vertices in the infected state, each potential next step only changes the potential by at most a constant.◮ Lemma 4.9.Let be a graph, and let ′ be a subgraph of that is an ( , (1± ) , )-expander.Let → ∞ and , → 0 as → ∞.Let be a SIRS process on with infection rate and with constant deimmunization rate .Consider the projection ′ of onto ′ .Let ∈ N and ∈ (0, / ) be constants.Assume that ≥ .Further, let , ∈ {−1, 0, 1}.If ≥ for a constant ∈ R >1 , then for sufficiently large , it holds that We aim to use the triangle inequality to bound the absolute change in the -values from above by the sum of the absolute changes in the -values.We use that for all ∈ R >1 holds that Further, for all , * ∈ R >2 and ∈ {−1, 0, 1} holds that We apply this inequality to bound the absolute change in potential from above.Note that for sufficiently large , it holds that min( − 1, − 1) ≥ /2.We conclude We now have the tools to apply the negative-drift theorem (Theorem 2.2) to bound the survival time of the infection.◮ Lemma 4.10.Let be a graph, and let ′ be a subgraph of that is an ( , (1 ± ) , )expander.Let → ∞ and , → 0 as → ∞.Let be a SIRS process on with infection rate and with constant deimmunization rate .Consider the projection ′ of onto ′ .Let 0 ∈ (0, 1) be a constant and let 0 be the event that there exists a ∈ N such that ≥ 0 .Let be the first time after with = 0.If ≥ for a constant ∈ R >1 , then E 0 = 2 Ω ( ) .◭ Proof.We assume that 0 occurs.Let (F ) ∈R ≥0 be the natural filtration of , and let ∈ N be such that ≥ 0 .We aim to apply the negative-drift theorem (Theorem 2.2) to get the desired bound.To this end, we define a stopping time that is dominated by the number of steps until , and we use the previous lemmas to show that all of the conditions for the drift theorem are satisfied.Note that we shift the time to start at instead of 0. We then translate the bound on the number of steps into a bound on the survival time.
As ≥ 0 , by Lemma 4.7, there exists a constant 0 ∈ R >0 such that ≤ 0 .Let be the minimum of the constants from Lemmas 4.5 and 4.6.By the contraposition of Lemma 4.7, there exists a constant 1 ∈ R >0 such that ≥ 1 implies that ≤ .We define = max( 0 , 1 ) We first show that for all ∈ N with ≤ < 1 holds that is large enough such that Lemma 4.9 is applicable.Let 1 ∈ (0, * / ) be a constant low enough such that * ln 1 1 + ln * − 1 > 2 .Such an 1 exists, as * and are positive constants.Then by the contraposition of Lemma 4.8, for all ∈ N, it follows that ≤ 2 implies that ≥ 1 .
To show that condition 2 of Theorem 2.2 is satisfied, let = 2(1 + 2(1 + / ) −1 1 ).For all ∈ N with ≤ < 1 holds ≤ 2 and therefore ≥ 1 .Hence, by Lemma 4.9, for all Note that 2 is a constant.We now show that condition 1 is satisfied as well.We already showed that for all ∈ N with < < 2 holds 1 ≤ ≤ .Hence, the conditions for Lemma 4.6 are satisfied, and we get that there exists a constant ∈ R >0 such that for all ∈ N holds that Now all of the conditions of Theorem 2.2 are satisfied, and we get that there exists a constant Note that this probability goes towards 0 as goes towards infinity.Hence, We showed that for all ∈ N with ≤ < 1 holds that ≥ 1 > 0. Thus, dominates 1 .Note that clocks in trigger at an arbitrarily high rate, as we do not have an upper bound on .However, the amounts of recovery triggers, infection triggers, and deimmunization triggers that occur until 1 differ by at most , pairwise by type, and each of them also has an exponential expectation.As we only consider recovery clocks, they trigger at a rate of at most , and the expected time between each trigger is at least 1 .By Wald's equation (Theorem 2.4), we get that We now prove our main result.
◮ Theorem 4.11.Let be a graph, and let ′ be a subgraph of that is an ( , (1 ± ) , )expander.Let → ∞ and , → 0 as → ∞.Let be the SIRS process on with infection rate and with constant deimmunization rate .Further, let start with at least one infected vertex in ′ and no recovered vertices in ′ .Last, let ′ be the projection of onto ′ , and let be the survival time of ′ .If ≥ for a constant ∈ R >1 , then for sufficiently large , it holds that E[ ] = 2 Ω ( ) .◭ Proof.For all ∈ (0, 1), let be the event that there exists a ∈ N such that ≥ .By Lemma 4.1, there exists an ∈ R >0 such that for sufficiently large holds that Pr[ ] ≥ 1 +2 .By Lemma 4.10, it holds that E[ | ] = 2 Ω ( ) .Using the law of total expectation, we get

Special graph classes
We present the implications of Theorem 4.11 on special graph classes.We focus our attention to two random graph models, namely Erdős-Rényi graphs and hyperbolic random graphs.

Erdős-Rényi graphs
In order to apply theorem 4.11 to Erdős-Rényi graphs, we make use of the following result.

Hyperbolic random graphs
For the formal definition of a hyperbolic random graph, we refer the reader to the article by Krioukov, Papadopoulos, Kitsak, Vahdat, and Boguñá [KPK+10].The two key properties we require for our main result to be applicable on hyperbolic random graphs are the following.
◮ Theorem 5.3 ([FK18, Theorem 1]).Let be a hyperbolic random graph with vertices that follows a power-law degree distribution with exponent ∈ (2, 3).Then the diameter of the giant component of is O (log ) 2/(3− ) with probability 1 − O −3/2 .◭ ◮ Theorem 5.4 ( [FK15]).Let be a hyperbolic random graph with vertices that follows a power-law degree distribution with exponent ∈ (2, 3).Then the size of the largest clique of is in Θ (3− )/2 with high probability.◭ We first use the poly-logarithmic diameter to show that the infection reaches the largest clique with a sufficient probability when the process starts with at least one infected vertex.◮ Lemma 5.5.Let be a hyperbolic random graph with vertices that follows a power-law degree distribution with exponent ∈ (2, 3), and let be an SIRS process on with infection rate and with constant deimmunization rate .Further, let start with at least one infected vertex in the giant component and no recovered vertices in the giant component.If ≥ ( −3)/2 for a constant ∈ R >0 , then the probability that the infection reaches a state in which a vertex in the largest clique is infected is at least exp − (ln ) 3/(3− ) for sufficiently large .◭ Proof.Let be a vertex that starts infected, and let be the shortest distance from to any vertex of the largest clique.Note that is bounded from above by the diameter of the giant component.Therefore, by Theorem 5.3, there exists a constant ∈ R >0 such that for sufficiently large with a probability of at least 1 2 , it holds that ≤ (ln ) 2/(3− ) .For all ∈ N, let be the event that reaches a state with an infected vertex that has a distance of to the largest clique.Consider for all ∈ N < the probability Pr[ | +1 ].Each vertex with a distance of + 1 to the largest clique has a neighbor that has a distance of to the clique.With a probability of 1+ , an infected vertex infects a specific neighbor before recovering.Therefore, Pr[ | +1 ] ≥ 1+ ≥ 2 ( −3)/(2) for sufficiently large .With a probability of at least 1 2 , it holds that ≤ (ln ) 2/(3− ) .This yields for sufficiently large that When the infection reaches the largest clique of a hyperbolic random graph, Theorem 4.11 yields an exponential expected survival time for a sufficiently large infection rate.◮ Corollary 5.6.Let be a hyperbolic random graph with vertices that follows a power-law degree distribution with exponent ∈ (2, 3), and let be the SIRS process on with infection rate and with constant deimmunization rate .Further, let start with at least one infected vertex in the giant component and no recovered vertices, and let be the survival time of .Then there exists a constant ∈ R >0 such that if ≥ ( −3)/2 , then E[ ] = 2 Ω ( (3− ) /2 ) .◭ Proof.Let be the size of the largest clique of .By Theorem 5.4, there exists a constant ∈ R >0 such that with high probability it holds that ≥ (3− )/2 .Let = −1 + 1 such that with high probability it holds that ≥ 1+ .Let be the event that there exists a configuration in which a vertex in the largest clique of is infected.By Lemma 5.5, it holds that Pr[ ] ≥ exp −(ln ) 3/(3− ) for sufficiently large .Note that a clique with vertices is an ( , (1 ± −1 ) , ( − 1) −1 )-expander.Hence, by Theorem 4.11, it holds that E[ | ] = 2 Ω ( ) , as the infection survives that long on the clique alone after its first vertex gets infected.By the law of total expectation and that with high probability ≥ (3− )/2 , we conclude ≥ e −(ln ) = 2 Ω ( (3− ) /2 ) .