Externalities in queues as stochastic processes: The case of FCFS M/G/1

Externalities are the costs that a user of a common resource imposes on others. For example, consider a FCFS M/G/1 queue and a customer with service demand of $x\geq0$ minutes who arrived into the system when the workload level was $v\geq0$ minutes. Let $E_v(x)$ be the total waiting time which could be saved if this customer gave up on his service demand. In this work, we analyse the \textit{externalities process} $E_v(\cdot)=\left\{E_v(x):x\geq0\right\}$. It is shown that this process can be represented by an integral of a (shifted in time by $v$ minutes) compound Poisson process with positive discrete jump distribution, so that $E_v(\cdot)$ is convex. Furthermore, we compute the LST of the finite-dimensional distributions of $E_v(\cdot)$ as well as its mean and auto-covariance functions. We also identify conditions under which, a sequence of normalized externalities processes admits a weak convergence on $\mathcal{D}[0,\infty)$ equipped with the uniform metric to an integral of a (shifted in time by $v$ minutes) standard Wiener process. Finally, we also consider the extended framework when $v$ is a general nonnegative random variable which is independent from the arrival process and the service demands. This leads to a generalization of an existing result from a previous work of Haviv and Ritov (1998).


Introduction
Consider a conventional M/G/1 queueing system that is served according to the first-come, first-served (FCFS) discipline, with arrival rate λ > 0 and with the service distribution given by B(•).Assume that the queue is stable, and let the workload level at time t = 0 be v ≥ 0 (say) minutes.Denote the workload at time t ≥ 0 by W v (t), and let T i be the arrival time of the i-th customer.The main objective of this paper is to analyze the aggregate effect of an additional customer, who has arrived at time t = 0 with a service requirement of size x ≥ 0, on the waiting times of all other customers.In other words, we are interested in the distribution of the externality: Thus, the externality E(x, v) is to be interpreted as the the total waiting time which could be saved if the additional customer reduced their service requirement from x 0 to zero.To the best of our knowledge, [9] is the only existing paper that analyzes E(x, v).In [9] it was shown that if (i) v is a random variable which is independent from the arrival process and the service requirements of the customers, and (ii) v is distributed according to the stationary distribution of the workload process, then the mean of E(x, v) is given by where µ i (i = 1, 2, . ..) is the i-th moment pertaining to B(•) and ρ ≡ λµ 1 .
Whereas [9] focused on computing the mean externality (under the specific condition mentioned above), we have managed to develop a full probabilistic analysis of E(x, v).In this context it is important to notice that can be seen as a collection of random variables which are all defined on the same probability space.By considering v as a fixed parameter while x is given the role of a time index, we analyze the stochastic process E v (•) ≡ E(•, v).To underline the natural interpretation of this process, let the additional customer arrive to the queue when the existing workload is v, and assume that this customer has two tasks that they want the server to do for them: a first one of size x 1 ≥ 0 and a second one of size x 2 ≥ 0.Then, ) is equal to the total waiting time that could be saved by the other customers if the customer gave up on their second task but insisted on completing the first one.
The main contribution of this work lies in an extensive analysis of E v (•) which in the sequel we refer to as the the externalities process.Specific open questions which we have managed to solve in the current paper, are: 1. What can be said about the distribution of the externalities in a nonstationary FCFS M/G/1 queue?As it turns out, the externalities process E v (•) can be represented by an integral of a compound Poisson process that is shifted in time by an amount v. Importantly, this compound Poisson process is defined on the same probability space as the one on which our model is defined.
2. Observe that the expected value in (2) is convex in x, which indicates that the marginal effect of extra workload on the customer population is increasing.Is it possible to extend this result by showing convexity of the externalities process?The answer is affirmative, where we also provide an explicit representation of the corresponding right-derivative.
3. Is there a systematic way to evaluate the moments of the externality E v (x)?To this end, we derive the Laplace-Stieltjes transform (LST) of the finite-dimensional distributions of the externalities process E v (•), from which the moments follow.In particular, we provide closedform formulae for the auto-covariance and auto-correlation functions of E v (•).Remarkably, it is shown that when v is fixed, then the autocorrelation does not depend on the stochastic ingredients of the model, i.e., the arrival rate and service distribution.
4. Is it possible to approximate the distribution of the externalities in some asymptotic regime?We show that, under an appropriate scaling, there is convergence of E v (•) to a specific Gaussian limiting process.The convergence takes place as the arrival rate tends to infinity and the service distribution is 'well behaved', e.g., it tends to zero in an appropriate way.

Motivation
We proceed by discussing the relevance of our result, and their applicability in an operational context.We do so by distinguishing three strands of application domains.
Choice of a management scheme.In the introduction of their paper, Haviv and Ritov [9] discuss various applications of the externalities setup that they analyze: airplanes taking off from a runway, commuters crossing a bridge, jobs sharing a common CPU, and messages being routed through a common data network.Their motivation for studying the distribution of externalities is as follows.In the first place, they argue that "a zero profit operator who charges users for the use of a common facility usually likes to do so in accordance with the congestion costs that they impose on others".This aligns with results in e.g.[5,8,10,11,15] where various relations between optimal queue regulation schemes and externalities are revealed.Then, they point out that there are various policies of managing a queueing system (e.g., by implementing different service disciplines).Correspondingly, different management policies may result in different amounts of externalities imposed by the same user.This leads them to the conclusion that "the resulting pricing mechanism can serve as an additional criterion for deciding which management scheme to adopt".A general account of externalities in a queueing context is given in [6], as well as various other connections between queueing and game theory.
Queues with discretionary services.Recently, there has been a growing interest in queueing models with customers who themselves choose their service durations (see, e.g., [4,15] and the references therein).When considering single-server queues with a non-preemptive service discipline, the customer who gets service does not care about the increasing costs of the waiting customers behind them, thus yielding a resource allocation which is inefficient from a social point of view.In order to restore social efficiency, a social planner may want to impose some sort of regulation.For example, the planner may decide on a price function which tells every customer how much they are going to pay for every service duration to be purchased.A price function will be optimal if it makes the customers behave as they should according to the socially optimal resource allocation.A reasonable price mechanism amounts to requiring every customer to pay for the expected cost which is enforced on the others due to their service requirement.The earlier paper [15] considered a model of a single-server queue with customers who arrive according to a Poisson process and dynamically choose their service durations, showing that when the social planner is restricted to choose a price function which is determined by the service requirement only, then the optimal price function internalizes the (expected) externalities.It is an open problem [14] whether a similar phenomenon occurs when the social planner may choose a price function which depends on the state of the queue.If the answer to this question is affirmative and the social planner observes the workload level at the onset of every service duration, then the optimal price function is equal to E[E v (x)], with v the initial workload at the start of the service and x the corresponding service requirement.As is shown in the present paper, this would reduce the search for the optimal price function to the parametric family of quadratic functions in x which are also linear in v.
Similarly, in another possible scenario a social planner observes the number of waiting customers at the start of every service but they do not see the customers' service requirements (See also [8,Section 3]).In this case, the conjectured optimal price function is the conditional expectation of E v (x) given the available information at the start of the service.Once more, our results imply that this conditional expectation is quadratic in x and also linear in the number of waiting customers at the time of the start of the service.
Queues with a proactive service discipline.Consider an emergency room with a single specific bed which is reserved for patients with special needs, e.g., those who arrive because of strokes, heart attacks, etc.We refer to these patients as 'urgent', while the patients who arrive due to other reasons are called 'regular'.Note that the special bed might be useful also for regular patients while the urgent ones can be treated only in the special bed.Hence a non-trivial question is: if there are many regular patients and no urgent patients, should the regular patients be allowed to use the special bed?Doing so is evidently beneficial to the regular patients, but it is also possible that immediately after allocating a regular patient to the special bed, a batch of urgent patients arrives whose treatments will be delayed.Now, assume that the urgent patients arrive according to a Poisson process with rate λ and their service requirements are iid random variables with a distribution function B(•) which are independent from the arrival process.Then, observe that E 0 (x) is equal to the total damage which is caused to the urgent patients due to an allocation of a regular customer into the special bed for x minutes once it is empty.Clearly, the decision maker could benefit from the distributional properties of E 0 (x) that we establish in the present paper.
The above example connects our work with server-allocation problems in multiclass queues.Recent progress in this direction can be found in, e.g., [1,12,13,21].

Organization of the paper
The organization of this work is as follows.Section 2 starts by a brief discussion of a known result, extensively used in the papwer: a fixed-point relation which is satisfied by the LST of the distribution of the number of customers who arrive to a queue during a busy period.Besides this fixedpoint relation, all results presented are novel contributions.Then, Section 3 includes a representation of the externalities process E v (•) in terms of a compound Poisson process, yielding two insightful decompositions: is equal to an integral of a compound Poisson process which is shifted in time by v.The rate of this process is equal to λ and its jumps have the distribution identified in Section 2. Section 5 provides a compact analysis of the crossing times of the right-derivative of E v (•).An important application of this decomposition can be found in Section 6 where we derive of a functional central limit theorem for the externalities process.
) is the same as the distribution of a sum of independent random variables.This helps in Section 4 where we derive the LST of the finite-dimensional distributions pertaining to the process E v (•).Moreover, this decomposition plays an important role in the derivations in Section 7 where we consider the more general framework when v is a nonnegative random variable, independent from the arrival process and the service requirements of the customers.In particular, the results of this part include a generalization of (2) to the case where v is not necessarily distributed according to the stationary distribution of the workload process.
Section 8 concludes by discussing some related open problems which lead to several directions of future research.In order to optimize the flow of the paper, all proofs are given in Section 9.

Number of customers during busy period
This section discusses a few results concerning the number of customers who arrive to a stable FCFS M/G/1 queue during a single busy period, needed in the upcoming sections.Proposition 1 is standard [2, Chapter II.4.4], while all the other results in this section are essentially direct consequences.However, since we did not find a reference for Propositions 2-3, we decided to include their proofs.For additional work on the distribution of the number of customers who arrive during a busy period, see [22] and the references therein.
As before, we consider the setting of an M/G/1 queue with arrival rate λ and a service distribution B(•), but now the system starts empty at time t = 0.In addition, denote the LST of B(•) by b(s) ≡ ∞ 0 e −st dB(t), s > 0. and, for any n ≥ 1, denote the n-th moment of B(•) by Throughout this paper we assume that ρ ≡ λµ 1 ∈ (0, 1) to ensure stability.Let N (s) be the probability that exactly s customers received service during the first busy period.The associated k-th moment is denoted by Proposition 1 For every z ∈ (0, 1), the following fixed-point equation in y has a unique solution y z which belongs to (0, 1).Furthermore, y z equals the generating function Remark 1 Notice that 0 < zb (λ) < zb(0) < 1 .
Thus, since both sides of ( 6) are continuous in y, for every z ∈ (0, 1), it is possible to find y z efficiently by a standard line-search algorithm.
In particular, for every α > 0, we can insert z = e −α into (6).This yields the following fixed-point relation for the LST: Therefore, we can differentiate both sides of (9) at zero in order to get a recursive formula for the moments η n , n ≥ 1.In the sequel, for any pair of integers m and k such that 1 ≤ m ≤ k, denote the corresponding incomplete Bell's polynomial where the summation is over all non-negative integers j 1 , j 2 , . . ., j k−m+1 which satisfy the following two conditions: In addition, for any pair of integers m and k such that 1 ≤ m ≤ k, we introduce the following compact notation: Proposition 2 For every positive integer n, The following corollary, providing explicit expressions for the first three moments in terms of the moments of B(•), is an immediate consequence of Proposition 2. The first moment η 1 also follows from the well-known result that the expected length of the busy period is µ 1 /(1 − ρ), in combination with Little's law.

Corollary 1
The first three moments are given by In a similar fashion, a combinatorial formula for the probability mass function N (s), s = 1, 2, . . .may be derived by repeatedly differentiating at zero.Using the compact notation we arrive at the following recursion.

Decompositions of externalities
This section first introduces the notation that will be used throughout the paper, and provides a detailed model description.Then we state our decomposition results.

Model description
With λ and B(•) as defined before, let {J(t) : t ≥ 0} be a compound Poisson process with rate λ ∈ (0, ∞) and a nonnegative jump distribution B(•).In addition, for each i ≥ 1, we let T i be the time of the i-th jump of the process J(•).In addition, consider two processes X 1 (•) and X 2 (•) which are given by for some two parameters x, v ≥ 0.Then, for each i = 1, 2, let Y i (•) be the reflection of X i (•) at the origin; this reflection, formally defined in e.g.[3, Section 2.4], can be thought of as a mechanism preventing the 'free processes' X i (•) from becoming negative.Then, define, for a given initial workload v and service requirement x, the externality via Notice that Y 2 (t) ≥ Y 1 (t), but the stability condition ρ < 1 implies that the hitting time of Y 2 (•) in the origin is an almost surely finite random variable.
Denote this random variable by ζ and notice that this makes E(x, v) an almost surely finite random variable.Observe that from time ζ on, the processes Y 1 (t) and Y 2 (t) are coupled (in that they coincide).Importantly, ) which was defined in the beginning of Section 1.Therefore, the quantity E(x, v) represents the externality which is due to an arrival of a customer with a service demand of x when the processing time of the existing workload is v.More generally, fixing the initial workload v ≥ 0, we can consider a stochastic process E v (x) ≡ E(x, v) indexed by x ∈ [0, ∞), which in the sequel we refer to as the externalities process.

Decomposition 1
For the analysis of the externalities process, the following notation and definitions are needed.Throughout, the initial workload v pertaining to Y 1 (t) is held fixed.In the first place, let τ 0 be the end of the first busy period of Y 1 (t).Also, let σ 1 be the time of the first jump of J(•) which occurs after τ 0 .In addition, denote the first time after σ 1 in which Y 1 (•) hits the origin by τ 1 (i.e., the end of the second busy period of Y 1 (t)).Similarly, we can define σ 2 to be the time of the first jump of J(•) which occurs after τ 1 .Moreover, let τ 2 be the first time after σ 2 in which Y 1 (•) hits the origin.We may continue recursively with this construction in the evident manner, thus yielding the two sequences (τ k ) k≥1 and (σ k ) k≥1 .
Also, for each k ≥ 1 denote I k ≡ σ k − τ k−1 and notice that I 1 , I 2 , . . . is a sequence of iid random variables which have an exponential distribution with rate λ.Furthermore, for each k ≥ 1, let N k be the number of jumps of . . is a sequence of iid random variables which are distributed according to N (•) (explicitly given in Proposition 3).In a similar fashion, denote the number of jumps of J(•) on (0, τ 0 ] by M and notice that M depends on v. Furthermore, it is important to notice that the random objects M , (I k ) k≥1 and (N k ) k≥1 are all independent.
The following identity, which directly follows from the pictorial illustration in Figure 1, is a key ingredient for the rest of our analysis: Figure 1: The blue (resp.red) graph represents the workload process when the initial workload level is v (resp.v + x).Note that each jump which occurs during the interval [0, τ 0 ] contributes x to the externality.Similarly, each jump which occurs during the interval [σ 1 , τ 1 ] contributes x − (σ 1 − τ 0 ) to the externality.In addition, each jump which occurs during the interval ) to the externality.Finally, notice that all jumps which occur after the 'coupling time' ζ have no contribution to the externality, and hence the conclusion is that for the current realization we have that Theorem 1 For every x ≥ 0 denote, In addition, define a right-continuous nondecreasing stochastic process (in x) as follows: Then, for each x ≥ 0, and hence E v (•) is convex with a right-derivative which equals Ėv (•).
For more examples of convex stochastic processes which arise in different applications, see [16].
For each y ≥ 0, let S(y) be the number of jumps that J(•) has until inf {t ≥ 0 : Notice that S(v + y) = Ėv (y) for every y ≥ 0. Therefore, when replacing Ėv (•) by S(v + •) in (21), this equation remains valid.Furthermore, the same technique which was applied in the proof of Proposition 1 can be used in order to show that S(•) is a compound Poisson process with rate λ and jump distribution N (•).As a result, we obtain the following compact representation of the externalities process.
Corollary 2 In the same probability space in which the model is defined, there is a compound Poisson process S(•) with rate λ and jump distribution

Decomposition 2
It is interesting to notice that Ėv (•) equals the number of jumps of J(•) which cause an increase in the value of E v (x).Consider some arbitrary x 1 , x 2 ≥ 0 and denote ∆( It is illustrated in Figure 2 that every jump of J(•) which causes an increase in the value of E v (x 1 ) contributes x 2 to the value of ∆(x 1 + x 2 , x 1 ).This means that we can write Especially, since the workload process is strong Markov, the sum in the right-hand side is distributed as E 0 (x 2 ) and is independent of Ėv (x 1 ) (see also Figure 2).Furthermore, assume that ξ ∼ Poi(λx 2 ) and U 1 , U 2 , . . . is an iid sequence of random variables which are distributed uniformly on [0, 1].In particular, assume that ξ, (U k ) k≥1 and (N k ) k≥1 are independent.In addition, for each j ≥ 1 we use the notation

Time
Workload Figure 2: The green (resp.red) graph describes a sample path of the workload process when a customer c with a service demand of x 1 + x 2 > 0 (resp. x 1 ) arrives at time zero and sees a system with existing workload level v > 0.
The blue graph describes a sample path of the workload of the same system once c reduces her service requirement to zero.Note that the jumps of the graphs are coordinated.In fact, each jump is associated with an arrival of a customer and the size of the jump is the service demand of that customer.
Observe that every jump on (0, ζ) adds x 2 to ∆ v (x 1 + x 2 , x 1 ).In addition, the value of the green graph at ζ equals x 2 .Thus, a regenerative argument yields that ∆ v (x 1 + x 2 , x 1 ) is distributed as x 2 multiplied by the number of jumps on (0, ζ) plus an independent random variable which is distributed like E 0 (x 2 ).
in order to denote the order statistics of U 1 , U 2 , . . ., U j .Then an application of known 'symmetry properties' yields the following distributional equality: This argument can be applied recursively in order to derive the following theorem.As illustrated in Section 4, it provides us with a systematic approach to compute the moments of the finite-dimensional distributions of E v (•).

2.
U is an infinite array of iid random variables which are distributed uniformly on [0, 1].
4. N , U and (ξ j ) j≥1 are independent. Then, and 4 Moments of the finite-dimensional distributions This section concentrates on the evaluation of moments corresponding to the finite dimensional distributions of the externalities process E v (•).We first present the mean and variance, then the auto-covariance and autocorrelation, after which we proceed with higher moments.

Mean and variance
Fix x > 0 and notice that an insertion of k = 1 into (28) yields that Thus, by an application of the formula of an expectation of a compound Poisson random variable, we directly obtain that and, as expected, Similarly, the formula of the variance of a compound Poisson random variable may be used in order to derive that

Auto-covariance and auto-correlation
Fix some x 1 , x 2 > 0. Since the sums in the right-hand side of ( 29) are independent, we find In addition, and hence an insertion of (32) implies that the auto-covariance function of As argued in the introduction, in the situation of a customer arriving at time 0 with two tasks (of size x 1 and x 2 , respectively), represents the total waiting time that could be saved by the other customers if the customer gave up on their second task but insisted on completing the first one.The auto-covariance (35) provides insight into the effect of the additional x 2 .
As a result, the auto-correlation function is given by Surprisingly, the expression in (36) is invariant with respect to the service distribution and the arrival rate.At the same time, observe that R v (x 1 , x 1 + x 2 ) is positive.In addition, the expression of R v (x 1 , x 1 + x 2 ) actually shows that the externalities process is not wide sense stationary (see the definition in [28, p. 15]).
Remark 3 Later, in Section 7 we consider a setup in which v is a general nonnegative random variable, independent from the arrival process and service requirements.There, it is shown that in the more complex setup, the auto-correlation function depends on the arrival rate and service distribution unless v is a degenerate random variable.

Higher moments
Higher moments (including joint moments) of E v (•) may be derived via differentiation of the LST formula which is given in the next theorem.This is a tedious derivation that we decided to leave out.The below result is particularly useful when analyzing a situation in which the customer arriving at time 0 has k tasks, having sizes x 1 , . . ., x k .

Crossing times of Ė0 (•)
The process Ėv (•) is nondecreasing such that Ėv (0) = M , and E v (x) ↑ ∞ as x → ∞.Therefore, it is natural to study the crossing times of the process Ėv (•).Namely, fix some y > 0 and the corresponding crossing time is In the sequel we consider the special case v = 0 for which M = 0 and hence x(y) ≡ x 0 (y) has a relatively tractable representation (but see Remark 5 below for some reflections on the case v > 0).In fact, Theorem 2 yields that where υ(y) ≡ min t ≥ 1 : t m=1 N k ≥ y .In particular, notice that υ(y) and I 1 , I 2 , . . .are independent.Furthermore, υ(y) can be described as the time until absorption in a Markov chain with a unique absorbing state.Specifically, this chain has a state-space {0, 1, 2, . . ., y } with an absorbing state y and an initial state 0. In addition, the transition matrix is equal to P ≡ [p ij ] 1≤i,j≤ y , given by It is well-known, that the mean of υ(y) can be characterized via where ψ y −1 = 1 and ψ 1 , ψ 2 , . . ., ψ y −2 are given recursively by the equations Therefore, Wald's identity may be applied to (42) to deduce that Remark 4 The second moment of υ(y) can be computed using a similar technique, thus also yielding Var [υ(y)].Hence, it is possible to compute the variance of x(y) via the formula Remark 5 When v > 0, it makes sense to rely on a similar computation in which we condition and de-condition on M .In practice, we do not see how this computation leads to a tractable expression for the general case.

Gaussian approximation of E v (•)
The main result of this section concerns a Gaussian approximation for the externalities process.In order to provide an accurate statement of this result, notice that the model which was described in Section 3 is characterized by the triplet (λ, B(•), v).Fix v ≥ 0 and consider a sequence of models such that the n-th model is associated with an arrival rate λ n > 0 and a service distribution B n (•).Respectively, for each n ≥ 1, we introduce the notation In addition, for each n ≥ 1, denote the externalities process which is associated with the n-th model by E (n) v (•).Also, let N n (•) be the probability mass function of the number of customers who got service during a single busy period of a FCFS M/G/1 queue with an arrival rate λ n and a service distribution B n (•).Correspondingly, for each n, k ≥ 1 denote and observe that η k,n is the analogue of η k in the n-th model.

Functional central limit theorem
The main result of this section is stated in the next functional central limit theorem.
Theorem 4 Define, for a fixed v ≥ 0, such that W (•) is a standard Wiener process.
2. A sequence (in n = 1, 2, . ..) of stochastic processes In addition, assume that the next conditions hold: (ii) There is n ≥ 1 such that ρ n < 1 for every n ≥ n . (iii) where ⇒ denotes weak convergence on D[0, ∞) equipped with the uniform metric (on compacta).
Observe that checking Condition (iii) is not straightforward because it is phrased in terms of the moments of N n (•).The following proposition presents two sets of sufficient conditions which are considerably more easy to verify.Broadly speaking, the proof of these sets of conditions being sufficient relies on the expressions appearing in the statement of Corollary 1.

Proposition 4
1. Assume that Condition (i) and in addition the two conditions are all satisfied.Then, (53) is valid.
2. Assume that Condition (i), Condition (ii) and in addition the three conditions are all satisfied.Then, (53) is valid.
Remark 6 Note that the condition lim sup n→∞ ρ n < 1 implies Condition (ii).In addition, it does not go together with a heavytraffic regime (i.e., ρ n ↑ 1 as n → ∞) under the first set of conditions in Proposition 4. Thus, the added value of the second set of conditions in Proposition 4 is that it could cover a heavy-traffic regime, i.e., ρ n ↑ 1 as n → ∞.
for some c ∈ (0, ∞), then At the same time, observe that (56) is not necessary for (57) even under the assumption that Condition (i) is satisfied.
The general idea of the proof of Theorem 4 is as follows: Notice that Condition (ii) allows us to apply Corollary 2 and hence for each n ≥ n there is a compensated compound Poisson process S n (•) with rate (λ n ) and jump distribution (N n (•)) such that Then, the crucial part of the proof is to show that the rest will follow from this benchmark via standard arguments.In the upcoming Section 6.2 we address a general result about a Gaussian approximation of compound Poisson processes.This will help in proving (59).
Remark 8 An extensive account of heavy-traffic approximations of queueing systems can be found in [27].Notably, heavy-traffic approximations have been developed for various functionals of the queueing process (such as the number of customers and the waiting time), but to the best of our knowledge we are the first to do so for the externalities process.This means that, in the strict sense, we cannot compare our Theorem 4 with existing results.This being said, there is a vast literature on Gaussian approximations for sequences of compound Poisson processes, related to Theorem 5 below (which is heavily relied upon in our derivation of Theorem 4); we therefore include in Section 6.2 a comparison between Theorem 5 and related results.
We proceed by discussing an immediate implication of Theorem 4. To this end, fixing x ≥ 0, recall that it is well-known result that Hence, under the conditions of Theorem 4 we conclude the following convergence: Remark 9 In fact, taking into account (32), the current analysis gives a new proof for (60) which is not based on stochastic calculus at all but only on approximation of a standard Wiener process by normalized compensated compound Poisson processes.Since (60) is known and the current proof is not simpler than the existing one, we mention this result in passing.

Comparison with the existing literature
Let J(•) be a compensated-compound Poisson process with rate λ and a jump distribution with finite fourth moment.Denote the standard deviation of the jump distribution by γ.Then, [18,Corollary 3.7] states conditions under which the sequence (in n) of processes weakly converges to a standard Wiener process in D[0, 1] equipped with the Skorohod topology.Thus, some differences between Theorem 5 and [18, Corollary 3.7] are: 1. Consider the setup of Theorem 5 with (i.e., λ n = nλ) and σ n ≡ γ 2 λ .Then, we get that J n ≡ J n and hence in that sense the setup of Theorem 5 is more general.
2. Theorem 5 guarantees weak convergence in a different topological space.
3. [18, Corollary 3.7] requires that the fourth moment of the jump distribution is finite while Theorem 5 imposes no conditions on the fourth moment of F n (for any n ≥ 1).
4. In [18, Corollary 3.7], we get that λ n grows linearly in n which implies Condition (I), but obviously Condition (I) might be satisfied in other asymptotic regimes.
5. In [18, Corollary 3.7], we get that ν n and σ n remain fixed (in n) and hence, due to the linear growth of λ n , Condition (II) is satisfied.Once again, obviously it could be satisfied in other asymptotic regimes as well.
Another result [23, Theorem 1.1], is about a weak convergence in D[0, ∞) equipped with the Skorohod topology of a sequence of modulated compound Poisson processes.When all the processes in this sequence are compound Poisson processes (i.e., when the modulating Markov chains in the background are all degenerate ones), then [23, Assumption 1] is reduced to: 1. Linear growth of the sequence λ n as n → ∞.
2. The sequences (in n) of the means and standard deviations of F n should both converge to constants.
We conclude that there is a strong resemblance between the comparison of Theorem 5 with [23, Theorem 1.1] and the comparison of Theorem 5 with [18,Corollary 3.7].
Another strand of literature regards the properties of a sequence of compound Poisson processes which weakly converges to a limiting process (see, e.g., [19,20,25]).This literature predominantly focuses on necessary conditions for weak convergence of such sequences, while Theorem 5 presents sufficient conditions.

When v is a random variable
In this part we revisit some results of the previous sections in the situation that v is a nonnegative random variable which is independent from the arrival process and the service requirements of the customers.The motivation for this extension of the existing framework lies in the fact that if v has the stationary distribution of an M/G/1 queue with an arrival rate λ and a service distribution B(•), then we recover the setup of Haviv and Ritov [9].For simplicity of notation, denote the conditional expectation (resp.covariance) given v by E v (resp.Cov v ).

Expressions for moments
To begin with, it is immediate that the decompositions of Section 3 remain true when the initial workload v is a general random variable.Similarly, Theorem 4 may be phrased in the extended setup.This is because for every bounded uniformly-continuous functional f we may apply the law of total expectation and then apply the dominated convergence theorem with Theorem 4 so as to deduce the needed result (see also [24, Corollary IV.9]).
A similar approach may be applied in order to derive the moments of the externalities process.For example, for every x ≥ 0, Furthermore, for every x 1 , x 2 ≥ 0 deduce that Thus, the law of total covariance yields that In particular, when x 1 = x and x 2 = 0 the variance is obtained: Remark 11 Equations ( 69) and (70) imply that the correlation is invariant to the arrival rate and the service distribution if and only if Var(v) = 0 (or equivalently, when v equals a constant with probability one).
For higher moments, it is possible to differentiate the LST formula, as given in the next corollary.Just like in Section 4, we do not include these computations here.The proof follows from conditioning and de-conditioning on v with the result of Theorem 3.

Comparison with existing literature
Haviv and Ritov [9] considered the special case when v is distributed according to the stationary distribution of the corresponding M/G/1 queue with an arrival rate λ and a service distribution B(•).In this case, the expected value of v is given by Observe that an insertion of these formula into (66) provides exactly the same expression as in [9, Eqn.(7)].Thus, in that sense, the formulae in this section may be considered as a natural generalization of this theorem, as in our framework v can have any distribution.Importantly, the proof in the current work stems from other considerations than those which appeared in the original proof of [9].Moreover, Corollary 3 might be applied for the special case of v which is distributed according to the stationary distribution of the corresponding M/G/1 system.This is a systematic approach to compute all externality moments in the model of [9].

Discussion and open problems
The main contributions of this work lie in the introduction of the notion of the externalities process and in the derivation of various of its properties in the case of a FCFS M/G/1 queue.The rest of this section includes a set of open problems, related to the research presented in this paper.
1.The current analysis is sensitive to the service discipline, in that it is FCFS-specific.Thus, it might be interesting to analyze the externalities process which corresponds to other service disciplines (e.g., preemptive ones) and examine the differences with respect to the results of the present paper.A particularly intriguing question concerns the characterization of the set of service disciplines for which the externalities process is convex.
2. One could think about the externalities processes of more complex queues, e.g., G/G/1, M t /G/1, etc.It is anticipated that in such cases the analysis is considerably more involved.
3. Consider the following natural multi-server version of the externalities process.Assume that there are k servers and a Poisson arrival process of customers, where the service demands of the customers constitute a sequence of iid k-dimensional nonnegative random vectors which are independent from the arrival process.This defines k coupled FCFS M/G/1 queues.Then, define a k-dimensional process such that its i-th (1 ≤ i ≤ k) coordinate is the externalities process which is associated with the i-th queue.Also in this setup one would like to describe the externalities process.A specific natural question is: Are there non-trivial assumptions on the k-dimensional service distributions under which we get an asymptotic independence of the externalities processes?
4. The Lévy-driven queue, as analyzed in [3], forms a class of storage models which can be seen as a natural generalization of the classic FCFS M/G/1 queue.A first question is: how should the externalities process be defined for such Lévy queues?In particular, it is interesting to analyze whether there is a definition for which the results of the current work may be generalized relying on the machinery developed for Lévy processes.Define a function and observe that for each l ≥ 1, the chain rule implies that In particular, when x = 1, we get that f (l) (1) = λ l µ l .As a result, according to the Faá di Bruno's formula, for each k ≥ 1 Thus, observe that differentiating n times (at zero) both sides of ( 9) with the general Leibniz rule yields that Notice that η n appears in the right-hand side only in the term This immediately yields the required recursive formula.

Proof of Proposition 3
N (1) = b(λ) follows by differentiating both sides of ( 14) at zero.Now, consider some s ≥ 2, then the general Leibniz rule and the Faá di Bruno's formula (recall f (•), defined in (74)) yield that

Proofs for Section 3
The proofs of Corollary 2 and Theorem 2 follow directly from the material presented in Section 3. Thus, we are now providing only the proof of Theorem 1.

Proof of Theorem 1
Observe that by definition of ξ(x), for each m ≥ 1 and y > 0, As a result, for every x > 0 we have that: With this identity at our disposal, the required result is a consequence of (18).

Proofs of Section 4
Proof of Corollary 3 With the notations that have been used in Theorem 2, observe that (28) can be rephrased as follows: Thus, we obtain that Note that given U, the sequences {N m,l : m ≥ 1}, 1 ≤ l ≤ k + 1 are independent.As a consequence, the result follows by conditioning and deconditioning on U with an application of the LST formula of a compound Poisson distribution.

Proofs of Section 6
Since the proof of Theorem 4 includes an application of Theorem 5, we start by providing the proof of Theorem 5.

Proof of Theorem 5
The following well-known bound is useful in the proof of Theorem 5: With this bound in hands, we prove convergence of the finite-dimensional distributions as stated in the next lemma.For the proof, it is convenient to denote Lemma 1 The conditions of Theorem 5 imply that for every where Σ is a covariance matrix such that Σ ij ≡ x i ∧x j for every 1 ≤ i, j ≤ d.
Proof: To begin with, consider the special case d = 1 and assume that for each n ≥ 1, W n is a random variable which is distributed according to F n (•).Proof: Fix some δ > 0 and take some 0 ≤ s < t.Notice that J n (•) is a process with stationary increments and J n (t − s) has a continuous distribution function.Therefore, Lemma 1 yields that Clearly, the probability in the right-hand side of (100) can be made sufficiently close to one by taking s and t which are close enough to each other, and hence the result follows.

Proof of Theorem 4
Let k > 0 and observe that Condition (i), Condition (ii) and Condition implies that the nominator of (104) is O(1) as n → ∞.In addition, under the assumption (108), the denominator of ( 104) is bounded from below by (1 − ρ n ) √ λ n which tends to ∞ as n → ∞.The proof of the first statement follows immediately from these results.
Due to the first statement, in order to prove the second statement, it is enough to consider the case when ρ n ↑ 1 as n → ∞.Notice that under the assumption lim inf the denominator of ( 104) is as n → ∞.In addition, due to (106) and (107), the nominator of (104) is O (1 − ρ n ) −4 as n → ∞.Combining these results with the assumption that λ n (1 − ρ n ) → ∞ as n → ∞ completes the proof.
(iii) allow us to apply Theorem 5 with the sequence {S n (•) : n ≥ 1} in order to deduce that S n (•)λ n η 2,n ⇒ W (•) as n → ∞ (101)when the convergence is in D[0, k + v] equipped with the uniform metric.Since the limit process is concentrated on C[0, k + v], according to the representation theorem [24, Theorem VI.13], there is a probability space with a + y)dy ≤ k sup 0≤y≤k+v Sn (y)λ n η 2,n − W (y) .
converges to zero with probability one, deduce that the process in D[0, k] equipped with the uniform metric to the process x → x 0 W (v + y)dy.Especially, since k is an arbitrary positive number and the limiting process is concentrated in C[0, ∞), then this convergence can be extended to D[0, ∞) via[24, Theorem V.23].Thus, the claim of Theorem 4 follows.Proof of Proposition 4Inserting the expressions which appear in the statement of Corollary 1 yields that for each n ≥ 1,