Skip to main content
Log in

Comments on: Subsampling weakly dependent time series and application to extremes

  • Discussion
  • Published:
TEST Aims and scope Submit manuscript

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

References

  • Athreya KB, Fukuchi J-I, Lahiri SN (1999) On the bootstrap and the moving block bootstrap for the maximum of a stationary process. J Stat Plan Inference 76:1–17

    Article  MathSciNet  MATH  Google Scholar 

  • Bickel PJ, Götze F, van Zwet WR (1997) Resampling fewer than nn observations: gains, losses, and remedies for losses. Stat Sin 7:1–31

    MATH  Google Scholar 

  • Doukhan P, Louhichi S (1999) A new weak dependence condition and applications to moment inequalities. Stoch Process Appl 84:313–342

    Article  MathSciNet  MATH  Google Scholar 

  • Fukuchi J-I (1994) Bootstrapping extremes of random variables. PhD Thesis, Iowa State University, IA

  • Künsch HR (1989) The jackknife and the bootstrap for general stationary observations. Ann Stat 17:1217–1241

    Article  MATH  Google Scholar 

  • Lahiri SN (1999) Theoretical comparison of block bootstrap methods. Ann Stat 27:386–404

    Article  MATH  Google Scholar 

  • Lahiri SN (2003) Resampling methods for dependent data. Springer, New York

    MATH  Google Scholar 

  • Liu RY, Singh K (1992) Moving blocks jackknife and bootstrap capture weak dependence. In: Lepage R, Billard L (eds) Exploring the limits of bootstrap. Wiley, New York, pp 225–248

    Google Scholar 

  • Politis DN, Romano JP, Wolf M (1999) Subsampling. Springer, New York

    Book  MATH  Google Scholar 

Download references

Acknowledgement

Research partially supported by NSF grant number DMS 1007703 and NSA grant number H98230-11-1-0130.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. N. Lahiri.

Additional information

This comment refers to the invited paper available at doi:10.1007/s11749-011-0269-8.

Appendix

Appendix

Here we provide an outline of the proof of the Theorem. First, note that by the continuity of the limit law ℍ(⋅) and by a subsequence argument, it is enough to show that for each fixed x,

$$\hat{\mathbb{H}}_{m,n}(x) - \mathbb{H}(x)\rightarrow_p 0.$$

Fix x∈ℝ. Write M i,b =maxY i,b and \(M_{b}^{*}= \max\{X_{1}^{*},\ldots,X_{b}^{*}\}\), the maximum over a single resampled block. Also, let \(\mathbb{H}^{\dagger}_{b}(x) =P_{*}([M_{b}^{*}-{v}_{m}]/{u}_{m} \leq x)\). Then, by (1) and the independence of the resampled blocks, it is easy to check that

which, by Theorem 5 and Condition (i), converges to ℍ(x) in probability provided \(k^{2}\operatorname{Var}({\mathbb{H}}^{\dagger}_{b}(x)) \rightarrow0\). Note that, by stationarity, with \(w_{m}(x) = u_{m}^{-1} x+v_{m}\),

where C 1,C 2,…∈(0,∞) are constants. By Conditions (i) and (iii), it follows that I 1n (p)=O(m/n)=o(1) for every fixed p≥1. And, by retracing the arguments in the proof of Theorem 5, one can show that for any jpb, p>1, and α jn >0,

Setting α jn =[mu m η(jb)]1/2 and noting that m=kb, we have

which, by Condition (iii), goes to zero by first letting n→∞ and then p→∞. This completes the proof of the theorem.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lahiri, S.N., Mukhopadhyay, S. Comments on: Subsampling weakly dependent time series and application to extremes. TEST 20, 491–496 (2011). https://doi.org/10.1007/s11749-011-0273-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11749-011-0273-z

Keywords

Navigation