References
Athreya KB, Fukuchi J-I, Lahiri SN (1999) On the bootstrap and the moving block bootstrap for the maximum of a stationary process. J Stat Plan Inference 76:1–17
Bickel PJ, Götze F, van Zwet WR (1997) Resampling fewer than nn observations: gains, losses, and remedies for losses. Stat Sin 7:1–31
Doukhan P, Louhichi S (1999) A new weak dependence condition and applications to moment inequalities. Stoch Process Appl 84:313–342
Fukuchi J-I (1994) Bootstrapping extremes of random variables. PhD Thesis, Iowa State University, IA
Künsch HR (1989) The jackknife and the bootstrap for general stationary observations. Ann Stat 17:1217–1241
Lahiri SN (1999) Theoretical comparison of block bootstrap methods. Ann Stat 27:386–404
Lahiri SN (2003) Resampling methods for dependent data. Springer, New York
Liu RY, Singh K (1992) Moving blocks jackknife and bootstrap capture weak dependence. In: Lepage R, Billard L (eds) Exploring the limits of bootstrap. Wiley, New York, pp 225–248
Politis DN, Romano JP, Wolf M (1999) Subsampling. Springer, New York
Acknowledgement
Research partially supported by NSF grant number DMS 1007703 and NSA grant number H98230-11-1-0130.
Author information
Authors and Affiliations
Corresponding author
Additional information
This comment refers to the invited paper available at doi:10.1007/s11749-011-0269-8.
Appendix
Appendix
Here we provide an outline of the proof of the Theorem. First, note that by the continuity of the limit law ℍ(⋅) and by a subsequence argument, it is enough to show that for each fixed x,
Fix x∈ℝ. Write M i,b =maxY i,b and \(M_{b}^{*}= \max\{X_{1}^{*},\ldots,X_{b}^{*}\}\), the maximum over a single resampled block. Also, let \(\mathbb{H}^{\dagger}_{b}(x) =P_{*}([M_{b}^{*}-{v}_{m}]/{u}_{m} \leq x)\). Then, by (1) and the independence of the resampled blocks, it is easy to check that
which, by Theorem 5 and Condition (i), converges to ℍ(x) in probability provided \(k^{2}\operatorname{Var}({\mathbb{H}}^{\dagger}_{b}(x)) \rightarrow0\). Note that, by stationarity, with \(w_{m}(x) = u_{m}^{-1} x+v_{m}\),
where C 1,C 2,…∈(0,∞) are constants. By Conditions (i) and (iii), it follows that I 1n (p)=O(m/n)=o(1) for every fixed p≥1. And, by retracing the arguments in the proof of Theorem 5, one can show that for any j≥pb, p>1, and α jn >0,
Setting α jn =[mu m η(j−b)]1/2 and noting that m=kb, we have
which, by Condition (iii), goes to zero by first letting n→∞ and then p→∞. This completes the proof of the theorem.
Rights and permissions
About this article
Cite this article
Lahiri, S.N., Mukhopadhyay, S. Comments on: Subsampling weakly dependent time series and application to extremes. TEST 20, 491–496 (2011). https://doi.org/10.1007/s11749-011-0273-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11749-011-0273-z