The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The present paper is devoted to strong laws of large numbers under moment conditions near those of the law of the iterated logarithm (LIL) for i.i.d sequences. More precisely, we wish to investigate possible limit theorems under moment conditions which are stronger than p for any p<2, in which case we know that there is a.s. convergence to 0, and weaker than EX2<∞, in which case the LIL holds.
One-dimensional random variables are introduced when the object of interest is a one-dimensional function of the events (in the probability space (,F, P)); recall Section 4 of the Introduction. In an analogous manner we now define multivariate random variables, or random vectors, as multivariate functions.
Hsu and Robbins (Proc. Nat. Acad. Sci. USA 33, 25–31, 1947) introduced the concept of complete convergence as a complement to the Kolmogorov strong law, in that they proved that ∑ n = 1 ∞ P ( | S n | > nε ) < ∞ $ {\sum }_{n=1}^{\infty } P(|S_{n}|>n\varepsilon )<\infty $ provided the mean of the summands is zero and that the variance is finite. Later, Erdős proved...
In Chapter 1 we learned how to handle transformations in order to find the distribution of new (constructed) random variables. Since the arithmetic mean or average of a set of (independent) random variables is a very important object in probability theory as well as in statistics, we focus in this chapter on sums of independent random variables (from which one easily finds corresponding results for...
The classical central limit theorem was generalized to a functional central limit theorem by Donsker (1951) (see Theorem A.3.2). In words the result means that one considers the partial sums {So, S1,..., Sn} of i.i.d. variables jointly for each n and shows that if the mean and variance are finite then the (polygonal) process obtained by normalization...
The standard assumptions in shock models are that the failure (of a system) is related either to the cumulative effect of a (large) number of shocks or that failure is caused by a shock that exceeds a certain critical level. An extension is to consider a mixture, that is, a system breaks down either because of a large shock or as a result of many smaller ones, whichever appears first. In this chapter...
Throughout this chapter (Ω, $$\mathfrak{F}$$ , P) is a probability space on which everything is defined, {Sn, n ≥ 0} is a random walk with positive drift, that is, So = 0, $$S_n=\sum\nolimits^n_{n-1}X_k, \,\, n \geq 1$$ , where {Xkk ≥ 1} is a sequence of i.i.d. random variables, and $$S_n \xrightarrow{{\textrm{a...
Suppose that an event E may occur at any point in time and that the number of occurrences of E during disjoint time intervals are independent. As examples we might think of the arrivals of customers to a store (where E means that a customer arrives), calls to a telephone switchboard, the emission of particles from a radioactive source, and accidents at a street crossing. The common feature in all...
Let X1, X2, . . . be a (random) sample from a distribution with distribution function F, and let X denote a generic random variable with this distribution. Very natural objects of interest are the largest observation, the smallest observation, the centermost observation (the median), among others. In this chapter we shall derive marginal as well as joint distributions of such objects.
Renewal theory can be generalized in various ways. The first generalization is to leave the assumption of nonnegativity of the summands. This topic was covered in Chapter 3 and onwards. The next one was considered in Section 4.5, namely “time dependent” boundaries. Since the appearance of the first edition of this book in 1988 research in the area has moved on. In this chapter we present some of the...
Classical limit theorems such as the law of large numbers, the central limit theorem and the law of the iterated logarithm are statements concerning sums of independent and identically distributed random variables, and thus, statements concerning random walks. Frequently, however, one considers random walks evaluated after a random number of steps. In sequential analysis, for example, one considers...
Let A and B be events, and suppose that P(B) > 0. We recall from Section 3 of the Introduction that the conditional probability of A given B is defined as P(A | B) = P(A \ B)/P(B) and that P(A | B) = P(A) if A and B are independent. Now, let (X, Y ) be a two-dimensional random variable whose components are discrete.
In Chapter 1 we studied how to handle (linear transformations of) random vectors, that is, vectors whose components are random variables. Since the normal distribution is (one of) the most important distribution(s) and since there are special properties, methods, and devices pertaining to this distribution, we devote this chapter to the study of the multivariate normal distribution, or, equivalently,...
Probability theory is, of course, much more than what one will find in this book (so far). In this chapter we provide an outlook on some extensions and further areas and concepts in probability theory. For more we refer to the more advanced literature cited in Appendix A. We begin, in the first section, by presenting some extensions of the classical limit theorems, that is, the law of large numbers...
In the first chapter we stated and proved various limit theorems for stopped random walks. These limit theorems shall, in subsequent chapters, be used in order to obtain results for random walks stopped according to specific stopping procedures as well as for the families of stopping times (random indices) themselves. However, before doing so we shall, in this chapter, survey some of the basic facts...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.