The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Let (ξ1, η1), (ξ2, η2),… be a sequence of i.i.d. two-dimensional random vectors. In the earlier article Iksanov and Pilipenko (2014) weak convergence in the J1-topology on the Skorokhod space of n − 1 / 2 max 0 ≤ k ≤ [ n ⋅ ] ( ξ 1 + … + ξ k + η k + 1 ) $n^{-1/2}\underset {0\leq k\leq [n\cdot ]}{\max }\,(\xi _{1}+\ldots +\xi _{k}+\eta _{k+1})$ was proved under...
If a random function is stochastically continuous (continuous with probability one, continuous in theLpsense) at every point of the parametric set Note that sometimes, while dealing with continuity of the function X in the Lp sense
General statement of the problem of testing two hypotheses Let the trajectory $$x(\cdot)$$ of a stochastic process $$\{X(t), t\in[0,T]\}$$ be observed.
Local perturbations of a Brownian motion are considered. As a limit we obtain a non-Markov process that behaves as a reflecting Brownian motion on the positive half line until its local time at zero reaches some exponential level, then changes a sign and behaves as a reflecting Brownian motion on the negative half line until some stopping time, etc.
The optimal stopping problem can be considered for nearly any stochastic process, and its formulation will be similar in each case. But its solution will be relatively simple only for a few processes. One class of such processes consists of discrete-time Markov chains.
In this chapter we study methods to find the projection of some element Xk on the space The first case is called the prediction problem,stationary sequence!prediction when we need to approximate in the best way the “future” Xn via “past”, that is, via observations
Consider a complete filtration $${\mathcal{F}}_t$$ , t∈[0,T]} and an m-dimensional Wiener process {W(t), t ∈ [0,T]} with respect to it. By definition, a stochastic differential equation (SDE) is an equation of the form stochastic differential equation with X0=ξ, where ξ is an $${\mathcal{F}}_0$$ -measurable random vector, $$b=b(t,x): [0,T]\times \mathbb{R}^n\rightarrow \mathbb{R}^n$$...
Hereinafter the mapping X(t, ) is denoted as X(t). According to commonly accepted terminology it is a random element taking values in X. The definition introduced above is obviously equivalent to the following one.
Below in this chapter we denote the discrete-time processes as Xn, Mn, and so on (with the lower index as time) and continuous-time processes, as before, are denoted as X(t), M(t), and so on (the time index is inside the parentheses).
Let phase space $$\Bbb{X}$$ of a random sequence $$\{X_n,\,n\in\Bbb{Z}^+\}$$ be enumerable. The sequence $$\{X_n,\,n\in\Bbb{Z}^+\}$$ is called a Markov chain if $$\forall n\in\Bbb{N}\ \forall i_1,\ldots,i_n,i_{n+1}\in \Bbb{X} \ \forall t_1\leq\cdots\leq t_n\leq t_{n+1}\in\Bbb{Z}_+:$$ $$\mathsf{P}(X_{t_{n+1}}=i_{n+1} / X_{t_{1}}=i_{1},\ldots,X_{t_{n}}=i_{n})=\mathsf{P}(X_{t_{n+1}}=i_{n+1}...
Mathematical foundations of investigating of the risk process in insurance were created by Swedish mathematician Filip Lundberg in 1903–1909. For a long time this theory had been developed by mostly Nordic mathematicians, such as Cramér, Segerdal, Teklind, and others. Later on risk theory started to develop not only with connection to insurance but also as the method of solving different problems...
Let $$\{W(t),t\in \mathbb{R}^+\}$$ be a Wiener process, and $$\{g(t),{\mathcal{F}}_t^W,t\in \mathbb{R}^+\}$$ a stochastic process (recall that the previous notation means that g is adapted to a natural filtration $$\{{\mathcal{F}}_t^W\}$$ of the Wiener process). Let $${\mathcal{F}}^W=\sigma\{W_t, t\geq 0\}.$$ A process g is said to belong to the class $$\hat{\mathcal{L}}_2([a,b])$$...
In this chapter, we consider random elements taking values in metric spaces and their distributions. The definition of a random element taking values in $${{\Bbb X}}$$ involves the predefined $$\sigma$$ -algebra $${{\rm X}}$$ of subsets of $${{\Bbb X}}$$ . The following statement shows that in a separable metric space, in fact, the unique natural choice for the $$\sigma$$ -algebra...
Let $${\Bbb T} \subset \mathbb{R}$$ , $$(\Omega, {\rm{F}}, \{{\rm{F}}_t\}_{t\in {\Bbb T}}, {\mathsf{P}})$$ be a probability space with complete filtration. Let $$X = \{X(t), t\in{\Bbb T}\}$$ be an adapted stochastic process taking values in some metric space $$({\Bbb X}, {\frak X})$$ , which sometimes is called the phase space of the process X.
The vector a in the relation (6.1) is the mean vector for the random vector ξ and the matrix B is the covariance matrix for ξ. The distribution of the Gaussian vector with the characteristic function (6.1) is called the Gaussian measure Gaussian measure with mean a and covariance B, and is denoted
In the framework of an arbitrage-free market and the Black–Scholes model consider two European call options with the same strike price and on the same underlying asset. Is it true that the option with a longer time to maturity has a larger arbitrage-free price? What can you say in this connection concerning a European put option?
The following theorem shows that all the finite-dimensional distributions for the process with independent increments on The finite-dimensional distributions of the process with independent increments
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.