## Gemeinsames Kolloquium der Arbeitsgruppen Stochastik

## TU Darmstadt / Goethe-Universität Frankfurt / Gutenberg-Universität Mainz

**Termine im Sommersemester 2017**

**Freitag, 28.04.2017**

Universität Mainz, Institut für Mathematik, Staudingerweg 9, Raum 05-432 (Hilbertraum):

**15:15 Uhr: Leif Döring (Mannheim): Skorokhod Embedding Problem for Lévy Processes
**

For a given probability distribution, the classical Skorokhod Embedding Problem consists of finding (if possible) a stopping time so that a Brownian motion at the stopping time has a the prescribed distribution. Many solutions have been found and applied in different contexts. We discuss the analogous question for Lévy processes and derive necessary/sufficient conditions for the existence of a solution. An explicit construction is derived using time-change theory for Markov processes.

**16:45 Uhr: Leonid Mytnik (TECHNION Haifa): On the zero set of super-Brownian motion
**

------------------------------------------------------------------------------------------------------------------------------------------------------------

**Freitag, 23.06.2017**

** **Universität Frankfurt, Institut für Mathematik, Robert-Mayer-Str. 10, Raum 711 (groß), 7. Stock:

**15:15 Uhr: Carsten Jentsch (Universität Mannheim): Statistical inference on party positions from texts: statistical modeling, bootstrap and adjusting for time effects
**

One central task in comparative politics is to locate party positions in a certain political space. For this purpose, several empirical methods have been proposed using text as data sources. In general, the analysis of texts to extract information is a difficult task. Its data structure is very complex and political texts usually contain a large number of words such that a simultaneous analysis of word counts becomes challenging. In this paper, we consider Poisson models for each word count simultaneously and provide a statistical analysis suitable for political text data. In particular, we allow for multi-dimensional party positions and develop a data-driven way of determining the dimension of positions. Allowing for multi-dimensional political positions gives new insights in the evolution of party positions and helps our understanding of a political system. Additionally, we consider a novel model which allows the political lexicon to change over time and develop an estimation procedure based on LASSO and fused LASSO penalization techniques to address high-dimensionality via significant dimension reduction. The latter model extension gives more insights into the potentially changing use of words by left and right-wing parties over time. Furthermore, the procedure is capable to identify automatically words having a discriminating effect between party positions. To address the potential dependence structure of the word counts over time, we included integer-valued time series processes into our modeling approach and we implemented a suitable bootstrap method to construct confidence intervals for the model parameters. We apply our approach to party manifesto data from German parties over all seven federal elections after German reunification. The approach is simply implemented as it does not require any a priori information (from external source) nor expert knowledge to process the data. The data studies confirm that our procedure is robust, runs stable and leads to meaningful and interpretable results.

**16:45 Uhr: Claudia Kirch (Universität Magdeburg): Frequency domain likelihood approximations for time series bootstrapping and Bayesian nonparametrics
**

A large class of time series methods are based on a Fourier analysis, which can be considered as a whitening of the data, giving rise for example to the famous Whittle likelihood. In particular, frequency domain bootstrap methods have been successfully applied in a large range of situations. In this talk, we will first review existing frequency domain bootstrap methodology for stationary time series before generalizing them for locally stationary time series. To this end, we first introduce a moving Fourier transformation that captures the time-varying spectral density in a similar manner as the classical Fourier transform does for stationary time series. We obtain consistent estimators for the local spectral densities and show that the corresponding bootstrap time series correctly mimics the covariance behavior of the original time series. The approach is illustrated by means of some simulations and an application to a wind data set. All time series bootstrap methods are implicitely using a likelihood approximation, which could be used explicitely in a Bayesian nonparametric framework for time series. So far, only the Whittle likelihood has been used in this context to get a nonparametric Bayesian estimation of the spectral density of stationary time series. In a second part of this talk we generalize this approach based on the implicit likelihood from the autoregressive aided periodogram bootstrap introduced by Kreiss and Paparoditis (2003). This likelihood combines a parametric approximation with a nonparametric correction making it particularly attractive for Bayesian applications. Some theoretic results about this likelihood approximation including posterior consistency in the Gaussian case are given. The performance is illustrated in simulations and an application to LIGO gravitational wave data.

------------------------------------------------------------------------------------------------------------------------------------------------------------

**Freitag, 07.07.2017**

TU Darmstadt, Fachbereich Mathematik, Schlossgartenstr. 7, Raum: tba:

**15:15 Uhr: Mikhail Lifshits (St. Petersburg): Energy saving approximation of random processes**

The classical linear prediction problem for a a wide sense stationary process consists of finding an element in the linear span of the past values providing the best possible mean square approximation to the current and future values of the process. In this talk we investigate this and some other similar problems where, in addition to prediction quality, optimization takes into account other features of the objects we search for. One of the most motivating examples of this kind is an approximation of a stationary process by a stationary/differentiable/ process taking into account the kinetic energy that the latter spends in its approximation efforts. We also provide appropriate extensions of the classical Kolmogorov and Krein prediction singularity criteria and Kolmogorov's criterion of error-free interpolation.

**16:45 Uhr: Vitali Wachtel (Augsburg): First-passage times over moving boundaries for random walks with non-identically distributed increments**

We consider a random walk $S_n$ with independent but not necessarily identical distributed increments. Assuming that increments satisfy the Lindeberg condition, we investigate the tail-behaviour of the time $T_g=\min\{n:x+S_n\leq g_n\}$ for a large class of boundaries $g_n$. We also prove limit theorems for $S_n$ conditioned on $T_x>n$.

------------------------------------------------------------------------------------------------------------------------------------------------------------

### Wegbeschreibungen

zur Anreise an die Uni Mainz finden Sie unter https://www.mathematik.uni-mainz.de/anfahrt/ , zur Anreise an die Uni Frankfurt unter https://www.uni-frankfurt.de/38074653/campus_bockenheim und https://www.uni-frankfurt.de/38093742/Campus_Bockenheim-pdf.pdf, zur Anreise an die TU Darmstadt unter http://www3.mathematik.tu-darmstadt.de/fb/mathe/wir-ueber-uns/adresse-und-lageplan/anreise.html

### Termine in früheren Semestern finden Sie hier.

Last update: 21.06.2017, S. Grün, gruen@mathematik.uni-mainz.de