## Contributions to robustness theory

### La Vecchia, Davide ; Ronchetti, Elvezio (Dir.) ; Trojani, Fabio (Codir.)

### Thèse de doctorat : Università della Svizzera italiana, 2011 ; 2011ECO004.

The goal of this PhD Thesis is the definition of new robust estimators, thereby extending the available theory and exploring new directions for applications in finance. The Thesis contains three papers, which analyze three different types of estimators: M-, Minimum Distance- and R- estimators. The focus is manly of their infinitesimal robustness, but global robustness properties are also... Plus

Ajouter à la liste personnelle- Summary
- The goal of this PhD Thesis is the definition of new robust estimators, thereby extending the available theory and exploring new directions for applications in finance. The Thesis contains three papers, which analyze three different types of estimators: M-, Minimum Distance- and R- estimators. The focus is manly of their infinitesimal robustness, but global robustness properties are also considered. The first paper (Higher-order infinitesimal robustness) studies M-estimators and it is a joint work with Elvezio Ronchetti and Fabio Trojani. Using the higher-order von Mises expansion, we go beyond the Influence Function and we extend Hampel's paradigm of robustness, introducing higher-order infinitesimally robust M-estimators. We show that a bounded estimating function having also bounded gradient with respect to the parameter ensures, at the same time, the stability of the: (i) second-order approximated bias (B-robustness); (ii) asymptotic variance (V-robustness), and (iii) saddlepoint density approximation. An application in finance (static risk management) concludes the paper. The second paper (On robust estimation via pseudo-additive information measures) is jointly written with Davide Ferrari and it introduces a new class of Minimum Divergence (in the following, MD) estimators. The theoretical contribution of the paper is to show that robustness is dual to information theory. Information theory plays a crucial role in statistical inference: Maximum Likelihood estimators are related to it through the minimization of Shannon entropy (namely, minimization of the Kullback-Leibler divergence). The fundamental axiom characterizing Shannon entropy is additivity. Relaxing this assumption, we obtain a generalized entropy (called q-entropy) which exploits the link between information theory and infinitesimal robustness. Minimizing the q-entropy, we define a new class of MD robust re- descending estimators, featuring B-, V-robustness and that have also good global robustness properties in terms of high-breakdown. The third paper (Semi-parametric rank-based tests and estimators for Markov processes) contains the preliminary results of a working paper that I have started in Princeton, working with Marc Hallin. The paper deals with R-estimators and rank-based tests. Precisely, combining the flexibility of the semi-parametric approach with the distribution- freeness of rank statistics, we define R-estimators and tests for stationary Markov processes. An application to inference and testing in stochastic volatility (SV) models concludes the paper.