75 résultats
pour « Quantification des risques »
This work presents a framework for constructing elicitable risk measures with properties like monotonicity, translation invariance, and convexity using multiplicative scoring functions. It defines necessary conditions for these properties and provides a method for developing new elicitable functionals, with applications in finance, statistics, and machine learning.
This paper examines the Solvency II correlation matrix used in Solvency Capital Requirement (SCR) calculations. It warns against misinterpreting null correlations as independence and highlights the matrix's limitations without a well-defined probabilistic model. It also critiques the flawed practice of arbitrarily increasing correlations to inflate capital requirements conservatively.
The paper explores Pareto optimality in decentralized peer-to-peer risk-sharing markets using robust distortion risk measures. It characterizes optimal risk allocations, influenced by agents' tail risk assessments. Using flood risk insurance as an example, the study compares decentralized and centralized market structures, highlighting benefits and drawbacks of decentralized insurance.
Elicitable functionals and consistent scoring functions aid in optimal forecasting but assume correct distributions, which is unrealistic. To address this, robust elicitable functionals account for small misspecifications using Kullback-Leibler divergence. These robust functionals maintain statistical properties and are applied in reinsurance and robust regression settings.
The RNN-HAR model, integrating Recurrent Neural Networks with the heterogeneous autoregressive (HAR) model, is proposed for Value at Risk (VaR) forecasting. It effectively captures long memory and non-linear dynamics. Empirical analysis from 2000 to 2022 shows RNN-HAR outperforms traditional HAR models in one-step-ahead VaR forecasting across 31 market indices.
This report uses UK fire statistics to model insurance claims for a company next year. It estimates the total sum of claims by modeling both the number and size of fires as random variables from statistical distributions. Monte Carlo simulations in R are used to predict the probability distribution of total claim costs.
"We study the general properties of robust convex risk measures as worst-case values under uncertainty on random variables. We establish general concrete results regarding convex conjugates and sub-differentials. We refine some results for closed forms of worstcase law invariant convex risk measures under two concrete cases of uncertainty sets for random variables: based on the first two moments and Wasserstein balls."
The paper proposes a novel approach using Monte Carlo Simulation to quantitatively prioritize project risks based on their impact on project duration and cost, addressing limitations of traditional risk matrices and enabling project managers to differentiate critical risks according to their specific impact on time or cost objectives.
"The risk measures contain some premium principles and shortfalls based on entropy. The shortfalls include the Gini shortfall, extended Gini shortfall, shortfall of cumulative residual entropy and shortfall of cumulative residual Tsallis entropy with order α."
New estimators for generalized tail distortion (GTD) risk measures are proposed, based on first-order asymptotic expansions, offering simplicity and comparable or better performance than existing methods. A reinsurance premium principle using GTD risk measure is tested on car insurance claims data, suggesting its effectiveness in embedding safety loading in pricing to counter statistical uncertainty.