Published (Peer Reviewed) |
[14] Quickest detection in practice in presence of seasonality: An illustration with call center data
N. El Karoui, S. Loisel, P.J. Laub, Y. Salhi
Book chapter to appear in Insurance data analytics : some case studies of advanced algorithms and applications (2020)
In this chapter, we explain how quickest detection algorithms can be useful for risk management in presence of seasonality.
We investigate the problem of detect- ing fast enough cases when a call center will need extra staff in a near future with a high probability.
We illustrate our findings on real data provided by a French insurer.
We also discuss the relevance of the cusum and of some machine-learning type competitor for this applied problem.
[13] Modelling net carrying amount of shares for market consistent valuation of life insurance liabilities
D. Dorobantu, Y. Salhi, P.-E. Thérond
Methodology and Computing in Applied Probability (2019), 22, 711-745 (2020)
The attractiveness of insurance saving products is driven, among others, by dividend payments to policyholders and participation in profits. These are mainly constrained by regulatory measures on profit-sharing on the basis of statutory accounts. Moreover, since both prudential and financial reporting regulations require market consistent best estimate measurement of insurance liabilities, cash-flow projection models have to be used for such a purpose in order to derive the underlying financial incomes. Such models are based on Monte-Carlo techniques. The latter should simulate future accounting profit and losses needed for profit-sharing mechanisms.
In this paper we deal with impairment losses on equity securities for financial portfolios which rely on instrument-by-instrument assessment (when projection models consider groups of shares). Our motivation is to describe the joint distribution of market value and impairment provision of a book of equity securities, with regard to the French accounting rules for depreciation. The results we obtain enable improving the ability of projection models to represent such an asymmetric mechanism. Formally, an impairment loss is recognized for an equity instrument if there has been a significant and prolonged decline in its market value below the carrying cost (acquisition value). Such constraints are formalized using an assumption about the dynamics of the equity, and leads to a complex option-like pay-off.
Using this formulation, we propose analytical formulas for some quantitative measurements related to the impairment losses of a financial equities book. These are derived from a general framework and some tractable examples are illustrated. We also investigate the operational implementation of these formulas and compare their computational time to a basic simulation approach.
[12] Le Prix du Risque de Longévité
N. El Karoui, C. Hillairet, S. Loisel, Y. Salhi
Revue d'Economie Financière, 133, 129-145 (2019)
Dans cet article, nous abordons la question du prix du risque de longévité. Nous
commençons par décrire le risque de longévité et ses composantes, en distinguant les aspects
biométriques, financiers et réglementaires. Nous expliquons ensuite les différents cadres
d’évaluation (actuariel, financier et réglementaire), leurs points communs et leurs divergences.
Nous traitons la question de l'actualisation et de la modélisation des taux d'intérêt
long terme pour la gestion du risque de longévité. Nous donnons également des détails sur
la façon subjective et pragmatique d'aborder certaines composantes du risque de longévité,
notamment les plus extrêmes, sur le marché.
[11] A Reduced-Form Model for A Life Insurance's Net Asset Value
A. Bignon, A. Ndjeng-Ndjeng, Y. Salhi, P.-E. Thérond
Bankers, Markets and Investors, 157 (2019)
In this paper we develop a closed-form model for the net asset value of a life insurance portfolio aimed at simplifying the assessment and quantification of the impact of financial stress scenarios on the insurer's solvency. In fact, using the current practice based on an internal model is time-consuming and thus it is not relevant when it comes to carry out sensitivity studies that should require rapid action from the management. Due to the nature of the stress scenarios that are mostly related to the financial market determinants, their impact is quite straightforward on the market value of financial assets. Therefore, in this paper, we focus on the distortion caused on the liability side and investigate a reduced-form model for the best estimate liabilities that is not only easily interpretable but also capable of anticipating market variation impact in the liabilities. The model is built based on a dataset drawn from a French life insurer's projection model using single, double and triple shocks on the interest rates yield curve, equity market value and profit sharing provision. In order to capture as much information as possible from the dataset, several feasible regression specifications are used. The general form of the empirical model is specified as a linear combination of the risk factors and its predictive ability is investigated based using an out-of-sample analysis.
[10] A Model-Point Approach to Indifference Pricing of Life Insurance Portfolios with Dependent Lives
C. Blanchet-Scalliet, D. Dorobantu, Y. Salhi
Methodology and Computing in Applied Probability (2019) 21(2) 423-448
In this paper, we study the pricing of life insurance portfolios in the presence of dependent lives. We assume that an insurer with an initial exposure to n mortality-contingent contracts wanted to acquire a second portfolio constituted of m individuals. The policyholders' lifetimes in these portfolios are correlated with a Farlie-Gumbel-Morgenstern (FGM) copula, which induces a dependency between the two portfolios. In this setting, we compute the indifference price charged by the insurer endowed with an exponential utility. The optimal price is characterized as a solution to a backward differential equation (BSDE). The latter can be decomposed into (n - 1)n! auxiliary BSDEs. In this general case, the derivation of the indifference price is computationally infeasible. Therefore, while focusing on the example of death benefit contracts, we develop a model point based approach in order to ease the computation of the price. It consists on replacing each portfolio with a single policyholder that replicates some risk metrics of interest. Also, the two representative agents should adequately reproduce the observed dependency between the initial portfolios.
[9] Age-Specific Adjustment of Graduated Mortality
Y. Salhi, P.-E. Thérond
ASTIN Bulletin (2018) 48(2) 543-569
Recently, there has been an increasing interest of life insurers to assess their portfolios own mortality risk. The new European prudential regulation, namely Solvency II, emphasized the need to use mortality and life tables that best capture and reflect the experienced mortality, and thus policyholders proper risk profile, in order to adequately quantify the underlying risk. Therefore, building a mortality table based on the experience from the portfolio is highly rec- ommended and, for this purpose, various approaches have been introduced in the literature. Although, such approaches succeed in capturing the main feature, it remains difficult to assess the mortality when the underlying portfolio lacks of sufficient exposure.
In this paper, we propose to graduate the mortality curve using an adaptive procedure based on the local likelihood, which has the ability to model the mortality patterns even in presence of complex structures and avoid to rely on experts opinion. However, such a technique fails at proposing a consistent yet regular structure when for portfolios with limited deaths. Although the technique borrows the information from the adjacent ages, it is sometimes not sufficient to produce a robust life tables. In presence of such a bias, we propose to adjust the corresponding curve, at the age level, based on a credibility approach. This consists on reviewing, as new observations arrive, the assumption on the mortality curve.
We derive the updating procedure and investigate the benefits of using the latter instead of a sole graduation based on real datasets. Moreover, we look at the divergences in the mortality forecasts generated by the classical credibility approaches including Hardy-Panjer, the Poisson- Gamma model and Makeham framework on portfolios originating from various French insurance companies.
[8] Alarm System for Credit Losses Impairment
Y. Salhi, P.-E. Thérond
Bulletin Français d'Actuariat (2017) 17(1) 131-161
The recent financial crisis has led the IASB to settle new reporting standards for financial instruments. The extended ability to measure some debt instruments at amortized cost is associated with a new impairment losses mechanism: Expected Credit Losses. In this paper, after a brief description of the principles elaborated by IASB for IFRS~9, we propose a methodology using credit default swaps (CDS for short) market prices in order to monitor significant changes in creditworthiness of financial instruments and subsequent credit losses impairment. This methodology is implemented in detail to a real world dataset. Numerical tests are drawn to assess the effectiveness of the procedure especially compared to changes of notation from credit rating agencies.
[7] A Class of Random Field Memory Models for Mortality Forecasting
P. Doukhan, J. Rynkiewicz, D. Pommeret, Y. Salhi
Insurance: Mathematics and Economics (2017) 77:97-110
This article proposes a parsimonious alternative approach for modeling the stochastic dynamics of mortality rates. Instead of the commonly used factor-based decomposition framework, we consider modeling mortality improvements using a random field specification with a given causal structure. Such a class of models introduces dependencies among adjacent cohorts aiming at capturing, among others, the cohort effects and cross generations correlations. It also describes the conditional heteroskedasticity of mortality. The proposed model is a generalization of the now widely used AR-ARCH models for random processes. For such class of models, we propose an estimation procedure for the parameters. Formally, we use the quasi-maximum likelihood estimator (QMLE) and show its statistical consistency and the asymptotic normality of the estimated parameters. The framework being general, we investigate and illustrate a simple variant, called the three-level memory model, in order to fully understand and assess the effectiveness of the approach for modeling mortality dynamics.
[6] Minimax Optimality in Robust Detection of a Disorder Time in Poisson Rate
N. El Karoui, S. Loisel, Y. Salhi
The Annals of Applied Porbability (2017) 27(4), 2515-2538
We consider the minimax quickest detection problem of an unobservable time of change in the rate of an inhomogeneous Poisson process. We seek a stopping rule that minimizes the robust Lorden (1971) criterion, formulated in terms of the number of events until detection, both for the worst-case delay and the false alarm constraint.
In the Wiener case, such a problem has been solved using the so-called cumulative sums (cusum) strategy by Shiryaev (1963, 2009), or Moustakides (2004) among others. In our setting, we derive the exact optimality of the cusum stopping rule by using finite variation calculus and elementary martingale properties to characterize the performance functions of the cusum stopping rule in terms of scale functions. These are solutions of some delayed differential equations that we solve elementarily. The case of detecting a decrease in the intensity is easy to study because the performance functions are continuous. In the case of an increase where the performance functions are not continuous, martingale properties require using a discontinuous local time. Nevertheless, from an identity relating the scale functions, the optimality of the cusum rule still holds. Finally, some numerical illustration are provided.
[5] Basis risk modeling: A co-integration based approach
Y. Salhi, S. Loisel
Statistics (2017) 51(1) 205-221
In this paper we propose a multivariate approach for forecasting pairwise mortality rates of related populations. The need for joint modeling of mortality rates is analyzed using a causality test. We show that for the datasets considered, the inclusion of national mortality information enhances predictions on its sub-populations. The investigated approach links national population mortality to that of a subset population, using an econometric model that captures a long-term relationship between the two mortality dynamics. This model does not focus on the correlation between the mortality rates of the two populations, but rather their long-term behavior, which suggests that the two times series cannot wander off in opposite directions for long before mean reverting, which is consistent with biological reasoning. The model can additionally capture short-term adjustments in the mortality dynamics of the two populations. An empirical comparison of the forecast of one-year death probabilities for policyholders is performed using both a classical factor-based model and the proposed approach. The robustness of the model is tested on mortality rate data for England and Wales, alongside the Continuous Mortality Investigation assured lives data set, representing the sub-population.
[4] Lapse Risk in Life Insurance: Correlation and Contagion Effects Among Policyholders' Behaviors
F. Barsotti, X. Milhaud, Y. Salhi
Insurance: Mathematics and Economics (2016) 71:317-331
The present paper extends the existing literature on lapse risk by presenting a flexible way to model the lapse decisions in a life insurance portfolio. Correlation and contagion effects among policyholders are embedded in the modeling and risk margins estimates can be easily obtained under both stable regimes and stress scenarios. The proposed approach integrates the effects of policyholders' behaviors through the definition of a mathematical framework where the lapse intensity follows a dynamic contagion process: an external component, the shot-noise intensity, is added to the Hawkes-based one. Contrary to previous works, our shot-noise intensity is not constant and the resulting intensity process is not Markovian. We study the influence of the interest rates dynamics on policyholders' behaviors and the resulting impact on lapse risk margins. Closed-form expressions for the moments of the lapse intensity are provided, showing how the lapse risk is affected by massive copycat behaviors. A sensitivity analysis studies the lapse risk metrics as function of the model' parameters, while a simulation study compares our results with the ones obtained using standard practices. The numerical outputs highlight a potential strong misestimation of lapses under extreme scenarios with classical stress testing methodologies.
[3] Partial splitting of longevity and financial risks: The life nominal chooser swaption
H. Bensusan, N. El Karoui, S. Loisel, Y. Salhi
Insurance: Mathematics and Economics (2016) 68:61-72
The previous attempts to launch liquid and standardized longevity derivatives in the market failed because banks do not seem to be ready to take longevity risk. Therefore, instead of trying to transfer longevity risk to investors, it could be interesting for financial institutions to propose interest rate protection adapted to longevity portfolios, in the spirit of liability driven investments.
In this paper, we introduce a new structured financial product: the so-called Life Nominal Chooser Swaption (LNCS). Thanks to such a contract, insurers could keep pure longevity risk and transfer a great part of interest rate risk underlying annuity portfolios to financial markets. Before the issuance of the contract, the insurer determines a confidence band of survival curves for her portfolio. An interest rate hedge is set up, based on swaption mechanisms. The bank uses this band as well as an interest rate model to price the product. At the end of the first period (e.g. 8 to 10 years), the insurer has the right to enter into an interest rate swap with the bank, where the nominal is adjusted to her (re-forecasted) needs. She chooses (inside the band) the survival curve that better fits her anticipation of future mortality of her portfolio (during 15 to 20 more years, say) given the information available at that time. This structure enables insurers and financial institutions to remain in their initial field of expertise.
We use a population dynamics longevity model and a classical two-factor interest rate model to price this product. Numerical results show that the option offered to the insurer (in terms of choice of nominal) is not too expensive in many real-world cases. We also discuss the pros and the cons of the product and of our methodology.
[2] A Credibility Approach for the Makeham Mortality Law
Y. Salhi, P.-E. Thérond, J. Tomas
European Actuarial Journal (2016) 6(1) 61-96
The present article illustrates a credibility approach to mortality. Interest from life insurers to assess their portfolios' mortality risk has considerably increased. The new reg- ulation and norms, Solvency II, have lightened the need of life tables that best reflect the experience of insured portfolios in order to quantify reliably the underlying mortality risk. In this context and following the work of Bulhlmann and Gisler (2005) and Hardy and Panjer (1998), we propose a credibility approach which consists on reviewing, as new observations arrive, the parameters of a Makeham graduation model. Such an adjustment allows to add a structure in the mortality pattern which is useful when portfolios are of limited size so as to ensure a good representation over the entire age bands considered. We investigate the divergences in the mortality forecasts generated by the classical credibility approaches of mortality including Hardy and Panjer (1998) and the Poisson-Gamma model on portfolios originating from various French insurance companies.
[1] Understanding, modelling and managing longevity risk: key issues and main challenges
P. Barrieu, H. Bensusan, N. El Karoui, C. Hillairet, S. Loisel, C. Ravanelli, Y. Salhi
Scandinavian actuarial journal (2012) 2012(3) 203-231
This article investigates the latest developments in longevity risk modelling, and explores the key risk management challenges for both the financial and insurance industries. The article discusses key definitions that are crucial for the enhancement of the way longevity risk is understood; providing a global view of the practical issues for longevity-linked insurance and pension products that have evolved concurrently with the steady increase in life expectancy since 1960s. In addition, the article frames the recent and forthcoming developments that are expected to action industry-wide changes as more effective regulation, designed to better assess and efficiently manage inherited risks, is adopted. Simultaneously, the evolution of longevity is intensifying the need for capital markets to be used to manage and transfer the risk through what are known as Insurance-Linked Securities (ILS). Thus, the article will examine the emerging scenarios, and will finally highlight some important potential developments for longevity risk management from a financial perspective with reference to the most relevant modelling and pricing practices in the banking industry.
Back to Top
| Published (Non-peer Reviewed) |
[3] Tables de mortalité best estimate: une approche de crédibilité pour les portefeuilles de petite taille
Y. Salhi
l'actuariel - № 31 - Jan. 2019
[2] De l'importance d'un bon suivi des hypothèses actuarielles
Y. Salhi
l'actuariel - № 22 - Oct. 2016
[1] Assurance : comment détecter une rupture dans la fréquence des sinistres ou l'intensité de mortalité ?
Y. Salhi
Les cahiers Louis Bachelier - № 19 - Nov. 2015
Back to Top
|
Submitted |
Optimal Model Selection for AR-ARCH Random Fields with Application to Mortality
P. Doukhan, J. Rynkiewicz, Y. Salhi
Submitted (2020)
This article proposes an optimal and robust methodology for model selection. The model of interest is a parsimonious alternative framework for modeling the stochastic dynamics of mortality improvement rates introduced by Doukhan et al. (2017). The approach models mortality improvements using a random field specification with a given causal structure instead of the commonly used factor-based decomposition framework. It captures some well documented stylized facts of mortality behavior: dependencies among adjacent cohorts, the cohort effects, cross generations correlations and the conditional heteroskedasticity of mortality. Such a class of models is a generalization of the now widely used AR-ARCH models for univariate processes. The framework being general, Doukhan et al. (2017) investigate and illustrate a simple variant, called the three-level memory model. However, it is not clear which is the best parametrization to use for specific mortality uses. In this paper, we investigate the optimal model choice and parameter selection among potential and candidate models. More formally, we propose a methodology well-suited to such a random field able to select the best model in the sense that the model is not only correct but also most economical among all the correct models. Formally, we show that a criterion based on a penalization of the log-likelihood, e.g. the using the Bayesian Information Criterion, is consistent. Finally, we investigate the methodology based on Monte-Carlo experiments as well as real-world datasets.
Semiparametric two-sample mixture components comparison test
X. Milhaud, D. Pommeret, Y. Salhi, P. Vandekerckhove
Submitted (2020)
We consider in this paper two-component mixture distributions having one known component. This is the case when a gold standard reference component is well known, and when a population contains such a component plus another one with different features. When two populations are drawn from such models, we propose a penalized chi-squared type testing procedure able to compare pairwise the unknown components, i.e. to test the equality of their residual features densities. An intensive numerical study is carried out from a large range of simulation setups to illustrate the asymptotic properties of our test. Moreover the testing procedure is applied on two real cases: i) mortality datasets, where results show that the test remains robust even in challenging situations where the unknown component only represents a small percentage of the global population, ii) galaxy velocities datasets, where stars luminosity mixed with the Milky Way are compared.
Credibility Adjustment of the Lee-Carter Longevity Model for Multiple Populations
J.-B. Coulomb, Y. Salhi, P.-E. Thérond
Submitted (2020)
In this paper we are interested in applying Bühlmann-Straub credibility for a multiple population model, in particular, in presence of small sized populations. We introduce a parsimonious extension of the classic Lee and Carter model in a similar manner as the joint-kappa model, which does impose a common factor to the considered populations. Hence, we propose an adjustment procedure based on the Bühlmann-Straub linear credibility into a Lee-Carter model in order to take into account the heterogeneity among the populations while allowing for learning effect for each population. The updating mechanism is based on the credibility estimations of future mortality rates depending on past observations through a recursive credibility formula. By doing so, the forecasts weight the importance of the information stemming from the single population while taking into account the neighboring populations. The proposed methodology is applied to real-world datasets and comparisons to classic mortality models is proposed.
Dynamic Bivariate Mortality Modelling
Y. Jiao, Y. Salhi, S. Wang
Submitted (2020)
The dependence structure of the life statuses plays an important role in the valuation of life insurance prod- ucts involving multiple lives. Although the mortality of individuals is well studied in the literature, their de- pendence remains an open field. Indeed, future lifetimes within a group of people like married couples, for example, can exhibit dependencies due to similar lifestyles or to the effect of exposure to common risk factors. In this paper, the main objective is to introduce a new approach for analyzing the mortality dependence between individuals in a couple. It is intended to describe in a dynamic framework the joint mortality of married couples in terms of marginal mortality rates. The proposed framework is general and aims to capture, by adjusting some parametric form, the desired effect. To this end, we use a well-suited multiplicative decomposition, which will serve as a building block for the framework and thus will be used to separate the dependence structure from the marginals. We make the link with the existing practice of affine mortality models. Moreover, we use the density approach introduced recently in the credit risk literature in order to characterize the broken-heart syndrome. Fi- nally, given that the framework is general, we propose some illustrative examples and show how the underlying model captures the main stylized facts of bivariate mortality dynamics.
Back to Top
|
Working Papers |
Semiparametric K-sample mixture components comparison test
X. Milhaud, D. Pommeret, Y. Salhi, P. Vandekerkhove
Work in progress (2020)
TBA.
Parsimonious predictive mortality modeling by regularization and cross-validation
K. Barigou, S. Loisel, Y. Salhi
Work in progress (2020)
TBA.
Bayesian Model Selection for AR-ARCH Random Fields with Application in Insurance
D. Pommeret, Y. Salhi
Work in progress (2020)
TBA.
Optimal Risk Sharing of Correlated Risks
N. Kazi-Tani, Y. Salhi
Work in progress (2020)
TBA.
Monitoring of Biometric Assumptions in Life Insurance
N. El Karoui, S. Loisel, Y. Salhi
Work in progress (2020)
TBA.
TBA
Back to Top
|