Research articles for the 2019-12-31
arXiv
Repeated application of machine-learning, eigen-centric methods to an evolving dataset reveals that eigenvectors calculated by well-established computer implementations are not stable along an evolving sequence. This is because the sign of any one eigenvector may point along either the positive or negative direction of its associated eigenaxis, and for any one eigen call the sign does not matter when calculating a solution. This work reports an algorithm that creates a consistently oriented basis of eigenvectors. The algorithm postprocesses any well-established eigen call and is therefore agnostic to the particular implementation of the latter. Once consistently oriented, directional statistics can be applied to the eigenvectors in order to track their motion and summarize their dispersion. When a consistently oriented eigensystem is applied to methods of machine-learning, the time series of training weights becomes interpretable in the context of the machine-learning model. Ordinary linear regression is used to demonstrate such interpretability. A reference implementation of the algorithm reported herein has been written in Python and is freely available, both as source code and through the thucyd Python package.
arXiv
We introduce a two-agent problem which is inspired by price asymmetry arising from funding difference. When two parties have different funding rates, the two parties deduce different fair prices for derivative contracts even under the same pricing methodology and parameters. Thus, the two parties should enter the derivative contracts with a negotiated price, and we call the negotiation a risk-sharing problem. This framework defines the negotiation as a problem that maximizes the sum of utilities of the two parties. By the derived optimal price, we provide a theoretical analysis on how the price is determined between the two parties. As well as the price, the risk-sharing framework produces an optimal amount of collateral. The derived optimal collateral can be used for contracts between financial firms and non-financial firms. However, inter-dealers markets are governed by regulations. As recommended in Basel III, it is a convention in inter-dealer contracts to pledge the full amount of a close-out price as collateral. In this case, using the optimal collateral, we interpret conditions for the full margin requirement to be indeed optimal.
arXiv
This document is an ongoing review on the state of the art of clustering financial time series and the study of correlation and other interaction networks. This preliminary document is intended for researchers in this field so that they can feedback to allow amendments, corrections and addition of new material unknown to the authors of this review. The aim of the document is to gather in one place the relevant material that can help the researcher in the field to have a bigger picture, the quantitative researcher to play with this alternative modeling of the financial time series, and the decision maker to leverage the insights obtained from these methods. We hope that this document will form a basis for implementation of an open toolbox of standard tools to study correlations, hierarchies, networks and clustering in financial markets.
arXiv
We propose an algorithm which predicts each subsequent time step relative to the previous time step of intractable short rate model (when adjusted for drift and overall distribution of previous percentile result) and show that the method achieves superior outcomes to the unbiased estimate both on the trained dataset and different validation data.
arXiv
It is generally understood that a given one-dimensional diffusion may be transformed by Cameron-Martin-Girsanov measure change into another one-dimensional diffusion with the same volatility but a different drift. But to achieve this we have to know that the change-of-measure local martingale that we write down is a true martingale; we provide a complete characterization of when this happens. This is then used to discuss absence of arbitrage in a generalized Heston model including the case where the Feller condition for the volatility process is violated.
arXiv
We explore the effect of past market movements on the instantaneous correlations between assets within the futures market. Quantifying this effect is of interest to estimate and manage the risk associated to portfolios of futures in a non-stationary context. We apply and extend a previously reported method called the Principal Regression Analysis (PRA) to a universe of $84$ futures contracts between $2009$ and $2019$. We show that the past up (resp. down) 10 day trends of a novel predictor -- the eigen-factor -- tend to reduce (resp. increase) instantaneous correlations. We then carry out a multifactor PRA on sectorial predictors corresponding to the four futures sectors (indexes, commodities, bonds and currencies), and show that the effect of past market movements on the future variations of the instantaneous correlations can be decomposed into two significant components. The first component is due to the market movements within the index sector, while the second component is due to the market movements within the bonds sector.
arXiv
We consider discrete default intensity based and logit type reduced form models for conditional default probabilities for corporate loans where we develop simple closed form approximations to the maximum likelihood estimator (MLE) when the underlying covariates follow a stationary Gaussian process. In a practically reasonable asymptotic regime where the default probabilities are small, say 1-3% annually, the number of firms and the time period of data available is reasonably large, we rigorously show that the proposed estimator behaves similarly or slightly worse than the MLE when the underlying model is correctly specified. For more realistic case of model misspecification, both estimators are seen to be equally good, or equally bad. Further, beyond a point, both are more-or-less insensitive to increase in data. These conclusions are validated on empirical and simulated data. The proposed approximations should also have applications outside finance, where logit-type models are used and probabilities of interest are small.
arXiv
At the initial stages of this research, the assumption was that the franchised businesses perhaps should not be affected much by recession as there are multiple cash pools available inherent to the franchised business model. However, after analyzing the available data, it indicated otherwise, the stock price performance as discussed indicates a different pattern. The stock price data is analyzed with an unconventional tool, Weibull distribution and observations confirmed the presence of either a reverse trend in franchised business than what is observed for non-franchised or the franchised stock followed large food suppliers. There is a layered ownership and cash flow in a franchised business model. The parent company run by franchiser depends on the performance of child companies run by franchisees. Both parent and child companies are run as independent businesses but only the parent company is listed as a stock ticker in stock exchange. Does this double layer of vertical operation, cash reserve, and cash flow protect them better in recession? The data analyzed in this paper indicates that the recession effect can be more severe; and if it dives with the average market, expect a slower recovery of stock prices in a franchised business model. This paper characterizes the differences and explains the natural experiment with available financial data.
arXiv
When stock prices are observed at high frequencies, more information can be utilized in estimation of parameters of the price process. However, high-frequency data are contaminated by the market microstructure noise which causes significant bias in parameter estimation when not taken into account. We propose an estimator of the Ornstein-Uhlenbeck process based on the maximum likelihood which is robust to the noise and utilizes irregularly spaced data. We also show that the Ornstein-Uhlenbeck process contaminated by the independent Gaussian white noise and observed at discrete equidistant times follows an ARMA(1,1) process. To illustrate benefits of the proposed noise-robust approach, we analyze an intraday pairs trading strategy based on the mean-variance optimization. In an empirical study of 7 Big Oil companies, we show that the use of the proposed estimator of the Ornstein-Uhlenbeck process leads to an increase in profitability of the pairs trading strategy.
arXiv
A general method is presented to compute the variance of a linear stochastic process through a matrix Lyapunov differential equation. This approach, adopted from control theory, is alternative and easier with respect to the classical arguments found in quantitative finance literature and can be readily applied to high-dimensional models. Both analytical and numerical methods to solve the Lyapunov equation are discussed and compared in terms of computational efficiency. A practical application is presented, where numerical and analytical solutions for the variance of a two-factor mean-reverting model are embedded into the Black pricing framework and market calibration of model parameters is performed. It is shown that the availability of an analytical formula for the variance makes the calibration 14 times faster, thus proving the practical value of tractable and general methods to derive it.
arXiv
We propose a new method for conducting Bayesian prediction that delivers accurate predictions without correctly specifying the unknown true data generating process. A prior is defined over a class of plausible predictive models. After observing data, we update the prior to a posterior over these models, via a criterion that captures a user-specified measure of predictive accuracy. Under regularity, this update yields posterior concentration onto the element of the predictive class that maximizes the expectation of the accuracy measure. In a series of simulation experiments and empirical examples we find notable gains in predictive accuracy relative to conventional likelihood-based prediction.
arXiv
Markov Chain Monte Carlo methods become increasingly popular in applied mathematics as a tool for numerical integration with respect to complex and high-dimensional distributions. However, application of MCMC methods to heavy tailed distributions and distributions with analytically intractable densities turns out to be rather problematic. In this paper, we propose a novel approach towards the use of MCMC algorithms for distributions with analytically known Fourier transforms and, in particular, heavy tailed distributions. The main idea of the proposed approach is to use MCMC methods in Fourier domain to sample from a density proportional to the absolute value of the underlying characteristic function. A subsequent application of the Parseval's formula leads to an efficient algorithm for the computation of integrals with respect to the underlying density. We show that the resulting Markov chain in Fourier domain may be geometrically ergodic even in the case of heavy tailed original distributions. We illustrate our approach by several numerical examples including multivariate elliptically contoured stable distributions.
arXiv
Using a purely probabilistic argument, we prove the global well-posedness of multidimensional superquadratic backward stochastic differential equations (BSDEs) without Markovian assumption. The key technique is the interplay between the local well-posedness of fully coupled path-dependent forward backward stochastic differential equations (FBSDEs) and backward iterations of the superquadratic BSDE. The superquadratic BSDE studied in this article includes quadratic BSDEs appear in stochastic differential game and price impact model. We also study the well-posedness of superquadratic FBSDE using the corresponding BSDE results. Our result also provides the well-posedness of a system of path-dependent quasilinear partial differential equations where the nonlinearity has superquadratic growth in the gradient of the solution.
arXiv
The indirect transactions between sectors of an economic system has been a long-standing open problem. There have been numerous attempts to define and mathematically formulate this concept in various other scientific fields in literature as well. The existing indirect effects formulations, however, cannot quantify the indirect transactions between any two sectors of an economic system. Consequently, although the direct and total requirement matrices are formulated and used for economic system analysis, the indirect requirements matrix has never been formulated before. Based on the system decomposition theory, the indirect transactions and the corresponding indirect requirements matrix are introduced in the present article for the first time. This novel concept of the indirect transactions is also compared with some existing indirect effect formulations, and the theoretical advancement brought by the proposed methodology is discussed. It is shown theoretically and through illustrative examples that the proposed indirect transactions accurately describe and quantify the indirect interactions and relationships, unlike the current indirect effects formulations. The indirect requirements matrices for the US economy using aggregated input-output tables for multiple years are also presented and briefly analyzed.
arXiv
We examine problems of "intermediated implementation," in which a single principal can only regulate limited aspects of the consumption bundles traded between intermediaries and agents with hidden characteristics. An example is sales, whereby retailers compete through offering consumption bundles to customers with hidden tastes, whereas a manufacturer with a potentially different goal than retailers' is limited to regulating the sold goods but not the charged prices by legal barriers. We study how the principal can implement through intermediaries any social choice rule that is incentive compatible and individually rational for agents. We demonstrate the effectiveness of per-unit fee schedule and distribution regulation, which hinges on whether intermediaries have private or interdependent values. We give further applications to healthcare regulation and income redistribution.
arXiv
What do binary (or probabilistic) forecasting abilities have to do with overall performance? We map the difference between (univariate) binary predictions, bets and "beliefs" (expressed as a specific "event" will happen/will not happen) and real-world continuous payoffs (numerical benefits or harm from an event) and show the effect of their conflation and mischaracterization in the decision-science literature. We also examine the differences under thin and fat tails. The effects are:
A- Spuriousness of many psychological results particularly those documenting that humans overestimate tail probabilities and rare events, or that they overreact to fears of market crashes, ecological calamities, etc. Many perceived "biases" are just mischaracterizations by psychologists. There is also a misuse of Hayekian arguments in promoting prediction markets.
We quantify such conflations with a metric for "pseudo-overestimation".
B- Being a "good forecaster" in binary space doesn't lead to having a good actual performance}, and vice versa, especially under nonlinearities. A binary forecasting record is likely to be a reverse indicator under some classes of distributions. Deeper uncertainty or more complicated and realistic probability distribution worsen the conflation .
C- Machine Learning: Some nonlinear payoff functions, while not lending themselves to verbalistic expressions and "forecasts", are well captured by ML or expressed in option contracts.
D- Fattailedness: The difference is exacerbated in the power law classes of probability distributions.
arXiv
We provide sufficient conditions on the coefficients of a stochastic evolution equation on a Hilbert space of functions driven by a cylindrical Wiener process ensuring that its mild solution is positive if the initial datum is positive. As an application, we discuss the positivity of forward rates in the Heath-Jarrow-Morton model via Musiela's stochastic PDE.
arXiv
An open market is a subset of an entire equity market composed of a certain fixed number of top capitalization stocks. Though the number of stocks in the open market is fixed, the constituents of the market change over time as each company's rank by its market capitalization fluctuates. When one is allowed to invest also in the money market, the open market resembles the entire 'closed' equity market in the sense that the equivalence of market viability (lack of arbitrage) and the existence of numeraire portfolio (portfolio which cannot be outperformed) holds. When access to the money market is prohibited, some topics such as Capital Asset Pricing Model (CAPM), construction of functionally generated portfolios, and the concept of the universal portfolio are presented in the open market setting.
arXiv
We consider the problem of portfolio optimization with a correlation constraint. The framework is the multiperiod stochastic financial market setting with one tradable stock, stochastic income and a non-tradable index. The correlation constraint is imposed on the portfolio and the non-tradable index at some benchmark time horizon. The goal is to maximize portofolio's expected exponential utility subject to the correlation constraint. Two types of optimal portfolio strategies are considered: the subgame perfect and the precommitment ones. We find analytical expressions for the constrained subgame perfect (CSGP) and the constrained precommitment (CPC) portfolio strategies. Both these portfolio strategies yield significantly lower risk when compared to the unconstrained setting, at the cost of a small utility loss. The performance of the CSGP and CPC portfolio strategies is similar.
arXiv
We prove a maximum principle for mild solutions to stochastic evolution equations with (locally) Lipschitz coefficients and Wiener noise on weighted $L^2$ spaces. As an application, we provide sufficient conditions for the positivity of forward rates in the Heath-Jarrow-Morton model, considering the associated Musiela SPDE on a homogeneous weighted Sobolev space.
arXiv
We investigate heterogenous employment effects of Flemish training programmes. Based on administrative individual data, we analyse programme effects at various aggregation levels using Modified Causal Forests (MCF), a causal machine learning estimator for multiple programmes. While all programmes have positive effects after the lock-in period, we find substantial heterogeneity across programmes and types of unemployed. Simulations show that assigning unemployed to programmes that maximise individual gains as identified in our estimation can considerably improve effectiveness. Simplified rules, such as one giving priority to unemployed with low employability, mostly recent migrants, lead to about half of the gains obtained by more sophisticated rules.
arXiv
This paper aims to analyze the relationship between yield curve -being a line of the interests in various maturities at a given time- and GDP growth in Turkey. The paper focuses on analyzing the yield curve in relation to its predictive power on Turkish macroeconomic dynamics using the linear regression model. To do so, the interest rate spreads of different maturities are used as a proxy of the yield curve. Findings of the OLS regression are similar to that found in the literature and supports the positive relation between slope of yield curve and GDP growth in Turkey. Moreover, the predicted values of the GDP growth from interest rate spread closely follow the actual GDP growth in Turkey, indicating its predictive power on the economic activity.
arXiv
Pairs Trading is carried out in the financial market to earn huge profits from known equilibrium relation between pairs of stock. In financial markets, seldom it is seen that stock pairs are correlated at particular lead or lag. This lead-lag relationship has been empirically studied in various financial markets. Earlier research works have suggested various measures for identifying the best pairs for pairs trading, but they do not consider this lead-lag effect. The present study proposes a new distance measure which incorporates the lead-lag relationship between the stocks while selecting the best pairs for pairs trading. Further, the lead-lag value between the stocks is allowed to vary continuously over time. The proposed measures importance has been show-cased through experimentation on two different datasets, one corresponding to Indian companies and another corresponding to American companies. When the proposed measure is clubbed with SSD measure, i.e., when pairs are identified through optimising both these measures, then the selected pairs consistently generate the best profit, as compared to all other measures. Finally, possible generalisation and extension of the proposed distance measure have been discussed.
arXiv
In life-cycle economics the Samuelson paradigm (Samuelson, 1969) states that the optimal investment is in constant proportions out of lifetime wealth composed of current savings and the present value of future income. It is well known that in the presence of credit constraints this paradigm no longer applies. Instead, optimal lifecycle investment gives rise to so-called stochastic lifestyling (Cairns et al., 2006), whereby for low levels of accumulated capital it is optimal to invest fully in stocks and then gradually switch to safer assets as the level of savings increases. In stochastic lifestyling not only does the ratio between risky and safe assets change but also the mix of risky assets varies over time. While the existing literature relies on complex numerical algorithms to quantify optimal lifestyling the present paper provides a simple formula that captures the main essence of the lifestyling effect with remarkable accuracy.
arXiv
Systemic liquidity risk, defined by the IMF as "the risk of simultaneous liquidity difficulties at multiple financial institutions", is a key topic in macroprudential policy and financial stress analysis. Specialized models to simulate funding liquidity risk and contagion are available but they require not only banks' bilateral exposures data but also balance sheet data with sufficient granularity, which are hardly available. Alternatively, risk analyses on interbank networks have been done via centrality measures of the underlying graph capturing the most interconnected and hence more prone to risk spreading banks. In this paper, we propose a model which relies on an epidemic model which simulate a contagion on the interbank market using the funding liquidity shortage mechanism as contagion process. The model is enriched with country and bank risk features which take into account the heterogeneity of the interbank market. The proposed model is particularly useful when full set of data necessary to run specialized models is not available. Since the interbank network is not fully available, an economic driven reconstruction method is also proposed to retrieve the interbank network by constraining the standard reconstruction methodology to real financial indicators. We show that the contagion model is able to reproduce systemic liquidity risk across different years and countries. This result suggests that the proposed model can be successfully used as a valid alternative to more complex ones.
arXiv
This paper aims to investigate the role of gold as a hedge and/or safe haven against oil price and currency market movements for medium (calm period) and large (extreme movement) fluctuations. In revisiting the role of gold, our study proposes new insights into the literature. First, our empirical design relaxes the assumption of homogeneous investors in favour of agents with different horizons. Second, we develop a new measure of correlation based on the fractal approach, called the q-detrending moving average cross-correlation coefficient. This allows us to measure the dependence for calm and extreme movements. The proposed measure is both time-varying and time-scale varying, taking into account the complex pattern of commodities and financial time series (chaotic, non-stationary, etc.). Using intraday data from May 2017 to March 2019, including 35608 observations for each variable, our results are as follows. First, we show a negative and significant average and tail dependence for all time scales between gold and USD exchange rates that is consistent with the gold's role as an effective hedge and safe-haven asset. Second, this study puts out average independence and positive and significant tail independence between gold and oil indicating that gold can be used by investors as a weak hedge but cannot be used as an effective safe-haven asset under exceptional market circumstances for all time scales. Third, we examine the hedging and stabilising benefits of gold over calm and turmoil periods for gold-oil futures and gold-currency portfolios by estimation of the optimal portfolio weights and the optimal hedge ratio. We confirm the usefulness of gold for hedging and safe havens at different investment horizons, which favors the inclusion of gold futures in oil futures and currency portfolios for risk management purposes.
arXiv
Birth rates have dramatically decreased and, with continuous improvements in life expectancy, pension expenditure is on an irreversibly increasing path. This will raise serious concerns for the sustainability of the public pension systems usually financed on a pay-as-you-go (PAYG) basis where current contributions cover current pension expenditure. With this in mind, the aim of this paper is to propose a mixed pension system that consists of a combination of a classical PAYG scheme and an increase of the contribution rate invested in a funding scheme. The investment of the funding part is designed so that the PAYG pension system is financially sustainable at a particular level of probability and at the same time provide some gains to individuals. In this sense, we make the individuals be an active part to face the demographic risks inherent in the PAYG and re-establish its financial sustainability.