# Research articles for the 2019-06-16

arXiv

Designing a financial market that works well is very important for developing and maintaining an advanced economy, but is not easy because changing detailed rules, even ones that seem trivial, sometimes causes unexpected large impacts and side effects. A computer simulation using an agent-based model can directly treat and clearly explain such complex systems where micro processes and macro phenomena interact. Many effective agent-based models investigating human behavior have already been developed. Recently, an artificial market model, which is an agent-based model for a financial market, has started to contribute to discussions on rules and regulations of actual financial markets. I introduce an artificial market model to design financial markets that work well and describe a previous study investigating tick size reduction. I hope that more artificial market models will contribute to designing financial markets that work well to further develop and maintain advanced economies.

arXiv

Over-the-counter markets are at the center of the postcrisis global reform of the financial system. We show how the size and structure of such markets can undergo rapid and extensive changes when participants engage in portfolio compression, a post-trade netting technology. Tightly-knit and concentrated trading structures, as featured by many large over-the-counter markets, are especially susceptible to reductions of notional and reconfigurations of network structure resulting from compression activities. Using transaction-level data on credit-default-swaps markets, we estimate reduction levels consistent with the historical development observed in these markets since the Global Financial Crisis. Finally, we study the effect of a mandate to centrally clear over-the-counter markets. When participants engage in both central clearing and portfolio compression, we find large netting failures if clearinghouses proliferate. Allowing for compression across clearinghouses by-and-large offsets this adverse effect.

arXiv

This paper presents a framework of developing neural networks for predicting implied volatility surfaces. Conventional financial conditions and empirical evidence related to the implied volatility are incorporated into the neural network architecture design and model training including no static arbitrage, boundaries, asymptotic slope and volatility smile. They are also satisfied empirically by the option data on the S&P 500 index over twenty years. The developed neural network model and its simplified variations outperform the widely used surface stochastic volatility inspired (SSVI) model on the mean average percentage error in both in-sample and out-of-sample datasets. This study has two main methodological contributions. First, an accurate deep learning prediction model is developed and tailored to implied volatility surfaces. Second, a framework, which seamlessly combines data-driven models with financial theories, can be extended and applied to solve other related business problems.

arXiv

In this paper, we establish sample path large and moderate deviation principles for log-price processes in Gaussian stochastic volatility models, and study the asymptotic behavior of exit probabilities, call pricing functions, and the implied volatility. In addition, we prove that if the volatility function in an uncorrelated Gaussian model grows faster than linearly, then, for the asset price process, all the moments of order greater than one are infinite. Similar moment explosion results are obtained for correlated models.

arXiv

This paper employs machine learning algorithms to forecast German electricity spot market prices. The forecasts utilize in particular bid and ask order book data from the spot market but also fundamental market data like renewable infeed and expected demand. Appropriate feature extraction for the order book data is developed. Using cross-validation to optimise hyperparameters, neural networks and random forests are proposed and compared to statistical reference models. The machine learning models outperform traditional approaches.

arXiv

The issue of model risk in default modeling has been known since inception of the Academic literature in the field. However, a rigorous treatment requires a description of all the possible models, and a measure of the distance between a single model and the alternatives, consistent with the applications. This is the purpose of the current paper. We first analytically describe all possible joint models for default, in the class of finite sequences of exchangeable Bernoulli random variables. We then measure how the model risk of choosing or calibrating one of them affects the portfolio loss from default, using two popular and economically sensible metrics, Value-at-Risk (VaR) and Expected Shortfall (ES).

arXiv

In this article we consider a large structural market model of defaultable assets, where the asset value processes are modelled by using stochastic volatility models with default upon hitting a lower boundary. The volatility processes are picked from a class of general mean-reverting diffusions satisfying certain regularity assumptions. The value processes and the volatility processes are all correlated through systemic Brownian motions. We prove that our system converges as the portfolio becomes large, and the limit of the empirical measure process has a density which is the unique solution to an SPDE in the two-dimensional half-space with a Dirichlet boundary condition. We use Malliavin calculus to establish the existence of a regular density for the volatility component of the density satisfying our stochastic initial-boundary value problem, and we improve an existing kernel smoothing technique to obtain higher regularity and uniqueness results.

arXiv

We introduce here for the first time the long-term swap rate, characterised as the fair rate of an overnight indexed swap with infinitely many exchanges. Furthermore we analyse the relationship between the long-term swap rate, the long-term yield, see Biagini et al. [2018], Biagini and H\"artel [2014], and El Karoui et al. [1997], and the long-term simple rate, considered in Brody and Hughston [2016] as long-term discounting rate. We finally investigate the existence of these long-term rates in two term structure methodologies, the Flesaker-Hughston model and the linear-rational model. A numerical example illustrates how our results can be used to estimate the non-optional component of a CoCo bond.

arXiv

Quantifying the importance and power of individual nodes depending on their position in socio-economic networks constitutes a problem across a variety of applications. Examples include the reach of individuals in (online) social networks, the importance of individual banks or loans in financial networks, the relevance of individual companies in supply networks, and the role of traffic hubs in transport networks. Which features characterize the importance of a node in a trade network during the emergence of a globalized, connected market? Here we analyze a model that maps the evolution of trade networks to a percolation problem. In particular, we focus on the influence of topological features of the node within the trade network. Our results reveal that an advantageous position with respect to different length scales determines the success of a node at different stages of globalization and depending on the speed of globalization.