# Research articles for the 2021-07-01

A structural approach to default modelling with pure jump processes
Jean-Philippe Aguilar,Nicolas Pesci,Victor James
arXiv

We present a general framework for the estimation of corporate default based on a firm's capital structure, when its assets are assumed to follow a pure jump L\'evy processes; this setup provides a natural extension to usual default metrics defined in diffusion (log-normal) models, and allows to capture extreme market events such as sudden drops in asset prices, which are closely linked to default occurrence. Within this framework, we introduce several pure jump processes featuring negative jumps only and derive practical closed formulas for equity prices, which enable us to use a moment-based algorithm to calibrate the parameters from real market data and to estimate the associated default metrics. A notable feature of these models is the redistribution of credit risk towards shorter maturity: this constitutes an interesting improvement to diffusion models, which are known to underestimate short term default probabilities. We also provide extensions to a model featuring both positive and negative jumps and discuss qualitative and quantitative features of the results. For readers convenience, practical tools for model implementation and GitHub links are also included.

Choice of a Mentor: A Subjective Evaluation of Expectations, Experiences and Feedbacks
Kaibalyapati Mishra
arXiv

Recent trends in academics show an increase in enrollment levels in higher education Predominantly in Doctoral programmes where individual scholars institutes and supervisors play the key roles The human factor at receiving end of academic excellence is the scholar having a supervisor at the facilitating end In this paper I try to establish the role of different factors and availability of information about them in forming the basic choice set in a scholars mind After studying three different groups of individuals who were subjected to substitutive choices we found that scholars prefer an approachable, moderately intervening and frequently interacting professor as their guide

Consistent Recalibration Models and Deep Calibration
Matteo Gambara,Josef Teichmann
arXiv

Consistent Recalibration models (CRC) have been introduced to capture in necessary generality the dynamic features of term structures of derivatives' prices. Several approaches have been suggested to tackle this problem, but all of them, including CRC models, suffered from numerical intractabilities mainly due to the presence of complicated drift terms or consistency conditions. We overcome this problem by machine learning techniques, which allow to store the crucial drift term's information in neural network type functions. This yields first time dynamic term structure models which can be efficiently simulated.

Crude oil price forecasting incorporating news text
Yun Bai,Xixi Li,Hao Yu,Suling Jia
arXiv

Sparse and short news headlines can be arbitrary, noisy, and ambiguous, making it difficult for classic topic model LDA (latent Dirichlet allocation) designed for accommodating long text to discover knowledge from them. Nonetheless, some of the existing research about text-based crude oil forecasting employs LDA to explore topics from news headlines, resulting in a mismatch between the short text and the topic model and further affecting the forecasting performance. Exploiting advanced and appropriate methods to construct high-quality features from news headlines becomes crucial in crude oil forecasting. To tackle this issue, this paper introduces two novel indicators of topic and sentiment for the short and sparse text data. Empirical experiments show that AdaBoost.RT with our proposed text indicators, with a more comprehensive view and characterization of the short and sparse text data, outperforms the other benchmarks. Another significant merit is that our method also yields good forecasting performance when applied to other futures commodities.

Feasible Implied Correlation Matrices from Factor Structures
arXiv

Forward-looking correlations are of interest in different financial applications, including factor-based asset pricing, forecasting stock-price movements or pricing index options. With a focus on non-FX markets, this paper defines necessary conditions for option implied correlation matrices to be mathematically and economically feasible and argues, that existing models are typically not capable of guaranteeing so. To overcome this difficulty, the problem is addressed from the underlying factor structure and introduces two approaches to solve it. Under the quantitative approach, the puzzle is reformulated into a nearest correlation matrix problem which can be used either as a stand-alone estimate or to re-establish positive-semi-definiteness of any other model's estimate. From an economic approach, it is discussed how expected correlations between stocks and risk factors (like CAPM, Fama-French) can be translated into a feasible implied correlation matrix. Empirical experiments are carried out on monthly option data of the S\&P 100 and S\&P 500 index (1996-2020).

Forecasting directional movements of stock prices for intraday trading using LSTM and random forests
Pushpendu Ghosh,Ariel Neufeld,Jajati Keshari Sahoo
arXiv

We employ both random forests and LSTM networks (more precisely CuDNNLSTM) as training methodologies to analyze their effectiveness in forecasting out-of-sample directional movements of constituent stocks of the S&P 500 from January 1993 till December 2018 for intraday trading. We introduce a multi-feature setting consisting not only of the returns with respect to the closing prices, but also with respect to the opening prices and intraday returns. As trading strategy, we use Krauss et al. (2017) and Fischer & Krauss (2018) as benchmark. On each trading day, we buy the 10 stocks with the highest probability and sell short the 10 stocks with the lowest probability to outperform the market in terms of intraday returns -- all with equal monetary weight. Our empirical results show that the multi-feature setting provides a daily return, prior to transaction costs, of 0.64% using LSTM networks, and 0.54% using random forests. Hence we outperform the single-feature setting in Fischer & Krauss (2018) and Krauss et al. (2017) consisting only of the daily returns with respect to the closing prices, having corresponding daily returns of 0.41% and of 0.39% with respect to LSTM and random forests, respectively.

Governance in Systems Based on Distributed Ledger Technology (DLT): A Comparative Study
Naudts, Ellen,Aerts, Timothy,franken, leonard,Pieterse, Aimo
SSRN
The (experimental) use of DLT is growing, also in the financial sector. DLT systems themselves advocate the fact that they use (consensus) algorithms and cryptography to create a leaderless system (horizontal). How is it possible to ensure that decisions in these DLT systems are in the interest of all stakeholders and the public interest, including supervisory authorities? Given the popularity of the Bitcoin and Ethereum blockchains, for example, it is not inconceivable that a part of the FMI will be based on a public DLT system in the future. It is therefore important to gain a better understanding of the governance of such DLT systems, and what the risks are for FIs when using DLT. This paper compares the governance of financial institutions (FIs) and financial market infrastructures (FMIs) on the one hand with the governance of systems based on distributed ledger technology (DLT systems), on the other, to discover how they differ, and where potential risks and benefits lie for their future use, also in the financial markets. We answer two questions: i) What are the differences in governance between traditional FIs, FMIs and DLT systems? and ii) What are the consequences of decentralized governance of DLT systems for supervisory authorities?

Improving Liquidity in Emission Trading Schemes
Park, Kwangwoo,Kim, Jihun
SSRN
This paper constructs a model of an Emission Trading Scheme (ETS) market using bid-ask spreads. We show that when such a market is dominated by a small number of traders with substantial market power, they tend to maximize their profits by widening bid-ask spreads, thereby reducing market liquidity. We argue that adding more market participants, including derivatives traders, can alleviate this illiquidity problem. Policy changes at the European Unionâ€™s ETS illustrate our theory, as the market significantly increased liquidity by enacting liquidity-provision policies to attract more participants as it transitioned from Phase 1 to Phase 2.

Market regime classification with signatures
Paul Bilokon,Antoine Jacquier,Conor McIndoe
arXiv

We provide a data-driven algorithm to classify market regimes for time series. We utilise the path signature, encoding time series into easy-to-describe objects, and provide a metric structure which establishes a connection between separation of regimes and clustering of points.

Political and legal aspects of the COVID-19 pandemic impact on world transport systems
Alexey Gubin,Valeri Lipunov,Mattia Masolletti
arXiv

The authors of the article analyze the impact of the global COVID-19 pandemic on the transport and logistics sector. The research is interdisciplinary in nature. The purpose of the study is to identify and briefly characterize new trends in the field of transport and cargo transportation in post-COVID conditions.

Price change prediction of ultra high frequency financial data based on temporal convolutional network
Wei Dai,Yuan An,Wen Long
arXiv

Through in-depth analysis of ultra high frequency (UHF) stock price change data, more reasonable discrete dynamic distribution models are constructed in this paper. Firstly, we classify the price changes into several categories. Then, temporal convolutional network (TCN) is utilized to predict the conditional probability for each category. Furthermore, attention mechanism is added into the TCN architecture to model the time-varying distribution for stock price change data. Empirical research on constituent stocks of Chinese Shenzhen Stock Exchange 100 Index (SZSE 100) found that the TCN framework model and the TCN (attention) framework have a better overall performance than GARCH family models and the long short-term memory (LSTM) framework model for the description of the dynamic process of the UHF stock price change sequence. In addition, the scale of the dataset reached nearly 10 million, to the best of our knowledge, there has been no previous attempt to apply TCN to such a large-scale UHF transaction price dataset in Chinese stock market.

Quantifying Long-Term Market Impact
Harvey, Campbell R.,Ledford, Anthony,Sciulli, Emidio,Ustinov, Philipp,Zohren, Stefan
SSRN
Impact costs occur when large buy or sell orders move market prices. The measurement of these costs is crucial for the evaluation of potential trading strategies as well as the successful execution of systematic investment strategies. However, common approaches suffer from a type of myopia: impact is only measured for the current transaction. In many cases, orders are correlated and the impact of the first order will affect the execution of future orders. We propose a new measure that quantifies the long-term effects of market impact: Expected Future Flow Shortfall (EFFS). Our method is both intuitive and straightforward to implement. Importantly, the EFFS method performs competitively with far more complex and data hungry approaches. Our method should be useful for both the evaluation of execution methods as well as the sizing of orders.

Relative arbitrage: sharp time horizons and motion by curvature
RePEC
We characterize the minimal time horizon over which any equity market with d ≥ 2 stocks and sufficient intrinsic volatility admits relative arbitrage. If d ∈ {2, 3}, the minimal time horizon can be computed explicitly, its value being zero if √ d = 2 and 3/(2π) if d = 3. If d ≥ 4, the minimal time horizon can be characterized via the arrival time function of a geometric flow of the unit simplex in R d that we call the minimum curvature flow.

Review Essay: Institutional Shareholders, Short-Termism and the Odds of a Coincidence
Tingle, QC, Bryce
SSRN
The essay reviews two recent book-length treatments of the rise of institutional shareholder power in the corporate governance arrangements of Western corporations, the internal incentives of those shareholders, and the economic consequences.The books reviewed: Corporate Governance and Investment Management: The Promises and Limitations of the New Financial Economy, by Roger M. Barker and Iris H.-Y. Chiu (Cheltenham, U.K.; Northampton, Massachusetts: Edward Elgar, 2017)Corporate Law and Economic Stagnation: How Shareholder Value and Short-Termism Contribute to the Decline of Western Economies, by Pavlos E. Masouros (The Hague, Netherlands: Eleven International Publishing, 2013)

Robust Replication of Volatility and Hybrid Derivatives on Jump Diffusions
Peter Carr,Roger Lee,Matthew Lorig
arXiv

We price and replicate a variety of claims written on the log price $X$ and quadratic variation $[X]$ of a risky asset, modeled as a positive semimartingale, subject to stochastic volatility and jumps. The pricing and hedging formulas do not depend on the dynamics of volatility process, aside from integrability and independence assumptions; in particular, the volatility process may be non-Markovian and exhibit jumps of unknown distribution. The jump risk may be driven by any finite activity Poisson random measure with bounded jump sizes. As hedging instruments, we use the underlying risky asset, a zero-coupon bond, and European calls and puts with the same maturity as the claim to be hedged. Examples of contracts that we price include variance swaps, volatility swaps, a claim that pays the realized Sharpe ratio, and a call on a leveraged exchange traded fund.

State-Owned Commercial Banks
Panizza, Ugo
RePEC
This paper builds a new dataset on bank ownership and reassesses the links between state-ownership of banks and each of financial development, economic growth, financial stability, bank performance, liquidity creation, and lending cyclicality. Using panel data to estimate the short-and medium-term relationship between state-ownership and financial depth, the paper shows that there is no robust correlation between these two variables. The paper also finds no evidence of a negative correlation between state-ownership of banks and economic growth (if anything, the relationship is positive but rarely statistically significant). Looking at financial instability, the paper finds that banking crises predict increases in state-ownership but that there is no evidence that high state-ownership predicts banking crises. Focusing on bank performance, the paper shows that data for the period 1995-2009 are consistent with existing evidence that state owned banks are less profitable than their private counterparts in emerging and developing economies. However, more recent data show no difference between the profitability of private and public banks located in emerging and developing economies. The paper also corroborates the existing literature which shows that in emerging and developing economies lending by state-owned banks is less procyclical than private bank lending. Exploring the role of fiscal fundamentals, the paper does not find any difference in countercyclicality between high and low debt countries, but it finds that countercyclical lending by state-owned banks substitutes, rather than complement, countercyclical fiscal policy. It also finds that lending by state-owned banks helps smoothing production in labor intensive industries and in industries with a large share of small firms.

Statistical Arbitrage for Multiple Co-Integrated Stocks
T. N. Li,A. Papanicolaou
arXiv

In this paper we construct and analyse a multi-asset model to be used for long-term statistical arbitrage strategies. A key feature of the model is that all assets have \textit{co-integration}, which, if sustained, allows for long-term positive profits with low probability of losses. Optimal portfolios are found by solving a Hamilton-Jacobi-Bellman equation, to which we can introduce portfolio constraints such as market neutral or dollar neutral. Under specific conditions of the parameters, we can prove there is long-term stability for an optimal portfolio with stable growth rate. Historical prices of the S\&P500 constituents can be tested for co-integration and our model calibrated for analysis, from which we find that co-integration strategies require a terminal investment horizon sufficiently far into the future in order for the optimal portfolios to gain from co-integration. The data also demonstrates that statistical arbitrage portfolios will have improved in-sample Sharpe ratios compared to multivariate Merton portfolios, and that statistical arbitrage portfolios are naturally immune to market fluctuations.

The Limit Order Book Recreation Model (LOBRM): An Extended Analysis
Zijian Shi,John Cartlidge
arXiv

The limit order book (LOB) depicts the fine-grained demand and supply relationship for financial assets and is widely used in market microstructure studies. Nevertheless, the availability and high cost of LOB data restrict its wider application. The LOB recreation model (LOBRM) was recently proposed to bridge this gap by synthesizing the LOB from trades and quotes (TAQ) data. However, in the original LOBRM study, there were two limitations: (1) experiments were conducted on a relatively small dataset containing only one day of LOB data; and (2) the training and testing were performed in a non-chronological fashion, which essentially re-frames the task as interpolation and potentially introduces lookahead bias. In this study, we extend the research on LOBRM and further validate its use in real-world application scenarios. We first advance the workflow of LOBRM by (1) adding a time-weighted z-score standardization for the LOB and (2) substituting the ordinary differential equation kernel with an exponential decay kernel to lower computation complexity. Experiments are conducted on the extended LOBSTER dataset in a chronological fashion, as it would be used in a real-world application. We find that (1) LOBRM with decay kernel is superior to traditional non-linear models, and module ensembling is effective; (2) prediction accuracy is negatively related to the volatility of order volumes resting in the LOB; (3) the proposed sparse encoding method for TAQ exhibits good generalization ability and can facilitate manifold tasks; and (4) the influence of stochastic drift on prediction accuracy can be alleviated by increasing historical samples.

The Role of Binance in Bitcoin Volatility Transmission
Carol Alexander,Daniel Heck,Andreas Kaeck
arXiv

We analyse high-frequency realised volatility dynamics and spillovers in the bitcoin market, focusing on two pairs: bitcoin against the US dollar (the main fiat-crypto pair) and trading bitcoin against tether (the main crypto-crypto pair). We find that the tether-margined perpetual contract on Binance is clearly the main source of volatility, continuously transmitting strong flows to all other instruments and receiving only a little volatility. Moreover, we find that (i) during US trading hours, traders pay more attention and are more reactive to prevailing market conditions when updating their expectations and (ii) the crypto market exhibits a higher interconnectedness when traditional Western stock markets are open. Our results highlight that regulators should not only consider spot exchanges offering bitcoin-fiat trading but also the tether-margined derivatives products available on most unregulated exchanges, most importantly Binance.

The incremental information in the yield curve about future interest rate risk
Christensen, Bent Jesper,Kjær, Mads Markvart,Veliyev, Bezirgen
RePEC
Using high-frequency intraday futures prices to measure yield volatility at selected maturities, we find that daily yield curves carry incremental information about future interest rate risk at the long end, relative to that contained in the time series of historical volatilities. Some of the information in the yield curves is not captured by standard affine models. At the short end, time series based forecasts outperform yield curve based forecasts. Both provide utility to a risk averse investor in longerterm instruments, not in short, relative to a random walk. Our results point to the existence of an unspanned volatility factor.

Unconventional Credit Policy in an Economy under Zero Lower Bound
Pozo, Jorge,Rojas, Youel
RePEC
In this paper we develop a simple two-period model that reconciles credit demand and supply frictions. In this stylized but realistic model credit and deposit markets are interlinked and credit demand and credit supply frictions amplify each other in such a way that produces in equilibrium very low levels of credit and stronger reductions of the real and nominal interest, so an economy is much closer to the ZLB. However, an unconventional credit policy, that consists on central bank loans to firms that are guaranteed by the government, can undo partially the effects of the credit frictions and prevents the economy from reaching the ZLB. Since central bank loans are not subject to the moral hazard problem between bankers and depositors and are government-guaranteed, credit market interventions rise aggregate credit supply and positively affect the aggregate credit demand, respectively. However, once the economy is at the ZLB the effect of a credit policy is reduced due to a relatively stronger inflation reduction, which in turn reduces entrepreneurs' incentives to demand bank loans.