Research articles for the 2020-04-19

COVID-19: $R_0$ is lower where outbreak is larger
Pietro Battiston,Simona Gamba
arXiv

We use daily data from Lombardy, the Italian region most affected by the COVID-19 outbreak, to calibrate a SIR model individually on each municipality. These are all covered by the same health system and, in the post-lockdown phase we focus on, all subject to the same social distancing regulations. We find that municipalities with a higher number of cases at the beginning of the period analyzed have a lower rate of diffusion, which cannot be imputed to herd immunity. In particular, there is a robust and strongly significant negative correlation between the estimated basic reproduction number ($R_0$) and the initial outbreak size, in contrast with the role of $R_0$ as a \emph{predictor} of outbreak size. We explore different possible explanations for this phenomenon and conclude that a higher number of cases causes changes of behavior, such as a more strict adoption of social distancing measures among the population, that reduce the spread. This result calls for a transparent, real-time distribution of detailed epidemiological data, as such data affects the behavior of populations in areas affected by the outbreak.



Clustering Approaches for Global Minimum Variance Portfolio
Jinwoo Park
arXiv

The only input to attain the portfolio weights of global minimum variance portfolio (GMVP) is the covariance matrix of returns of assets being considered for investment. Since the population covariance matrix is not known, investors use historical data to estimate it. Even though sample covariance matrix is an unbiased estimator of the population covariance matrix, it includes a great amount of estimation error especially when the number of observed data is not much bigger than number of assets. As it is difficult to estimate the covariance matrix with high dimensionality all at once, clustering stocks is proposed to come up with covariance matrix in two steps: firstly, within a cluster and secondly, between clusters. It decreases the estimation error by reducing the number of features in the data matrix. The motivation of this dissertation is that the estimation error can still remain high even after clustering, if a large amount of stocks is clustered together in a single group. This research proposes to utilize a bounded clustering method in order to limit the maximum cluster size. The result of experiments shows that not only the gap between in-sample volatility and out-of-sample volatility decreases, but also the out-of-sample volatility gets reduced. It implies that we need a bounded clustering algorithm so that maximum clustering size can be precisely controlled to find the best portfolio performance.



Designing a NISQ reservoir with maximal memory capacity for volatility forecasting
Samudra Dasgupta,Kathleen E. Hamilton,Pavel Lougovski,Arnab Banerjee
arXiv

Quantitative risk management, particularly volatility forecasting, is critically important to traders, portfolio managers as well as policy makers. In this paper, we applied quantum reservoir computing for forecasting VIX (the CBOE volatility index), a highly non-linear and memory intensive `real-life' signal that is driven by market dynamics and trader psychology and cannot be expressed by a deterministic equation. As a first step, we lay out the systematic design considerations for using a NISQ reservoir as a computing engine (which should be useful for practitioners). We then show how to experimentally evaluate the memory capacity of various reservoir topologies (using IBM-Q's Rochester device) to identify the configuration with maximum memory capacity. Once the optimal design is selected, the forecast is produced by a linear combination of the average spin of a 6-qubit quantum register trained using VIX and SPX data from year 1990 onwards. We test the forecast performance over the sub-prime mortgage crisis period (Dec 2007 - Jun 2009). Our results show a remarkable ability to predict the volatility during the Great Recession using today's NISQs.



Empirical Study of Market Impact Conditional on Order-Flow Imbalance
Anastasia Bugaenko
arXiv

In this research we have empirically investigated the key drivers affecting liquidity in equity markets. We illustrated how theoretical models, such as Kyle's model, of agents' interplay in the financial markets, are aligned with the phenomena observed in publicly available trades and quotes data. Specifically, we confirmed that for small signed order-flows, the price impact grows linearly with increase in the order-flow imbalance. We have, further, implemented a machine learning algorithm to forecast market impact given a signed order-flow. Our findings suggest that machine learning models can be used in estimation of financial variables; and predictive accuracy of such learning algorithms can surpass the performance of traditional statistical approaches.

Understanding the determinants of price impact is crucial for several reasons. From a theoretical stance, modelling the impact provides a statistical measure of liquidity. Practitioners adopt impact models as a pre-trade tool to estimate expected transaction costs and optimize the execution of their strategies. This further serves as a post-trade valuation benchmark as suboptimal execution can significantly deteriorate a portfolio performance.

More broadly, the price impact reflects the balance of liquidity across markets. This is of central importance to regulators as it provides an all-encompassing explanation of the correlation between market design and systemic risk, enabling regulators to design more stable and efficient markets.



Induced Innovation and Economic Environment
Tomáš Evan,Vladimír Holý
arXiv

The Hicks induced innovation hypothesis states that a price increase of a production factor is a spur to invention. We propose an alternative hypothesis restating that a spur to invention require not only an increase of one factor but also a decrease of at least one other factor to offset the companies' cost. We illustrate the need for our alternative hypthesis in a historical example of the industrial revolution in the United Kingdom. Furthermore, we econometrically evaluate both hypotheses in a case study of research and development (R&D) in 29 OECD countries from 2003 to 2017. Specifically, we investigate dependence of investments to R&D on economic environment represented by average wages and oil prices using panel regression. We find that our alternative hypothesis is supported for R&D funded and/or performed by business enterprises while the original Hicks hypothesis holds for R&D funded by the government and R&D performed by universities.



Machine learning for multiple yield curve markets: fast calibration in the Gaussian affine framework
Sandrine Gümbel,Thorsten Schmidt
arXiv

Calibration is a highly challenging task, in particular in multiple yield curve markets. This paper is a first attempt to study the chances and challenges of the application of machine learning techniques for this. We employ Gaussian process regression, a machine learning methodology having many similarities with extended Kalman filtering - a technique which has been applied many times to interest rate markets and term structure models.

We find very good results for the single curve markets and many challenges for the multi curve markets in a Vasicek framework. The Gaussian process regression is implemented with the Adam optimizer and the non-linear conjugate gradient method, where the latter performs best. We also point towards future research.



Minimizing the Ruin Probability under the Sparre Andersen Model
Linlin Tian,Lihua Bai
arXiv

In this paper, we consider the problem of minimizing the ruin probability of an insurance company in which the surplus process follows the Sparre Andersen model. Similar to Bai et al. \cite{bai2017optimal}, we recast this problem in a Markovian framework by adding another dimension representing the time elapsed since the last claim. After Markovization, We investigate the regularity properties of the value function and state the dynamic programming principle. Furthermore, we show that the value function is the unique constrained viscosity solution to the associated Hamilton-Jacobi-Bellman equation. It should be noted that there is no discount factor in our paper, which makes it tricky to prove the uniqueness. To overcome this difficulty, we construct the strict viscosity supersolution. Then instead of comparing the usual viscosity supersolution and subsolution, we compare the supersolution and the strict subsolution. Eventually we show that all viscosity subsolution is less than the supersolution.



Modeling Institutional Credit Risk with Financial News
Tam Tran-The
arXiv

Credit risk management, the practice of mitigating losses by understanding the adequacy of a borrower's capital and loan loss reserves, has long been imperative to any financial institution's long-term sustainability and growth. MassMutual is no exception. The company is keen on effectively monitoring downgrade risk, or the risk associated with the event when credit rating of a company deteriorates. Current work in downgrade risk modeling depends on multiple variations of quantitative measures provided by third-party rating agencies and risk management consultancy companies. As these structured numerical data become increasingly commoditized among institutional investors, there has been a wide push into using alternative sources of data, such as financial news, earnings call transcripts, or social media content, to possibly gain a competitive edge in the industry. The volume of qualitative information or unstructured text data has exploded in the past decades and is now available for due diligence to supplement quantitative measures of credit risk. This paper proposes a predictive downgrade model using solely news data represented by neural network embeddings. The model standalone achieves an Area Under the Receiver Operating Characteristic Curve (AUC) of more than 80 percent. The output probability from this news model, as an additional feature, improves the performance of our benchmark model using only quantitative measures by more than 5 percent in terms of both AUC and recall rate. A qualitative evaluation also indicates that news articles related to our predicted downgrade events are specially relevant and high-quality in our business context.



Optimal market making under partial information with general intensities
Diego Zabaljauregui,Luciano Campi
arXiv

Starting from the Avellaneda-Stoikov framework, we consider a market maker who wants to optimally set bid/ask quotes over a finite time horizon, to maximize her expected utility. The intensities of the orders she receives depend not only on the spreads she quotes, but also on unobservable factors modelled by a hidden Markov chain. We tackle this stochastic control problem under partial information with a model that unifies and generalizes many existing ones under full information, combining several risk metrics and constraints, and using general decreasing intensity functionals. We use stochastic filtering, control and piecewise-deterministic Markov processes theory, to reduce the dimensionality of the problem and characterize the reduced value function as the unique continuous viscosity solution of its dynamic programming equation. We then solve the analogous full information problem and compare the results numerically through a concrete example. We show that the optimal full information spreads are biased when the exact market regime is unknown, and the market maker needs to adjust for additional regime uncertainty in terms of P&L sensitivity and observed order flow volatility. This effect becomes higher, the longer the waiting time in between orders.



The End of Leisure and Retirement, Covid-19: Innovations, Jobs, pensions, and Keynes: Guaranteed Income or Future Poverty and Redundancy?
Caldararo, Niccolo Leo
SSRN
The history of the support by society of the aged is discussed in cross cultural and historical context. Various cultural traditions are compared with the forms developed in complex societies from ancient Egypt and Greece and Rome, to China, the Aztec, Inca and Maya, to those of religious organizations, or those developed under different modern ideological systems like capitalism and communism as well as social democratic nations. It is found that the way a society values the aged and views their contribution to society determines largely their willingness to provide for their support. An increasing number of companies have gone bankrupt in recent years following the 2007 credit crisis and stock market collapse. More have raided their pension funds to stay afloat or have closed them and transferred liability to the federal Pension Benefit Guaranty Corporation. Major changes to federal law concerning pensions and the responsibility of corporations to fund them has made under the Pension Protection Act of 2006. World wide workers̢۪ retirement payments are under assault as are investments by pension funds due to laws governing priority of payment in different countries concerning stock holders vs bondholders and liability for pension funds. The need for retirement of some kind in the post-Covid-19 world will require new forms as well as recovery of pre-Covid-19 savings and investments. Changes in the law are proposed to increase the stability of pensions and reliability to workers of pension payments.

The socio-economic determinants of the coronavirus disease (COVID-19) pandemic
Viktor Stojkoski,Zoran Utkovski,Petar Jolakoski,Dragan Tevdovski,Ljupco Kocarev
arXiv

The magnitude of the coronavirus disease (COVID-19) pandemic has an enormous impact on the social life and the economic activities in almost every country in the world. Besides the biological and epidemiological factors, a multitude of social and economic criteria also govern the extent of the coronavirus disease spread in the population. Consequently, there is an active debate regarding the critical socio-economic determinants that contribute to the resulting pandemic. In this paper, we contribute towards the resolution of the debate by leveraging Bayesian model averaging techniques and country level data to investigate the potential of 35 determinants, describing a diverse set of socio-economic characteristics, in explaining the coronavirus pandemic outcome.



This time is indeed different: A study on global market reactions to public health crisis
Daniel Schell, ,Wang, Mei ,Huynh, Toan Luu Duc
SSRN
This paper studies the differences in stock market reaction to the same kind of disease-related news by analyzing abnormal returns of global stock markets during Public Health Risk Emergency of International Concern (PHEIC) announcements. Drawing the data from 26 stock market indices over the period from 22 April 2008 to 12 March 2020, we compare stock market reactions to all six PHEIC announcements made by World Health Organization since 2009. Although the PHEIC announcements can be categorized as the same type of event, we found no consistent patterns in the market reactions to PHEIC announcements. The markets did not show significant reactions in a 30-day event window, which suggests a relatively low economic impact of the diseases on a global scale, except for Covid-19. Among all diseases included in our study, only Covid-19 had a significant negative effect on stock markets at least lasting 30 days.

What do Index Options Teach us About Covid-19?
Jackwerth, Jens Carsten
SSRN
The Covid-19 crisis shows up in option-implied risk-neutral distributions rather late. As of January 17, 2020, markets were still impervious to the crisis. February 21, 2020 saw a slight increase in volatility, and only on March 20, 2020, is the full impact visible. The shift is more moderate at short maturities (out to June 19 and December 18, 2020). It turns severe for the three-year maturity out to December 16, 2022, with a pronounced bimodality showing a sizeable crash scenario. The corresponding physical distribution instead features a symmetric high volatility scenario instead.