# Research articles for the 2020-12-21

An Empirical Evaluation On The Effectiveness Of Medicaid Expansion Across 49 States
Tung Yu Marco Chan
arXiv

In 2014 the Patient Protection and Affordable Care Act (ACA) introduced the expansion of Medicaid where states can opt to expand the eligibility for those in need of free health insurance. In this paper, we attempt to assess the effectiveness of Medicaid expansion on health outcomes of state populations using Difference-in-Difference (DD) regressions to seek for causal impacts of expanding Medicaid on health outcomes in 49 states. We find that in the time frame of 2013 to 2016, Medicaid expansion seems to have had no significant impact on the health outcomes of states that have chosen to expand.

Association between COVID-19 cases and international equity indices
Nick James,Max Menzies
arXiv

This paper analyzes the impact of COVID-19 on the populations and equity markets of 92 countries. We compare country-by-country equity market dynamics to cumulative COVID-19 case and death counts and new case trajectories. First, we examine the multivariate time series of cumulative cases and deaths, particularly regarding their changing structure over time. We reveal similarities between the case and death time series, and key dates that the structure of the time series changed. Next, we classify new case time series, demonstrate five characteristic classes of trajectories, and quantify discrepancy between them with respect to the behavior of waves of the disease. Finally, we show there is no relationship between countries' equity market performance and their success in managing COVID-19. Each country's equity index has been unresponsive to the domestic or global state of the pandemic. Instead, these indices have been highly uniform, with most movement in March.

Censored EM algorithm for Weibull mixtures: application to arrival times of market orders
Markus Kreer,Ayse Kizilersu,Anthony W. Thomas
arXiv

In a previous analysis the problem of "zero-inflated" time data (caused by high frequency trading in the electronic order book) was handled by left-truncating the inter-arrival times. We demonstrated, using rigorous statistical methods, that the Weibull distribution describes the corresponding stochastic dynamics for all inter-arrival time differences except in the region near zero. However, since the truncated Weibull distribution was not able to describe the huge "zero-inflated" probability mass in the neighbourhood of zero (making up approximately 50\% of the data for limit orders), it became clear that the entire probability distribution is a mixture distribution of which the Weibull distribution is a significant part. Here we use a censored EM algorithm to analyse data for the difference of the arrival times of market orders, which usually have a much lower percentage of zero inflation, for four selected stocks trading on the London Stock Exchange.

Forecasting day-ahead electricity prices: A review of state-of-the-art algorithms, best practices and an open-access benchmark
Jesus Lago,Grzegorz Marcjasz,Bart De Schutter,Rafał Weron
arXiv

While the field of electricity price forecasting has benefited from plenty of contributions in the last two decades, it arguably lacks a rigorous approach to evaluating new predictive algorithms. The latter are often compared using unique, not publicly available datasets and across too short and limited to one market test samples. The proposed new methods are rarely benchmarked against well established and well performing simpler models, the accuracy metrics are sometimes inadequate and testing the significance of differences in predictive performance is seldom conducted. Consequently, it is not clear which methods perform well nor what are the best practices when forecasting electricity prices. In this paper, we tackle these issues by performing a literature survey of state-of-the-art models, comparing state-of-the-art statistical and deep learning methods across multiple years and markets, and by putting forward a set of best practices. In addition, we make available the considered datasets, forecasts of the state-of-the-art models, and a specifically designed python toolbox, so that new algorithms can be rigorously evaluated in future studies.

High-frequency dynamics of the implied volatility surface
Bastien Baldacci
arXiv

We present a Hawkes modeling of the volatility surface's high-frequency dynamics and show how the Hawkes kernel coefficients govern the surface's skew and convexity. We provide simple sufficient conditions on the coefficients to ensure no-arbitrage opportunities of the surface. Moreover, these conditions reduce the number of the kernel's parameters to estimate. Finally, we show that at the macroscopic level, the surface is driven by a sum of risk factors whose volatility processes are rough.

Instabilities in Multi-Asset and Multi-Agent Market Impact Games
Francesco Cordoni,Fabrizio Lillo
arXiv

We consider the general problem of a set of agents trading a portfolio of assets in the presence of transient price impact and additional quadratic transaction costs and we study, with analytical and numerical methods, the resulting Nash equilibria. Extending significantly the framework of Schied and Zhang (2018) and Luo and Schied (2020), who considered the one asset case, we focus our attention on the conditions on the value of transaction cost making the trading profile of the agents, and as a consequence the price trajectory, wildly oscillating and the market unstable. We prove the existence and uniqueness of the corresponding Nash equilibria for the related mean-variance optimization problem. We find that the presence of more assets and a large number of agents make the market more prone to large oscillations and instability. When the number of assets is fixed, a more complex structure of the cross-impact matrix, i.e. the existence of multiple factors for liquidity, makes the market less stable compared to the case when a single liquidity factor exists.

Kicking You When You're Already Down: The Multipronged Impact of Austerity on Crime
arXiv

The UK Welfare Reform Act 2012 imposed a series of welfare cuts, which disproportionately impacted ex-ante poorer areas. In this paper, we consider the impact of these austerity measures on two different but complementary elements of crime -- the crime rate and the less-studied concentration of crime -- over the period 2011-2015 in England and Wales, and document four new facts. First, areas more exposed to the welfare reforms experience increased levels of crime, an effect driven by a rise in violent crime. Second, both violent and property crime become more concentrated within an area due to the welfare reforms. Third, it is ex-ante more deprived neighborhoods that bear the brunt of the crime increases over this period. Fourth, we find no evidence that the welfare reforms increased recidivism, suggesting that the changes in crime we find are likely driven by new criminals. Combining these results, we document unambiguous evidence of a negative spillover of the welfare reforms at the heart of the UK government's austerity program on social welfare, which reinforced the direct inequality-worsening effect of this program. More deprived districts are more exposed to the welfare reforms, and it is these districts that then experience the further negative consequences of the reforms via increased crime. Our findings underscore the importance of considering both multiple dimensions of crime as well as considering different levels of spatial aggregation of crime data. Given that it is violent crime that responds to the (economically-based) welfare cuts, our work also highlights the need to develop better economic models of non-rational crime.

Levelling Down and the COVID-19 Lockdowns: Uneven Regional Recovery in UK Consumer Spending
John Gathergood,Fabian Gunzinger,Benedict Guttman-Kenney,Edika Quispe-Torreblanca,Neil Stewart
arXiv

We show the recovery in consumer spending in the United Kingdom through the second half of 2020 is unevenly distributed across regions. We utilise Fable Data: a real-time source of consumption data that is a highly correlated, leading indicator of Bank of England and Office for National Statistics data. The UK's recovery is heavily weighted towards the "home counties" around outer London and the South. We observe a stark contrast between strong online spending growth while offline spending contracts. The strongest recovery in spending is seen in online spending in the "commuter belt" areas in outer London and the surrounding localities and also in areas of high second home ownership, where working from home (including working from second homes) has significantly displaced the location of spending. Year-on-year spending growth in November 2020 in localities facing the UK's new tighter "Tier 3" restrictions (mostly the midlands and northern areas) was 38.4% lower compared with areas facing the less restrictive "Tier 2" (mostly London and the South). These patterns had been further exacerbated during November 2020 when a second national lockdown was imposed. To prevent such COVID-19-driven regional inequalities from becoming persistent we propose governments introduce temporary, regionally-targeted interventions in 2021. The availability of real-time, regional data enables policymakers to efficiently decide when, where and how to implement such regional interventions and to be able to rapidly evaluate their effectiveness to consider whether to expand, modify or remove them.

National Accounts as a Stock-Flow Consistent System, Part 1: The Real Accounts
Matti Estola,Kristian Vepsäläinen
arXiv

The 2008 economic crisis was not forecastable by at that time existing models of macroeconomics. Thus macroeconomics needs new tools. We introduce a model based on National Accounts that shows how macroeconomic sectors are interconnected. These connections explain the spread of business cycles from one industry to another and from financial sector to the real economy. These lingages cannot be explained by General Equilibrium type of models. Our model describes the real part of National Accounts (NA) of an economy. The accounts are presented in the form of a money flow diagram between the following macro-sectors: Non-financial firms, financial firms, households, government, and rest of the world. The model contains all main items in NA and the corresponding simulation model creates time paths for 59 key macroeconomic quantities for an unlimited future. Finnish data of NA from time period 1975-2012 is used in calibrating the parameters of the model, and the model follows the historical data with sufficient accuracy. Our study serves as a basis for systems analytic macro-models that can explain the positive and negative feed-backs in the production system of an economy. These feed-backs are born from interactions between economic units and between real and financial markets. JEL E01, E10.

Key words: Stock-Flow Models, National Accounts, Simulation model.

On the difference between the volatility swap strike and the zero vanna implied volatility
Elisa Alos,Frido Rolloos,Kenichiro Shiraya
arXiv

In this paper, Malliavin calculus is applied to arrive at exact formulas for the difference between the volatility swap strike and the zero vanna implied volatility for volatilities driven by fractional noise. To the best of our knowledge, our estimate is the first to derive the rigorous relationship between the zero vanna implied volatility and the volatility swap strike. In particular, we will see that the zero vanna implied volatility is a better approximation for the volatility swap strike than the ATMI.

Optimal ratcheting of dividends in a Brownian risk model
Hansjoerg Albrecher,Pablo Azcue,Nora Muler
arXiv

We study the problem of optimal dividend payout from a surplus process governed by Brownian motion with drift under the additional constraint of ratcheting, i.e. the dividend rate can never decrease. We solve the resulting two-dimensional optimal control problem, identifying the value function to be the unique viscosity solution of the corresponding Hamilton-Jacobi-Bellman equation. For finitely many admissible dividend rates we prove that threshold strategies are optimal, and for any finite continuum of admissible dividend rates we establish the $\varepsilon$-optimality of curve strategies. This work is a counterpart of Albrecher et al. (2020), where the ratcheting problem was studied for a compound Poisson surplus process with drift. In the present Brownian setup, calculus of variation techniques allow to obtain a much more explicit analysis and description of the optimal dividend strategies. We also give some numerical illustrations of the optimality results.

Pension Benefits, Retirement and Human Capital Depreciation in Late Adulthood
arXiv

Economists have mainly focused on human capital accumulation and considerably less on the causes and consequences of human capital depreciation in late adulthood. Studying human capital depreciation over the life cycle has powerful economic consequences for decision-making in old age. Using data from China, we examine how a new retirement program affects cognitive performance. We find large negative effects of pension benefits on cognitive functioning among the elderly. We detect the most substantial impact of the program on delayed recall, a significant predictor of the onset of dementia. We show suggestive evidence that the program leads to larger negative impacts among women. We demonstrate that retirement and access to a retirement pension plan plays a significant role in explaining cognitive decline at older ages.

Power mixture forward performance processes
Levon Avanesyan,Ronnie Sircar
arXiv

We consider the forward investment problem in market models where the stock prices are continuous semimartingales adapted to a Brownian filtration. We construct a broad class of forward performance processes with initial conditions of power mixture type, $u(x) = \int_{\mathbb{I}} \frac{x^{1-\gamma}}{1-\gamma }\nu(\mathrm{d} \gamma)$. We proceed to define and fully characterize two-power mixture forward performance processes with constant risk aversion coefficients in the interval $(0,1)$, and derive properties of two-power mixture forward performance processes when the risk aversion coefficients are continuous stochastic processes. Finally, we discuss the problem of managing an investment pool of two investors, whose respective preferences evolve as power forward performance processes.

Principal Component Analysis and Factor Analysis for Feature Selection in Credit Rating
Shenghuan Yang,lonut Florescu,Md Tariqul Islam
arXiv

The credit rating is an evaluation of a company's credit risk that values the ability to pay back the debt and predict the likelihood of the debtor defaulting. There are various features influencing credit rating. Therefore, it is essential to select substantive features to explore the main reason for credit rating change. To address this issue, this paper exploited Principal Component Analysis and Factor Analysis as feature selection algorithms to select important features, summarized the similar features together, and obtained a minimum set of features for four sectors, Financial Sector, Energy Sector, Health Care Sector, Consumer Discretionary Sector. This paper used two data sets, Financial Ratio and Balance Sheet, with two mappings, Detailed Mapping, and Coarse Mapping, converting the target variable(credit rating) into categorical variable. To test the accuracy of credit rating prediction, Random Forest Classifier was used to test and train feature sets. The results showed that the accuracy of Financial Ratio feature sets was higher than that of Balance Sheet feature sets. In addition, Factor Analysis can reduce the number of features significantly to obtain almost the same accuracy that can decrease dramatically the time spent on analyzing data; we also summarized seven dominant factors and ten dominant factors affecting credit rating change in Financial Ratio and Balance Sheet by utilizing Factor Analysis, respectively, which can explain the reason of credit rating change better.

Stability of martingale optimal transport and weak optimal transport
Julio Backhoff-Veraguas,Gudmund Pammer
arXiv

Under mild regularity assumptions, the transport problem is stable in the following sense: if a sequence of optimal transport plans $\pi_1, \pi_2, \ldots$ converges weakly to a transport plan $\pi$, then $\pi$ is also optimal (between its marginals).

Alfonsi, Corbetta and Jourdain asked whether the same property is true for the martingale transport problem. This question seems particularly pressing since martingale transport is motivated by robust finance where data is naturally noisy. On a technical level, stability in the martingale case appears more intricate than for classical transport since optimal transport plans $\pi$ are not characterized by a `monotonicity'-property of their support.

In this paper we give a positive answer and establish stability of the martingale transport problem. As a particular case, this recovers the stability of the left curtain coupling established by Juillet. An important auxiliary tool is an unconventional topology which takes the temporal structure of martingales into account. Our techniques also apply to the the weak transport problem introduced by Gozlan, Roberto, Samson and Tetali.

Tail Risks, Investment Horizons, and Asset Prices
Jozef Baruník,Matěj Nevrla
arXiv

We show that the two important sources of risk -- market tail risk and extreme market volatility risk -- are priced in the cross-section of asset returns heterogeneously across horizons. Specifically, we find that tail risk is a short-term phenomenon whereas extreme volatility risk is priced by investors in the long-term. These risks stem from a dependence structures in the joint distribution of stochastic discount factor and asset returns at various investment horizons that are more general than usually assumed by traditional covariance-based measures. The risk premium we document suggests that investors care about the transitory as well as persistent shocks.

The Solution of the Equity Premium Puzzle
Atilla Aras
arXiv

In this paper, the solution of the equity premium puzzle was given. First, the Arrow-Pratt measure of relative risk aversion for detecting the risk behavior of investors was questioned, and then a new tool was developed to study the risk behavior of investors. This new tool in the new formulated model was tested for the equity premium puzzle for a solution. The calculations of this newly tested model show that the value of the coefficient of relative risk aversion is 1.033526 by assuming the value of the subjective time discount factor as 0.99. Since these values are compatible with the existing empirical studies, they confirm the validity of the newly derived model that provides a solution to the equity premium puzzle.