# Research articles for the 2021-01-17

arXiv

A new realized conditional autoregressive Value-at-Risk (VaR) framework is proposed, through incorporating a measurement equation into the original quantile regression model. The framework is further extended by employing various Expected Shortfall (ES) components, to jointly estimate and forecast VaR and ES. The measurement equation models the contemporaneous dependence between the realized measure (i.e., Realized Variance and Realized Range) and the latent conditional ES. An adaptive Bayesian Markov Chain Monte Carlo method is employed for estimation and forecasting, the properties of which are assessed and compared with maximum likelihood through a simulation study. In a comprehensive forecasting study on 1% and 2.5 % quantile levels, the proposed models are compared to a range of parametric, non-parametric and semi-parametric models, based on 7 market indices and 7 individual assets. One-day-ahead VaR and ES forecasting results favor the proposed models, especially when incorporating the sub-sampled Realized Variance and the sub-sampled Realized Range in the model.

arXiv

This paper studies a nonzero-sum Dynkin game in discrete time under non-exponential discounting. For both players, there are two levels of game-theoretic reasoning intertwined. First, each player looks for an intra-personal equilibrium among her current and future selves, so as to resolve time inconsistency triggered by non-exponential discounting. Next, given the other player's chosen stopping policy, each player selects a best response among her intra-personal equilibria. A resulting inter-personal equilibrium is then a Nash equilibrium between the two players, each of whom employs her best intra-personal equilibrium with respect to the other player's stopping policy. Under appropriate conditions, we show that an inter-personal equilibrium exists, based on concrete iterative procedures along with Zorn's lemma. To illustrate our theoretic results, we investigate a two-player real options valuation problem: two firms negotiate a deal of cooperation to initiate a project jointly. By deriving inter-personal equilibria explicitly, we find that coercive power in negotiation depends crucially on the impatience levels of the two firms.

SSRN

We exploit geographic variation in the exposure of US banks to COVID-19 and lockdown policies to document the impact of the pandemic and consequent economic crisis on banks. Combining county-level data on COVID-19 and lockdown policies with bank-level data on loan performance and lending growth, and syndicated loan data, we document that banks geographically more exposed to the pandemic and lockdown policies show (i) an increase in loan loss provisions and non-performing loans, (ii) an increase in lending to small businesses, but not in other lending categories, and (iii) an increase in interest spreads and decrease in loan maturities. These findings show that banks have already seen the negative impact of the pandemic and have reacted to higher lending risk with an adjustment in loan conditionality, but have also responded to higher loan demand and government support programmes.

arXiv

We investigate the optimal portfolio deleveraging (OPD) problem with permanent and temporary price impacts, where the objective is to maximize equity while meeting a prescribed debt/equity requirement. We take the real situation with cross impact among different assets into consideration. The resulting problem is, however, a non-convex quadratic program with a quadratic constraint and a box constraint, which is known to be NP-hard. In this paper, we first develop a successive convex optimization (SCO) approach for solving the OPD problem and show that the SCO algorithm converges to a KKT point of its transformed problem. Second, we propose an effective global algorithm for the OPD problem, which integrates the SCO method, simple convex relaxation and a branch-and-bound framework, to identify a global optimal solution to the OPD problem within a pre-specified $\epsilon$-tolerance. We establish the global convergence of our algorithm and estimate its complexity. We also conduct numerical experiments to demonstrate the effectiveness of our proposed algorithms with both the real data and the randomly generated medium- and large-scale OPD problem instances.

arXiv

In the context of traditional life insurance, the future discretionary benefits ($FDB$), which are a central item for Solvency~II reporting, are generally calculated by computationally expensive Monte Carlo algorithms. We derive analytic formulas for lower and upper bounds for the $FDB$. This yields an estimation interval for the $FDB$, and the average of lower and upper bound is a simple estimator. These formulae are designed for real world applications, and we compare the results to publicly available reporting data.

arXiv

This paper applies a recurrent neural network (RNN) method to forecast cotton and oil prices. We show how these new tools from machine learning, particularly Long-Short Term Memory (LSTM) models, complement traditional methods. Our results show that machine learning methods fit reasonably well the data but do not outperform systematically classical methods such as Autoregressive Integrated Moving Average (ARIMA) models in terms of out of sample forecasts. However, averaging the forecasts from the two type of models provide better results compared to either method. Compared to the ARIMA and the LSTM, the Root Mean Squared Error (RMSE) of the average forecast was 0.21 and 21.49 percent lower respectively for cotton. For oil, the forecast averaging does not provide improvements in terms of RMSE. We suggest using a forecast averaging method and extending our analysis to a wide range of commodity prices.

arXiv

The aim of this paper is to quantify and manage systemic risk caused by default contagion in the interbank market. We model the market as a random directed network, where the vertices represent financial institutions and the weighted edges monetary exposures between them. Our model captures the strong degree of heterogeneity observed in empirical data and the parameters can easily be fitted to real data sets. One of our main results allows us to determine the impact of local shocks, where initially some banks default, to the entire system and the wider economy. Here the impact is measured by some index of total systemic importance of all eventually defaulted institutions. As a central application, we characterize resilient and non-resilient cases. In particular, for the prominent case where the network has a degree sequence without second moment, we show that a small number of initially defaulted banks can trigger a substantial default cascade. Our results complement and extend significantly earlier findings derived in the configuration model where the existence of a second moment of the degree distribution is assumed. As a second main contribution, paralleling regulatory discussions, we determine minimal capital requirements for financial institutions sufficient to make the network resilient to small shocks. An appealing feature of these capital requirements is that they can be determined locally by each institution without knowing the complete network structure as they basically only depend on the institution's exposures to its counterparties.

SSRN

This paper proposes a general statistical framework for systemic financial stress indexes. Several existing index designs can be represented as special cases. We introduce a daily variant of the ECBâ€™s CISS for the euro area and the US. The CISS aggregates a representative set of stress indicators using their time-varying cross-correlations as systemic weights, like portfolio risk is computed from the risk characteristics of individual assets. A bootstrap algorithm delivers test statistics. A linear VAR shows that the Great Recession is mainly caused by CISS shocks, while their contribution to the COVID-19 crisis appears limited. A quantile VAR suggests particularly strong real effects of financial stress in the worst states of the economy.

arXiv

Using representative migration survey data from the Indian state of Kerala, this paper assesses the impact of transnational migration on social signaling through the consumption of visible goods. Using the plausibly exogenous variation in migration networks in the neighborhood and religious communities to account for the potential endogeneity, we find significant and positive effects on conspicuous consumption. In terms of the mechanisms, we put forward three possible channels. While we are unable to rule out the associated changes in preferences driving up the spending on status goods, we observe only modest effects of peer group spending due to higher status competition. A key channel that we propose through a theoretical framework is a potential information gap among permanent residents about the income levels of a migrant. This we argue can be leveraged by migrants to increase the visible consumption to gain higher status in the society.

arXiv

Understanding the emergence of universal features such as the stylized facts in markets is a long-standing challenge that has drawn much attention from economists and physicists. Most existing models, such as stochastic volatility models, focus mainly on price changes, neglecting the complex trading dynamics. Recently, there are increasing studies on order books, thanks to the availability of large-scale trading datasets, aiming to understand the underlying mechanisms governing the market dynamics. In this paper, we collect order-book datasets of Bitcoin platforms across three countries over millions of users and billions of daily turnovers. We find a 1+1D field theory, govern by a set of KPZ-like stochastic equations, predicts precisely the order book dynamics observed in empirical data. Despite the microscopic difference of markets, we argue the proposed effective field theory captures the correct universality class of market dynamics. We also show that the model agrees with the existing stochastic volatility models at the long-wavelength limit.

arXiv

We analyze recently proposed mortgage contracts which aim to eliminate selective borrower default when the loan balance exceeds the house price (the "underwater" effect). We show that contracts which automatically reduce the outstanding balance in the event of house price decline remove the default incentive, but may induce prepayment in low price states. However, low state prepayments vanish if the benefit from home ownership is sufficiently high. We also show that capital gain sharing features, such as prepayment penalties in high house price states, are ineffective as they virtually eliminate prepayment. For observed foreclosure costs, we find that contracts with automatic balance adjustments become preferable to the traditional fixed rate contracts at mortgage rate spreads between 50-100 basis points. Results are obtained using American options pricing method, in a continuous time model with diffusive home prices. Here, we associate the contracts' values and optimal decision rules with free boundary problems. We provide explicit solutions in the long contract maturity limit.

arXiv

Value-at-risk is one of the important subjects that extensively used by researchers and practitioners for measuring and managing uncertainty in financial markets. Although value-at-risk is a common risk control instrument, but there are criticisms about its performance. One of these cases, which has been studied in this research, is the value-at-risk underestimation during times of crisis. In these periods, the non-Gaussian behavior of markets intensifies and the estimated value-at-risks by normal models are lower than the real values. In fact, during times of crisis, the probability density of extreme values in financial return series increases and this heavy-tailed behavior of return series reduces the accuracy of the normal value-at-risk estimation models. A potential approach that can be used to describe non-Gaussian behavior of return series, is Tsallis entropy framework and non-extensive statistical methods. In this paper, we have used non-extensive value at risk model for analyzing the behavior of financial markets during times of crisis. By applying q-Gaussian probability density function, we can see a better value-at-risk estimation in comparison with the normal models, especially during times of crisis. We showed that q-Gaussian model estimates value-at-risk better than normal model. Also we saw in the mature markets, it is obvious that the difference of value-at-risk between normal condition and non-extensive approach increase more than one standard deviation during times of crisis, but in the emerging markets we cannot see a specific pattern.

arXiv

I propose a new tool to characterize the resolution of uncertainty around FOMC press conferences. It relies on the construction of a measure capturing the level of discussion complexity between the Fed Chair and reporters during the Q&A sessions. I show that complex discussions are associated with higher equity returns and a drop in realized volatility. The method creates an attention score by quantifying how much the Chair needs to rely on reading internal documents to be able to answer a question. This is accomplished by building a novel dataset of video images of the press conferences and leveraging recent deep learning algorithms from computer vision. This alternative data provides new information on nonverbal communication that cannot be extracted from the widely analyzed FOMC transcripts. This paper can be seen as a proof of concept that certain videos contain valuable information for the study of financial markets.

arXiv

In repeated-game applications where both the collusive and non-collusive outcomes can be supported as equilibria, researchers must resolve underlying selection questions if theory will be used to understand counterfactual policies. One guide to selection, based on clear theoretical underpinnings, has shown promise in predicting when collusive outcomes will emerge in controlled repeated-game experiments. In this paper we both expand upon and experimentally test this model of selection, and its underlying mechanism: strategic uncertainty. Adding an additional source of strategic uncertainty (the number of players) to the more-standard payoff sources, we stress test the model. Our results affirm the model as a tool for predicting when tacit collusion is likely/unlikely to be successful. Extending the analysis, we corroborate the mechanism of the model. When we remove strategic uncertainty through an explicit coordination device, the model no longer predicts the selected equilibrium.

arXiv

To manage computational complexity, models of macro-energy systems commonly deploy reduced sets of time-series data. This paper evaluates the adequacy of time-series reduction when modelling energy systems with fully renewable generation and a consequent dependency on storage. Analysis includes various methods to derive reduced time-series and to implement them into models, either as time-slices, also referred to as representative days, or continuous time-steps. All methods are tested with regard to unmet demand and accuracy of estimated system costs using a simple capacity expansion model of the power sector within a renewable energy system. Some methods achieve little unmet demand, but instead their results regarding storage are biased and favour seasonal at the expense of short-term storage. We conclude that renewable energy systems limit the adequacy of time-series reduction and future research should focus on alternative methods to reduce computational complexity.

arXiv

This paper develops a new method for identifying econometric models with partially latent covariates. Such data structures arise naturally in industrial organization and labor economics settings where data are collected using an "input-based sampling" strategy, e.g., if the sampling unit is one of multiple labor input factors. We show that the latent covariates can be nonparametrically identified, if they are functions of a common shock satisfying some plausible monotonicity assumptions. With the latent covariates identified, semiparametric estimation of the outcome equation proceeds within a standard IV framework that accounts for the endogeneity of the covariates. We illustrate the usefulness of our method using two applications. The first focuses on pharmacies: we find that production function differences between chains and independent pharmacies may partially explain the observed transformation of the industry structure. Our second application investigates education achievement functions and illustrates important differences in child investments between married and divorced couples.