# Research articles for the 2021-07-08

arXiv

We demonstrate the use of Adaptive Stress Testing to detect and address potential vulnerabilities in a financial environment. We develop a simplified model for credit card fraud detection that utilizes a linear regression classifier based on historical payment transaction data coupled with business rules. We then apply the reinforcement learning model known as Adaptive Stress Testing to train an agent, that can be thought of as a potential fraudster, to find the most likely path to system failure -- successfully defrauding the system. We show the connection between this most likely failure path and the limits of the classifier and discuss how the fraud detection system's business rules can be further augmented to mitigate these failure modes.

RePEC

We present a simple and operational yet rigorous framework that combines current methods of bank solvency stress tests with a description of fire sales. We demonstrate the applicability of our framework to the EBA stress testing exercise. Fire sales are described by an equilibrium model which balances leverage improvements and drops in security prices. The differences in bank losses caused by fire sales are significant and go beyond the trivial fact that with deleveraging we will get bigger losses. It is shown that ignoring potential deleveraging effects can show institutions as resilient which are in fact fragile and thus create a false sense of resilience.

arXiv

The novel of coronavirus (COVID-19) has suddenly and abruptly changed the world as we knew at the start of the 3rd decade of the 21st century. Particularly, COVID-19 pandemic has negatively affected financial econometrics and stock markets across the globe. Artificial Intelligence (AI) and Machine Learning (ML)-based prediction models, especially Deep Neural Network (DNN) architectures, have the potential to act as a key enabling factor to reduce the adverse effects of the COVID-19 pandemic and future possible ones on financial markets. In this regard, first, a unique COVID-19 related PRIce MOvement prediction (COVID19 PRIMO) dataset is introduced in this paper, which incorporates effects of social media trends related to COVID-19 on stock market price movements. Afterwards, a novel hybrid and parallel DNN-based framework is proposed that integrates different and diversified learning architectures. Referred to as the COVID-19 adopted Hybrid and Parallel deep fusion framework for Stock price Movement Prediction (COVID19-HPSMP), innovative fusion strategies are used to combine scattered social media news related to COVID-19 with historical mark data. The proposed COVID19-HPSMP consists of two parallel paths (hence hybrid), one based on Convolutional Neural Network (CNN) with Local/Global Attention modules, and one integrated CNN and Bi-directional Long Short term Memory (BLSTM) path. The two parallel paths are followed by a multilayer fusion layer acting as a fusion centre that combines localized features. Performance evaluations are performed based on the introduced COVID19 PRIMO dataset illustrating superior performance of the proposed framework.

arXiv

In this paper, we study analytical properties of the solutions to the generalised delay Ait-Sahalia-type interest rate model with Poisson-driven jump. Since this model does not have explicit solution, we employ several new truncated Euler-Maruyama (EM) techniques to investigate finite time strong convergence theory of the numerical solutions under the local Lipschitz condition plus the Khasminskii-type condition. We justify the strong convergence result for Monte Carlo calibration and valuation of some debt and derivative instruments.

arXiv

We present a lattice gas model of financial markets to explain previous empirical observations of the interplay of trends and reversion. The shares of an asset are modeled by gas molecules that are distributed across a hidden social network of investors. Neighbors in the network tend to align their positions due to herding behavior. The model is equivalent to the Ising model on this network, with the magnetization in the role of the deviation of the asset price from its value. For N independent assets, it generalizes to an O(N) vector model.

In efficient markets, the system is driven to its critical temperature. There, it is characterized by long-range correlations and universal critical exponents, in analogy with the second-order phase transition between water and steam. Using the renormalization group, we show that these critical exponents imply predictions for the auto-correlations of financial market returns. For a simple network topology, consistency with observation implies a fractal dimension of the network of 3.3 and a correlation time of 10 years.

While the simple model agrees well with market data on long time scales, it cannot explain the observed market trends over time horizons from one month to one year. In a next step, the approach should therefore be extended to other models of critical dynamics and to general network topologies. It opens the door for indirectly measuring universal properties of the hidden social network of investors from the observable interplay of trends and reversion.

RePEC

In this paper, we focus on the managerial characteristics of micro and small-sized firms. Using linked employer-employee data on the Portuguese economy for the 2010-2018 period, we estimate the impact of management teams' human capital on the probability of firms becoming financially distressed and their subsequent recovery. Our estimates show that the relevance of management teams' formal education on the probability of firms becoming financially distressed depends on firms' size and the type of education. We show that management teams' formal education and tenure reduce the probability of micro and small-sized firms becoming financially distressed and increases the probability of their subsequent recovery. The estimates also suggest that those impacts are stronger for micro and small-sized firms. Additionally, our results show that functional experience previously acquired in other firms, namely in foreign-owned and in exporting firms and in the area of finance, may reduce the probability of micro firms becoming financially distressed. On the other hand, previous functional experience in other firms seems to have a strong and highly significant impact on increasing the odds of recovery of financially distressed firms. We conclude that policies that induce an improvement in the managerial human capital of micro and small-sized firms have significant scope to improve their financial condition, enhancing the economy's resilience against shocks.

arXiv

This paper develops likelihood-based methods for estimation, inference, model selection, and forecasting of continuous-time integer-valued trawl processes. The full likelihood of integer-valued trawl processes is, in general, highly intractable, motivating the use of composite likelihood methods, where we consider the pairwise likelihood in lieu of the full likelihood. Maximizing the pairwise likelihood of the data yields an estimator of the parameter vector of the model, and we prove consistency and asymptotic normality of this estimator. The same methods allow us to develop probabilistic forecasting methods, which can be used to construct the predictive distribution of integer-valued time series. In a simulation study, we document good finite sample performance of the likelihood-based estimator and the associated model selection procedure. Lastly, the methods are illustrated in an application to modelling and forecasting financial bid-ask spread data, where we find that it is beneficial to carefully model both the marginal distribution and the autocorrelation structure of the data. We argue that integer-valued trawl processes are especially well-suited in such situations.

arXiv

Models of economic decision makers often include idealized assumptions, such as rationality, perfect foresight, and access to all relevant pieces of information. These assumptions often assure the models' internal validity, but, at the same time, might limit the models' power to explain empirical phenomena. This paper is particularly concerned with the model of the hidden action problem, which proposes an optimal performance-based sharing rule for situations in which a principal assigns a task to an agent, and the action taken to carry out this task is not observable by the principal. We follow the agentization approach and introduce an agent-based version of the hidden action problem, in which some of the idealized assumptions about the principal and the agent are relaxed so that they only have limited information access, are endowed with the ability to gain information, and store it in and retrieve it from their (limited) memory. We follow an evolutionary approach and analyze how the principal's and the agent's decisions affect the sharing rule, task performance, and their utility over time. The results indicate that the optimal sharing rule does not emerge. The principal's utility is relatively robust to variations in intelligence, while the agent's utility is highly sensitive to limitations in intelligence. The principal's behavior appears to be driven by opportunism, as she withholds a premium from the agent to assure the optimal utility for herself.

arXiv

Forecasting stock returns is a challenging problem due to the highly stochastic nature of the market and the vast array of factors and events that can influence trading volume and prices. Nevertheless it has proven to be an attractive target for machine learning research because of the potential for even modest levels of prediction accuracy to deliver significant benefits. In this paper, we describe a case-based reasoning approach to predicting stock market returns using only historical pricing data. We argue that one of the impediments for case-based stock prediction has been the lack of a suitable similarity metric when it comes to identifying similar pricing histories as the basis for a future prediction -- traditional Euclidean and correlation based approaches are not effective for a variety of reasons -- and in this regard, a key contribution of this work is the development of a novel similarity metric for comparing historical pricing data. We demonstrate the benefits of this metric and the case-based approach in a real-world application in comparison to a variety of conventional benchmarks.

arXiv

While the original Ait-Sahalia interest rate model has been found considerable use as a model for describing time series evolution of interest rates, it may not possess adequate specifications to explain responses of interest rates to empirical phenomena such as volatility 'skews' and 'smiles', jump behaviour, market regulatory lapses, economic crisis, financial clashes among others collectively. The aim of this paper is to propose a modified version of this model by incorporating additional features to collectively describe these empirical phenomena adequately. Moreover, due to lack of a closed-form solution to the proposed model, we employ several new truncated EM techniques to numerically study this model and justify the scheme within Monte Carlo framework to compute some financial quantities such as a bond and a path-dependent barrier option.

arXiv

We consider a real options model for the optimal irreversible investment problem of a profit maximizing company. The company has the opportunity to invest into a production plant capable of producing two products, of which the prices follow two independent geometric Brownian motions. After paying a constant sunk investment cost, the company sells the products on the market and thus receives a continuous stochastic revenue-flow. This investment problem is set as a two-dimensional optimal stopping problem. We find that the optimal investment decision is triggered by a convex curve, which we characterize as the unique continuous solution to a nonlinear integral equation. Furthermore, we provide analytical and numerical comparative statics results of the dependency of the project's value and investment decision with respect to the model's parameters.

arXiv

Accurate modeling of operational risk is important for a bank and the finance industry as a whole to prepare for potentially catastrophic losses. One approach to modeling operational is the loss distribution approach, which requires a bank to group operational losses into risk categories and select a loss frequency and severity distribution for each category. This approach estimates the annual operational loss distribution, and a bank must set aside capital, called regulatory capital, equal to the 0.999 quantile of this estimated distribution. In practice, this approach may produce unstable regulatory capital calculations from year-to-year as selected loss severity distribution families change. This paper presents truncation probability estimates for loss severity data and a consistent quantile scoring function on annual loss data as useful severity distribution selection criteria that may lead to more stable regulatory capital. Additionally, the Sinh-arcSinh distribution is another flexible candidate family for modeling loss severities that can be easily estimated using the maximum likelihood approach. Finally, we recommend that loss frequencies below the minimum reporting threshold be collected so that loss severity data can be treated as censored data.

arXiv

This paper introduces a formulation of the optimal network compression problem for financial systems. This general formulation is presented for different levels of network compression or rerouting allowed from the initial interbank network. We prove that this problem is, generically, NP-hard. We focus on objective functions generated by systemic risk measures under systematic shocks to the financial network. We conclude by studying the optimal compression problem for specific networks; this permits us to study the so-called robust fragility of certain network topologies more generally as well as the potential benefits and costs of network compression. In particular, under systematic shocks and heterogeneous financial networks the typical heuristics of robust fragility no longer hold generally.

arXiv

We study the link between lobbying and industrial concentration. Using data for the past 20 years in the US, we show how lobbying increases when an industry becomes more concentrated, using mergers as shocks to concentration. This holds true both for expenditures on federal lobbying as well as expenditures on campaign contributions. Results are in line with the predictions of a model where lobbying is akin to a public good for incumbents, and thus typically underprovided, while a merger solves the coordination problem.

arXiv

Plastic pollution is one of the most challenging problems affecting the marine environment of our time. Based on a unique dataset covering four European seas and eight European countries, this paper adds to the limited empirical evidence base related to the societal welfare effects of marine litter management. We use a discrete choice experiment to elicit public willingness-to-pay (WTP) for macro and micro plastic removal to achieve Good Environmental Status across European seas as required by the European Marine Strategy Framework Directive. Using a common valuation design and following best-practice guidelines, we draw meaningful comparisons between countries, seas and policy contexts. European citizens have strong preferences to improve the environmental status of the marine environment by removing both micro and macro plastic litter favouring a pan-European approach. However, public WTP estimates differ significantly across European countries and seas. We explain why and discuss implications for policymaking.

arXiv

We develop a network reconstruction model based on entropy maximization considering the sparsity of networks. We reconstruct the interbank network in Japan from financial data in individual banks' balance sheets using the developed reconstruction model from 2000 to 2016. The observed sparsity of the interbank network is successfully reproduced. We examine the characteristics of the reconstructed interbank network by calculating important network attributes. We obtain the following characteristics, which are consistent with the previously known stylized facts. Although we do not introduce the mechanism to generate the core and peripheral structure, we impose the constraints to consider the sparsity that is no transactions within the same bank category except for major commercial banks, the core and peripheral structure has spontaneously emerged. We identify major nodes in each community using the value of PageRank and degree to examine the changing role of each bank category. The observed changing role of banks is considered a result of the quantitative and qualitative monetary easing policy started by the Bank of Japan in April 2013.