# Research articles for the 2020-12-22

arXiv

A geometric method to analyze nonlinear oscillations is discussed. We consider a nonlinear oscillation modeled by a second order ordinary differential equation without specifying the function form. By transforming the differential equation into the system of first order ordinary differential equations, the trajectory is embedded in $R^3$ as a curve, and thereby the time evolution of the original state can be translated into the behavior of the curve in $R^3$, or the vector field along the curve. We analyze the vector field to investigate the dynamic properties of a nonlinear oscillation. While the function form of the model is unspecified, the vector fields and those associated quantities can be estimated by a nonparametric filtering method. We apply the proposed analysis to the time series of the Japanese stock price index. The application shows that the vector field and its derivative will be used as the tools of picking up various signals that help understanding of the dynamic properties of the stock price index.

arXiv

The aim of this paper is to study the optimal investment problem by using coherent acceptability indices (CAIs) as a tool to measure the portfolio performance. We call this problem the acceptability maximization. First, we study the one-period (static) case, and propose a numerical algorithm that approximates the original problem by a sequence of risk minimization problems. The results are applied to several important CAIs, such as the gain-to-loss ratio, the risk-adjusted return on capital and the tail-value-at-risk based CAI. In the second part of the paper we investigate the acceptability maximization in a discrete time dynamic setup. Using robust representations of CAIs in terms of a family of dynamic coherent risk measures (DCRMs), we establish an intriguing dichotomy: if the corresponding family of DCRMs is recursive (i.e. strongly time consistent) and assuming some recursive structure of the market model, then the acceptability maximization problem reduces to just a one period problem and the maximal acceptability is constant across all states and times. On the other hand, if the family of DCRMs is not recursive, which is often the case, then the acceptability maximization problem ordinarily is a time-inconsistent stochastic control problem, similar to the classical mean-variance criteria. To overcome this form of time-inconsistency, we adapt to our setup the set-valued Bellman's principle recently proposed in \cite{KovacovaRudloff2019} applied to two particular dynamic CAIs - the dynamic risk-adjusted return on capital and the dynamic gain-to-loss ratio. The obtained theoretical results are illustrated via numerical examples that include, in particular, the computation of the intermediate mean-risk efficient frontiers.

arXiv

As several COVID-19 vaccine candidates approach approval for human use, governments around the world are preparing comprehensive standards for vaccine distribution and monitoring to avoid long-term consequences that may result from rush-to-market. In this early draft article, we identify challenges for vaccine distribution in four core areas - logistics, health outcomes, user-centric impact, and communication. Each of these challenges is analyzed against five critical consequences impacting disease-spread, individual behaviour, society, the economy, and data privacy. Disparities in equitable distribution, vaccine efficacy, duration of immunity, multi-dose adherence, and privacy-focused record keeping are among the most critical difficulties that must be addressed. While many of these challenges have been previously identified and planned for, some have not been acknowledged from a comprehensive view to account for unprecedented repercussions in specific subsets of the population.

arXiv

What is the best market-neutral implementation of classical Equity Factors? Should one use the specific predictability of the short-leg to build a zero beta Long-Short portfolio, in spite of the specific costs associated to shorting, or is it preferable to ban the shorts and hedge the long-leg with -- say -- an index future? We revisit this question by focusing on the relative predictability of the two legs, the issue of diversification, and various sources of costs. Our conclusion is that, using the same Factors, a Long-Short implementation leads to superior risk-adjusted returns than its Hedged Long-Only counterpart, at least when Assets Under Management are not too large.

arXiv

We quantify the significance and magnitude of the effect of measurement error in satellite weather data on modeling agricultural production, agricultural productivity, and resilience outcomes. In order to provide rigor to our approach, we combine geospatial weather data from a variety of satellite sources with the geo-referenced household survey data from four sub-Saharan African countries that are part of the World Bank Living Standards Measurement Study - Integrated Surveys on Agriculture (LSMS-ISA) initiative. Our goal is to provide systematic evidence on which weather metrics have strong predictive power over a large set of crops and countries and which metrics are only useful in highly specific settings.

arXiv

Energy companies need efficient procedures to perform market calibration of stochastic models for commodities. If the Black framework is chosen for option pricing, the bottleneck of the market calibration is the computation of the variance of the asset. Energy commodities are commonly represented by multi-factor linear models, whose variance obeys a matrix Lyapunov differential equation. In this paper, analytical and numerical methods to derive the variance are discussed: the Lyapunov approach is shown to be more straightforward than ad-hoc derivations found in the literature and can be readily extended to higher-dimensional models. A case study is presented, where the variance of a two-factor mean-reverting model is embedded into the Black formulae and the model parameters are calibrated against listed options. The analytical and numerical method are compared, showing that the former makes the calibration 14 times faster. A Python implementation of the proposed methods is available as open-source software on GitHub.

arXiv

Social distancing has been one of the core public policy measures used to mitigate the economic and health impacts of the COVID-19 pandemic. Such widespread adoption of social distancing measures is wholly unprecedented, and governments have implemented a variety of policies to encourage compliance. These typically rely on financial penalties (fines) and/or informational messages (nudges). There is, however, a lack of evidence on the impact of these policies. We examine the effectiveness of fines and nudges in promoting social distancing in a web-based interactive experiment. The study involves a nearrepresentative sample of 400 participants from the US population, and it was conducted in May 2020 at the height of the pandemic. Fines significantly promote distancing, but nudges only have a marginal impact. Political ideology also has a causal impact -- progressives are more likely to practice distancing, and are marginally more responsive to fines. Further, individuals do more social distancing when they know they may be a superspreader. Our results highlight the crucial role of web-based interactive experiments in informing governments on the causal impact of policies at a time when lab and/or field-based experimental research is not feasible.

arXiv

"Code is law" is the funding principle of cryptocurrencies. The security, transferability, availability and other properties of a crypto-asset are determined by the code through which it is created. If code is open source, as it happens for most cryptocurrencies, this principle would prevent manipulations and grant transparency to users and traders. However, this approach considers cryptocurrencies as isolated entities thus neglecting possible connections between them. Here, we show that 4% of developers contribute to the code of more than one cryptocurrency and that the market reflects these cross-asset dependencies. In particular, we reveal that the first coding event linking two cryptocurrencies through a common developer leads to the synchronisation of their returns in the following months. Our results identify a clear link between the collaborative development of cryptocurrencies and their market behaviour. More broadly, our work reveals a so-far overlooked systemic dimension for the transparency of code-based ecosystems and we anticipate it will be of interest to researchers, investors and regulators.

arXiv

Against the widely held belief that diversification at banking institutions contributes to the stability of the financial system, Wagner (2010) found that diversification actually makes systemic crisis more likely. While it is true, as Wagner asserts, that the probability of joint default of the diversified portfolios is larger; we contend that, as common practice, the effect of diversification is examined with respect to a risk measure like VaR. We find that when banks use VaR, diversification does reduce individual and systemic risk. This, in turn, generates a different set of incentives for banks and regulators.

arXiv

Using recent data from voluntary mass testing, I provide credible bounds on prevalence of SARS-CoV-2 for Austrian counties in early December 2020. When estimating prevalence, a natural missing data problem arises: no test results are generated for non-tested people. In addition, tests are not perfectly predictive for the underlying infection. This is particularly relevant for mass SARS-CoV-2 testing as these are conducted with rapid Antigen tests, which are known to be somewhat imprecise. Using insights from the literature on partial identification, I propose a framework addressing both issues at once. I use the framework to study differing selection assumptions for the Austrian data. Whereas weak monotone selection assumptions provide limited identification power, reasonably stronger assumptions reduce the uncertainty on prevalence significantly.

arXiv

After the 2007/2008 financial crisis, the UK government decided that a change in regulation was required to amend the poor control of financial markets. The Financial Services Act 2012 was developed as a result in order to give more control and authority to the regulators of financial markets. Thus, the Financial Conduct Authority (FCA) succeeded the Financial Services Authority (FSA). An area requiring an improvement in regulation was insider trading. Our study examines the effectiveness of the FCA in its duty of regulating insider trading through utilising the event study methodology to assess abnormal returns in the run-up to the first announcement of mergers. Samples of abnormal returns are examined on periods, under regulation either by the FSA or by the FCA. Practically, stock price data on the London Stock Exchange from 2008-2012 and 2015-2019 is investigated. The results from this study determine that abnormal returns are reduced after the implementation of the Financial Services Act 2012; prices are also found to be noisier in the period before the 2012 Act. Insignificant abnormal returns are found in the run-up to the first announcement of mergers in the 2015-2019 period. This concludes that the FCA is efficient in regulating insider trading.

arXiv

The existence of involuntary unemployment advocated by J. M. Keynes is a very important problem of the modern economic theory. Using a three-generations overlapping generations model, we show that the existence of involuntary unemployment is due to the instability of the economy. Instability of the economy is the instability of the difference equation about the equilibrium price around the full-employment equilibrium, which means that a fall in the nominal wage rate caused by the presence of involuntary unemployment further reduces employment. This instability is due to the negative real balance effect that occurs when consumers' net savings (the difference between savings and pensions) are smaller than their debt multiplied by the marginal propensity to consume from childhood consumption.

arXiv

The dynamic portfolio optimization problem in finance frequently requires learning policies that adhere to various constraints, driven by investor preferences and risk. We motivate this problem of finding an allocation policy within a sequential decision making framework and study the effects of: (a) using data collected under previously employed policies, which may be sub-optimal and constraint-violating, and (b) imposing desired constraints while computing near-optimal policies with this data. Our framework relies on solving a minimax objective, where one player evaluates policies via off-policy estimators, and the opponent uses an online learning strategy to control constraint violations. We extensively investigate various choices for off-policy estimation and their corresponding optimization sub-routines, and quantify their impact on computing constraint-aware allocation policies. Our study shows promising results for constructing such policies when back-tested on historical equities data, under various regimes of operation, dimensionality and constraints.

arXiv

We propose methods for constructing regularized mixtures of density forecasts. We explore a variety of objectives and regularization penalties, and we use them in a substantive exploration of Eurozone inflation and real interest rate density forecasts. All individual inflation forecasters (even the ex post best forecaster) are outperformed by our regularized mixtures. The log scores of the Simplex and Best-Average mixtures, for example, are approximately 7% better than that of the ex post best individual forecaster, and 15% better than that of the median forecaster. From the Great Recession onward, the optimal regularization tends to move density forecasts' probability mass from the centers to the tails, correcting for overconfidence.

arXiv

This paper considers the Split-Then-Combine (STC) approach (Arroyo and de Juan, 2014) to combine forecasts inside the simplex space, the sample space of positive weights adding up to one. As it turns out, the simplicial statistic given by the center of the simplex compares favorably against the fixed-weight, average forecast. Besides, we also develop a Combine-After-Selection (CAS) method to get rid of redundant forecasters. We apply these two approaches to make out-of-sample one-step ahead combinations and subcombinations of forecasts for several economic variables. This methodology is particularly useful when the sample size is smaller than the number of forecasts, a case where other methods (e.g., Least Squares (LS) or Principal Component Analysis (PCA)) are not applicable.

arXiv

Many still rightly wonder whether accounting numbers affect business value. Basic questions are why? and how? I aim at promoting an objective choice on how optimizing the most suitable valuation methods under a value-based management framework through some performance measurement systems. First, I present a comprehensive review of valuation methods. Three valuations methods, (i) Free Cash Flow Valuation Model (FCFVM), (ii) Residual Earning Valuation Model (REVM) and (iii) Abnormal Earning Growth Model (AEGM), are presented. I point out to advantages and limitations. As applications, the proofs of the findings are illustrated on three study cases: Marks & Spencer's business pattern (size and growth prospect), which had a recently advertised valuation problem, and two comparable companies, Tesco and Sainsbury's, all three chosen for multiple-based valuation. For the purpose, two value drivers are chosen, EnV/EBIT (entity value/earnings before interests and taxes) and the corresponding EnV/Sales. Thus, the question whether accounting numbers through models based on mathematical economics truly affect business value has an answer: Maybe, yes.