Research articles for the 2020-09-27
arXiv
Cryptocurrency markets have many of the characteristics of 20th century commodities markets, making them an attractive candidate for trend following strategies. We present a decade of evidence from the infancy of bitcoin, showcasing the potential investor returns in cryptocurrency trend following, 255% walkforward annualised returns. We find that cryptocurrencies offer similar returns characteristics to commodities with similar risk-adjusted returns, and strong bear market diversification against traditional equities. Code available at https://github.com/Globe-Research/bittrends.
arXiv
Cryptocurrencies' values often respond aggressively to major policy changes, but none of the existing indices informs on the market risks associated with regulatory changes. In this paper, we quantify the risks originating from new regulations on FinTech and cryptocurrencies (CCs), and analyse their impact on market dynamics. Specifically, a Cryptocurrency Regulatory Risk IndeX (CRRIX) is constructed based on policy-related news coverage frequency. The unlabeled news data are collected from the top online CC news platforms and further classified using a Latent Dirichlet Allocation model and Hellinger distance. Our results show that the machine-learning-based CRRIX successfully captures major policy-changing moments. The movements for both the VCRIX, a market volatility index, and the CRRIX are synchronous, meaning that the CRRIX could be helpful for all participants in the cryptocurrency market. The algorithms and Python code are available for research purposes on www.quantlet.de.
arXiv
In order to price contingent claims one needs to first understand the dynamics of these indices. Here we provide a first econometric analysis of the CRIX family within a time-series framework. The key steps of our analysis include model selection, estimation and testing. Linear dependence is removed by an ARIMA model, the diagnostic checking resulted in an ARIMA(2,0,2) model for the available sample period from Aug 1st, 2014 to April 6th, 2016. The model residuals showed the well known phenomenon of volatility clustering. Therefore a further refinement lead us to an ARIMA(2,0,2)-t-GARCH(1,1) process. This specification conveniently takes care of fat-tail properties that are typical for financial markets. The multivariate GARCH models are implemented on the CRIX index family to explore the interaction.
arXiv
A standard quantitative method to access credit risk employs a factor model based on joint multivariate normal distribution properties. By extending a one-factor Gaussian copula model to make a more accurate default forecast, this paper proposes to incorporate a state-dependent recovery rate into the conditional factor loading, and model them by sharing a unique common factor. The common factor governs the default rate and recovery rate simultaneously and creates their association implicitly. In accordance with Basel III, this paper shows that the tendency of default is more governed by systematic risk rather than idiosyncratic risk during a hectic period. Among the models considered, the one with random factor loading and a state-dependent recovery rate turns out to be the most superior on the default prediction.
SSRN
A ranking over a set of alternatives is an aggregation of experts' opinions (AEO) if it depends on the experts' assessments only. We study both those rankings that result from pooling Bayesian experts and those that result from pooling possibly non-Bayesian experts. In the non-Bayesian case, we allow for the simultaneous presence of experts that may display very different attitudes toward uncertainty. We show that a unique axiom along with a mild regularity condition fully characterize those AEO rankings which are "generalized averages" of experts' opinions, in the sense that the average is obtained by using a capacity rather than a probability measure. We call these rankings non-linear pools. We consider a number of special cases such as linear pools (Stone (1961)), concave/convex pools (Cres, Gilboa, and Vieille (2011)), quantiles, and pools of equally reliable experts. We then apply our results to the theory of risk measures. Our application can be viewed as a generalization of the robust approach (Glasserman and Xu (2013)) to risk measurement in that it allows both for a more general notion of "model" and more general aggregation rules. We show that a wide class of risk measures can be regarded as non-linear pools. Not only does this class include all coherent risk measures (Artzner et al. (1999)), but also measures like the Value-at-Risk, which fail subadditivity. We also briefly discuss the possibility of extending our findings to include the convex risk measures of Follmer and Schied (2002) as well as their non-subadditive extensions.
SSRN
We study an economy with a CEO who trades off the incentive to divert funds, which leads to underinvestment, against the incentive to overinvest based on his optimism. In equilibrium, we see overinvestment relative to what the shareholder or a social planner would implement but underinvestment relative to what the optimistic CEO would implement if there were no feedback between real investment and asset prices. For large wealth shares, the CEO's welfare is higher under a social planner where no funds can be diverted. For small wealth shares, overinvestment peaks and the real short-rate and Tobin's q decline.
SSRN
This paper provides a theoretical and empirical analysis of alternative discount rate concepts for computing LGDs using historical bank workout data. It benchmarks five discount rate concepts for workout recovery cash flows to derive observed Loss rates Given Default (LGDs) in terms of economic robustness and empirical implications: contract rate at origination, loan weighted average cost of capital, return on equity, market return on defaulted debt, and market equilibrium return. The paper develops guiding principles for LGD discount rates and argues that the Weighted Average Cost of Capital (WACC) and market equilibrium return dominate the popular contract rate method. The empirical analysis of data provided by Global Credit Data (GCD) shows that declining risk-free rates are in part offset by increasing market risk premiums. Common empirical discount rates are between the risk-free rate and the return on equity. The variation of empirical LGDs is moderate for the various discount rate approaches. Furthermore, a simple correction technique for resolution bias is developed and increases observed LGDs for all periods, particularly recent periods.
SSRN
The London Interbank Offer Rate (LIBOR), based on inputs from banks, is plausibly the most important set of reference interest rates in the world. Following the LIBOR rigging scandal and the post-2008 decline in interbank lending underpinning LIBOR, banks and regulators have agreed to sustain LIBOR only through the end of 2021, after which, LIBOR updates are likely to cease. In this paper, we use an event-based design to study the implications of the LIBOR scandal and phaseout (LSP) on capital markets, focusing on firms with U.S.-traded public debt. Using LSP events as exogenous shocks to the optimality of existing contracts, we provide causal evidence on the effects of the LSP on firms and their stakeholders. Our findings suggest that the consequences of the LSP, as measured by stock and bond returns, are immaterial for the average firm, but negative for firms with fewer outside options to renegotiate or repurchase debt (i.e., with credit ratings below investment grade or lower interest coverage ratios). These results suggest that the negative consequences of the LSP have mostly been borne by the shareholders and bondholders of borrowers with limited options to renegotiate or repurchase debt.
arXiv
Ergodicity describes an equivalence between the expectation value and the time average of observables. Applied to human behaviour, ergodic theories of decision-making reveal how individuals should tolerate risk in different environments. To optimise wealth over time, agents should adapt their utility function according to the dynamical setting they face. Linear utility is optimal for additive dynamics, whereas logarithmic utility is optimal for multiplicative dynamics. Whether humans approximate time optimal behavior across different dynamics is unknown. Here we compare the effects of additive versus multiplicative gamble dynamics on risky choice. We show that utility functions are modulated by gamble dynamics in ways not explained by prevailing economic theory. Instead, as predicted by time optimality, risk aversion increases under multiplicative dynamics, distributing close to the values that maximise the time average growth of wealth. We suggest that our findings motivate a need for explicitly grounding theories of decision-making on ergodic considerations.
arXiv
We compare the Malliavin-Mancino and Cuchiero-Teichmann Fourier instantaneous estimators to investigate the impact of the Epps effect arising from asynchrony in the instantaneous estimates. We demonstrate the instantaneous Epps effect under a simulation setting and provide a simple method to ameliorate the effect. We find that using the previous tick interpolation in the Cuchiero-Teichmann estimator results in unstable estimates when dealing with asynchrony, while the ability to bypass the time domain with the Malliavin-Mancino estimator allows it to produce stable estimates and is therefore better suited for ultra-high frequency finance. An empirical analysis using Trade and Quote data from the Johannesburg Stock Exchange illustrates the instantaneous Epps effect and how the intraday correlation dynamics can vary between days for the same equity pair.
SSRN
This paper studies the heterogeneous effects of the COVID-19 outbreak on stock prices in China and its hidden mechanisms from multi perspectives. First, based on event study and panel data regression, we find that the spread of the epidemic has a significant negative impact on stock market returns. However, this effect is heterogeneous for different industries. In particular, stocks in the pharmaceutical manufacturing industry and its upstream industry, the chemical industry, even benefit from the epidemic. Second, we construct a fear sentiment index by using data from searching volume of COVID-19 related words and find that the panic in the early stages also significantly reduced the stock market return. Third, we demonstrate the underlying mechanism from four firm characteristics. The results show that companies with high asset intensity, low labor intensity, high inventory-to-revenues ratio, and small market value are more negatively affected. We argue that the labor employment in state-owned enterprises is less flexible, so for labor-intensive state-owned firms, their stock performance is worse because of higher idle labor costs. Our evidence also strongly supports this hypothesis. Finally, we create a brand new index based on the WIOD input-output database to measure the relative position of an industry in the supply chain. Our findings show that companies located downstream are more vulnerable to the COVID-19 outbreak.
SSRN
Institutional investors' role in shareholder voting is among the most hotly debated subjects in corporate governance. Some argue that institutions lack adequate incentives to effectively monitor managers; others contend that the largest institutions have developed analytical resources that produce informed votes. But little attention has been paid to the tradeoff these institutions face between voting their shares and earning profitsâ"both for themselves and for the ultimate beneficiary of institutional fundsâ"by lending those shares. Using a unique dataset and a recent change in SEC rules as an empirical setting, we document a substantial increase in the degree to which large institutions lend shares rather than cast votes in corporate elections. We show that, after the SEC clarified fundsâ power to lend shares rather than vote them at shareholder meetings, institutions supplied 58% more shares for lending immediately prior to those meetings. The change is concentrated in stocks with high index fund ownership; a difference-in-differences approach shows that supply increases from 15.6% to 22.3% at those stocks. Even when it comes to proxy fights, we show, stocks with high index ownership see a marked increase in shares available for lending immediately prior to the meeting. Overall, we show that loosening the legal constraints on institutional share lending has had significant implications for how index funds balance the lending-voting tradeoff.
arXiv
This research develops a model-based LAtent Causal Socioeconomic Health (LACSH) index at the national level. We build upon the latent health factor index (LHFI) approach that has been used to assess the unobservable ecological/ecosystem health. This framework integratively models the relationship between metrics, the latent health, and the covariates that drive the notion of health. In this paper, the LHFI structure is integrated with spatial modeling and statistical causal modeling, so as to evaluate the impact of a continuous policy variable (mandatory maternity leave days and government's expenditure on healthcare, respectively) on a nation's socioeconomic health, while formally accounting for spatial dependency among the nations. A novel visualization technique for evaluating covariate balance is also introduced for the case of a continuous policy (treatment) variable. We apply our LACSH model to countries around the world using data on various metrics and potential covariates pertaining to different aspects of societal health. The approach is structured in a Bayesian hierarchical framework and results are obtained by Markov chain Monte Carlo techniques.
arXiv
Savage (1972) lays down the foundation of Bayesian decision theory, but asserts that it is not applicable in big worlds where the environment is complex. Using the theory of finite automaton to model belief formation, this paper studies the characteristics of optimal learning behavior in small and big worlds, where the complexity of the environment is low and high, respectively, relative to the cognitive ability of the decision maker. Confirming Savage's claim, optimal learning behavior is closed to Bayesian in small worlds but significantly different in big worlds. In addition, I show that in big worlds, the optimal learning behavior could exhibit a wide range of well-documented non-Bayesian learning behavior, including the use of heuristic, correlation neglect, persistent over-confidence, inattentive learning, and other behaviors of model simplification or misspecification. These results establish a clear and testable relationship between the prominence of non-Bayesian learning behavior, complexity and cognitive ability.
SSRN
We propose a novel measure for workersâ financial incentives based on within-establishment wage differences among similar workers from the same occupation. This measure captures all forms of incentive pay that lead to worker-employer-specific pay premiums, including explicit (e.g., bonuses) and implicit forms (e.g., tournaments). We estimate the measure using a linked worker-establishment-firm dataset that covers 31 million workers in Germany. For validation, we exploit survey-based information on performance pay and variation in monitoring costs due to occupational characteristics, establishment size, and task complexity and show that the measure behaves as theoretically predicted. Applying the measure yields evidence that workersâ incentives positively correlate with firmsâ performance and innovativeness, which supports a positive relationship between incentives and effort.
arXiv
The complexity of financial markets arise from the strategic interactions among agents trading stocks, which manifest in the form of vibrant correlation patterns among stock prices. Over the past few decades, complex financial markets have often been represented as networks whose interacting pairs of nodes are stocks, connected by edges that signify the correlation strengths. However, we often have interactions that occur in groups of three or more nodes, and these cannot be described simply by pairwise interactions but we also need to take the relations between these interactions into account. Only recently, researchers have started devoting attention to the higher-order architecture of complex financial systems, that can significantly enhance our ability to estimate systemic risk as well as measure the robustness of financial systems in terms of market efficiency. Geometry-inspired network measures, such as the Ollivier-Ricci curvature and Forman-Ricci curvature, can be used to capture the network fragility and continuously monitor financial dynamics. Here, we explore the utility of such discrete Ricci-type curvatures in characterizing the structure of financial systems, and further, evaluate them as generic indicators of the market instability. For this purpose, we examine the daily returns from a set of stocks comprising the USA S&P-500 and the Japanese Nikkei-225 over a 32-year period, and monitor the changes in the edge-centric network curvatures. We find that the different geometric measures capture well the system-level features of the market and hence we can distinguish between the normal or 'business-as-usual' periods and all the major market crashes. This can be very useful in strategic designing of financial systems and regulating the markets in order to tackle financial instabilities.
arXiv
This work researches the impact of including a wider range of participants in the strategy-making process on the performance of organizations which operate in either moderately or highly complex environments. Agent-based simulation demonstrates that the increased number of ideas generated from larger and diverse crowds and subsequent preference aggregation lead to rapid discovery of higher peaks in the organization's performance landscape. However, this is not the case when the expansion in the number of participants is small. The results confirm the most frequently mentioned benefit in the Open Strategy literature: the discovery of better performing strategies.
SSRN
Stablecoins, usually represented by and associated with the dominant Tether (USDT) token, have evolved into an important mechanism of the whole cryptoassets system, with their main objective being to enable easy transactions between cryptoasset exchanges with a stable exchange rate mostly through pegging to and being backed by a fiat currency or a physical asset. This backing itself has become a controversial topic for the most dominant stablecoin, and its role in the 2017 skyrocketing cryptoasset prices has attracted many speculations in the community. However, the actual research interest in stablecoins and their role in cryptoasset price dynamics has been rather scarce. Herein, we provide a detailed analysis of interactions and dynamics between a set of 28 stablecoins, Bitcoin as the most dominant cryptoasset, and altcoins to examine whether stablecoins in fact induce price movements. We provide evidence that stablecoins mostly reflect an increasing demand for investing in cryptoassets rather than serve as a boosting mechanism for periods of extreme appreciation. We further discuss some specificities of 2017, even though the general dynamic patterns remain very similar to the general behavior. Overall, we do not find support for various scandalous claims about stablecoins.
arXiv
Here we propose two alternatives to Black 76 to value European option future contracts in which the underlying market prices can be negative or mean reverting. The two proposed models are Ornstein-Uhlenbeck (OU) and continuous time GARCH (generalized autoregressive conditionally heteroscedastic). We then analyse the values and compare them with Black 76, the most commonly used model, when the underlying market prices are positive
SSRN
This paper examines an Epstein-Zin recursive utility with quasi-hyperbolic discounting in continuous time. I directly define the utility process and consider a Merton's optimal consumption-investment problem for application. I show that a solution to the Hamilton-Jacobi-Bellman equation is the value function. The numerical comparative statics and mathematical analysis shows that, unlike in the constant relative risk aversion utility, present bias in the Epstein-Zin utility causes economically significant overconsumption, maintaining a plausible attitude toward risks. Additionally, I show that the sophisticated agent's preproperation occurs if and only if his or her elasticity of intertemporal substitution in consumption is larger than one.
arXiv
We consider the optimal reinsurance problem from the point of view of a direct insurer owning several dependent risks, assuming a maximal expected utility criterion and independent negotiation of reinsurance for each risk. Without any particular hypothesis on the dependency structure, we show that optimal treaties exist in a class of independent randomized contracts. We derive optimality conditions and show that under mild assumptions the optimal contracts are of classical (non-randomized) type. A specific form of the optimality conditions applies in that case. We illustrate the results with some numerical examples.
arXiv
In this paper we investigate a utility maximization problem with drift uncertainty in a multivariate continuous-time Black-Scholes type financial market which may be incomplete. We impose a constraint on the admissible strategies that prevents a pure bond investment and we include uncertainty by means of ellipsoidal uncertainty sets for the drift. Our main results consist firstly in finding an explicit representation of the optimal strategy and the worst-case parameter, secondly in proving a minimax theorem that connects our robust utility maximization problem with the corresponding dual problem. Thirdly, we show that, as the degree of model uncertainty increases, the optimal strategy converges to a generalized uniform diversification strategy.
arXiv
The paper develops multiplicative compensation for complex-valued semimartingales and studies some of its consequences. It is shown that the stochastic exponential of any complex-valued semimartingale with independent increments becomes a true martingale after multiplicative compensation, where such compensation is meaningful. This generalization of the L\'evy-Khintchin formula fills an existing gap in the literature. We further report Girsanov-type results based on non-negative multiplicatively compensated semimartingales. In particular, we obtain a simplified expression for the multiplicative compensator under the new measure.
arXiv
We develop a stochastic calculus that makes it easy to capture a variety of predictable transformations of semimartingales such as changes of variables, stochastic integrals, and their compositions. The framework offers a unified treatment of real-valued and complex-valued semimartingales. The proposed calculus is a blueprint for the derivation of new relationships among stochastic processes with specific examples provided below.
arXiv
In many labor markets, workers and firms are connected via affiliative relationships. A management consulting firm wishes to both accept the best new workers but also place its current affiliated workers at strong firms. Similarly, a research university wishes to hire strong job market candidates while also placing its own candidates at strong peer universities. We model this affiliate matching problem in a generalization of the classic stable marriage setting by permitting firms to state preferences over not just which workers to whom they are matched, but also to which firms their affiliated workers are matched. Based on results from a human survey, we find that participants (acting as firms) give preference to their own affiliate workers in surprising ways that violate some assumptions of the classical stable marriage problem. This motivates a nuanced discussion of how stability could be defined in affiliate matching problems; we give an example of a marketplace which admits a stable match under one natural definition of stability, and does not for that same marketplace under a different, but still natural, definition. We conclude by setting a research agenda toward the creation of a centralized clearing mechanism in this general setting.
SSRN
The purpose of this study is to present a unique database on commercialized patents and to illustrate how it can be used to analyze the commercialization process of patents. The dataset is based on a survey of Swedish patents owned by inventors and small firms with a remarkably high response rate of 80 percent. It contains some key variables on commercialization not found anywhere else, including whether, when and how (acquisition, licensing, existing or new firm) patents were commercialized as well as whether this commercialization was profitable or not. Thus, this patent database measures technological innovation. The dataset is complemented with indicators of patent quality (patent renewal, forward citations, and patent family) from archive sources. Basic statistics for the key variables are described. Finally, the scientific output in terms of published articles in peer-reviewed journals shows how this database can be used to analyze the commercialization process of patents. The dataset has, for instance, been used to 1) evaluate government loan programs for inventors; 2) analyze the different roles of the inventor and the Schumpeterian entrepreneur during commercialization; 3) estimate the transfer of tacit knowledge when patents are sold or licensed; and 4) analyze the entry strategy among inventors in oligopolistic markets