# Research articles for the 2019-06-23

arXiv

Blockchain technology, and more specifically Bitcoin (one of its foremost applications), have been receiving increasing attention in the scientific community. The first publications with Bitcoin as a topic, can be traced back to 2012. In spite of this short time span, the production magnitude (1162 papers) makes it necessary to make a bibliometric study in order to observe research clusters, emerging topics, and leading scholars. Our paper is aimed at studying the scientific production only around bitcoin, excluding other blockchain applications. Thus, we restricted our search to papers indexed in the Web of Science Core Collection, whose topic is "bitcoin". This database is suitable for such diverse disciplines such as economics, engineering, mathematics, and computer science. This bibliometric study draws the landscape of the current state and trends of Bitcoin-related research in different scientific disciplines.

arXiv

Traditional sentiment construction in finance relies heavily on the dictionary-based approach, with a few exceptions using simple machine learning techniques such as Naive Bayes classifier. While the current literature has not yet invoked the rapid advancement in the natural language processing, we construct in this research a textual-based sentiment index using a novel model BERT recently developed by Google, especially for three actively trading individual stocks in Hong Kong market with hot discussion on Weibo.com. On the one hand, we demonstrate a significant enhancement of applying BERT in sentiment analysis when compared with existing models. On the other hand, by combining with the other two existing methods commonly used on building the sentiment index in the financial literature, i.e., option-implied and market-implied approaches, we propose a more general and comprehensive framework for financial sentiment analysis, and further provide convincing outcomes for the predictability of individual stock return for the above three stocks using LSTM (with a feature of a nonlinear mapping), in contrast to the dominating econometric methods in sentiment influence analysis that are all of a nature of linear regression.

arXiv

Cost Surfaces are a quantitative means of assigning social, environmental, and engineering costs that impact movement across landscapes. Cost surfaces are a crucial aspect of route optimization and least cost path (LCP) calculations and are used in a wide range of disciplines including computer science, landscape ecology, and energy infrastructure modeling. Linear features present a key weakness to traditional routing calculations along costs surfaces because they cannot identify whether moving from a cell to its adjacent neighbors constitutes crossing a linear barrier (increased cost) or following a corridor (reduced cost). Following and avoiding linear features can drastically change predicted routes. In this paper, we introduce an approach to address this "adjacency" issue using a search kernel that identifies these critical barriers and corridors. We have built this approach into a new Java-based open-source software package called CostMAP (cost surface multi-layer aggregation program), which calculates cost surfaces and cost networks using the search kernel. CostMAP not only includes the new adjacency capability, it is also a versatile multi-platform package that allows users to input multiple GIS data layers and to set weights and rules for developing a weighted-cost network. We compare CostMAP performance with traditional cost surface approaches and show significant performance gains, both following corridors and avoiding barriers, using examples in a movement ecology framework and pipeline routing for carbon capture, and storage (CCS). We also demonstrate that the new software can straightforwardly calculate cost surfaces on a national scale.

arXiv

In this paper, we present a discrete-type approximation scheme to solve continuous-time optimal stopping problems based on fully non-Markovian continuous processes adapted to the Brownian motion filtration. The approximations satisfy suitable variational inequalities which allow us to construct $\epsilon$-optimal stopping times and optimal values in full generality. Explicit rates of convergence are presented for optimal values based on reward functionals of path-dependent SDEs driven by fractional Brownian motion. In particular, the methodology allows us to design concrete Monte-Carlo schemes for non-Markovian optimal stopping time problems as demonstrated in the companion paper by Bezerra, Ohashi and Russo.

arXiv

We map stock market interactions to spin models to recover their hierarchical structure using a simulated annealing based Super-Paramagnetic Clustering (SPC) algorithm. This is directly compared to a modified implementation of a maximum likelihood approach we call Fast Super-Paramagnetic Clustering (f-SPC). The methods are first applied standard toy test-case problems, and then to a data-set of 447 stocks traded on the New York Stock Exchange (NYSE) over 1249 days. The signal to noise ratio of stock market correlation matrices is briefly considered. Our result recover approximately clusters representative of standard economic sectors and mixed ones whose dynamics shine light on the adaptive nature of financial markets and raise concerns relating to the effectiveness of industry based static financial market classification in the world of real-time data analytics. A key result is that we show that f-SPC maximum likelihood solutions converge to ones found within the Super-Paramagnetic Phase where the entropy is maximum, and those solutions are qualitatively better for high dimensionality data-sets.

SSRN

This study shows how primary market supply influences the secondary market liquidity of outstanding bonds. Liquidity is higher around new bond issuance by the same issuer and in the same maturity segment. It rises once the new issue is priced and remains elevated for several days. The effect is mostly attributed to switch trades between old and new bonds. It increases by the volume issued and decreases by the amount of similar paper outstanding. The liquidity surge is positively linked to the new bond's attractiveness; it is stronger during times of positive market sentiment.

arXiv

The investment risk minimization problem with budget and return constraints has been the subject of research using replica analysis but there are shortcomings in the extant literature. With respect to Tobin's separation theorem and the capital asset pricing model, it is necessary to investigate the implications of a risk-free asset and examine its influence on the optimal portfolio. Accordingly, in this work, we explore the investment risk minimization problem in the presence of a risk-free asset with budget and return constraints. Moreover, we discuss opportunity loss, the Pythagorean theorem of the Sharpe ratio, and Tobin's separation theorem.

arXiv

Statistical and multiscaling characteristics of WTI Crude Oil prices expressed in US dollar in relation to the most traded currencies as well as to gold futures and to the E-mini S$\&$P500 futures prices on 5 min intra-day recordings in the period January 2012 - December 2017 are studied. It is shown that in most of the cases the tails of return distributions of the considered financial instruments follow the inverse cubic power law. The only exception is the Russian ruble for which the distribution tail is heavier and scales with the exponent close to 2. From the perspective of multiscaling the analysed time series reveal the multifractal organization with the left-sided asymmetry of the corresponding singularity spectra. Even more, all the considered financial instruments appear to be multifractally cross-correlated with oil, especially on the level of medium-size fluctuations, as the multifractal cross-correlation analysis carried out by means of the multifractal cross-correlation analysis (MFCCA) and detrended cross-correlation coefficient $\rho_q$ show. The degree of such cross-correlations is however varying among the financial instruments. The strongest ties to the oil characterize currencies of the oil extracting countries. Strength of this multifractal coupling appears to depend also on the oil market trend. In the analysed time period the level of cross-correlations systematically increases during the bear phase on the oil market and it saturates after the trend reversal in 1st half of 2016. The same methodology is also applied to identify possible causal relations between considered observables. Searching for some related asymmetry in the information flow mediating cross-correlations indicates that it was the oil price that led the Russian ruble over the time period here considered rather than vice versa.

arXiv

We characterize the asymptotic small-time and large-time implied volatility smile for the rough Heston model introduced by El Euch, Jaisson and Rosenbaum. We show that the asymptotic short-maturity smile scales in qualitatively the same way as a general rough stochastic volatility model, and is characterized by the Fenchel-Legendre transform of the solution a Volterra integral equation (VIE). The solution of this VIE satisfies a space-time scaling property which simplifies its computation. We corroborate our results numerically with Monte Carlo simulations. We also compute a power series in the log-moneyness variable for the asymptotic implied volatility, which yields tractable expressions for the vol skew and convexity, thus being useful for calibration purposes. We also derive formal asymptotics for the small-time moderate deviations regime and a formal saddlepoint approximation for call options in the large deviations regime. This goes to higher order than previous works for rough models, and in particular captures the effect of the mean reversion term. In the large maturity case, the limiting asymptotic smile turns out to be the same as for the standard Heston model, for which there is a well known closed-form formula in terms of the SVI parametrization.

SSRN

The paper shows that the chance inverted curve predicts recession is less than 3.9%, and even not statistically significant. But then we ask why investors still see linkage between inverted curve and recession? The behavior psychology research demonstrates that, for the majority, bad events (such as the 2007 event) register stronger and longer than good events, and vivid in investorsÃ¢€™ memory. Finally, we show that the strongest and best predictor for recession is the current GDP growth.

SSRN

We propose sustainable finance to become mainstream finance. Embedding ESG principles in a new normal for finance will facilitate its alignment with the broader objectives of society, the engagement of the private sector within the sustainability discussion and action plans, including the Agenda 2030; and by systematically pursuing risk reduction and the provision of stable long term returns, the achievement of the G20Ã‚´s goal of "solid, balanced, sustainable and inclusive growth". We propose the G20 to support the process by providing a comprehensive conceptual vision, an evolving roadmap, operational coordination and forward guidance to the multiple actors involved as well as the problem-solving capacity to tackle the numerous obstacles that will emerge in this transition.

SSRN

We estimate the value of equity analyst research motivated by regulatory changes such as MIFID II, which unbundles equity research and trading functions. We find that changes in target prices (CTPs) of equity analysts even as early as 120 days before a rating change can accurately predict actual credit rating changes of all rating agencies in the United States and Europe, even during a financial crisis. The accuracy of CTPs as a predictor of credit rating actions is as high as 78% even after controlling for actual stock price changes in predicting these rating actions, and this finding is robust to outlook and watch-list effects. For our sample of firms that had a credit event, we estimate the value of analyst research on average to be about 15.5 billion $ for a downgrade and about 5.2 billion $ for an upgrade. Unconditionally, we estimate the value of analyst research to be about 1.8 billion $ when the CTP predicts a future decline in stock prices and about 1.1 billion $ when the CTP predicts a future increase in stock prices. We conclude that the economic value of analyst research is indeed sizable, contrary to the conclusions of most of the extant literature.