# Research articles for the 2019-12-01

arXiv

We propose to study electricity capacity remuneration mechanism design through a Principal-Agent approach. The Principal represents the aggregation of electricity consumers (or a representative entity), subject to the physical risk of shortage, and the Agent represents the electricity capacity owners, who invest in capacity and produce electricity to satisfy consumers' demand, and are subject to financial risks. Following the methodology of Cvitanic et al. (2017), we propose an optimal contract, from consumers' perspective, which complements the revenue capacity owners achieved from the spot energy market, and incentivizes both parties to perform an optimal level of investments while sharing the physical and financial risks. Numerical results provide insights on the necessity of a capacity remuneration mechanism and also show how this is especially true when the level of uncertainties on demand or production side increases.

arXiv

Issues such as urban sprawl, congestion, oil dependence, climate change and public health, are prompting urban and transportation planners to turn to land use and urban design to rein in automobile use. One of the implicit beliefs in this effort is that the right land-use policies will, in fact, help to reduce automobile use and increase the use of alternative modes of transportation. Thus, planners and transport engineers are increasingly viewing land use policies and lifestyle patterns as a way to manage transportation demand. While a substantial body of work has looked at the relationship between the built environment and travel behaviour, as well as the influence of lifestyles and lifestyle-related decisions on using different travel modes and activity behaviours, limited work has been done in capturing these effects simultaneously and also in exploring the effect of intra-household interaction on individual attitudes and beliefs towards travel and activity behavior, and their subsequent influence on lifestyles and modality styles. Therefore, for this study we proposed a framework that captures the concurrent influence of lifestyles and modality styles on both household-level decisions, such as neighbourhood location, and individual-level decisions, such as travel mode choices using a hierarchical Latent Class Choice Model.

arXiv

This study constructs an integrated early warning system (EWS) that identifies and predicts stock market turbulence. Based on switching ARCH (SWARCH) filtering probabilities of the high volatility regime, the proposed EWS first classifies stock market crises according to an indicator function with thresholds dynamically selected by the two-peak method. A hybrid algorithm is then developed in the framework of a long short-term memory (LSTM) network to make daily predictions that alert turmoils. In the empirical evaluation based on ten-year Chinese stock data, the proposed EWS yields satisfying results with the test-set accuracy of $96.6\%$ and on average $2.4$ days of the forewarned period. The model's stability and practical value in real-time decision-making are also proven by the cross-validation and back-testing.

arXiv

Portfolio traders strive to identify dynamic portfolio allocation schemes so that their total budgets are efficiently allocated through the investment horizon. This study proposes a novel portfolio trading strategy in which an intelligent agent is trained to identify an optimal trading action by using deep Q-learning. We formulate a Markov decision process model for the portfolio trading process, and the model adopts a discrete combinatorial action space, determining the trading direction at prespecified trading size for each asset, to ensure practical applicability. Our novel portfolio trading strategy takes advantage of three features to outperform in real-world trading. First, a mapping function is devised to handle and transform an initially found but infeasible action into a feasible action closest to the originally proposed ideal action. Second, by overcoming the dimensionality problem, this study establishes models of agent and Q-network for deriving a multi-asset trading strategy in the predefined action space. Last, this study introduces a technique that has the advantage of deriving a well-fitted multi-asset trading strategy by designing an agent to simulate all feasible actions in each state. To validate our approach, we conduct backtests for two representative portfolios and demonstrate superior results over the benchmark strategies.

arXiv

The sampling efficiency of MCMC methods in Bayesian inference for stochastic volatility (SV) models is known to highly depend on the actual parameter values, and the effectiveness of samplers based on different parameterizations varies significantly. We derive novel algorithms for the centered and the non-centered parameterizations of the practically highly relevant SV model with leverage, where the return process and innovations of the volatility process are allowed to correlate. Moreover, based on the idea of ancillarity-sufficiency interweaving (ASIS), we combine the resulting samplers in order to guarantee stable sampling efficiency irrespective of the baseline parameterization.We carry out an extensive comparison to already existing sampling methods for this model using simulated as well as real world data.

SSRN

Equity crowdfunding is an increasingly international form of digital platform-based entrepreneurial finance. With an unprecedented number of available international investment opportunities, it becomes relevant to ask what leads cross-border investors to invest in certain campaigns and not others. Building on attention literature, we theorize how investor attention drives cross-border investorsâ€™ investment choices. We test our hypotheses using a unique dataset of cross-border equity crowdfunding investments. We find that investor attention, as proxied by campaign visibility in investor countries and entrepreneur-investor co-nationality, has strong positive effects on cross-border investment tie formation, whereas national distances lose their importance in this digital setting.

SSRN

This paper examines the role of banking sector foreign currency hedging demand in the foreign exchange market. First, the paper documents deviations from covered interest parity for a panel of emerging economies and tests whether resident bank foreign currency hedging needs affect these deviations. Next, I exploit data from Mexican regulatory filings on derivatives transactions and bank balance sheets to assess the impact of FX hedging demand from all resident banks, foreigners, and global banks operating in Mexico. These hedging demand measures are included in an econometric model of covered interest parity (under limits to arbitrage) with tenures from 1 month to 12 months, and then interacted with arbitrageur balance sheet constraint variables to test whether these amplify the impact of FX hedging demand. The main result of the paper is that bank hedging demand directly influences CIP deviations in the EM panel and the case of Mexico, while evidence of interaction effects is mixed. The direct effect of resident bank hedging demand is robust to including foreign exchange bid-ask spreads and arbitrage constraint variables in the regression model. In addition, global banks are the driver of this hedging effect. The results validate an important mechanism in the theoretical literature: that higher bank demand for foreign currency hedging, particularly from global banks, can directly increase the cost of hedging. This paper adds to the literature on CIP deviations by analyzing emerging market currencies, with the unique advantage of using regulatory data and observed FX derivatives transactions.

SSRN

This project explored the Black-Litterman framework to construct a portfolio of stocks listed in the S&P 100 index, and tracked the performance of the efficient portfolio against the S&P 100 index. Thirteen technical indicators (viz. RSI, Bollinger Bands, MACD etc.) were used to incorporate the investorâ€™s personal views into the Bayesian framework. Performance of an equally-weighted linking matrix was compared against the performance of a distributed linking matrix in the training period of January 2009 to December 2013, wherein the constructed portfolios minimized Expected Shortfall under the said probabilistic model. Excluding transaction costs, the resulting optimized portfolio outperformed the benchmark during the testing period of January 2014 to December 2018 on both nominal and risk-adjusted basis as measured by overall performance and Sharpe ratio.

arXiv

We propose a model for price formation in financial markets based on clearing of a standard call auction with random orders, and verify its validity for prediction of the daily closing price distribution statistically. The model considers random buy and sell orders, placed following demand- and supply-side valuation distributions; an equilibrium equation then leads to a distribution for clearing price and transacted volume. Bid and ask volumes are left as free parameters, permitting possibly heavy-tailed or very skewed order flow conditions. In highly liquid auctions, the clearing price distribution converges to an asymptotically normal central limit, with mean and variance in terms of supply/demand-valuation distributions and order flow imbalance. By means of simulations, we illustrate the influence of variations in order flow and valuation distributions on price/volume, noting a distinction between high- and low-volume auction price variance. To verify the validity of the model statistically, we predict a year's worth of daily closing price distributions for 5 constituents of the Eurostoxx 50 index; Kolmogorov-Smirnov statistics and QQ-plots demonstrate with ample statistical significance that the model predicts closing price distributions accurately, and compares favourably with alternative methods of prediction.

SSRN

A portfolio of commodity index-ETF pairs was constructed and its performance was compared against the Bloomberg commodity index. The portfolio was trained on data from 2011 to 2014 to determine the optimal parameters for trade-execution thresholds. Using these parameters and the optimal exposures found from optimizing the risk-adjusted expected returns, the trading strategy was tested from 2015 to 2018 with the portfolio re-balanced at the beginning of each year. Comparing the returns of the portfolio against the Bloomberg Barclays Commodities Index as a benchmark, it was found that the portfolio outperformed the benchmark in total returns and Sharpe ratio for 2015, 2017, and 2018, but it under-performed for both metrics in 2016. Performing a paired t-test on the portfolio and benchmark returns proved that the portfolio does not signiï¬cantly outperform the benchmark in any given year.

SSRN

Given the erosion in the publicâ€™s trust in market mechanisms and globalization as a means for delivering inclusive prosperity for all, the creation of effective markets is a priority for policymakers. This paper looks at the creation of open application programming interface (API) standards in banking, and explores the competition problems that they address. It argues that by fundamentally changing the way that consumers buy and use banking services this represents the development of a more entrepreneurial approach to remedying malfunctioning markets. It also underlines the importance of competition authorities being able to investigate market failures on the demand side (and to take action to resolve those failures), and notes that these remedies may have consequences for other markets where consumers lack property rights over the data that is collected on their behaviour.

SSRN

Complementing a political motivation proposed by Stiglitz (2012), we identify the self-perpetuating nature of income inequality motivated in neoclassical corporate finance, corporate financing in the pecking order, by a significant positive impact of macro-level income inequality on wealth inequality. Our empirical analysis corroborates the same; for 15,812 firms, operating in 41 developed and developing countries, we find evidence of a significant positive impact of national income inequality on firm ownership concentration. We employ panel ordered logit estimations and show evidence of the effect observable after controlling for other firm-, industry-, and country-level factors, identified in literature to impact firm ownership concentration, as well as robust to their potential endogeneity to the extent of employing temporally preceding observations. The effect is also robust against residual heteroskedasticity, alternative measures of income inequality and firm size, sample restrictions to high income and OECD member economies, and accounting for the three-level hierarchical arrangement of firms operating in industries which have a unique operational presence in a country. This is a contribution also to the literature on firm ownership concentration by identifying a new determinant of the same with theoretical motivation and empirical validation.

arXiv

Few assets in financial history have been as notoriously volatile as cryptocurrencies. While the long term outlook for this asset class remains unclear, we are successful in making short term price predictions for several major crypto assets. Using historical data from July 2015 to November 2019, we develop a large number of technical indicators to capture patterns in the cryptocurrency market. We then test various classification methods to forecast short-term future price movements based on these indicators. On both PPV and NPV metrics, our classifiers do well in identifying up and down market moves over the next 1 hour. Beyond evaluating classification accuracy, we also develop a strategy for translating 1-hour-ahead class predictions into trading decisions, along with a backtester that simulates trading in a realistic environment. We find that support vector machines yield the most profitable trading strategies, which outperform the market on average for Bitcoin, Ethereum and Litecoin over the past 22 months, since January 2018.

arXiv

Recently, there has been a surge of interest in the use of machine learning to help aid in the accurate predictions of financial markets. Despite the exciting advances in this cross-section of finance and AI, many of the current approaches are limited to using technical analysis to capture historical trends of each stock price and thus limited to certain experimental setups to obtain good prediction results. On the other hand, professional investors additionally use their rich knowledge of inter-market and inter-company relations to map the connectivity of companies and events, and use this map to make better market predictions. For instance, they would predict the movement of a certain company's stock price based not only on its former stock price trends but also on the performance of its suppliers or customers, the overall industry, macroeconomic factors and trade policies. This paper investigates the effectiveness of work at the intersection of market predictions and graph neural networks, which hold the potential to mimic the ways in which investors make decisions by incorporating company knowledge graphs directly into the predictive model. The main goal of this work is to test the validity of this approach across different markets and longer time horizons for backtesting using rolling window analysis. In this work, we concentrate on the prediction of individual stock prices in the Japanese Nikkei 225 market over a period of roughly 20 years. For the knowledge graph, we use the Nikkei Value Search data, which is a rich dataset showing mainly supplier relations among Japanese and foreign companies. Our preliminary results show a 29.5% increase and a 2.2-fold increase in the return ratio and Sharpe ratio, respectively, when compared to the market benchmark, as well as a 6.32% increase and 1.3-fold increase, respectively, compared to the baseline LSTM model.

arXiv

Financial time series forecasting is, without a doubt, the top choice of computational intelligence for finance researchers from both academia and financial industry due to its broad implementation areas and substantial impact. Machine Learning (ML) researchers came up with various models and a vast number of studies have been published accordingly. As such, a significant amount of surveys exist covering ML for financial time series forecasting studies. Lately, Deep Learning (DL) models started appearing within the field, with results that significantly outperform traditional ML counterparts. Even though there is a growing interest in developing models for financial time series forecasting research, there is a lack of review papers that were solely focused on DL for finance. Hence, our motivation in this paper is to provide a comprehensive literature review on DL studies for financial time series forecasting implementations. We not only categorized the studies according to their intended forecasting implementation areas, such as index, forex, commodity forecasting, but also grouped them based on their DL model choices, such as Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), Long-Short Term Memory (LSTM). We also tried to envision the future for the field by highlighting the possible setbacks and opportunities, so the interested researchers can benefit.

SSRN

We propose the first factor model that explains cross-sectional variation in optionable stock returns. Our model includes new factors based on option-implied volatility minus realized volatility, the call minus put implied volatility spread, and the difference between changes in call and put implied volatilities, along with the market factor. The model outperforms previously-proposed factor models at explaining the average returns of portfolios of optionable stocks formed by sorting on other option-based predictors, as well as a large number of other well-known predictors, of the cross section of future stock returns. The newly proposed model provides a benchmark for assessing whether portfolios of optionable stocks generate average returns that are not a manifestation of previously-documented phenomena.

arXiv

We investigate a solution for the problems related to the application of multivariate GARCH models to markets with a large number of stocks by restricting the form of the conditional covariance matrix. The model is a factor model and uses only six free GARCH parameters. One factor can be interpreted as the market component, the remaining factors are equal. This allow the analytical calculation of the inverse covariance matrix. The time-dependence of the factors enables the determination of dynamical beta coefficients. We compare the results from our model with the results of other GARCH models for the daily returns from the S\&P500 market and find that they are competitive. As applications we use the daily values of beta coefficients to confirm a transition of the market in 2006. Furthermore we discuss the relationship of our model with the leverage effect.

arXiv

The dynamics of financial markets are driven by the interactions between participants, as well as the trading mechanisms and regulatory frameworks that govern these interactions. Decision-makers would rather not ignore the impact of other participants on these dynamics and should employ tools and models that take this into account. To this end, we demonstrate the efficacy of applying opponent-modeling in a number of simulated market settings. While our simulations are simplified representations of actual market dynamics, they provide an idealized "playground" in which our techniques can be demonstrated and tested. We present this work with the aim that our techniques could be refined and, with some effort, scaled up to the full complexity of real-world market scenarios. We hope that the results presented encourage practitioners to adopt opponent-modeling methods and apply them online systems, in order to enable not only reactive but also proactive decisions to be made.

arXiv

Stochastic bridges are commonly used to impute missing data with a lower sampling rate to generate data with a higher sampling rate, while preserving key properties of the dynamics involved in an unbiased way. While the generation of Brownian bridges and Ornstein-Uhlenbeck bridges is well understood, unbiased generation of such stochastic bridges subject to a given extremum has been less explored in the literature. After a review of known results, we compare two algorithms for generating Brownian bridges constrained to a given extremum, one of which generalises to other diffusions. We further apply this to generate unbiased Ornstein-Uhlenbeck bridges and unconstrained processes, both constrained to a given extremum, along with more tractable numerical approximations of these algorithms. Finally, we consider the case of drift, and applications to geometric Brownian motions.

arXiv

In this paper we study the problem of stopping a Brownian bridge $X$ in order to maximise the expected value of an exponential gain function. In particular, we solve the stopping problem $$\sup_{0\le \tau\le 1}\mathsf{E}[\mathrm{e}^{X_\tau}]$$ which was posed by Ernst and Shepp in their paper [Commun. Stoch. Anal., 9 (3), 2015, pp. 419--423] and was motivated by bond selling with non-negative prices.

Due to the non-linear structure of the exponential gain, we cannot rely on methods used in the literature to find closed-form solutions to other problems involving the Brownian bridge. Instead, we develop techniques that use pathwise properties of the Brownian bridge and martingale methods of optimal stopping theory in order to find the optimal stopping rule and to show regularity of the value function.

arXiv

Relative advantage, or the degree to which a new technology is perceived to be better over the existing technology it supersedes, has a significant impact on individuals decision of adopting to the new technology. This paper investigates the impact of electric vehicles perceived advantage over the conventional internal combustion engine vehicles, from consumers perspective, on their decision to select electric vehicles. Data is obtained from a stated preference survey from 1176 residents in New South Wales, Australia. The collected data is used to estimate an integrated choice and latent variable model of electric vehicle choice, which incorporates the perceived advantage of electric vehicles in the form of latent variables in the utility function. The design of the electric vehicle, impact on the environment, and safety are three identified advantages from consumers point of view. The model is used to simulate the effectiveness of various policies to promote electric vehicles on different cohorts. Rebate on the purchase price is found to be the most effective strategy to promote electric vehicles adoption.

arXiv

This paper discusses the short-maturity behavior of Asian option prices and hedging portfolios. We consider the risk-neutral valuation and the delta value of the Asian option having a H\"older continuous payoff function in a local volatility model. The main idea of this analysis is that the local volatility model can be approximated by a Gaussian process at short maturity $T.$ By combining this approximation argument with Malliavin calculus, we conclude that the short-maturity behaviors of Asian option prices and the delta values are approximately expressed as those of their European counterparts with volatility $$\sigma_{A}(T):=\sqrt{\frac{1}{T^3}\int_0^T\sigma^2(t,S_0)(T-t)^2\,dt}\,,$$ where $\sigma(\cdot,\cdot)$ is the local volatility function and $S_0$ is the initial value of the stock. In addition, we show that the convergence rate of the approximation is determined by the H\"older exponent of the payoff function. Finally, the short-maturity asymptotics of Asian call and put options are discussed from the viewpoint of the large deviation principle.

arXiv

Listing on the Dow Jones Sustainability Index is seen as a gold-standard, verifying to the market that a firm is fully engaged with a corporate social responsibility agenda. Robustly quantifying the impact of listing, and de-listing, against any industry level shocks, as well as evolution in the competitive relationship between firms within the industry, provides a strength absent in existing works. It is shown that cumulative abnormal returns on stocks added to the index are significantly positive in the three trading weeks prior to the official announcement. The post-listing correction result posited to date is also demonstrated to hold; the proportion of periods with significant negative returns is low, however. Announcement, rather than effective dates are critical to returns. Differentials between these stages in the chronology is an important contribution of this paper. Most effects end before the membership changes become effective. Whilst there are considerable gains to be made, they come pre-announcement date and require foresight to exploit. Investors must research likely new members to gain maximum return.

arXiv

It is known that quantum computers can speed up Monte Carlo simulation compared to classical counterparts. There are already some proposals of application of the quantum algorithm to practical problems, including quantitative finance. In many problems in finance to which Monte Carlo simulation is applied, many random numbers are required to obtain one sample value of the integrand, since those problems are extremely high-dimensional integrations, for example, risk measurement of credit portfolio. This leads to the situation that the required qubit number is too large in the naive implementation where a quantum register is allocated per random number. In this paper, we point out that we can reduce qubits keeping quantum speed up if we perform calculation similar to classical one, that is, estimate the average of integrand values sampled by a pseudo-random number generator (PRNG) implemented on a quantum circuit. We present not only the overview of the idea but also concrete implementation of PRNG and application to credit risk measurement. Actually, reduction of qubits is a trade-off against increase of circuit depth. Therefore full reduction might be impractical, but such a trade-off between speed and memory space will be important in adjustment of calculation setting considering machine specs, if large-scale Monte Carlo simulation by quantum computer is in operation in the future.

arXiv

A commonly used stochastic model for derivative and commodity market analysis is the Barndorff-Nielsen and Shephard (BN-S) model. Though this model is very efficient and analytically tractable, it suffers from the absence of long range dependence and many other issues. For this paper, the analysis is restricted to crude oil price dynamics. A simple way of improving the BN-S model with the implementation of various machine learning algorithms is proposed. This refined BN-S model is more efficient and has fewer parameters than other models which are used in practice as improvements of the BN-S model. The procedure and the model show the application of data science for extracting a "deterministic component" out of processes that are usually considered to be completely stochastic. Empirical applications validate the efficacy of the proposed model for long range dependence.

SSRN

This paper discuss linear regression analysis applied on cross sections, on time-series, and on panel data. I also covers topics such as heteroscedastic errors, serially correlated errors with HC and HAC covariance matrices. For panel regressions it covers one- and two-way models, fixed effect, pooled and random effects model. The section on panel regressions also covers HC and HAC covariance matrices. Along with the discussion I also implement the discussion using R on accounting and stock market data.

arXiv

The possibility of re-switching of techniques in Piero Sraffa's intersectoral model, namely the returning capital-intensive techniques with monotonic changes in the profit rate, is traditionally considered as a paradox putting at stake the viability of the neoclassical theory of production. It is argued here that this phenomenon can be rationalized within the neoclassical paradigm. Sectoral interdependencies can give rise to non-monotonic effects of progressive variations in income distribution on relative prices. The re-switching of techniques is, therefore, the result of cost-minimizing technical choices facing returning ranks of relative input prices in full consistency with the neoclassical perspective.

arXiv

We provide a general probabilistic framework within which we establish scaling limits for a class of continuous-time stochastic volatility models with self-exciting jump dynamics. In the scaling limit, the joint dynamics of asset returns and volatility is driven by independent Gaussian white noises and two independent Poisson random measures that capture the arrival of exogenous shocks and the arrival of self-excited shocks, respectively. Various well-studied stochastic volatility models with and without self-exciting price/volatility co-jumps are obtained as special cases under different scaling regimes. We analyze the impact of external shocks on the market dynamics, especially their impact on jump cascades and show in a mathematically rigorous manner that many small external shocks may tigger endogenous jump cascades in asset returns and stock price volatility.

arXiv

This study presents new analytic approximations of the stochastic-alpha-beta-rho (SABR) model. Unlike existing studies that focus on the equivalent Black-Scholes (BS) volatility, we instead derive the equivalent volatility under the constant-elasticity-of-variance (CEV) model, which is the limit of the SABR model when the volatility of volatility approaches 0. Numerical examples demonstrate the accuracy of the CEV volatility approximation for a wide range of parameters. Moreover, in our approach, arbitrage occurs at a lower strike price than in existing BS-based approximations.

RePEC

Hockett's "franchise view" argues, convincingly, that the capacity of banks or quasi-bank financial entities to create money rests on the regulations and guarantees of the state maintaining the legal and regulatory system under which they operate. Block suggests that this insight could be used as a beachhead from which to establish the legitimacy of locally embedded, non-profit lenders whose investments would be dedicated to public purposes. However, given the contemporary ideological, political, and economic context, this proposal on its own could prove counterproductive. To maximize the positive impact of the insight into the public character of money creation, the proposal for public-purpose banking should be fused to democratization of central banking. This could plausibly have ideological effects that would make the public character of private economic power easier to perceive, and to reshape. Subordination of central banks to elected officials would also bring an end to the dynamic whereby monetary easing provides political cover for damaging fiscal austerity, leading to more democratic decision-making about the appropriate combination of fiscal and monetary policy.

arXiv

The performance of financial market prediction systems depends heavily on the quality of features it is using. While researchers have used various techniques for enhancing the stock specific features, less attention has been paid to extracting features that represent general mechanism of financial markets. In this paper, we investigate the importance of extracting such general features in stock market prediction domain and show how it can improve the performance of financial market prediction. We present a framework called U-CNNpred, that uses a CNN-based structure. A base model is trained in a specially designed layer-wise training procedure over a pool of historical data from many financial markets, in order to extract the common patterns from different markets. Our experiments, in which we have used hundreds of stocks in S\&P 500 as well as 14 famous indices around the world, show that this model can outperform baseline algorithms when predicting the directional movement of the markets for which it has been trained for. We also show that the base model can be fine-tuned for predicting new markets and achieve a better performance compared to the state of the art baseline algorithms that focus on constructing market-specific models from scratch.

SSRN

We present a novel framework to decompose the drivers of trend-following performance into (i) the magnitude of market moves, (ii) the strategyâ€™s ability to profit from those market moves, and (iii) the degree of diversification across markets. This framework allows us to examine why trend performance has been below the strategyâ€™s long-term average return during the current decade. We find that the lower performance of the strategy is neither explained by (ii) nor (iii): trend following has continued to profit from market moves and benefit from diversification. Instead, the primary explanatory factor is (i), namely that the average size of global market moves has been more muted than usual in the current decade. The fact that trend-following strategies continue to translate market moves into profits in a diversified manner suggests that trend-following investing may see stronger performance in market environments characterized by more pronounced movements in markets going forward.