Research articles for the 2020-09-21
arXiv
In this report it is analyzed the focuses of the commercial dynamism of India, covering the fundamentals of growth rate, trade balance, coverage rate, openness rate, share of world indicators and then present each of them in detail.
arXiv
Consequences from the 2019 anti-extradition protests in Hong Kong have been studied in many facets, but one topic of interest that has not been explored is the impact on the immigration of Bangladeshi immigrants into the city. This paper explores the value add of Bangladeshis to the Hong Kong, how the protests affected their mentality and consequently their immigration, and potentially longer-term detrimental effects on the city.
arXiv
In a discrete-time setting, we study arbitrage concepts in the presence of convex trading constraints. We show that solvability of portfolio optimization problems is equivalent to absence of arbitrage of the first kind, a condition weaker than classical absence of arbitrage opportunities. We center our analysis on this characterization of market viability and derive versions of the fundamental theorems of asset pricing based on portfolio optimization arguments. By considering specifically a discrete-time setup, we simplify existing results and proofs that rely on semimartingale theory, thus allowing for a clear understanding of the foundational economic concepts involved. We exemplify these concepts, as well as some unexpected situations, in the context of one-period factor models with arbitrage opportunities under borrowing constraints.
arXiv
The cryptocurrency market is unique on many levels: Very volatile, frequently changing market structure, emerging and vanishing of cryptocurrencies on a daily level. Following its development became a difficult task with the success of cryptocurrencies (CCs) other than Bitcoin. For fiat currency markets, the IMF offers the index SDR and, prior to the EUR, the ECU existed, which was an index representing the development of European currencies. Index providers decide on a fixed number of index constituents which will represent the market segment. It is a challenge to fix a number and develop rules for the constituents in view of the market changes. In the frequently changing CC market, this challenge is even more severe. A method relying on the AIC is proposed to quickly react to market changes and therefore enable us to create an index, referred to as CRIX, for the cryptocurrency market. CRIX is chosen by model selection such that it represents the market well to enable each interested party studying economic questions in this market and to invest into the market. The diversified nature of the CC market makes the inclusion of altcoins in the index product critical to improve tracking performance. We have shown that assigning optimal weights to altcoins helps to reduce the tracking errors of a CC portfolio, despite the fact that their market cap is much smaller relative to Bitcoin. The codes used here are available via www.quantlet.de.
arXiv
In this paper the dynamics of an economic system with foreign financing, of integer or fractional order, are analyzed. The symmetry of the system determines the existence of two pairs of coexisting attractors. The integer-order version of the system proves to have several combinations of coexisting hidden attractors with self-excited attractors. Because one of the system variables represents the foreign capital in ow, the presence of hidden attractors could be of a real interest in economic models. The fractional-order variant presents another interesting coexistence of attractors in the fractional order space.
arXiv
Social systems are characterized by an enormous network of connections and factors that can influence the structure and dynamics of these systems. All financial markets, including the cryptocurrency market, belong to the economical sphere of human activity that seems to be the most interrelated and complex. The cryptocurrency market complexity can be studied from different perspectives. First, the dynamics of the cryptocurrency exchange rates to other cryptocurrencies and fiat currencies can be studied and quantified by means of multifractal formalism. Second, coupling and decoupling of the cryptocurrencies and the conventional assets can be investigated with the advanced cross-correlation analyses based on fractal analysis. Third, an internal structure of the cryptocurrency market can also be a subject of analysis that exploits, for example, a network representation of the market. We approach this subject from all three perspectives based on data recorded between January 2019 and June 2020. This period includes the Covid-19 pandemic and we pay particular attention to this event and investigate how strong its impact on the structure and dynamics of the market was. Besides, the studied data covers a few other significant events like double bull and bear phases in 2019. We show that, throughout the considered interval, the exchange rate returns were multifractal with intermittent signatures of bifractality that can be associated with the most volatile periods of the market dynamics like a bull market onset in April 2019 and the Covid-19 outburst in March 2020. The topology of a minimal spanning tree representation of the market also used to alter during these events from a distributed type without any dominant node to a highly centralized type with a dominating hub of USDT. However, the MST topology during the pandemic differs in some details from other volatile periods.
arXiv
We analyze the convergence of expected utility under the approximation of the Black-Scholes model by binomial models. In a recent paper by D. Kreps and W. Schachermayer a surprising and somewhat counter-intuitive example was given: such a convergence may, in general, fail to hold true. This counterexample is based on a binomial model where the i.i.d. logarithmic one-step increments have strictly positive third moments. This is the case, when the up-tick of the log-price is larger than the down-tick. In the paper by D. Kreps and W. Schachermayer it was left as an open question how things behave in the case when the down-tick is larger than the up-tick and -- most importantly -- in the case of the symmetric binomial model where the up-tick equals the down-tick. Is there a general positive result of convergence of expected utility in this setting? In the present note we provide a positive answer to this question. It is based on some rather fine estimates of the convergence arising in the Central Limit Theorem.
arXiv
I critique a recent analysis (Miles, Stedman & Heald, 2020) of COVID-19 lockdown costs and benefits, focussing on the United Kingdom (UK). Miles et al. (2020) argue that the March-June UK lockdown was more costly than the benefit of lives saved, evaluated using the NICE threshold of {\pounds}30000 for a quality-adjusted life year (QALY) and that the costs of a lockdown for 13 weeks from mid-June would be vastly greater than any plausible estimate of the benefits, even if easing produced a second infection wave causing over 7000 deaths weekly by mid-September.
I note here two key problems that significantly affect their estimates and cast doubt on their conclusions. Firstly, their calculations arbitrarily cut off after 13 weeks, without costing the epidemic end state. That is, they assume indifference between mid-September states of 13 or 7500 weekly deaths and corresponding infection rates. This seems indefensible unless one assumes that (a) there is little chance of any effective vaccine or improved medical or social interventions for the foreseeable future, (b) notwithstanding temporary lockdowns, COVID-19 will very likely propagate until herd immunity. Even under these assumptions it is very questionable. Secondly, they ignore the costs of serious illness, possible long-term lowering of life quality and expectancy for survivors. These are uncertain, but plausibly at least as large as the costs in deaths.
In summary, policy on tackling COVID-19 cannot be rationally made without estimating probabilities of future medical interventions and long-term illness costs. More work on modelling these uncertainties is urgently needed.
arXiv
This paper develops a model that incorporates the presence of stochastic arbitrage explicitly in the Black--Scholes equation. Here, the arbitrage is generated by a stochastic bubble, which generalizes the deterministic arbitrage model obtained in the literature. It is considered to be a generic stochastic dynamic for the arbitrage bubble, and a generalized Black--Scholes equation is then derived. The resulting equation is similar to that of the stochastic volatility models, but there are no undetermined parameters as the market price of risk. The proposed theory has asymptotic behaviors that are associated with the weak and strong arbitrage bubble limits. For the case where the arbitrage bubble's volatility is zero (deterministic bubble), the weak limit corresponds to the usual Black-Scholes model. The strong limit case also give a Black--Scholes model, but the underlying asset's mean value replaces the interest rate. When the bubble is stochastic, the theory also has weak and strong asymptotic limits that give rise to option price dynamics that are similar to the Black--Scholes model. Explicit formulas are derived for Gaussian and lognormal stochastic bubbles. Consequently, the Black--Scholes model can be considered to be a "low energy" limit of a more general stochastic model.
arXiv
An explicit weak solution for the 3/2 stochastic volatility model is obtained and used to develop a simulation algorithm for option pricing purposes. The 3/2 model is a non-affine stochastic volatility model whose variance process is the inverse of a CIR process. This property is exploited here to obtain an explicit weak solution, similarly to Kouritzin (2018). A simulation algorithm based on this solution is proposed and tested via numerical examples. The performance of the resulting pricing algorithm is comparable to that of other popular simulation algorithms.
arXiv
Standard approaches to the theory of financial markets are based on equilibrium and efficiency. Here we develop an alternative based on concepts and methods developed by biologists, in which the wealth invested in a financial strategy is like the population of a species. We study a toy model of a market consisting of value investors, trend followers and noise traders. We show that the average returns of strategies are strongly density dependent, i.e. they depend on the wealth invested in each strategy at any given time. In the absence of noise the market would slowly evolve toward an efficient equilibrium, but the large statistical uncertainty in profitability makes this noisy and uncertain. Even in the long term, the market spends extended periods of time far from perfect efficiency. We show how core concepts from ecology, such as the community matrix and food webs, apply to markets. The wealth dynamics of the market ecology explains how market inefficiencies spontaneously occur and gives insight into the origins of excess price volatility and deviations of prices from fundamental values.
arXiv
Equity basket correlation can be estimated both using the physical measure from stock prices, and also using the risk neutral measure from option prices. The difference between the two estimates motivates a so-called "dispersion strategy''. We study the performance of this strategy on the German market and propose several profitability improvement schemes based on implied correlation (IC) forecasts. Modelling IC conceals several challenges. Firstly the number of correlation coefficients would grow with the size of the basket. Secondly, IC is not constant over maturities and strikes. Finally, IC changes over time. We reduce the dimensionality of the problem by assuming equicorrelation. The IC surface (ICS) is then approximated from the implied volatilities of stocks and the implied volatility of the basket. To analyze the dynamics of the ICS we employ a dynamic semiparametric factor model.
arXiv
Categorization is an essential component for us to understand the world for ourselves and to communicate it collectively. It is therefore important to recognize that classification system are not necessarily static, especially for economic systems, and even more so in urban areas where most innovation takes place and is implemented. Out-of-date classification systems would potentially limit further understanding of the current economy because things constantly change. Here, we develop an occupation-based classification system for the US labor economy, called industrial topics, that satisfy adaptability and representability. By leveraging the distributions of occupations across the US urban areas, we identify industrial topics - clusters of occupations based on their co-existence pattern. Industrial topics indicate the mechanisms under the systematic allocation of different occupations. Considering the densely connected occupations as an industrial topic, our approach characterizes regional economies by their topical composition. Unlike the existing survey-based top-down approach, our method provides timely information about the underlying structure of the regional economy, which is critical for policymakers and business leaders, especially in our fast-changing economy.
arXiv
The study introduces an automated trading system for S\&P500 E-mini futures (ES) based on state-of-the-art machine learning. Concretely: we extract a set of scenarios from the tick market data to train the model and further use the predictions to model trading. We define the scenarios from the local extrema of the price action. Price extrema is a commonly traded pattern, however, to the best of our knowledge, there is no study presenting a pipeline for automated classification and profitability evaluation. Our study is filling this gap by presenting a broad evaluation of the approach showing the resulting average Sharpe ratio of 6.32. However, we do not take into account order execution queues, which of course affect the result in the live-trading setting. The obtained performance results give us confidence that this approach is worthwhile.
arXiv
In this paper, we study the statistical properties of the moneyness scaling transformation by Leung and Sircar (2015). This transformation adjusts the moneyness coordinate of the implied volatility smile in an attempt to remove the discrepancy between the IV smiles for levered and unlevered ETF options. We construct bootstrap uniform confidence bands which indicate that the implied volatility smiles are statistically different after moneyness scaling has been performed. An empirical application shows that there are trading opportunities possible on the LETF market. A statistical arbitrage type strategy based on a dynamic semiparametric factor model is presented. This strategy presents a statistical decision algorithm which generates trade recommendations based on comparison of model and observed LETF implied volatility surface. It is shown to generate positive returns with a high probability. Extensive econometric analysis of LETF implied volatility process is performed including out-of-sample forecasting based on a semiparametric factor model and uniform confidence bands' study. It provides new insights into the latent dynamics of the implied volatility surface. We also incorporate Heston stochastic volatility into the moneyness scaling method for better tractability of the model.
arXiv
In this paper, we propose a market model with returns assumed to follow a multivariate normal tempered stable distribution defined by a mixture of the multivariate normal distribution and the tempered stable subordinator. This distribution is able to capture two stylized facts: fat-tails and asymmetry, that have been empirically observed for asset return distributions. On the new market model, we discuss a new portfolio optimization method, which is an extension of Markowitz's mean-variance optimization. The new optimization method considers not only reward and dispersion but also asymmetry. The efficient frontier is also extended to a curved surface on three-dimensional space of reward, dispersion, and asymmetry. We also propose a new performance measure which is an extension of the Sharpe Ratio. Moreover, we derive closed-form solutions for two important measures used by portfolio managers in portfolio construction: the marginal Value-at-Risk (VaR) and the marginal Conditional VaR (CVaR). We illustrate the proposed model using stocks comprising the Dow Jones Industrial Average. First, perform the new portfolio optimization and then demonstrating how the marginal VaR and marginal CVaR can be used for portfolio optimization under the model. Based on the empirical evidence presented in this paper, our framework offers realistic portfolio optimization and tractable methods for portfolio risk management.
arXiv
This paper develops a methodology for tracking in real time the impact of the COVID-19 pandemic on economic activity by analyzing high-frequency electricity market data. The approach is validated by several robustness tests and by contrasting our estimates with the official statistics on the recession caused by COVID-19 in different European countries during the first two quarters of 2020. Compared with the standard indicators, our results are much more chronologically disaggregated and up-to-date and, therefore, can inform the current debate on the appropriate policy response to the pandemic. Unsurprisingly, we find that nations that experienced the most severe initial outbreaks also grappled with the hardest economic recessions. However, we detect diffused signs of recovery, with economic activity in most European countries returning to its pre-pandemic level by August 2020. Furthermore, we show how delaying intervention or pursuing 'herd immunity' are not successful strategies, since they increase both economic disruption and mortality. The most effective short-run strategy to minimize the impact of the pandemic appears to be the introduction of early and relatively less stringent non-pharmaceutical interventions.
arXiv
In this paper we propose a regularization approach for network modeling of German power derivative market. To deal with the large portfolio, we combine high-dimensional variable selection techniques with dynamic network analysis. The estimated sparse interconnectedness of the full German power derivative market, clearly identify the significant channels of relevant potential risk spillovers. Our empirical findings show the importance of interdependence between different contract types, and identify the main risk contributors. We further observe strong pairwise interconnections between the neighboring contracts especially for the spot contracts trading in the peak hours, its implications for regulators and investors are also discussed. The network analysis of the full German power derivative market helps us to complement a full picture of system risk, and have a better understanding of the German power market functioning and environment.
arXiv
We investigate two different constructions of robust Orlicz spaces as a generalisation of robust $L^p$-spaces. Our first construction is top-down and considers the maximal domain of a worst-case Luxemburg norm. From an applied persepective, this approach can be justified by a uniform-boundedness-type result. In typical situations, the worst-case Orlicz space agrees with the intersection of the corresponding individual Orlicz spaces. Our second construction produces the closure of a space of test random variables with respect to the worst-case Luxemburg norm. We show that separability of such spaces or their subspaces has very strong implications in terms of dominatedness of the set of priors and thus for applications in the field of robust finance. For instance, norm closures of bounded continuous functions as considered in the $G$-framework lead to spaces which are lattice-isomorphic to sublattices of a classical $L^1$-space lacking, however, Dedekind $\sigma$-completeness. We further show that the topological spanning power of options is always limited under nondominated uncertainty.
arXiv
We continue a series of papers devoted to construction of semi-analytic solutions for barrier options. These options are written on underlying following some simple one-factor diffusion model, but all the parameters of the model as well as the barriers are time-dependent. We managed to show that these solutions are systematically more efficient for pricing and calibration than, eg., the corresponding finite-difference solvers. In this paper we extend this technique to pricing double barrier options and present two approaches to solving it: the General Integral transform method and the Heat Potential method. Our results confirm that for double barrier options these semi-analytic techniques are also more efficient than the traditional numerical methods used to solve this type of problems.
arXiv
This article explores the challenges for the adoption of scrubbers and low sulfur fuels on ship manufacturers and shipping companies. Results show that ship manufacturers, must finance their working capital and operating costs, which implies an increase in the prices of the ships employing these new technologies. On the other hand, shipping companies must adopt the most appropriate technology according to the areas where ships navigate, the scale economies of trade routes, and the cost-benefit analysis of ship modernization.
arXiv
With the cost of implementation shrinking and robot-to-workers ratio skyrocketing, the effects of automation on our economy and society are more palpable than ever. According to various studies, over half of our jobs could be fully executed by machines over the next decade or two, with severe impacts concentrated disproportionately on manufacturing-focused developing countries. In response to the threat of mass displacement of labour due to automation, economists, politicians, and even the business community have come to see Universal Basic Income (UBI) as the panacea. This paper argued against a UBI by addressing its implementation costs and efficiency in mitigating the impact of automation through quantitative evidence as well as results of failed UBI-comparable programs across the world. The author of this paper instead advocated for the continuation of existing means-tested welfare systems and further investment in education scheme for unskilled and low-skilled labour.
This paper was submitted to the "Young Economist of the Year 2019" essay competition hosted by the Financial Times and the Royal Economic Society, where it won a high commendation and was one of the 36 best papers shortlisted among 1,300 qualified submissions to be honoured on the website of the Royal Economic Society (2.7% acceptance rate). Due to the rules and policies of the Royal Economic Society, the author could only make this paper available to the public at least one year after the original date of submission.
arXiv
How should one construct a portfolio from multiple mean-reverting assets? Should one add an asset to portfolio even if the asset has zero mean reversion? We consider a position management problem for an agent trading multiple mean-reverting assets. We solve an optimal control problem for an agent with power utility, and present a semi-explicit solution. The nearly explicit nature of the solution allows us to study the effects of parameter mis-specification, and derive a number of properties of the optimal solution.
arXiv
This paper provides a non-robust interpretation of the distributionally robust optimization (DRO) problem by relating the distributional uncertainties to the chance probabilities. Our analysis allows a decision-maker to interpret the size of the ambiguity set, which is often lack of business meaning, through the chance parameters constraining the objective function. We first show that, for general $\phi$-divergences, a DRO problem is asymptotically equivalent to a class of mean-deviation problems. These mean-deviation problems are not subject to uncertain distributions, and the ambiguity radius in the original DRO problem now plays the role of controlling the risk preference of the decision-maker. We then demonstrate that a DRO problem can be cast as a chance-constrained optimization (CCO) problem when a boundedness constraint is added to the decision variables. Without the boundedness constraint, the CCO problem is shown to perform uniformly better than the DRO problem, irrespective of the radius of the ambiguity set, the choice of the divergence measure, or the tail heaviness of the center distribution. Thanks to our high-order expansion result, a notable feature of our analysis is that it applies to divergence measures that accommodate well heavy tail distributions such as the student $t$-distribution and the lognormal distribution, besides the widely-used Kullback-Leibler (KL) divergence, which requires the distribution of the objective function to be exponentially bounded. Using the portfolio selection problem as an example, our comprehensive testings on multivariate heavy-tail datasets, both synthetic and real-world, shows that this business-interpretation approach is indeed useful and insightful.
arXiv
While abundant empirical studies support the long-range dependence (LRD) of mortality rates, the corresponding impact on mortality securities are largely unknown due to the lack of appropriate tractable models for valuation and risk management purposes. We propose a novel class of Volterra mortality models that incorporate LRD into the actuarial valuation, retain tractability, and are consistent with the existing continuous-time affine mortality models. We derive the survival probability in closed-form solution by taking into account of the historical health records. The flexibility and tractability of the models make them useful in valuing mortality-related products such as death benefits, annuities, longevity bonds, and many others, as well as offering optimal mean-variance mortality hedging rules. Numerical studies are conducted to examine the effect of incorporating LRD into mortality rates on various insurance products and hedging efficiency.