Research articles for the 2020-01-26
SSRN
We extend a recent result of Trybula and Zawisza [Mathematics of Operations Research, 44(3), 966-987, 2019], who investigate a continuous-time portfolio optimization problem under monotone mean-variance preferences. Their main finding is that the optimal strategies for monotone and classical mean-variance preferences coincide in a stochastic factor model for the financial market. We generalize this result to any model for the financial market where stock prices are continuous.
SSRN
The relevant literature suggests that ownership structure is one of the main corporate governance mechanisms influencing the scope of financial performance. The aim of this study is to investigate the relationship between ownership structure and financial performance of listed beverage food and tobacco companies for the period of 2010-2015. This study also examines the impact of ownership structure on financial performance. The sample consists of 10 listed beverage food and tobacco companies in Sri Lanka. In this study, data was collected from secondary sources and hypotheses are examined by using Pearson's correlation and regression analysis. The results reveal that ownership concentration and foreign ownership structure are positively correlated with financial performance of listed beverage food and tobacco companies while institutional ownership structure isn't significantly correlated with financial performance. It is also found that there is a significant impact of foreign ownership structure on financial performance. Higher the foreign ownership structure in listed beverage food and tobacco companies, the higher the financial performance which is preferable for the shareholders and it improves the wealth of companies.
arXiv
We address the so-called calibration problem which consists of fitting in a tractable way a given model to a specified term structure like, e.g., yield or default probability curves. Time-homogeneous jump-diffusions like Vasicek or Cox-Ingersoll-Ross (possibly coupled with compounded Poisson jumps, JCIR), are tractable processes but have limited flexibility; they fail to replicate actual market curves. The deterministic shift extension of the latter (Hull-White or JCIR++) is a simple but yet efficient solution that is widely used by both academics and practitioners. However, the shift approach is often not appropriate when positivity is required, which is a common constraint when dealing with credit spreads or default intensities. In this paper, we tackle this problem by adopting a time change approach. On the top of providing an elegant solution to the calibration problem under positivity constraint, our model features additional interesting properties in terms of implied volatilities. It is compared to the shift extension on various credit risk applications such as credit default swap, credit default swaption and credit valuation adjustment under wrong-way risk. The time change approach is able to generate much larger volatility and covariance effects under the positivity constraint. Our model offers an appealing alternative to the shift in such cases.
arXiv
Many researches have discussed the phenomenon and definition of sharing economy, but an understanding of sharing economy's reconstructions of the world remains elusive. We illustrate the mechanism of sharing economy's reconstructions of the world in detail based on big data including the mechanism of sharing economy's reconstructions of society, time and space, users, industry, and self-reconstruction in the future, which is very important for society to make full use of the reconstruction opportunity to upgrade our world through sharing economy. On the one hand, we established the mechanisms for sharing economy rebuilding society, industry, space-time, and users through qualitative analyses, and on the other hand, we demonstrated the rationality of the mechanisms through quantitative analyses of big data.
arXiv
Proponents of behavioral finance have identified several "puzzles" in the market that are inconsistent with rational finance theory. One such puzzle is the "excess volatility puzzle". Changes in equity prices are too large given changes in the fundamentals that are expected to change equity prices. In this paper, we offer a resolution to the excess volatility puzzle within the context of rational finance. We empirically show that market inefficiency attributable to the volatility of excess return across time is caused by fitting an improper distribution to the historical returns. Our results indicate that the variation of gross excess returns is attributable to poorly fitting the tail of the return distribution and that the puzzle disappears by employing a more appropriate distribution for the return data. The new distribution that we introduce in this paper that better fits the historical return distribution of stocks explains the excess volatility in the market and thereby explains the volatility puzzle. Failing to estimate the historical returns using the proper distribution is only one possible explanation for the existence of the volatility puzzle. However, it offers statistical models within the rational finance framework which can be used without relying on behavioral finance assumptions when searching for an explanation for the volatility puzzle.
arXiv
In this article we model a financial derivative price as an observable on the market state function. We apply geometric techniques to integrating the Heisenberg Equation of Motion. We illustrate how the non-commutative nature of the model introduces quantum interference effects that can act as either a drag or a boost on the resulting return. The ultimate objective is to investigate the nature of quantum drift in the Accardi-Boukas quantum Black-Scholes framework which involves modelling the financial market as a quantum observable, and introduces randomness through the Hudson-Parthasarathy quantum stochastic calculus. In particular we aim to differentiate randomness that is introduced through external noise (quantum stochastic calculus) and randomness that is fundamental to a quantum system (Heisenberg Equation of Motion).
arXiv
Technological change is responsible for major changes in the labor market. One of the offspring of technological change is the SBTC, which is for many economists the leading cause of the increasing wage inequality. However, despite that the technological change affected similarly the majority of the developed countries, nevertheless, the level of the increase of wage inequality wasn't similar. Following the predictions of the SBTC theory, the different levels of inequality could be due to varying degrees of skill inequality between economies, possibly caused by variations in the number of skilled workers available. However, recent research shows that the difference mentioned above can explain a small percentage of the difference between countries. Therefore, most of the resulting inequality could be due to the different ways in which the higher level of skills is valued in each labor market. The position advocated in this article is that technological change is largely given for all countries without much scope to reverse. Therefore, in order to illustrate the changes in the structure of wage distribution that cause wage inequality, we need to understand how technology affects labor market institutions.In this sense, the pay inequality caused by technological progress is not a phenomenon we passively accept. On the contrary, recognizing that the structure and the way labor market institutions function is largely influenced by the way institutions respond to technological change, we can understand and maybe reverse this underlying wage inequality.
arXiv
This paper analyses how Time Series Analysis techniques can be applied to capture movement of an exchange traded index in a stock market. Specifically, Seasonal Auto Regressive Integrated Moving Average (SARIMA) class of models is applied to capture the movement of Nifty 50 index which is one of the most actively exchange traded contracts globally [1]. A total of 729 model parameter combinations were evaluated and the most appropriate selected for making the final forecast based on AIC criteria [8]. NIFTY 50 can be used for a variety of purposes such as benchmarking fund portfolios, launching of index funds, exchange traded funds (ETFs) and structured products. The index tracks the behaviour of a portfolio of blue chip companies, the largest and most liquid Indian securities and can be regarded as a true reflection of the Indian stock market [2].
SSRN
The audit committee is one of the key elements in the corporate governance structure that helps to control and monitor management in the organization. The aim of this study is to investigate the impact of audit committee on organizational performance of listed hotels and travels in Sri Lanka. The sample consists of 15 listed hotels and travels in Sri Lanka. In this study, data was collected from secondary sources and hypotheses are examined by using Pearsonâs correlation and multiple regression analysis. The results reveal that audit committee attributes such as AC independence, AC experts and AC meetings have a significant impact on organizational performance of listed hotels and travels in Sri Lanka. Further audit committee size is not found to have a significant impact on the organizational performance. The findings could be useful to regulators in other jurisdiction who are looking at ways to enhance the effectiveness of AC, overall firm governance and enhance the organizational performance.
arXiv
In commodity and energy markets swing options allow the buyer to hedge against futures price fluctuations and to select its preferred delivery strategy within daily or periodic constraints, possibly fixed by observing quoted futures contracts. In this paper we focus on the natural gas market and we present a dynamical model for commodity futures prices able to calibrate liquid market quotes and to imply the volatility smile for futures contracts with different delivery periods. We implement the numerical problem by means of a least-square Monte Carlo simulation and we investigate alternative approaches based on reinforcement learning algorithms.
arXiv
A new methodology has been introduced to clean the correlation matrix of single stocks returns based on a constrained principal component analysis using financial data. Portfolios were introduced, namely "Fundamental Maximum Variance Portfolios", to capture in an optimal way the risks defined by financial criteria ("Book", "Capitalization", etc.). The constrained eigenvectors of the correlation matrix, which are the linear combination of these portfolios, are then analyzed. Thanks to this methodology, several stylized patterns of the matrix were identified: i) the increase of the first eigenvalue with a time scale from 1 minute to several months seems to follow the same law for all the significant eigenvalues with 2 regimes; ii) a universal law seems to govern the weights of all the "Maximum variance" portfolios, so according to that law, the optimal weights should be proportional to the ranking based on the financial studied criteria; iii) the volatility of the volatility of the "Maximum Variance" portfolios, which are not orthogonal, could be enough to explain a large part of the diffusion of the correlation matrix; iv) the leverage effect (increase of the first eigenvalue with the decline of the stock market) occurs only for the first mode and cannot be generalized for other factors of risk. The leverage effect on the beta, which is the sensitivity of stocks with the market mode, makes variable the weights of the first eigenvector.
arXiv
We study dynamic optimal portfolio allocation for monotone mean--variance preferences in a general semimartingale model. Armed with new results in this area we revisit the work of Cui, Li, Wang and Zhu (2012, MAFI) and fully characterize the circumstances under which one can set aside a non-negative cash flow while simultaneously improving the mean--variance efficiency of the left-over wealth. The paper analyzes, for the first time, the monotone hull of the Sharpe ratio and highlights its relevance to the problem at hand.
arXiv
We present a stochastic-local volatility model for derivative contracts on commodity futures able to describe forward-curve and smile dynamics with a fast calibration to liquid market quotes. A parsimonious parametrization is introduced to deal with the limited number of options quoted in the market. Cleared commodity markets for futures and options are analyzed to include in the pricing framework specific trading clauses and margining procedures. Numerical examples for calibration and pricing are provided for different commodity products.
arXiv
The Social Cost of Carbon (SCC) is estimated by integrated assessment models and is widely used by government agencies to value the climate impacts of rulemakings, however, the core discussion around SCC so far was focused on validity of obtained numerical estimates and related uncertainties while largely neglecting a deeper discussion of the SCC applicability limits stemming from the calculation method. This work provides a conceptual mathematical background and the economic interpretation that is behind the SCC calculation in the three widely used integrated assessment models. Policy makers need to be aware of a subtle, but decisive difference between the actual and the commonly implied meanings of SCC that substantially limits its applicability as compared to the current practice.
SSRN
I examine the liquidity effect of the adoption of ASC 606: Revenue from contracts with customers. Using a staggered difference-in-differences design, I find that the adoption of ASC 606 increases liquidity. Next, I examine the channels through which the adoption of the standard affects liquidity. Theory suggests that the adoption of standards can affect liquidity through either the precision channel, i.e., the change in the accounting reportâs ability to reflect economic events, the comparability channel, i.e., the increase in comparability across reporting entities because of the standardization of accounting standards, or both. I find that the adoption of the new revenue recognition standard is associated with increases in both precision and comparability, which are, in turn, associated with an increase in liquidity. I further show that firms that experience an increase in neither precision nor comparability do not experience an increase in liquidity.