# Research articles for the 2019-10-15

arXiv

The informational context is regularly questioned in a transitional economic regime like the one implemented in China or Vietnam. This article investigates this issue and the predictive power of fundamental analysis in such context and more precisely in a Chinese context with an analysis of 3 different industries (media, power, and steel). Through 3 different kinds of correlation, we examine 25 financial determinants for 60 Chinese listed companies between 2011 and 2015. Our results show that fundamental analysis can effectively be used as an investment tool in transitional economic context. Contrasting with the EMH for which the accounting information is instantaneously integrated into the financial information (stock prices), our study suggests that these two levels of information are not synchronized in China opening therefore a door for a fundamental analysis based prediction. Furthermore, our results also indicate that accounting information illustrates quite well the economic reality since financial reports in each industry can disclose a part of stock value information in line with the economic situation of the industry under consideration.

arXiv

We study the problem of dynamically trading futures in a regime-switching market. Modeling the underlying asset price as a Markov-modulated diffusion process, we present a utility maximization approach to determine the optimal futures trading strategy. This leads to the analysis of the associated system of Hamilton-Jacobi-Bellman (HJB) equations, which are reduced to a system of linear ODEs. We apply our stochastic framework to two models, namely, the Regime-Switching Geometric Brownian Motion (RS-GBM) model and Regime-Switching Exponential Ornstein-Uhlenbeck (RS-XOU) model. Numerical examples are provided to illustrate the investor's optimal futures positions and portfolio value across market regimes.

arXiv

We address a long-standing open problem in risk theory, namely the optimal strategy to pay out dividends from an insurance surplus process, if the dividend rate can never be decreased. The optimality criterion here is to maximize the expected value of the aggregate discounted dividend payments up to the time of ruin. In the framework of the classical Cram\'{e}r-Lundberg risk model, we solve the corresponding two-dimensional optimal control problem and show that the value function is the unique viscosity solution of the corresponding Hamilton-Jacobi-Bellman equation. We also show that the value function can be approximated arbitrarily closely by ratcheting strategies with only a finite number of possible dividend rates and identify the free boundary and the optimal strategies in several concrete examples. These implementations illustrate that the restriction of ratcheting does not lead to a large efficiency loss when compared to the classical un-constrained optimal dividend strategy.

arXiv

Investment returns naturally reside on irregular domains, however, standard multivariate portfolio optimization methods are agnostic to data structure. To this end, we investigate ways for domain knowledge to be conveniently incorporated into the analysis, by means of graphs. Next, to relax the assumption of the completeness of graph topology and to equip the graph model with practically relevant physical intuition, we introduce the portfolio cut paradigm. Such a graph-theoretic portfolio partitioning technique is shown to allow the investor to devise robust and tractable asset allocation schemes, by virtue of a rigorous graph framework for considering smaller, computationally feasible, and economically meaningful clusters of assets, based on graph cuts. In turn, this makes it possible to fully utilize the asset returns covariance matrix for constructing the portfolio, even without the requirement for its inversion. The advantages of the proposed framework over traditional methods are demonstrated through numerical simulations based on real-world price data.

arXiv

This paper has the following objectives: to understand the concepts of Environmental Accounting in Brazil; Make criticisms and propositions anchored in the reality or demand of environmental accounting for Amazonia Paraense. The methodological strategy was a critical analysis of Ferreira's books (2007); Ribeiro (2010) and Tinoco and Kraemer (2011) using their correlation with the scientific production of authors discussing the Paraense Amazon, besides our experience as researchers of this territory. As a result, we created three sections: one for understanding the current constructs of environmental accounting, one for criticism and one for propositions.

arXiv

This paper studies a robust portfolio optimization problem under the multi-factor volatility model introduced by Christoffersen et al. (2009). The optimal strategy is derived analytically under the worst-case scenario with or without derivative trading. To illustrate the effects of ambiguity, we compare our optimal robust strategy with some strategies that ignore the information of uncertainty, and provide the corresponding welfare analysis. The effects of derivative trading to the optimal portfolio selection are also discussed by considering alternative strategies. Our study is further extended to the cases with jump risks in asset price and correlated volatility factors, respectively. Numerical experiments are provided to demonstrate the behavior of the optimal portfolio and utility loss.

arXiv

We present an expansion for portfolio optimization in the presence of small, instantaneous, quadratic transaction costs. Specifically, the magnitude of transaction costs has a coefficient that is of the order $\epsilon$ small, which leads to the optimization problem having an asymptotically-singular Hamilton-Jacobi-Bellman equation whose solution can be expanded in powers of $\sqrt\epsilon$. In this paper we derive explicit formulae for the first two terms of this expansion. Analysis and simulation are provided to show the behavior of this approximating solution.

arXiv

We study the finite sample properties of the Fourier estimator of the integrated leverage effect in the presence of microstructure noise contamination. Our estimation strategy is related to a measure of the contemporaneous correlation between financial returns and their volatility increments. We do not prior assume that the aforementioned correlation is constant, as mainly done in the literature. We instead consider it as a stochastic process. In this framework, we show that the Fourier estimator is asymptotically unbiased but its mean squared error diverges when noisy high-frequency data are in use. This drawback of the estimator is further analyzed in a simulation study where a feasible estimation strategy is developed to tackle this problem. The paper concludes with an empirical study on the leverage effect patterns estimated using high-frequency data for the S&P 500 futures between January 2007 and December 2008.

arXiv

Charles Cobb and Paul Douglas in 1928 used data from the US manufacturing sector for 1899-1922 to introduce what is known today as the Cobb-Douglas production function that has been widely used in economic theory for decades. We employ the R programming language to fit the formulas for the parameters of the Cobb-Douglas production function generated by the authors recently via the bi-Hamiltonian approach to the same data set utilized by Cobb and Douglas. We conclude that the formulas for the output elasticities and total factor productivity are compatible with the original 1928 data.

arXiv

We extend the scope of the vol-of-vol expansion given in [27] from finite dimensional stochastic volatility models to infinite dimensional (rough) forward variance models and provide new explicit representations of the push-down Malliavin weights that simplifies the computations and provides new insights into their structure. This will validate the Bergomi-Guyon expansion for a large class of forward variance models.