# Research articles for the 2021-05-31

arXiv

Prediction of future movement of stock prices has always been a challenging task for the researchers. While the advocates of the efficient market hypothesis (EMH) believe that it is impossible to design any predictive framework that can accurately predict the movement of stock prices, there are seminal work in the literature that have clearly demonstrated that the seemingly random movement patterns in the time series of a stock price can be predicted with a high level of accuracy. Design of such predictive models requires choice of appropriate variables, right transformation methods of the variables, and tuning of the parameters of the models. In this work, we present a very robust and accurate framework of stock price prediction that consists of an agglomeration of statistical, machine learning and deep learning models. We use the daily stock price data, collected at five minutes interval of time, of a very well known company that is listed in the National Stock Exchange (NSE) of India. The granular data is aggregated into three slots in a day, and the aggregated data is used for building and training the forecasting models. We contend that the agglomerative approach of model building that uses a combination of statistical, machine learning, and deep learning approaches, can very effectively learn from the volatile and random movement patterns in a stock price data. We build eight classification and eight regression models based on statistical and machine learning approaches. In addition to these models, a deep learning regression model using a long-and-short-term memory (LSTM) network is also built. Extensive results have been presented on the performance of these models, and the results are critically analyzed.

arXiv

We propose and develop an algebraic approach to revealed preference. Our approach dispenses with non algebraic structure, such as topological assumptions. We provide algebraic axioms of revealed preference that subsume previous, classical revealed preference axioms, as well as generate new axioms for behavioral theories, and show that a data set is rationalizable if and only if it is consistent with an algebraic axiom.

arXiv

Since the beginning of the 1990s, Brazil has introduced different policies for increasing agricultural production among family farms, such as the National Program for Strengthening Family Farming (Pronaf), the technical assistance and rural extension programmes (ATER), and seeds distribution. Despite the importance of these policies for the development of family farming, there is a lack of empirical studies investigating their impact on commercialization of food products. By considering household-level data from the 2014 Brazilian National Household Sample Survey, we use propensity score matching techniques accounting for the interaction effects between policies to compare the commercialisation behaviour of recipients with non recipients. We find that Pronaf has a significant positive impact on family farmers propensity to engage in commercialisation, and this effect increases if farmers have also access to ATER. Receiving technical assistance alone has a positive effect, but this is mostly limited to smaller farms. In turn, seed distribution appears not to increase commercialization significantly. A well balanced policy mix could ensure that, in a country subject to the pressure of international food markets, increased commercialisation does not result in reduced food security for rural dwellers.

arXiv

Generally, people behave in social dilemmas such as proself and prosocial. However, inside social groups, people have a tendency to choose prosocial alternatives due to in-group favoritism. The bioelectrical activity of the human brain shows the differences between proself and prosocial exist even out of a socialized group. Moreover, a group socialization strengthens these differences. We used EEG System, "Neuron-Spectrum-4/EPM" (16 channels), to track the brain bioelectrical activity during decision making in laboratory experiments with the Prisoner's dilemma game and the short-term socialization stage. We compared the spatial distribution of the spectral density during the different experimental parts. The noncooperative decision was characterized by the increased values of spectral the beta rhythm in the orbital regions of prefrontal cortex. The cooperative choice, on the contrary, was accompanied by the theta-rhythm activation in the central cortex regions in both hemispheres and the high-frequency alpha rhythm in the medial regions of the prefrontal cortex. People who increased the cooperation level after the socialization stage was initially different from the ones who decreased the cooperation in terms of the bioelectrical activity. Well-socialized participants differed by increased values of spectral density of theta-diapason and decreased values of spectral density of beta-diapason in the middle part of frontal lobe. People who decreased the cooperation level after the socialization stage was characterized by decreased values of spectral density of alpha rhythm in the middle and posterior convex regions of both hemispheres.

arXiv

There is a random variable (X) with a determined outcome (i.e., X = x0), p(x0) = 1. Consider x0 to have a discrete uniform distribution over the integer interval [1, s], where the size of the sample space (s) = 1, in the initial state, such that p(x0) = 1. What is the probability of x0 and the associated information entropy (H), as s increases exponentially? If the sample space expansion occurs at an exponential rate (rate constant = lambda) with time (t) and applying time scaling, such that T = lambda x t, gives: p(x0|T)=exp(-T) and H(T)=T. The characterization has also been extended to include exponential expansion by means of simultaneous, independent processes, as well as the more general multi-exponential case. The methodology was applied to the expansion of the broad money supply of US$ over the period 2001-2019, as a real-world example. At any given time, the information entropy is related to the rate at which the sample space is expanding. In the context of the expansion of the broad money supply, the information entropy could be considered to be related to the "velocity" of the expansion of the money supply.

arXiv

We introduce Climate Change Valuation Adjustment (CCVA) to capture climate change impacts on CVA+FVA that are currently invisible assuming typical market practice. To discuss such impacts on CVA+FVA from changes to instantaneous hazard rates we introduce a flexible and expressive parameterization to capture the path of this impact to climate change endpoints, and transient transition effects. Finally we provide quantification of examples of typical interest where there is risk of economic stress from sea level change up to 2101, and from transformations of business models. We find that even with the slowest possible uniform approach to a climate change impact in 2101 there can still be significant CVA+FVA impacts on interest rate swaps of 20 years or more maturity. Transformation effects on CVA+FVA are strongly dependent on timing and duration of business model transformation. Using a parameterized approach enables discussion with stakeholders of economic impacts on CVA+FVA, whatever the details behind the climate impact.

arXiv

Complex systems in the real world are frequently characterized by entities connected by relationships of different nature. These systems can be appropriately described by multilayer networks where the different relations between nodes can be conveniently expressed structuring the network through different layers. To extend in a multilayer context the classical network indicators proposed for monolayer networks is an important issue. In this work we study the incidence of triangular patterns expressed through the local clustering coefficient in a weighted undirected multilayer network. We provide new local clustering coefficients for multilayer networks, looking at the network from different perspectives depending on the node's position, as well as a global clustering coefficient for the whole network. We also prove that all the well-known expressions for clustering coefficients existing in the literature, suitably extended to the multilayer framework, may be unified into our proposal, both in terms of tensors and supradjacency matrix. To show the effectiveness of the proposed measures, we study an application to the multilayer temporal financial network based on the returns of the S&P100 assets.

arXiv

We show that competitive equilibria in a range of models related to production networks can be recovered as solutions to dynamic programs. Although these programs fail to be contractive, we prove that they are tractable. As an illustration, we treat Coase's theory of the firm, equilibria in production chains with transaction costs, and equilibria in production networks with multiple partners. We then show how the same techniques extend to other equilibrium and decision problems, such as the distribution of management layers within firms and the spatial distribution of cities.

arXiv

The study investigates the relationship between bank profitability and a comprehensive list of bank specific, industry specific and macroeconomic variables using unique panel data from 23 Bangladeshi banks with large market shares from 2005 to 2019 employing the Pooled Ordinary Least Square (POLS) Method for regression estimation. The random Effect model has been used to check for robustness. Three variables, namely, Return on Asset (ROA), Return on Equity (ROE), and Net Interest Margin (NIM), have been used as profitability proxies. Non-interest income, capital ratio, and GDP growth have been found to have a significant relationship with ROA. In addition to non-interest income, market share, bank size, and real exchange rates are significant explaining variables if profitability is measured as NIM. The only significant determinant of profitability measured by ROE is market share. The primary contribution of this study to the existing knowledge base is an extensive empirical analysis by covering the entire gamut of independent variables (bank specific, industry related, and macroeconomic) to explain the profitability of the banks in Bangladesh. It also covers an extensive and recent data set. Banking sector stakeholders may find great value from the outputs of this paper. Regulators and policymakers may find this useful in undertaking analyses in setting policy rates, banking industry stability, and impact assessment of critical policy measures before and after the enactment, etc. Investors and the bank management are to use the findings of this paper in analyzing the real drivers of profitability of the banks they are contemplating to invest and managing on a daily basis.

arXiv

The Foreign Exchange (Forex) is a large decentralized market, on which trading analysis and algorithmic trading are popular. Research efforts have been focusing on proof of efficiency of certain technical indicators. We demonstrate, however, that the values of indicator functions are not reproducible and often reduce the number of trade opportunities, compared to price-action trading.

In this work, we develop two dataset-agnostic Forex trading heuristic templates with high rate of trading signals. In order to determine most optimal parameters for the given heuristic prototypes, we perform a machine learning simulation of 10 years of Forex price data over three low-margin instruments and 6 different OHLC granularities. As a result, we develop a specific and reproducible list of most optimal trade parameters found for each instrument-granularity pair, with 118 pips of average daily profit for the optimized configuration.

arXiv

Although recent studies have shown that electricity systems with shares of wind and solar above 80% can be affordable, economists have raised concerns about market integration. Correlated generation from variable renewable sources depresses market prices, which can cause wind and solar to cannibalise their own revenues and prevent them from covering their costs from the market. This cannibalisation appears to set limits on the integration of wind and solar, and thus to contradict studies that show that high shares are cost effective. Here we show from theory and with simulation examples how market incentives interact with prices, revenue and costs for renewable electricity systems. The decline in average revenue seen in some recent literature is due to an implicit policy assumption that technologies are forced into the system, whether it be with subsidies or quotas. This decline is mathematically guaranteed regardless of whether the subsidised technology is variable or not. If instead the driving policy is a carbon dioxide cap or tax, wind and solar shares can rise without cannibalising their own market revenue, even at penetrations of wind and solar above 80%. The strong dependence of market value on the policy regime means that market value needs to be used with caution as a measure of market integration. Declining market value is not necessarily a sign of integration problems, but rather a result of policy choices.

arXiv

This paper revisits the discussion on determinants of budget balances and investigates the change in their effect in light of the COVID-19 crisis by utilizing data on 43 countries and a system generalized method of moments approach. The results show that the overall impact of the global pandemic led to a disproportionate increase in the magnitude of the estimated effects of the macroeconomic determinants on the budget balance. However, we also find that more developed economies were able to undertake higher stimulus packages for the relatively same level of primary balance. We believe that one of the factors affecting this outcome is that that they exhibit a higher government debt position in domestic currency denomination.

arXiv

Estimating health benefits of reducing fossil fuel use from improved air quality provides important rationales for carbon emissions abatement. Simulating pollution concentration is a crucial step of the estimation, but traditional approaches often rely on complicated chemical transport models that require extensive expertise and computational resources. In this study, we develop a novel and succinct machine learning framework that is able to provide precise and robust annual average fine particle (PM2.5) concentration estimations directly from a high-resolution fossil energy use data set. The accessibility and applicability of this framework show great potentials of machine learning approaches for integrated assessment studies. Applications of the framework with Chinese data reveal highly heterogeneous health benefits of reducing fossil fuel use in different sectors and regions in China with a mean of \$34/tCO2 and a standard deviation of \$84/tCO2. Reducing rural and residential coal use offers the highest co-benefits with a mean of \$360/tCO2. Our findings prompt careful policy designs to maximize cost-effectiveness in the transition towards a carbon-neutral energy system.

arXiv

We show that in a financial market given by semimartingales an arbitrage opportunity, provided it exists, can only be exploited through short selling. This finding provides a theoretical basis for differences in regulation for financial services providers that are allowed to go short and those without short sales. The privilege to be allowed to short sell gives access to potential arbitrage opportunities, which creates by design a bankruptcy risk.

arXiv

This paper aims to study the impact of public and private investments on the economic growth of developing countries. The study uses the panel data of 39 developing countries covering the periods 1990-2019. The study was based on the neoclassical growth models or exogenous growth models state in which land, labor, capital accumulation, etc., and technology proved substantial for economic growth. The paper finds that public investment has a strong positive impact on economic growth than private investment. Gross capital formation, labor growth, and government final consumption expenditure were found significant in explaining the economic growth. Overall, both public and private investments are substantial for the economic growth and development of developing countries.

arXiv

Spectral risk measures (SRMs) belong to the family of coherent risk measures. A natural estimator for the class of SRMs has the form of L-statistics. Various authors have studied and derived the asymptotic properties of the empirical estimator of SRM. We propose a kernel based estimator of SRM. We investigate the large sample properties of general L-statistics based on i.i.d and dependent observations and apply them to our estimator. We prove that it is strongly consistent and asymptotically normal. We compare the finite sample performance of our proposed kernel estimator with that of several existing estimators for different SRMs using Monte Carlo simulation. We observe that our proposed kernel estimator outperforms all the estimators. Based on our simulation study we have estimated the exponential SRM of four future indices-that is Nikkei 225, Dax, FTSE 100, and Hang Seng. We also perform a backtesting exercise of SRM.

arXiv

We analyze recently proposed mortgage contracts that aim to eliminate selective borrower default when the loan balance exceeds the house price (the ``underwater'' effect). We show contracts that automatically reduce the outstanding balance in the event of house price decline remove the default incentive, but may induce prepayment in low price states. However, low state prepayments vanish if the benefit from home ownership is sufficiently high. We also show that capital gain sharing features, such as prepayment penalties in high house price states, are ineffective as they virtually eliminate prepayment. For observed foreclosure costs, we find that contracts with automatic balance adjustments become preferable to the traditional fixed-rate contracts at mortgage rate spreads between 50-100 basis points. We obtain these results for perpetual versions of the contracts using American options pricing methodology, in a continuous-time model with diffusive home prices. The contracts' values and optimal decision rules are associated with free boundary problems, which admit semi-explicit solutions.

arXiv

This paper examines a class of barrier options-multi-step barrier options, which can have any finite number of barriers of any level. We obtain a general, explicit expression of option prices of this type under the Black-Scholes model. Multi-step barrier options are not only useful in that they can handle barriers of different levels and time steps, but can also approximate options with arbitrary barriers. Moreover, they can be embedded in financial products such as deposit insurances based on jump models with simple barriers. Along the way, we derive multi-step reflection principle, which generalizes the reflection principle of Brownian motion.

arXiv

This paper considers the representation of energy storage in electricity sector capacity planning models. The incorporation of storage in long-term systems models of this type is increasingly relevant as the cost of storage technologies, particularly batteries, and of complementary variable renewable technologies, decline. To value storage technologies appropriately, a representation of linkages between time periods is required, breaking classical temporal aggregation strategies that greatly improve computation time. We appraise approaches to address this problem, highlighting a common underlying structure, conditions for lossless aggregation, and challenges of aggregation at relevant geographical scales. We then investigate solutions to the modeling problem including a decomposition scheme to avoid temporal aggregation at a parallelizable computational cost. These examples frame aspects of the problem ripe for contributions from the operational research community.

arXiv

We study the problem of active portfolio management where an investor aims to outperform a benchmark strategy's risk profile while not deviating too far from it. Specifically, an investor considers alternative strategies whose terminal wealth lie within a Wasserstein ball surrounding a benchmark's -- being distributionally close -- and that have a specified dependence/copula -- tying state-by-state outcomes -- to it. The investor then chooses the alternative strategy that minimises a distortion risk measure of terminal wealth. In a general (complete) market model, we prove that an optimal dynamic strategy exists and provide its characterisation through the notion of isotonic projections.

We further propose a simulation approach to calculate the optimal strategy's terminal wealth, making our approach applicable to a wide range of market models. Finally, we illustrate how investors with different copula and risk preferences invest and improve upon the benchmark using the Tail Value-at-Risk, inverse S-shaped, and lower- and upper-tail distortion risk measures as examples. We find that investors' optimal terminal wealth distribution has larger probability masses in regions that reduce their risk measure relative to the benchmark while preserving the benchmark's structure.

arXiv

We study a utility maximization problem in a financial market with a stochastic drift process, combining a worst-case approach with filtering techniques. Drift processes are difficult to estimate from asset prices, and at the same time optimal strategies in portfolio optimization problems depend crucially on the drift. We approach this problem by setting up a worst-case optimization problem with a time-dependent uncertainty set for the drift. Investors assume that the worst possible drift process with values in the uncertainty set will occur. This leads to local optimization problems, and the resulting optimal strategy needs to be updated continuously in time. We prove a minimax theorem for the local optimization problems and derive the optimal strategy. Further, we show how an ellipsoidal uncertainty set can be defined based on filtering techniques and demonstrate that investors need to choose a robust strategy to be able to profit from additional information.

arXiv

What resources and technologies are strategic? This question is often the focus of policy and theoretical debates, where the label "strategic" designates those assets that warrant the attention of the highest levels of the state. But these conversations are plagued by analytical confusion, flawed heuristics, and the rhetorical use of "strategic" to advance particular agendas. We aim to improve these conversations through conceptual clarification, introducing a theory based on important rivalrous externalities for which socially optimal behavior will not be produced alone by markets or individual national security entities. We distill and theorize the most important three forms of these externalities, which involve cumulative-, infrastructure-, and dependency-strategic logics. We then employ these logics to clarify three important cases: the Avon 2 engine in the 1950s, the U.S.-Japan technology rivalry in the late 1980s, and contemporary conversations about artificial intelligence.