Research articles for the 2019-09-23
SSRN
The design of Bitcoin is closely related to gold which has led to the hypothesis that Bitcoin has gold-like features such as being a store of value and a safe haven. Given Bitcoinâs extreme volatility it is interesting to analyze how investors protect themselves from extreme negative price movements in Bitcoin. In other words, is there a safe haven for investors in Bitcoin? We analyze intra-day price changes of the largest stablecoins and find that they act as a safe haven, and Tether showing the strongest effect. These findings demonstrate that Bitcoin investors seek out stablecoins when Bitcoin price falls are extreme (even for Bitcoin standards) rendering stablecoins instable.
arXiv
This study provides a consistent and efficient pricing method for both Standard & Poor's 500 Index (SPX) options and the Chicago Board Options Exchange's Volatility Index (VIX) options under a multiscale stochastic volatility model. To capture the multiscale volatility of the financial market, our model adds a fast scale factor to the well-known Heston volatility and we derive approximate analytic pricing formulas for the options under the model. The analytic tractability can greatly improve the efficiency of calibration compared to fitting procedures with the finite difference method or Monte Carlo simulation. Our experiment using options data from 2016 to 2018 shows that the model reduces the errors on the training sets of the SPX and VIX options by 9.9% and 13.2%, respectively, and decreases the errors on the test sets of the SPX and VIX options by 13.0\% and 16.5\%, respectively, compared to the single-scale model of Heston. The error reduction is possible because the additional factor reflects short-term impacts on the market, which is difficult to achieve with only one factor. It highlights the necessity of modeling multiscale volatility.
SSRN
We provide empirical evidence to support the calibration of a limit on household indebtedness levels, in the form of a cap on the debt-service-to-income (DSTI) ratio, in order to reduce the probability of borrower defaults in Romania. The analysis establishes two findings that are new to the literature. First, we show that the relationship between DSTI and probability of default is non-linear, with probability of default responding to increases in DSTI only after a certain threshold. Second, we establish that consumer loan defaults occur at lower levels of DSTI compared to mortgages. Our results support the recent regulation adopted by the National Bank of Romania, limiting the household DSTI at origination to 40 percent for new mortgages and consumer loans. Our counterfactual analysis indicates that had the limit been in place for all the loans in our sample, the probability of default (PD) would have been lower by 23 percent.
SSRN
Composite equity issuance (CEI) anomaly was one of well-known stock anomalies. Kent Daniel and Sheridan Titman in their 2006 paper, âMarket Reaction to Tangible and Intangible Informationâ, broke down stock return into tangible and intangible return and argued that intangible return can explain both return reversal effect and book to price effect. In their paper, composite equity issuance was used as a measure of intangible return. Despite wide citations of CEI anomaly by various publications, we found little understanding on fundamental drivers behind this anomaly. In this paper, we decompose this anomaly through a simple transformation and show that efficacy of CEI anomaly rests on two most powerful stock factors: share change and dividend yield.
arXiv
Deep Learning (DL) models can be used to tackle time series analysis tasks with great success. However, the performance of DL models can degenerate rapidly if the data are not appropriately normalized. This issue is even more apparent when DL is used for financial time series forecasting tasks, where the non-stationary and multimodal nature of the data pose significant challenges and severely affect the performance of DL models. In this work, a simple, yet effective, neural layer, that is capable of adaptively normalizing the input time series, while taking into account the distribution of the data, is proposed. The proposed layer is trained in an end-to-end fashion using back-propagation and leads to significant performance improvements compared to other evaluated normalization schemes. The proposed method differs from traditional normalization methods since it learns how to perform normalization for a given task instead of using a fixed normalization scheme. At the same time, it can be directly applied to any new time series without requiring re-training. The effectiveness of the proposed method is demonstrated using a large-scale limit order book dataset, as well as a load forecasting dataset.
arXiv
We investigate state-dependent effects of fiscal multipliers and allow for endogenous sample splitting to determine whether the US economy is in a slack state. When the endogenized slack state is estimated as the period of the unemployment rate higher than about 12 percent, the estimated cumulative multipliers are significantly larger during slack periods than non-slack periods and are above unity. We also examine the possibility of time-varying regimes of slackness and find that our empirical results are robust under a more flexible framework. Our estimation results points out the importance of the heterogenous effects of fiscal policy.
SSRN
AI, automation and other digital technologies are thought to be transforming the economy, but the empirical evidence on their diffusion and impact is scarce. This paper uses new firm-level administrative data from Germany to analyze causes and consequences of firms' investment in the new technology - digitization and automation. Main results characterize relationship of technology and labor: (1) investment in technology is typically increased by labor scarcity; (2) new technologies typically reduce employment. Both results hide important heterogeneity across industries: aggregate substitution patterns are driven e.g. by manufacturing or trade, but complementarity dominates e.g. in IT or finance. For identification, in (1) I use variation in labor scarcity driven by population aging, and in (2) difference-in-differences across high- and low-adoption areas and industries. Additional results demonstrate that financial constraints impede technology adoption and that technology increases skill level of the workforce and labor productivity. Overall, the effects of new technologies significantly vary across industries, firms and types of workers, highlighting the importance of considering a broad set of technologies and studying patterns of their adoption by firms.
arXiv
We develop a dynamic version of the SSVI parameterisation for the total implied variance, ensuring that European vanilla option prices are martingales, hence preventing the occurrence of arbitrage, both static and dynamic. Insisting on the constraint that the total implied variance needs to be null at the maturity of the option, we show that no model--in our setting--allows for such behaviour. This naturally gives rise to the concept of implied volatility bubbles, whereby trading in an arbitrage-free way is only possible during part of the life of the contract, but not all the way until expiry.
arXiv
In the following paper, we analyse the ID$_3$-Price in the German Intraday Continuous electricity market using an econometric time series model. A multivariate approach is conducted for hourly and quarter-hourly products separately. We estimate the model using lasso and elastic net techniques and perform an out-of-sample, very short-term forecasting study. The model's performance is compared with benchmark models and is discussed in detail. Forecasting results provide new insights to the German Intraday Continuous electricity market regarding its efficiency and to the ID$_3$-Price behaviour.
arXiv
Sustainable development is a worldwide recognized social and political goal, discussed in both academic and political discourse and with much research on the topic related to sustainable development in higher education. Since mental models are formed more effectively at school age, we propose a new way of thinking that will help achieve this goal. This paper was written in the context of Russia, where the topic of sustainable development in education is poorly developed. The authors used the classical methodology of the case analysis. The analysis and interpretation of the results were conducted in the framework of the institutional theory. Presented is the case of Ural Federal University, which has been working for several years on the creation of a device for the purification of industrial sewer water in the framework of an initiative student group. Schoolchildren recently joined the program, and such projects have been called university-to-school projects. Successful solutions of inventive tasks contribute to the formation of mental models. This case has been analyzed in terms of institutionalism, and the authors argue for the primacy of mental institutions over normative ones during sustainable society construction. This case study is the first to analyze a partnership between a Federal University and local schools regarding sustainable education and proposes a new way of thinking.
arXiv
We introduce new forecast encompassing tests for the risk measure Expected Shortfall (ES). The ES currently receives much attention through its introduction into the Basel III Accords, which stipulate its use as the primary market risk measure for the international banking regulation. We utilize joint loss functions for the pair ES and Value at Risk to set up three ES encompassing test variants. The tests are built on misspecification robust asymptotic theory and we verify the finite sample properties of the tests in an extensive simulation study. We use the encompassing tests to illustrate the potential of forecast combination methods for different financial assets.
SSRN
This paper studies whether bilateral international financial connection data help predict bilateral stock return comovement. It is shown that, when the United States is chosen as the benchmark, a larger U.S. portfolio investment asset position on the destination economy predicts a stronger stock return comovement between them. For large economies such as the United States and Germany, the portfolio investment position is also the best predictor among other connection variables. The paper discusses with a simple general equilibrium portfolio model that the empirical pattern is consistent with the behavior of index investors who trade in response to risk-on/risk-off shocks.
arXiv
Portfolio optimization emerged with the seminal paper of Markowitz (1952). The original mean-variance framework is appealing because it is very efficient from a computational point of view. However, it also has one well-established failing since it can lead to portfolios that are not optimal from a financial point of view. Nevertheless, very few models have succeeded in providing a real alternative solution to the Markowitz model. The main reason lies in the fact that most academic portfolio optimization models are intractable in real life although they present solid theoretical properties. By intractable we mean that they can be implemented for an investment universe with a small number of assets using a lot of computational resources and skills, but they are unable to manage a universe with dozens or hundreds of assets. However, the emergence and the rapid development of robo-advisors means that we need to rethink portfolio optimization and go beyond the traditional mean-variance optimization approach. Another industry has faced similar issues concerning large-scale optimization problems. Machine learning has long been associated with linear and logistic regression models. Again, the reason was the inability of optimization algorithms to solve high-dimensional industrial problems. Nevertheless, the end of the 1990s marked an important turning point with the development and the rediscovery of several methods that have since produced impressive results. The goal of this paper is to show how portfolio allocation can benefit from the development of these large-scale optimization algorithms. Not all of these algorithms are useful in our case, but four of them are essential when solving complex portfolio optimization problems. These four algorithms are the coordinate descent, the alternating direction method of multipliers, the proximal gradient method and the Dykstra's algorithm.
SSRN
We provide a review of macro-finance models featuring nonlinear dynamics. We survey the models developed recently in the literature, including models with amplification effects of financial constraints, models with households' leverage constraints, and models with financial networks. We also construct an illustrative model for those readers who are unfamiliar with the literature. Within this framework, we highlight several important limitations of local solution methods compared with global solution methods, including the fact that local-linearization approximations omit important nonlinear dynamics, yielding biased impulse-response analysis.
arXiv
Will a large economy be stable? Building on Robert May's original argument for large ecosystems, we conjecture that evolutionary and behavioural forces conspire to drive the economy towards marginal stability. We study networks of firms in which inputs for production are not easily substitutable, as in several real-world supply chains. Relying on results from Random Matrix Theory, we argue that such networks generically become dysfunctional when their size increases, when the heterogeneity between firms becomes too strong or when substitutability of their production inputs is reduced. At marginal stability and for large heterogeneities, we find that the distribution of firm sizes develops a power-law tail, as observed empirically. Crises can be triggered by small idiosyncratic shocks, which lead to "avalanches" of defaults characterized by a power-law distribution of total output losses. This scenario would naturally explain the well-known "small shocks, large business cycles" puzzle, as anticipated long ago by Bak, Chen, Scheinkman and Woodford.
arXiv
We define the concept of good trade execution and we construct explicit adapted good trade execution strategies in the framework of linear temporary market impact. Good trade execution strategies are dynamic, in the sense that they react to the actual realisation of the traded asset price path over the trading period; this is paramount in volatile regimes, where price trajectories can considerably deviate from their expected value. Remarkably however, the implementation of our strategies does not require the full specification of an SDE evolution for the traded asset price, making them robust across different models. Moreover, rather than minimising the expected trading cost, good trade execution strategies minimise trading costs in a pathwise sense, a point of view not yet considered in the literature. The mathematical apparatus for such a pathwise minimisation hinges on certain random Young differential equations that correspond to the Euler-Lagrange equations of the classical Calculus of Variations. These Young differential equations characterise our good trade execution strategies in terms of an initial value problem that allows for easy implementations.
SSRN
In this study we propose a new determinant of non-performing loans for the case of the Greek banking sector. We employ aggregate yearly data for the period 1996- 2016 and we conduct a Principal Component Analysis for all the Worldwide Governance Indicators (WGI) for Greece, aiming to isolate the common component and thus to create the GOVERNANCE indicator. We find that the GOVERNANCE indicator is a significant determinant of Greek banksâ non-performing loans indicating that both political and governance factors impact on the level of the Greek non-performing loans. An additional variable that also has a statistically significant impact on the level of Greek non-performing loans, when combined with WGI in the dynamic specification of our model, is systemic liquidity risk. Our results could be of interest to policy makers and regulators as a macro prudential policy tool.
arXiv
This paper studies the bail-out optimal dividend problem with regime switching under the constraint that the cumulative dividend strategy is absolutely continuous. We confirm the optimality of the regime-modulated refraction-reflection strategy when the underlying risk model follows a general spectrally negative Markov additive process. To verify the conjecture of a barrier type optimal control, we first introduce and study an auxiliary problem with the final payoff at an exponential terminal time and characterize the optimal threshold explicitly using fluctuation identities of the refracted-reflected Levy process. Second, we transform the problem with regime-switching into an equivalent local optimization problem with a final payoff up to the first regime switching time. The refraction-reflection strategy with regime-modulated thresholds can be shown as optimal by using results in the first step and some fixed point arguments for auxiliary recursive iterations.
arXiv
Recent technology advances have enabled firms to flexibly process and analyze sophisticated employee performance data at a reduced and yet significant cost. We develop a theory of optimal incentive contracting where the monitoring technology that governs the above procedure is part of the designer's strategic planning. In otherwise standard principal-agent models with moral hazard, we allow the principal to partition agents' performance data into any finite categories and to pay for the amount of information that the output signal carries. Through analysis of the trade-off between giving incentives to agents and saving the monitoring cost, we obtain characterizations of optimal monitoring technologies such as information aggregation, strict MLRP, likelihood ratio-convex performance classification, group evaluation in response to rising monitoring costs, and assessing multiple task performances according to agents' endogenous tendencies to shirk. We examine the implications of these results for workforce management and firms' internal organizations.
SSRN
An asset bubble relaxes collateral constraints and increases borrowing by credit-constrainedagents. At the same time, as the bubble deflates when constraints start binding, it amplifiesdownturns. We show analytically and quantitatively that the macroprudential policy shouldoptimally respond to building asset price bubbles non-monotonically depending on the underlyinglevel of indebtedness. If the level of debt is moderate, policy should accommodate the bubbleto reduce the incidence of a binding collateral constraint. If debt is elevated, policy should leanagainst the bubble more aggressively to mitigate the pecuniary externalities from a deflatingbubble when constraints bind.
arXiv
In this paper, we derive the price of a European call option of an asset following a normal process assuming stochastic volatility. The volatility is assumed to follow the Cox Ingersoll Ross (CIR) process. We then use the fast Fourier transform (FFT) to evaluate the option price given we know the characteristic function of the return analytically. We compare the results of fast Fourier transform with the Monte Carlo simulation results of our process. Further, we present a numerical example to understand the normal implied volatility of the model.
SSRN
This paper investigates bond risk premia in the framework of predictive systems. Different from the traditional linear predictive models, predictive systems allow predictors to be imperfectly correlated with conditional expected returns, and could incorporate prior beliefs on the negative correlation between unexpected and expected returns. We find that predictive systems can deliver stronger evidence of predictability than linear predictive models. Furthermore, bond risk premia inferred by predictive systems are countercyclical and increase with inflation risk, and this is consistent with what consumption-based asset pricing models imply.
arXiv
Sectoral production is modeled by cascading binary compounding processes. The sequence of processes is discovered in a self-similar hierarchical structure stylized in the economy-wide networks of production. All nesting substitution elasticities and Hicks neutral productivity growths are measured in order that the general equilibrium feedbacks between all sectoral unit cost functions reenact the transformation of networks observed as a set of two temporally distant input-output coefficient matrices. This system of unit cost functions is examined as to how idiosyncratic sectoral productivity shocks propagate into aggregate macroeconomic fluctuations in light of potential networks transformation. Also, we study how sectoral productivity increments propagate into dynamical general equilibrium allowing networks transformation and ultimately gain social benefits.
SSRN
Hungarian Abstract: A hagyományos közgazdaságtan a reálelemzés keretében érvel, és ez akadálya a valós gazdasági folyamatok megértésének. Schumpeter és Keynes kritizálta ezt a megközelÃtést és helyette a monetáris elemzést alkalmazta. Ezt az elméleti alapvetést követem, amikor megkérdÅ'jelezem a kamatpolitika hagyományos felfogását. A könyvben a pénz és a kamat egy új elméletét ismertetve megmutatom, hogy a kamat természetes rátájának klasszikus fogalma nem ad támpontot a monetáris politika számára. Az endogén pénzelmélet jut el ahhoz a felismeréshez, hogy a pénzteremtésben a kereskedelmi bankok hitelezése a döntÅ' jelentÅ'ségű, és ezzel a jegybank szerepe háttérbe szorul ezen a téren. A pénzteremtés irányÃtását a központi bank napjainkban már nem kizárólag a kamatpolitika alakÃtásával gyakorolja, hanem a hitelezésnek a bankok pénzügyi stabilitási és likviditási szabályozásán keresztül valósÃtja meg. A jegybank ugyanakkor a reálkamathoz fűzÅ'dÅ' téves hiedelmektÅ'l megszabadÃtva meghatározó szerepet játszik a kamatok alakÃtásában. English Abstract: Neoclassical economics uses real analysis and this approach limits its applicability to practical economic problems, including policy applications. Joseph Schumpeter and later John Maynard Keynes himself rejected the approach of real analysis and the notion of money neutrality. They argued that to understand the changes in the economy requires closer reflection of the crucial importance of credit and money. Schumpeter and Keynes promoted monetary analysis as the proper method to analyze the modern economy. This book joins their instructive criticism by raising doubts about the traditional views on interest rate. We deal with a new theory of money and interest rate and I show that Wicksellâs classical concepts of the natural (real equilibrium) rate of interest would not give guidance for contemporary monetary policy practitioners. These terms are unobservable. The natural rate apparently became unstable after the global financial crisis. Such concepts of interest rates are key variables of real analysis, but they are inherently uncertain, and imprecise. There is a growing dissatisfaction with traditional macroeconomic approaches based on the notion of money neutrality in real analysis.The endogenous theory of money highlighted the importance of money created by commercial bank credit. The central bankâs presumed control over money became questionable. Central banks determine interest rates (short term and nominal) but the quantity of money is determined endogenously. Macroprudencial and liquidity requirement regulations -- which are important factors in influencing commercial bank lending and endogenous money creation -- became the most important instrument the central bank can use to control money. With respect to money creation, we distinguish between inside and outside money. Inside money is created by the private sectorâs need for money helped with bank credit. Outside money is created by the state (not the private sector), but its creation is indirectly influenced by the money demand of the private sector. We give a brief overview of the historical process of the emergence of money by comparing the main elements of the chartalist and metallist concepts of money. Concerning the current debates about the role of banks in money creation we compare three alternative theories of banking. All three are in conflict with each other in explaining even the basic facts, and yet still they live in peaceful coexistence in textbooks. These three theories treat the role of banks in money creation differently. According to one of the theories, banks simply act as intermedieries in channelling savings to borrowers, and play no part at all in money creation. Another set of theories maintains that individual banks are unable to create money, since they cannot print banknotes, but the banking system as a whole can create money in a manner governed by the central bank by the money multiplier process. According to the third theory banks may create money independently from the central bank endogenously through credit creation. The description of todayâs money flows and the interpretation of todayâs monetary policy is only possible through the endogenous money theory. Globally important central banks will increase their interest rate eventually. However, at low interest rates a relatively small increase may lead to significant losses in portfolios with longer duration. Investment funds in their effort to minimize losses by shortening the duration of their portfolios may trigger abrupt changes in capital flows. Increased volatility may undermine financial stability. This type of stability risk is independent of the countryâs liquidity situation or weather the banks are well capitalized or profitable. Such systemic risks cannot be managed by macroprudential tools but these tools remain an integral part of monetary policy management.
arXiv
This paper introduces novel backtests for the risk measure Expected Shortfall (ES) following the testing idea of Mincer and Zarnowitz (1969). Estimating a regression framework for the ES stand-alone is infeasible, and thus, our tests are based on a joint regression for the Value at Risk and the ES, which allows for different test specifications. These ES backtests are the first which solely backtest the ES in the sense that they only require ES forecasts as input parameters. As the tests are potentially subject to model misspecification, we provide asymptotic theory under misspecification for the underlying joint regression. We find that employing a misspecification robust covariance estimator substantially improves the tests' performance. We compare our backtests to existing approaches and find that our tests outperform the competitors throughout all considered simulations. In an empirical illustration, we apply our backtests to ES forecasts for 200 stocks of the S&P 500 index.
SSRN
In this paper we reviewed some numerical algorithms, implemented in R language which solve the Risk Budgeting (RB) allocation problem. We demonstrated that the well known Sequential Quadratic Programming (SQP) whose objective function is not strictly convex, fails to converge for high dimensional baskets.On the contrary, the new explored algorithms tackle this issue by transforming the objective function into a strictly convex one. Amongst them, the âSpinu Algorithmâ proves to be the most robust and fastest algorithm, both for large equity and small multi-asset portfolios. Surprisingly, the promising Cyclical Coordinate Descent (CCD) algorithm proposed by Griveau et.al. (2013) which suits well for risk budgeting on a large basket of equity stocks, lacks robustness when performed on a multi-asset universe.Our results confirm that the âSpinu Algorithmâ is the most robust framework to implement risk-budgeting portfolios for any type of investment universe.
SSRN
This paper investigates the effects of credit rating downgrades, equity mispricing and CEO overconfidence on zero-leverage policy, using data for listed United States firms during the period 1980-2012. The results show that (1) the likelihood of zero-leverage increases significantly following a downgrade in credit rating; (2) zero-leverage is the outcome of the past attempts by firms to issue more overvalued equity capital; (3) firms with overconfident CEOs are more likely to choose zero-leverage. The results clearly suggest that the conditions prevailing in both credit and equity markets exert significant influence on zero-leverage policy. The analysis also advocates the inclusion of managerial biases in conjunction with the market-wide conditions in the analysis of zero-leverage policy. Overall, the findings reveal that zero-leverage firms find that the benefits of issuing overvalued equity outweigh the benefits associated with debt financing. These results are robust to a battery of checks.
arXiv
We study the system of heterogeneous interbank lending and borrowing based on the relative average of log-capitalization given by the linear combination of the average within groups and the ensemble average and describe the evolution of log-capitalization by a system of coupled diffusions. The model incorporates a game feature with homogeneity within groups and heterogeneity between groups where banks search for the optimal lending or borrowing strategies through minimizing the heterogeneous linear quadratic costs in order to avoid to approach the default barrier. Due to the complicity of the lending and borrowing system, the closed-loop Nash equilibria and the open-loop Nash equilibria are both driven by the coupled Riccati equations. The existence of the equilibria in the two-group case where the number of banks are sufficiently large is guaranteed by the solvability for the coupled Riccati equations as the number of banks goes to infinity in each group. The equilibria are consisted of the mean-reverting term identical to the one group game and the group average owing to heterogeneity. In addition, the corresponding heterogeneous mean filed game with the arbitrary number of groups is also discussed. The existence of the $\epsilon$-Nash equilibrium in the general $d$ heterogeneous groups is also verified. Finally, in the financial implication, we observe the Nash equilibria governed by the mean-reverting term and the linear combination of the ensemble averages of individual groups and study the influence of the relative parameters on the liquidity rate through the numerical analysis.
SSRN
In January 2021, the Consumer Financial Protection Bureau will face a decision: to renew its special definition for Qualified Mortgages (QMs) made by Fannie Mae and Freddie Mac, abolish that definition, or adopt some other approach to Qualified Mortgages. Concerns about access to credit have propelled the issue whether any definition of a QM should impose a debt-to-income (DTI) cap to the forefront of the debate. Because market discipline will not halt an inflating housing bubble occasioned by deteriorating DTI levels, we argue that the CFPB needs to mandate a general DTI cap as part of the definition of a QM. However, we would temper that DTI cap in two important respects. First, the Bureau should examine whether the 43 percent DTI limit could be modestly raised without significantly raising housing prices or default risk, that is, without increasing systemic risk. Second, the CFPB should further relax the DTI cap for loans that meet the affordable housing goals established by the Federal Housing Finance Agency. Providing targeted DTI relief to affordable housing goal loans would expand credit availability to those who really need it without creating inflationary pressures culminating in a future real estate bubble.
SSRN
This paper examines the impact of the opioid epidemic on the financing costs of local governments. We find that a higher county-level drug overdose death rate is associated with an increase in the offering yield spread of local municipal bonds. A difference-in-differences analysis around introductions of must-access Prescription Drug Monitoring Programs (PDMP) and an instrumental variable approach using opioid makers' marketing payment to local physicians suggest that the impact of opioid abuses on municipal borrowing cost is likely causal. The opioid crisis reduces future revenues of local governments and increases police and criminal justice expenditures.
arXiv
We compare performance of US stocks based on their size (market capitalization). We regress alpha and beta over size and other factors for individual stocks in Standard & Poor 500, and for randomly generated portfolios. In addition, we compare exchange-traded funds (ETFs) consisting of large-, mid- and small-cap stocks, including international ETFs. Conclusions: Size and market exposure (beta) are inversely related (strong evidence for ETFs and portfolios, weaker evidence for individual stocks).
SSRN
The aim of this paper is to provide more transparency regarding risk parity portfolios. We design a structural framework to properly describe, evaluate, and improve the performance characteristics of risk parity portfolios. This is achieved by segregating the input parameter for the risk parity portfolio, namely the variance covariance matrix, into the volatility vector and correlation matrix, and accordingly, develop a volatility-timing and a correlationtiming portfolio, which maintain the characteristics of risk parity under specific assumptions. We investigate the relationship of the risk parity portfolio with the aforementioned portfolios and show why risk parity as an allocation scheme is methodologically constructed to achieve a superior risk-adjusted performance as compared to that of traditional and heuristic weighting techniques. Finally, we demonstrate how the risk parity portfolio can be improved according to different market situations.
SSRN
A recently published study (Hong, Hung, and Lobo, The Accounting Review 2014) claims to show that, depending on the benchmark sample used, the 2005 mandatory adoption of IFRS is associated with a 38-82% reduction in IPO underpricing. We re-examine this result controlling for the concurrent adoption of the Prospectus Directive (PD), which mandated increased IPO prospectus disclosures, and the enforcement of these disclosures in the member states of the European Union (EU). First, we find that there is a significant data error in Hong et al.âs study that renders their reported results unreliable: approximately 30% of the treatment firms this study categorizes as mandatory IFRS adoptions are not, in fact, subject to a mandate to report in IFRS. These are firms admitted to trading on âexchange-regulatedâ markets in the EU that do not require IFRS. We use hand-collected prospectus data to identify the correct treatment sample. Our analysis shows that, for affected firms, there is a statistically significant decrease in IPO underpricing associated with adoption of the PD for firms based in countries that also concurrently enhanced accounting enforcement (see Christensen, Hail, and Leuz 2013), but there is no association between mandatory IFRS adoption and IPO underpricing. We also examine voluntary IFRS adoptions by firms admitted to trading on exchange-regulated markets after 2005. Overall we find no evidence that mandatory IFRS adoption resulted in very large economic gains for IPO firms. This study also provides a brief but comprehensive description of much of EU capital markets law.
SSRN
Spanish Abstract: En los últimos años la discusión acerca del diseño institucional de las agencias reguladoras del sector financiero se ha ido centrando en aspectos de gobierno corporativo. Asimismo, la independencia en la conducción de la polÃtica monetaria y el hecho de que las tareas de regulación y supervisión financiera en muchas ocasiones están dentro de la órbita de los bancos centrales, han conducido a la incorporación de prácticas de transparencia en la conducción de la polÃtica financiera. Si bien existen diversos esfuerzos para medir la independencia de los bancos centrales, los intentos de cuantificación de la transparencia en la conducción de la polÃtica financiera son escasos o principalmente centrados en el proceso de supervisión. El presente documento propone diseñar un Ãndice de transparencia en materia de regulación financiera y aplicarlo a Uruguay, con el fin de, más adelante, compararlo con otros paÃses. De acuerdo al Ãndice propuesto, Uruguay obtiene un puntaje de 17.75 sobre 20, destacándose en particular los valores obtenidos en las dimensiones de transparencia de procesos y de polÃticas.English Abstract: In recent years, the discussion about the institutional design of regulatory agencies in the financial sector has focused on aspects of corporate governance. Likewise, the independence in the conduct of monetary policy and the fact that the tasks of financial regulation and supervision are often within the orbit of the central banks, have led to the incorporation of transparency practices in the conduct of the financial policy While there are several efforts to measure the independence of central banks, attempts to quantify transparency in the conduct of financial policy are scarce or mainly focused on the supervisory process. This document proposes the design of an index of transparency in financial regulation and its application to Uruguay, in order to compare it with other countries. According to the proposed index, Uruguay obtains a score of 17.75 out of 20, highlighting in particular the values ââobtained in the dimensions of transparency of processes and policies.
arXiv
In this paper we formulate a regression problem to predict realized volatility by using option price data and enhance VIX-styled volatility indices' predictability and liquidity. We test algorithms including regularized regression and machine learning methods such as Feedforward Neural Networks (FNN) on S&P 500 Index and its option data. By conducting a time series validation we find that both Ridge regression and FNN can improve volatility indexing with higher prediction performance and fewer options required. The best approach found is to predict the difference between the realized volatility and the VIX-styled index's prediction rather than to predict the realized volatility directly, representing a successful combination of human learning and machine learning. We also discuss suitability of different regression algorithms for volatility indexing and applications of our findings.
SSRN
The British banking sector had many small banks in the mid-nineteenth century. From around 1885 until the end of World War One there was a process of increasingly larger mergers between banks. By the end of the merger wave the English and Welsh market was highly concentrated, with only five major banks. News of a merger brought a persistent rise in the share prices of both the acquiring and the target bank (roughly 1% and 7%, respectively). Non-merging banks, especially those whose local market concentration rose, saw their stock prices rise.
SSRN
Where does new volatility enter the volatility of securities listed in many countries? While literature has focused on where information enters the price, I develop a framework to study how each marketâs volatility contributes to the permanent volatility of the Asset. I build a VECM with an Autoregressive Stochastic Volatility (ASV) framework estimated using the MCMC method and Bayesian inference. This specification allows defining the measures of a marketâs contribution to volatility discovery. In the application, I study cash and 3-months futures markets of some metals traded on the London Metals Exchange. I also study the EURO STOXX 50 Index and its futures. I find that for most the securities, while price discovery happens on the cash market, the volatility discovery mostly happens in the futures market. Overall, the results suggest that information discovery and volatility discovery do not necessarily have the same determinants.
arXiv
The paper uses diffusion models to understand the main determinants of diffusion of solar photovoltaic panels (SPP) worldwide, focusing on the role of public incentives. We applied the generalized Bass model (GBM) to adoption data of 26 countries between 1992-2016. The SPP market appears as a frail and complicate one, lacking public media support. Even the major shocks in adoption curves, following state incentive implemented after 2006, failed to go beyond short-term effects and therefore were unable to provide sustained momentum to the market. This suggests that further barriers to adoption should be removed.