Research articles for the 2019-09-30
SSRN
I investigate the exit outcomes of start-ups backed by government VCs (GVCs) and private VCs (PVCs), using a sample of 8,106 start-ups in China funded by VCs between 1991 and 2013 and exit information updated in 2018. I find that start-ups backed by GVCs are less likely to exit through domestic Initial Public Offerings (IPOs), oversea IPOs, and M\&As. GVC backed start-ups are also less likely to list on the intermediate public market before companies go to IPOs. I explore three potential explanations. Firstly, I study whether GVCs and PVCs have different investment objectives. Evidence shows that GVCs support younger and innovation-oriented start-ups, yet propensity score matching analysis shows that this difference in investment objectives cannot fully explain the GVCs' underperformance. Secondly, the lack of market discipline can lead to VCs' lack of effort. I find that the performance gap between GVCs and PVCs narrows down in more mature VC markets. Thirdly, political connections between VC managers and local governors can lead to inefficient investments. I exploit political turnover as exogenous shocks to political connections and find that deals made in times of political turnover by both GVCs and PVCs have a lower probability of successful exits. But no significant evidence is found on the performance gap varying in times of political turnover.
arXiv
We describe a new public-domain open-source simulator of an electronic financial exchange, and of the traders that interact with the exchange, which is a truly distributed and cloud-native system that been designed to run on widely available commercial cloud-computing services, and in which various components can be placed in specified geographic regions around the world, thereby enabling the study of planetary-scale latencies in contemporary automated trading systems. Our simulator allows an exchange server to be launched in the cloud, specifying a particular geographic zone for the cloud hosting service; automated-trading clients which attach to the exchange can then also be launched in the cloud, in the same geographic zone and/or in different zones anywhere else on the planet, and those clients are then subject to the real-world latencies introduced by planetary-scale cloud communication interconnections. In this paper we describe the design and implementation of our simulator, called DBSE, which is based on a previous public-domain simulator, extended in ways that are partly inspired by the architecture of the real-world Jane Street Exchange. DBSE relies fundamentally on UDP and TCP network communications protocols and implements a subset of the FIX de facto standard protocol for financial information exchange. We show results from an example in which the exchange server is remotely launched on a cloud facility located in London (UK), with trader clients running in Ohio (USA) and Sydney (Australia). We close with discussion of how our simulator could be further used to study planetary-scale latency arbitrage in financial markets.
arXiv
A model of sharing revenues among groups when group members are ranked several times is presented. The methodology is based on pairwise comparison matrices, allows for the use of any weighting method, and makes possible to tune the level of inequality. Our proposal is demonstrated on the example of Formula One prize money allocation among the constructors. We introduce an axiom called scale invariance, which requires the ranking of teams to be independent of the parameter controlling inequality. The eigenvector method is revealed to violate this condition in our dataset, while the row geometric mean method always satisfies it. The revenue allocation is not influenced by the arbitrary valuation given to the race prizes in the official points scoring system of Formula One.
arXiv
To improve the efficient frontier of the classical mean-variance model in continuous time, we propose a varying terminal time mean-variance model with a constraint on the mean value of the portfolio asset, which moves with the varying terminal time. Using the embedding technique from stochastic optimal control in continuous time and varying the terminal time, we determine an optimal strategy and related deterministic terminal time for the model. Our results suggest that doing so for an investment plan requires minimizing the variance with a varying terminal time.
arXiv
Global fixed income returns span across multiple maturities and economies, that is, they naturally reside on multi-dimensional data structures referred to as tensors. In contrast to standard "flat-view" multivariate models that are agnostic to data structure and only describe linear pairwise relationships, we introduce a tensor-valued approach to model the global risks shared by multiple interest rate curves. In this way, the estimated risk factors can be analytically decomposed into maturity-domain and country-domain constituents, which allows the investor to devise rigorous and tractable global portfolio management and hedging strategies tailored to each risk domain. An empirical analysis confirms the existence of global risk factors shared by eight developed economies, and demonstrates their ability to compactly describe the global macroeconomic environment.
SSRN
The development of a proper method for calculating the value of copyrights and related rights is an important task in the current climate of the new economy. Costs of royalties paid to collecting societies are significant and disputable in many industries, like hotels or gastronomy. There are many disputes and law cases related to indicating the fair value of copyrights. The dominant position of collecting societies has resulted in a number of decisions of the European Commission and judgments of the European Court of Justice as well as national courts, concerning the activities of collecting societies and the level of fees collected. The problem has been widely analyzed, mainly by law specialists. In the case of the valuation of copyrights, methods based on income and market comparison are preferred, however, the fact that only a limited market exists in the case of dealing with collecting societies, should be taken into consideration. Author presents method based on income approach. Proposed model links financial results of the hotels with advantages of using TV and radio sets by the hotel guests in income approach. The presented method is useful to indicate the fair value of copyrights and was prepared to help in law disputes relating to the misuse of the dominant position by CMOs.
arXiv
We provide a verification and characterization result of optimal maximal sub-solutions of BSDEs in terms of fully coupled forward backward stochastic differential equations. We illustrate the application thereof in utility optimization with random endowment under probability and discounting uncertainty. We show with explicit examples how to quantify the costs of incompleteness when using utility indifference pricing, as well as a way to find optimal solutions for recursive utilities.
SSRN
Spanish Abstract: El propósito de este documento es exponer los procedimientos para calcular la rentabilidad de los recursos propios de un proyecto financiado parcialmente con deuda. Muestra el cálculo de las tasas variables de WACC, con y sin ahorro fiscal, para la amortización de la deuda, la forma de construir los tres flujos de efectivo opcionales y la aplicación de la tasa coherente de costo de capital. Incluye ejemplos.English Abstract: The purpose of this document is to expose the procedures to calculate the profitability of the own resources of a project partially financed with debt. It shows the calculation of the variable rates of WACC, with and without tax savings, for the amortization of the debt, the way to build the three optional cash flows and the application of the coherent capital cost rate. It includes examples.
SSRN
We develop a dynamic theory of capital structure, liquidity and risk management, and payout policies for a financially constrained firm under incomplete markets. In addition to costly external equity financing, the key friction we emphasize is limited financial spanning. We show that the marginal source of external financing on a on-going basis is debt. The firm only relies on costly external equity financing or costly default, when it is near or exceeds its endogenous debt capacity. The firm may be either endogenously risk-averse or risk-loving depending on the degree of market incompleteness and its current leverage. When it is risk-averse, the firm optimally manages its risk by fully hedges its hedgeable risk so as to minimize the volatility of its leverage. When it is risk-loving, the firm gambles for resurrection by selling insurance or excessively loading up on insurance. A key prediction of our theory is that the most important determinant of future leverage is current leverage, in line with existing empirical evidence.
SSRN
Using a unique dataset on gaming and credit card payments, we discover that credit card default rates are higher among individuals who spend more, more frequently, and more erratically on video games, and among those who have more and more diverse types of games on their mobile devices. These results are more pronounced for spending during the workweek and among overnight spenders, and are not driven by differences in financial literacy or a specific game type. Furthermore, intense gamers increase luxury, addictive, and impulsive expenditures more after receiving credit cards. Models of self-control offer a consistent explanation of our findings.
SSRN
There is a consensus that during the Great Recession period quantitative easing puts downward pressure on long-term interest rates. Using quarterly data and vector autoregressive model this note provides empirical evidence that quantitative easing, measured by changes in monetary base as a percentage of gross domestic product, instead, positively affects the U.S. 10-year government bond yield. One possible explanation for this is that increasing the size of the Federal Reserve balance sheets raises optimism about the macroeconomy.
arXiv
In this paper, we revisit the equity premium puzzle reported in 1985 by Mehra and Prescott. We show that the large equity premium that they report can be explained by choosing a more appropriate distribution for the return data. We demonstrate that the high-risk aversion value observed by Mehra and Prescott may be attributable to the problem of fitting a proper distribution to the historical returns and partly caused by poorly fitting the tail of the return distribution. We describe a new distribution that better fits the return distribution and when used to describe historical returns can explain the large equity risk premium and thereby explains the puzzle.
SSRN
We identified the impact of the expansionary monetary policy in China during the 2008â"2009 global financial crisis on the credit and investment allocation among firms after controlling for the simultaneous fiscal stimulus. We utilized the extent of the exposure to the construction sector, which is the primary beneficiary of the fiscal stimulus to control for the latter factor, as well as the variable indicating state ownership. We obtained robust evidence that the expansionary monetary policy led to the misallocation of bank credits to less productive firms after controlling for these confounding factors. However, we found that the investment increased more for more productive firms. Additional analyses showed that this is partly because more productive firms hoarded cash before the crisis, and partly because less productive firms more often engaged in building cash reserves.
arXiv
Recently, there has been a surge of interest in the use of machine learning to help aid in the accurate predictions of financial markets. Despite the exciting advances in this cross-section of finance and AI, many of the current approaches are limited to using technical analysis to capture historical trends of each stock price and thus limited to certain experimental setups to obtain good prediction results. On the other hand, professional investors additionally use their rich knowledge of inter-market and inter-company relations to map the connectivity of companies and events, and use this map to make better market predictions. For instance, they would predict the movement of a certain company's stock price based not only on its former stock price trends but also on the performance of its suppliers or customers, the overall industry, macroeconomic factors and trade policies. This paper investigates the effectiveness of work at the intersection of market predictions and graph neural networks, which hold the potential to mimic the ways in which investors make decisions by incorporating company knowledge graphs directly into the predictive model. The main goal of this work is to test the validity of this approach across different markets and longer time horizons for backtesting using rolling window analysis. In this work, we concentrate on the prediction of individual stock prices in the Japanese Nikkei 225 market over a period of roughly 20 years. For the knowledge graph, we use the Nikkei Value Search data, which is a rich dataset showing mainly supplier relations among Japanese and foreign companies. Our preliminary results show a 29.5% increase and a 2.2-fold increase in the return ratio and Sharpe ratio, respectively, when compared to the market benchmark, as well as a 6.32% increase and 1.3-fold increase, respectively, compared to the baseline LSTM model.
SSRN
This paper develops composite indicators of financial integration within the euro area for both price-based and quantity-based indicators covering money, bond, equity and banking markets. Prior to aggregation, individual integration indicators are harmonised by applying the probability integral transform. We find that financial integration in Europe increased steadily between 1995 and 2007. The subprime mortgage crisis marked a turning point, bringing about a marked drop in both composite indicators. This fragmentation trend reversed when the European banking union and the ECB's Outright Monetary Transactions Programme were announced in 2012, with financial integration recovering more strongly when measured by price-based indicators. In a growth regression framework, we find that higher financial integration tends to be associated with an increase in per capita real GDP growth in euro area countries. This correlation is found to be stronger the higher a country's growth opportunities.
SSRN
I study the role of accounting and financial reporting in entrepreneurial finance by examining whether financial statement disclosure increases capital raised through equity crowdfunding. On average, I find a positive association between financial reporting and capital raised, suggesting that accounting reduces information asymmetry with potential investors. Additionally, the importance of financial reporting in equity crowdfunding varies predictably in the cross-section. Specifically, financial reporting is associated with greater capital raised during periods of higher macroeconomic uncertainty and when the firm has longer historical operations prior to raising capital. Finally, using a structural path analysis, I find evidence that financial reporting is indirectly associated with a lower probability of ex-post failure by increasing the likelihood of raising capital. These results provide insight into the role of financial reporting in entrepreneurial finance and inform the ongoing debate over regulation and disclosure in the equity crowdfunding market.
SSRN
Purpose â" Whether stock returns are linked to exchange rate changes and whether foreign exchange risk is priced in a domestic context are less conclusive and thus still subject to a great debate. This paper attempts to provide new empirical evidence on these two inter-related issues, which are critical to investors and corporate risk management. Design/methodology/approach â" This paper applies two different econometric approaches: Nonlinear Seemingly Unrelated Regression (NLSUR) via Hansenâs (1982) Generalized Method of Moment (GMM) and multivariate GARCH in mean (MGARCH-M) to examine the exchange rate exposure and its pricing.Findings â" Using industry data for Japan, similar to previous studies, foreign exchange risk is not priced based on the test of an unconditional two-factor asset pricing model. However, strong evidence of time-varying foreign exchange risk premium and significant exchange rate betas are obtained based on the tests of conditional asset pricing models using multivariate GARCH in mean (MGARCH-M) approach where both conditional first and second moments of industry returns and risk factors are estimated simultaneously.Research limitations/implications â" The strong empirical evidence found in this study implies that corporate currency hedging not only results in more stable cash flows for a firm, but also reduces its cost of capital, and hence is justifiable.Originality/value â" This paper conducts an in-depth investigation regarding the exchange rate exposure and its pricing by utilizing two different econometric approaches: Nonlinear Seemingly Unrelated Regression (NLSUR) via Hansenâs (1982) Generalized Method of Moment (GMM) and multivariate GARCH in mean (MGARCH-M). In doing so, a more reliable conclusion about the exchange rate exposure and its pricing can be drawn.
SSRN
We study the impact of the Affordable Care Act (ACA) on municipal healthcare borrowing costs. The ACA expanded the insured customer base for hospitals, although exposed them to greater regulatory risk. Following a favorable 2012 ACA Supreme Court ruling, healthcare yields decreased by 39 basis points, for per-issue and economy-wide interest savings of $3.0 million and $1.74 billion. The effect was larger for urban and private hospitals. Yields decreased by another 17 basis points in states that voted to expand Medicaid. However, the ACA effect on long-term yields was weak, suggesting that repeal risk remains an obstacle to long-run financing.
SSRN
Purpose â" The purpose of this paper is to provide empirical evidence on how 1999â"2001 dot-com crisis and 2007â"2009 subprime crisis affect the gains from international diversification from the perspective of US investors. Design/methodology/approach â" A conditional international CAPM with asymmetric multivariate GARCH-M specification is used to estimate international diversification gains. Findings â" The authors find that over the entire sample period, the average gains from international diversification is statistically significant and about 1.253 percent per year. During the subprime crisis period, the average gains decreases to about 0.567 percent per year, but it increases to 2.829 percent per year during the dot-com crisis. Research limitations/implications â" These research findings although confirm the conjectures that international financial turmoil tends to increase the co-movements among global financial markets, are in contrast to the conjectures that international diversification does not work during the financial crisis as evidence from the dot-com crisis. Therefore, future research on international diversification should not just focus on the correlation among international financial markets and should adopt a fully parameterized asset pricing model to study this research topic. Practical implications â" Given the empirical results found in this paper that international diversification gains may be decreasing or increasing during the financial crisis, as long as investors are not able to predict international financial crises, it is the average gains from international diversification over the longer periods that should encourage investors to diversify, regardless of potentially lower benefits over the shorter periods of time. Originality/value â" The major value of this paper is that although the increase in the conditional correlation during the financial turmoil is consistent with previous studies, the empirical results clearly show that the impact of a financial crisis on the gains from international diversification cannot be solely determined by the correlation between domestic and world stock market returns since the gains also depend on the unsystematic risk from the domestic stock market. Consequently, it is premature for previous studies to conclude that the gain from international diversification is diminishing due to an increasing correlation among international stock markets during the financial crisis.
SSRN
Exploiting confidential data from the euro area, we show that sound banks pass negative rates on to their corporate depositors without experiencing a contraction in funding and that the tendency to charge negative rates becomes stronger as policy rates move deeper into negative territory. The negative interest rate policy (NIRP) provides stimulus to the economy through firmsâ asset re-balancing. Firms with high current assets linked to banks offering negative rates appear to increase their investment in tangible and intangible assets and to decrease their cash holdings to avoid the costs associated with negative rates. Overall, our results challenge the commonly held view that conventional monetary policy becomes ineffective when policy rates reach the zero lower bound.
SSRN
Italian Abstract: Ricerca sulla situazione economica italiana basata sui dati economici ufficiali; vengono analizzati e confrontati con il passato il debito pubblico, le riserve ufficiali, il PIL, l'inflazione e la disoccupazione. English Abstract: Research into the state of the Italian economy based on official economic data; the current Sovereign Debt, Official Reserves, GDP, Inflation and Unemployment situation is presented and and compared with the past.
SSRN
We investigate how counterpartiesâ characteristics, and the collateral they use, interact with their demand for liquidity in the Bank of Englandâs (BoE) operations. Between 2010 and 2016 there was regular usage of two BoE facilities: Indexed Long-Term Repos (ILTR) and the Funding for Lending Scheme (FLS). Using BoE proprietary data, we show that participation in ILTR is not skewed towards riskier counterparties, and is instead consistent with safe counterparties using the facilities to meet their liquidity needs. Collateral assets used for FLS are less liquid, since almost all assets are loan portfolios. Riskier and larger institutions are more likely to pre-position collateral in the FLS, but these counterparties do not subsequently draw upon FLS more than others do. Overall, our study points to no systemic misincentives; rather banks react to incentives in the manner intended by the policy objectives. Our results support the view that the central bank can provide market liquidity without absorbing undue risks onto its balance sheet.
SSRN
This study evaluates the effect of loan-to-value (LTV) limits on house prices. The identification strategy exploits variation in LTV limits induced by Hong Kong Monetary Authoritiesâ decision to introduce differentiated LTV limits anchored on the value of the property, which allows us to observe the counterfactual house price development. We estimate that a one-percentage-point decrease in LTV limits reduces the house price growth by 0.8 percentage points. Overall, our results document that macro prudential policies, which regulate access to financing, are an effective policy tool to control house price growth.
arXiv
Motivated by empirical evidence for rough volatility models, this paper investigates continuous-time mean-variance (MV) portfolio selection under the Volterra Heston model. Due to the non-Markovian and non-semimartingale nature of the model, classic stochastic optimal control frameworks are not directly applicable to the associated optimization problem. By constructing an auxiliary stochastic process, we obtain the optimal investment strategy, which depends on the solution to a Riccati-Volterra equation. The MV efficient frontier is shown to maintain a quadratic curve. Numerical studies show that both roughness and volatility of volatility materially affect the optimal strategy.
arXiv
Background Food taxes and subsidies are one intervention to address poor diets. Price elasticity (PE) matrices are commonly used to model the change in food purchasing. Usually a PE matrix is generated in one setting then applied to another setting with differing starting consumption and prices of foods. This violates econometric assumptions resulting in likely misestimation of total food consumption. We illustrate rescaling all consumption after applying a PE matrix using a total food expenditure elasticity (TFEe, the expenditure elasticity for all food combined given the policy induced change in the total price of food). We use case studies of NZ$2 per 100g saturated fat (SAFA) tax, NZ$0.4 per 100g sugar tax, and a 20% fruit and vegetable (F&V) subsidy. Methods We estimated changes in food purchasing using a NZ PE matrix applied conventionally, then with TFEe adjustment. Impacts were quantified for total food expenditure and health adjusted life years (HALYs) for the total NZ population alive in 2011 over the rest of their lifetime using a multistate lifetable model. Results Two NZ studies gave TFEes of 0.68 and 0.83, with international estimates ranging from 0.46 to 0.90. Without TFEe adjustment, total food expenditure decreased with the tax policies and increased with the F&V subsidy, implausible directions of shift given economic theory. After TFEe adjustment, HALY gains reduced by a third to a half for the two taxes and reversed from an apparent health loss to a health gain for the F&V subsidy. With TFEe adjustment, HALY gains (in 1000s) were 1,805 (95% uncertainty interval 1,337 to 2,340) for the SAFA tax, 1,671 (1,220 to 2,269) for the sugar tax, and 953 (453 to 1,308) for the F&V subsidy. Conclusions If PE matrices are applied in settings beyond where they were derived, additional scaling is likely required. We suggest that the TFEe is a useful scalar.
arXiv
We present a novel Monte Carlo based LSV calibration algorithm that applies to all stochastic volatility models, including the non-Markovian rough volatility family. Our framework overcomes the limitations of the particle method proposed by Guyon and Henry-Labord\`ere (2012) and theoretically guarantees a variance reduction without additional computational complexity. Specifically, we obtain a closed-form and exact calibration method that allows us to remove the dependency on both the kernel function and bandwidth parameter. This makes the algorithm more robust and less prone to errors or instabilities in a production environment. We test the efficiency of our algorithm on various hybrid (rough) local stochastic volatility models.
arXiv
This paper studies the optimal dividend for a multi-line insurance group, in which each insurance company is exposed to some external credit default risk. The external default contagion is considered in the sense that one default event can affect the default probabilities of all surviving insurance companies. The total dividend problem is formulated for the insurance group and we reveal for the first time that the optimal singular dividend strategy is still of the barrier type. Furthermore, we show that the optimal barrier for each insurance company is modulated by the current default state, namely how many and which companies have defaulted will determine the dividend threshold for each surviving company. These interesting conclusions match with observations from the real market and are based on our analysis of the associated recursive system of Hamilton-Jacobi-Bellman variational inequalities (HJBVIs), which is new to the literature. The existence of the classical solution is established and the rigorous proof of the verification theorem is provided. For the case of two companies, the value function and optimal barriers for each company can be explicitly constructed. Some numerical examples are also presented.
arXiv
This paper introduces an intermediary between conditional expectation and conditional sublinear expectation, called R-conditioning. The R-conditioning of a random-vector in $L^2$ is defined as the best $L^2$-estimate, given a $\sigma$-subalgebra and a degree of model uncertainty. When the random vector represents the payoff of derivative security in a complete financial market, its R-conditioning with respect to the risk-neutral measure is interpreted as its risk-averse value. The optimization problem defining the optimization R-conditioning is shown to be well-posed. We show that the R-conditioning operators can be used to approximate a large class of sublinear expectations to arbitrary precision. We then introduce a novel numerical algorithm for computing the R-conditioning. This algorithm is shown to be strongly convergent.
Implementations are used to compare the risk-averse value of a Vanilla option to its traditional risk-neutral value, within the Black-Scholes-Merton framework. Concrete connections to robust finance, sensitivity analysis, and high-dimensional estimation are all treated in this paper.
SSRN
We present three in-detail quantitative trading ideas for equity markets in China and Japan. First idea discusses the emerging 50 ETF option markets in China and the feasibility of a covered call strategy. Second idea evaluates the opportunities implied by the ETF purchase program of Bank of Japan. Third idea demonstrates the predicting effectiveness of Chinese short selling and margin buying data with feature selection. Limitations and concerns for each idea is discussed separately. With a valuation model established in the first section, we are able to value each trading idea at 1.5M, 0.46M and 0.33M USD respectively, based on derived estimates of strategy performances and investment characteristics.
arXiv
The 2008 mortgage crisis is an example of an extreme event. Extreme value theory tries to estimate such tail risks. Modern finance practitioners prefer Expected Shortfall based risk metrics (which capture tail risk) over traditional approaches like volatility or even Value-at-Risk. This paper provides a quantum annealing algorithm in QUBO form for a dynamic asset allocation problem using expected shortfall constraint. It was motivated by the need to refine the current quantum algorithms for Markowitz type problems which are academically interesting but not useful for practitioners. The algorithm is dynamic and the risk target emerges naturally from the market volatility. Moreover, it avoids complicated statistics like generalized pareto distribution. It translates the problem into qubit form suitable for implementation by a quantum annealer like D-Wave. Such QUBO algorithms are expected to be solved faster using quantum annealing systems than any classical algorithm using classical computer (but yet to be demonstrated at scale).
SSRN
I review Sunstein's recent book emphasising its reliance on classical, as opposed to ecological, rationality.
arXiv
The possibility of re-switching of techniques in Piero Sraffa's intersectoral model, namely the returning capital-intensive techniques with monotonic changes in the profit rate, is traditionally considered as a paradox putting at stake the viability of the neoclassical theory of production. It is argued here that this phenomenon can be rationalized within the neoclassical paradigm. Sectoral interdependencies can give rise to non-monotonic effects of progressive variations in income distribution on relative prices. The re-switching of techniques is, therefore, the result of cost-minimizing technical choices facing returning ranks of relative input prices in full consistency with the neoclassical perspective.
SSRN
There are three fundamental ways of testing the validity of an investment algorithm against historical evidence: a) the walk-forward method; b) the resampling method; and c) the Monte Carlo method. By far the most common approach followed among academics and practitioners is the walk-forward method. Implicit in that choice is the assumption that a given investment algorithm should be deployed throughout all market regimes. We denote such assumption the âall-weatherâ hypothesis, and the algorithms based on that hypothesis âstrategic investment algorithmsâ (or âinvestment strategiesâ).The all-weather hypothesis is not necessarily true, as demonstrated by the fact that many investment strategies have floundered in a zero-rate environment. This motivates the problem of identifying investment algorithms that are optimal for specific market regimes, denoted âtactical investment algorithms.â This paper argues that backtesting against synthetic datasets should be the preferred approach for developing tactical investment algorithms. A new organizational structure for asset managers is proposed, as a tactical algorithmic factory, consistent with the Monte Carlo backtesting paradigm.
SSRN
Until recently, few efforts have been made to systematically measure and aggregate the nominal value of the different types of sovereign government debt in default. To help fill this gap, the Bank of Canada (BoC) developed a comprehensive database of sovereign defaults that is posted on its website and updated in partnership with the Bank of England (BoE). Our database draws on previously published datasets compiled by various public and private sector sources. It combines elements of these, together with new information, to develop comprehensive estimates of stocks of government obligations in default. These include bonds and other marketable securities, bank loans and official loans, valued in US dollars, for the years 1960 to 2018 on both a country-by-country and a global basis. This update of the BoC-BoE database, and future updates, will be useful to researchers analyzing the economic and financial effects of individual sovereign defaults and, importantly, the impact on global financial stability of episodes involving multiple sovereign defaults.
arXiv
In this paper, a mathematical model based on the one-parameter Mittag-Leffler function is proposed to be used for the first time to describe the relation between unemployment rate and inflation rate, also known as the Phillips curve. The Phillips curve is in the literature often represented by an exponential-like shape. On the other hand, Phillips in his fundamental paper used a power function in the model definition. Considering that the ordinary as well as generalised Mittag-Leffler function behaves between a purely exponential function and a power function it is natural to implement it in the definition of the model used to describe the relation between the data representing the Phillips curve. For the modelling purposes the data of two different European economies, France and Switzerland, were used and an "out-of-sample" forecast was done to compare the performance of the Mittag-Leffler model to the performance of the power-type and exponential-type model. The results demonstrate that the ability of the Mittag-Leffler function to fit data that manifest signs of stretched exponentials, oscillations or even damped oscillations can be of use when describing economic relations and phenomenons, such as the Phillips curve.
arXiv
Financial crime is a large and growing problem, in some way touching almost every financial institution. Financial institutions are the front line in the war against financial crime and accordingly, must devote substantial human and technology resources to this effort. Current processes to detect financial misconduct have limitations in their ability to effectively differentiate between malicious behavior and ordinary financial activity. These limitations tend to result in gross over-reporting of suspicious activity that necessitate time-intensive and costly manual review. Advances in technology used in this domain, including machine learning based approaches, can improve upon the effectiveness of financial institutions' existing processes, however, a key challenge that most financial institutions continue to face is that they address financial crimes in isolation without any insight from other firms. Where financial institutions address financial crimes through the lens of their own firm, perpetrators may devise sophisticated strategies that may span across institutions and geographies. Financial institutions continue to work relentlessly to advance their capabilities, forming partnerships across institutions to share insights, patterns and capabilities. These public-private partnerships are subject to stringent regulatory and data privacy requirements, thereby making it difficult to rely on traditional technology solutions. In this paper, we propose a methodology to share key information across institutions by using a federated graph learning platform that enables us to build more accurate machine learning models by leveraging federated learning and also graph learning approaches. We demonstrated that our federated model outperforms local model by 20% with the UK FCA TechSprint data set. This new platform opens up a door to efficiently detecting global money laundering activity.
arXiv
Conditional Autoregressive Value-at-Risk and Conditional Autoregressive Expectile have become two popular approaches for direct measurement of market risk. Since their introduction several improvements both in the Bayesian and in the classical framework have been proposed to better account for asymmetry and local non-linearity. Here we propose a unified Bayesian Conditional Autoregressive Risk Measures approach by using the Skew Exponential Power distribution. Further, we extend the proposed models using a semiparametric P-spline approximation answering for a flexible way to consider the presence of non-linearity. To make the statistical inference we adapt the MCMC algorithm proposed in Bernardi et al. (2018) to our case. The effectiveness of the whole approach is demonstrated using real data on daily return of five stock market indices.