Research articles for the 2021-07-29
arXiv
This paper introduces a dynamic change of measure approach for computing the analytical solutions of expected future prices (and therefore, expected returns) of contingent claims over a finite horizon. The new approach constructs hybrid probability measures called the "equivalent expectation measures"(EEMs), which provide the physical expectation of the claim's future price until before the horizon date, and serve as pricing measures on or after the horizon date. The EEM theory can be used for empirical investigations of both the cross-section and the term structure of returns of contingent claims, such as Treasury bonds, corporate bonds, and financial derivatives.
arXiv
This review presents the set of electricity price models proposed in the literature since the opening of power markets. We focus on price models applied to financial pricing and risk management. We classify these models according to their ability to represent the random behavior of prices and some of their characteristics. In particular, this classification helps users to choose among the most suitable models for their risk management problems.
SSRN
AI in finance broadly refers to the applications of AI techniques in financial businesses. This area has attracted attention for decades with both classic and modern AI techniques applied to increasingly broader areas of finance, economy and society. In contrast to either discussing the problems, aspects and opportunities of finance that have benefited from specific AI techniques and in particular some new-generation AI and data science (AIDS) areas or reviewing the progress of applying specific techniques to resolving certain financial problems, this review offers a comprehensive and dense roadmap of the overwhelming challenges, techniques and opportunities of AI research in finance over the past decades. The landscapes and challenges of financial businesses and data are firstly outlined, followed by a comprehensive categorization and a dense overview of the decades of AI research in finance. We then structure and illustrate the data-driven analytics and learning of financial businesses and data. A comparison, criticism and discussion of classic vs. modern AI techniques for finance follows. Finally, the open issues and opportunities to address future AI-empowered finance and finance-motivated AI research are discussed.
arXiv
Backtesting risk measure forecasts requires identifiability (for model validation) and elicitability (for model comparison). The systemic risk measures CoVaR (conditional value-at-risk), CoES (conditional expected shortfall) and MES (marginal expected shortfall), measuring the risk of a position $Y$ given that a reference position $X$ is in distress, fail to be identifiable and elicitable. We establish the joint identifiability of CoVaR, MES and (CoVaR, CoES) together with the value-at-risk (VaR) of the reference position $X$, but show that an analogue result for elicitability fails. The novel notion of multi-objective elicitability however, relying on multivariate scores equipped with an order, leads to a positive result when using the lexicographic order on $\mathbb{R}^2$. We establish comparative backtests of Diebold--Mariano type for superior systemic risk forecasts and comparable VaR forecasts, accompanied by a traffic-light approach. We demonstrate the viability of these backtesting approaches in simulations and in an empirical application to DAX 30 and S&P 500 returns.
arXiv
The 2017 crackdown on Rakhine Rohingyas by the Myanmar army (Tatmadaw) pushed more than 600,000 refugees into Bangladesh. Both Western and Islamic countries denounced Aung Sang Suu Kyis government, but both Asian giants, China and India, supported Myanmars actions. Both also have high stakes in Myanmar given their long-term geopolitics and geoeconomic South and Southeast Asian plans. In spite of Myanmar-based commonalities, Chinas and Indias approaches differ significantly, predicting equally dissimilar outcomes. This chapter examines their foreign policy and stakes in Myanmar in order to draw a sketch of the future of Rakhine Rohingyas stuck in Bangladesh.
arXiv
City logistics involves movements of goods in urban areas respecting the municipal and administrative guidelines. The importance of city logistics is growing over the years especially with its role in minimizing traffic congestion and freeing up of public space for city residents. Collaboration is key to managing city logistics operations efficiently. Collaboration can take place in the form of goods consolidation, sharing of resources, information sharing, etc. We investigate the problems of collaboration planning of stakeholders to achieve sustainable city logistics operations. Two categories of models are proposed to evaluate the collaboration strategies. At the macro level, we have the simplified collaboration square model and advance collaboration square model and at the micro level we have the operational level model. These collaboration decision making models, with their mathematical elaborations on business-to-business, business-to-customer, customer-to-business, and customer-to-customer provide roadmaps for evaluating the collaboration strategies of stakeholders for achieving sustainable city logistics operations attainable under non-chaotic situation and presumptions of human levity tendency. City logistics stakeholders can strive to achieve effective collaboration strategies for sustainable city logistics operations by mitigating the uncertainty effect and understanding the theories behind the moving nature of the individual complexities of a city. To investigate system complexity, we propose axioms of uncertainty and use spider networks and system dynamics modeling to investigate system elements and their behavior over time.
arXiv
This paper introduces new methods to study behaviours among the 52 largest cryptocurrencies between 01-01-2019 and 30-06-2021. First, we explore evolutionary correlation behaviours and apply a recently proposed turning point algorithm to identify regimes in market correlation. Next, we inspect the relationship between collective dynamics and the cryptocurrency market size - revealing an inverse relationship between the size of the market and the strength of collective dynamics. We then explore the time-varying consistency of the relationships between cryptocurrencies' size and their returns and volatility. There, we demonstrate that there is greater consistency between size and volatility than size and returns. Finally, we study the spread of volatility behaviours across the market changing with time by examining the structure of Wasserstein distances between probability density functions of rolling volatility. We demonstrate a new phenomenon of increased uniformity in volatility during market crashes, which we term \emph{volatility dispersion}.
arXiv
Many fields have been affected by the introduction of concepts such as sensors, industry 4.0, internet of things, machine learning and artificial intelligence in recent years. As a result of the interaction of cyber physical systems with these concepts, digital twin model has emerged. The concept of digital twin has been used in many areas with its emergence. The use of this model has made significant gains, especially in decision making processes. The gains in decision making processes contribute to every field and cause changes in terms of cost. In this study, the historical development of the concept of digital twin has been mentioned and general information about the usage areas of digital twin has been given. In the light of this information, the cost effect of the digital twin model, therefore its appearance from the cost accounting window and its use as a cost reduction method were evaluated. This study was carried out in order to shed light on the studies with the insufficient resources in the Turkish literature and the cost accounting perspective.
SSRN
This article examines the impact of COVID-19 on market sentiment and stock market returns for firms listed in India using daily data from January 2020 to May 2021, a period that includes the first and second wave of the COVID-19 pandemic. We applied wavelet coherence to explore the co-movement of COVID-19 and sentiment. Market-related implicit sentiment proxies depicting the market's bullish sentiment negatively correlate with COVID-19, whereas sentiment proxies representing the market's bearish sentiment positively correlate with COVID-19 during the first wave of the pandemic. No co-movement was found between sentiment proxies and COVID-19 during the second wave of the pandemic. We also applied event study methodology to measure abnormal returns and regression analysis to explain the causes of abnormal returns during both the pandemic waves. We found statistically significant negative abnormal returns during wave 1 of COVID-19 due to increased negative sentiment in the market. During wave 2, we did not find abnormal returns to be statistically significant. We also found that during wave 1, return on assets (ROA) is statistically associated with abnormal returns, while during wave 2, no firm-specific characteristic is statistically associated with abnormal returns. These findings are among the first empirical evidence of COVID-19, market sentiment, and stock market returns during wave 1 and wave 2 of the pandemic.
arXiv
We propose a new estimator for the average causal effects of a binary treatment with panel data in settings with general treatment patterns. Our approach augments the two-way-fixed-effects specification with the unit-specific weights that arise from a model for the assignment mechanism. We show how to construct these weights in various settings, including situations where units opt into the treatment sequentially. The resulting estimator converges to an average (over units and time) treatment effect under the correct specification of the assignment model. We show that our estimator is more robust than the conventional two-way estimator: it remains consistent if either the assignment mechanism or the two-way regression model is correctly specified and performs better than the two-way-fixed-effect estimator if both are locally misspecified. This strong double robustness property quantifies the benefits from modeling the assignment process and motivates using our estimator in practice.
arXiv
This paper presents static and dynamic versions of univariate, multivariate, and multilevel functional time-series methods to forecast implied volatility surfaces in foreign exchange markets. We find that dynamic functional principal component analysis generally improves out-of-sample forecast accuracy. More specifically, the dynamic univariate functional time-series method shows the greatest improvement. Our models lead to multiple instances of statistically significant improvements in forecast accuracy for daily EUR-USD, EUR-GBP, and EUR-JPY implied volatility surfaces across various maturities, when benchmarked against established methods. A stylised trading strategy is also employed to demonstrate the potential economic benefits of our proposed approach.
arXiv
This paper studies a dynamic optimal reinsurance and dividend-payout problem for an insurer in a finite time horizon. The goal of the insurer is to maximize its expected cumulative discounted dividend payouts until bankruptcy or maturity which comes earlier. The insurer is allowed to dynamically choose reinsurance contracts over the whole time horizon. This is a mixed singular-classical control problem and the corresponding Hamilton-Jacobi-Bellman equation is a variational inequality with fully nonlinear operator and with gradient constraint. The $C^{2,1}$ smoothness of the value function and a comparison principle for its gradient function are established by penalty approximation method. We find that the surplus-time space can be divided into three non-overlapping regions by a risk-magnitude-and-time-dependent reinsurance barrier and a time-dependent dividend-payout barrier. The insurer should be exposed to higher risk as surplus increases; exposed to all the risks once surplus upward crosses the reinsurance barrier; and pay out all reserves in excess of the dividend-payout barrier. The localities of these regions are explicitly estimated.
arXiv
Stock and financial markets are examined from the perspective of communication-theoretical perspectives on the dynamics of information and meaning. The study focuses on the link between the dynamics of investors expectations and market price movement. This process is considered quantitatively in a model representation. On supposition that available information is differently processed by different groups of investors, market asset price evolution is described from the viewpoint of communicating the information and meaning generation within the market. A non-linear evolutionary equation linking investors expectations with market asset price movement is derived. Model predictions are compared with real market data.
RePEC
Fair-value budgeting represents a more comprehensive measure of cost for government activities than the measure required under current law. However, fair-value budgeting raises practical questions: Which government activities would benefit from fair-value estimates? How might they be used? How can agencies estimate fair value without observing market prices for government risks? The use of fair value could depend on three criteria: Commitment, whether the government makes commitments that it cannot shed through future legislation;
arXiv
Forex trading is the largest market in terms of qutantitative trading. Traditionally, traders refer to technical analysis based on the historical data to make decisions and trade. With the development of artificial intelligent, deep learning plays a more and more important role in forex forecasting. How to use deep learning models to predict future price is the primary purpose of most researchers. Such prediction not only helps investors and traders make decisions, but also can be used for auto-trading system. In this article, we have proposed a novel approach of feature selection called 'feature importance recap' which combines the feature importance score from tree-based model with the performance of deep learning model. A stacking model is also developed to further improve the performance. Our results shows that proper feature selection approach could significantly improve the model performance, and for financial data, some features have high importance score in many models. The results of stacking model indicate that combining the predictions of some models and feed into a neural network can further improve the performance.
SSRN
Several features of financial research make it particularly prone to the occurrence of false discoveries. First, the probability of finding a positive (profitable investment strategy) is very low, due to intense competition. Second, true findings are mostly short-lived, as a result of the non-stationary nature of financial systems. Third, unlike in the natural sciences, it is rarely possible to verify statistical findings through controlled experiments. Financeâs inability to conduct controlled experiments makes it virtually impossible to debunk a false claim. One would hope that, in such a field, researchers would be particularly careful when conducting statistical inference. Sadly, the opposite is true.Tenure-seeking researchers publish thousands of academic articles that promote dubious investment strategies, without controlling for multiple testing. Some of those articles are written for, funded, or promoted by investment firms with a commercial interest. As a consequence, todayâs academic finance exhibits some resemblance with medicineâs predicament during the 1950-2000 period, when Big Tobacco paid for thousands of studies in support of their bottom line. Unlike finance, medical journals today impose strict controls for multiple testing. Academic financeâs denial of its replication crisis risks its branding as a pseudoscience.
RePEC
Does financial intermediation affect structural change? We investigate both theoretically and empirically whether financial development accelerates structural change during the post-industrialization phase where employment, value-added and expenditure shares change towards services and away from manufacturing. We build a dynamic general equilibrium model where firms and households face different types of intermediation costs, and structural change can be driven by mutually independent technology differences { exogenous productivity gaps or asymmetric factor elasticities { as well as by learning-by-doing. Besides suggesting a stronger impact of financial development when productivity is endogenous and services are labor-intensive, all the model specifications robustly predict that exogenous reductions in intermediation costs { e.g., deregulation shocks { accelerate the pace and extent of structural change. We test this prediction empirically by examining the effects of state by- state bank branching deregulation in the United States in the 1970-1990s period. Using a range of estimation techniques including synthetic control methods { pooled, augmented, and with staggered treatment { we show that bank branching deregulation accelerated the structural change that was already underway, i.e., services account for a greater share of output and employment than they would have in the absence of deregulation.
SSRN
We study bank responses to the Paycheck Protection Program (PPP) and its effects on lender balance sheets and profitability. To address the endogeneity between bank decisions and balance sheet effects, we develop a Bayesian joint model that examines the decision to participate, the intensity of participation, and ultimate balance sheet outcomes. Overall, lenders were driven by risk-aversion and funding capacity rather than profitability in their decision to participate and the intensity of their participation. Indeed, with greater participation intensity, banks experienced sizable growth in their loan portfolios but a decline in their interest margins. In counterfactual exercises, we show that the PPP offset a large potential contraction in business lending, and that bank margins would have fallen even more precipitously if lenders had not participated in the program. Although the PPP was intended as a credit support program for small firms, the program indirectly supported the margins of banks that channeled these loans.
arXiv
This paper investigates the assumption of homogeneous effects of federal tax changes across the U.S. states and identifies where and why that assumption may not be valid. More specifically, what determines the transmission mechanism of tax shocks at the state level? How vital are states' fiscal structures, financial conditions, labor market rigidities, and industry mix? Do these economic and structural characteristics drive the transmission mechanism of the tax changes at the state level at different horizons? This study employs a panel factor-augmented vector autoregression (FAVAR) technique to answer these issues. The findings show that state economies respond homogeneously in terms of employment and price levels; however, they react heterogeneously in real GDP and personal income growth. In most states, these reactions are statistically significant, and the heterogeneity in the effects of tax cuts is significantly related to the state's fiscal structure, manufacturing and financial composition, and the labor market's rigidity. A cross-state regression analysis shows that states with higher tax elasticity, higher personal income tax, strict labor market regulation, and economic policy uncertainties are relatively less responsive to federal tax changes. In contrast, the magnitude of the response in real GDP, personal income, and employment to tax cuts is relatively higher in states with a larger share of finance, manufacturing, lower tax burdens, and flexible credit markets.
SSRN
We study the evolution of USmortgage credit supply during the COVID-19 pandemic. Although the mortgage market experienced a historic boom in 2020, we show there was also a large and sustained increase in intermediation markups that limited the pass-through of lowrates to borrowers. Markups typically rise during periods of peak demand, but this historical relationship explains only part of the large increase during the pandemic. We present evidence that pandemic-related labor market frictions and operational bottlenecks contributed to unusually inelastic credit supply, and that technology-based lenders, likely less constrained by these frictions, gained market share. Rising forbearance and default risk did not significantly affect rates on âplainvanillaâ conforming mortgages, but it did lead to higher spreads on mortgages without government guarantees and loans to the riskiest borrowers. Mortgage-backed securities purchases by the Federal Reserve also supported the flow of credit in the conforming segment.
SSRN
The main issue addressed in this paper is whether a new financial crisis can be avoided.After reviewing the key elements that were present in the 2007/2009 financial crisis, there is an analysis of the regulatory reforms which took place during and after the financial meltdown. The role played in it by the shadow banking system and the regulatory reforms dealing with it deserve particular attention.The regulatory reforms are assessed in the context of systemic risk and run vulnerability in order to recommend what should be done to prevent a new financial crisis from happening. The main conclusions are: 1) A key issue to avoid a new financial crisis is to prevent an excessive concentration of loans in any one sector, region or kind of assets of the economy.2) The role of the central bank as lender of last resort should be reassessed in light of the experience of what has been done in the context of the COVID-19 pandemic. 3) In order to prevent managers from taking excessive risks using other people´s money, managerial compensation schemes should be changed. 4) Issues which have to do with the conflict of interests in the credit rating agencies are still waiting for better regulation.5) After the failure of mainstream economic theory, it is time to reevaluate the contributions of authors like Keynes, Kindleberger and Minsky on the subject of economic crisis.
arXiv
Identifying the instances of jumps in a discrete time series sample of a jump diffusion model is a challenging task. We have developed a novel statistical technique for jump detection and volatility estimation in a return time series data using a threshold method. Since we derive the threshold and the volatility estimator simultaneously by solving an implicit equation, we obtain unprecedented accuracy across a wide range of parameter values. Using this method, the increments attributed to jumps have been removed from a large collection of historical data of Indian sectorial indices. Subsequently, we test the presence of regime switching dynamics in the volatility coefficient using a new discriminating statistic. The statistic is shown to be sensitive to the transition kernel of the regime switching model. We perform the testing using bootstrap method and find a clear indication of presence of multiple regimes of volatility in the data.
SSRN
This paper examines how investors perceive business group membership in Korea during the COVID-19 pandemic. Evidence of a time-varying and heterogeneous value of affiliation emerges from stock price performance analysis. I find that investors discount business group affiliation during a market collapse, but are willing to pay a premium for affiliation during market recovery. Overall, this pattern is more pronounced for financially weak affiliates and large business groups. Results also show that business group membership alleviates investorsâ concerns regarding financial flexibility highlighting the role of internal capital markets as a substitute to external finance.
SSRN
We study the impact of transparency on liquidity in OTC markets. We do so by providing an analysis of liquidity in a corporate bond market without trade transparency (Germany), and comparing our findings to a market with full posttrade disclosure (the U.S.). We employ a unique regulatory dataset of transactions of German financial institutions from 2008 until 2014 to find that: First, overall trading activity is much lower in the German market than in the U.S. Second, similar to the U.S., the determinants of German corporate bond liquidity are in line with search theories of OTC markets. Third, surprisingly, frequently traded German bonds have transaction costs that are 39-61 bp lower than a matched sample of bonds in the U.S. Our results support the notion that, while market liquidity is generally higher in transparent markets, a subset of bonds could be more liquid in more opaque markets because of investors "crowding" their demand into a small number of more actively traded securities.
arXiv
We examine machine learning and factor-based portfolio optimization. We find that factors based on autoencoder neural networks exhibit a weaker relationship with commonly used characteristic-sorted portfolios than popular dimensionality reduction techniques. Machine learning methods also lead to covariance and portfolio weight structures that diverge from simpler estimators. Minimum-variance portfolios using latent factors derived from autoencoders and sparse methods outperform simpler benchmarks in terms of risk minimization. These effects are amplified for investors with an increased sensitivity to risk-adjusted returns, during high volatility periods or when accounting for tail risk.
arXiv
The MobilityCoin is a new, all-encompassing currency for the management of the multimodal urban transportation system. MobilityCoins includes and replaces various existing transport policy instruments while also incentivizing a shift to more sustainable modes as well as empowering the public to vote for infrastructure measures.
arXiv
This article examines neural network-based approximations for the superhedging price process of a contingent claim in a discrete time market model. First we prove that the $\alpha$-quantile hedging price converges to the superhedging price at time $0$ for $\alpha$ tending to $1$, and show that the $\alpha$-quantile hedging price can be approximated by a neural network-based price. This provides a neural network-based approximation for the superhedging price at time $0$ and also the superhedging strategy up to maturity. To obtain the superhedging price process for $t>0$, by using the Doob decomposition it is sufficient to determine the process of consumption. We show that it can be approximated by the essential supremum over a set of neural networks. Finally, we present numerical results.
arXiv
Online trading has attracted millions of people around the world. In March 2021, it was reported there were 18 million accounts from just one broker. Historically, manipulation in financial markets is considered to be fraudulently influencing share, currency pairs or any other indices prices. This article introduces the idea that online trading platform technical issues can be considered as brokers manipulation to control traders profit and loss. More importantly it shows these technical issues are the contributing factors of the 82% risk of retail traders losing money. We identify trading platform technical issues of one of the world's leading online trading providers and calculate retail traders losses caused by these issues. To do this, we independently record each trade details using the REST API response provided by the broker. We show traders log activity files is the only way to assess any suspected profit or loss manipulation by the broker. Therefore, it is essential for any retail trader to have access to their log files. We compare our findings with broker's Trustpilot customer reviews. We illustrate how traders' profit and loss can be negatively affected by broker's platform technical issues such as not being able to close profitable trades, closing trades with delays, disappearance of trades, disappearance of profit from clients statements, profit and loss discrepancies, stop loss not being triggered, stop loss or limit order triggered too early. Although regulatory bodies try to ensure that consumers get a fair deal, these attempts are hugely insufficient in protecting retail traders. Therefore, regulatory bodies such as the FCA should take these technical issues seriously and not rely on brokers' internal investigations, because under any other circumstances, these platform manipulations would be considered as crimes and connivingly misappropriating funds.
arXiv
Large digital platforms create environments where different types of user interactions are captured, these relationships offer a novel source of information for fraud detection problems. In this paper we propose a framework of relational graph convolutional networks methods for fraudulent behaviour prevention in the financial services of a Super-App. To this end, we apply the framework on different heterogeneous graphs of users, devices, and credit cards; and finally use an interpretability algorithm for graph neural networks to determine the most important relations to the classification task of the users. Our results show that there is an added value when considering models that take advantage of the alternative data of the Super-App and the interactions found in their high connectivity, further proofing how they can leverage that into better decisions and fraud detection strategies.
arXiv
Non-governmental organisations have made a significant contribution in the development of Bangladesh. Today, Bangladesh has more than 2000 NGOs, and few of them are among the largest in the world. NGOs are claimed to have impacts on the sustainable development in Bangladesh. However, to what extent they have fostered equity and social inclusion in the urban cities of Bangladesh remains a subject of thorough examination. The 11th goal of the Sustainable Development Goals (SDG) advocates for making cities and human settlements inclusive, safe, resilient and sustainable. Bangladesh which is the most densely populated country in the world faces multifaceted urbanization challenges. The capital city Dhaka itself has experienced staggering population growth in last few decades. Today, Dhaka has become one of the fastest growing megacities in the world. Dhaka started its journey with a manageable population of 2.2 million in 1975 which now reached 14.54 million. The growth rate averaged 6 per cent each year. As this rapid growth of Dhaka City is not commensurate with its industrial development, a significant portion of its population is living in informal settlements or slums where they experience the highest level of poverty and vulnerability. Many NGOs have taken either concerted or individual efforts to address socio-economic challenges in the city. Earlier results suggest that programs undertaken by NGOs have shown potential to positively contribute to fostering equity and reducing social exclusion. This paper, attempts to explore what types of relevant NGO programs are currently in place taking the case of Dhaka city.
arXiv
Predicting the future price trends of stocks is a challenging yet intriguing problem given its critical role to help investors make profitable decisions. In this paper, we present a collaborative temporal-relational modeling framework for end-to-end stock trend prediction. The temporal dynamics of stocks is firstly captured with an attention-based recurrent neural network. Then, different from existing studies relying on the pairwise correlations between stocks, we argue that stocks are naturally connected as a collective group, and introduce the hypergraph structures to jointly characterize the stock group-wise relationships of industry-belonging and fund-holding. A novel hypergraph tri-attention network (HGTAN) is proposed to augment the hypergraph convolutional networks with a hierarchical organization of intra-hyperedge, inter-hyperedge, and inter-hypergraph attention modules. In this manner, HGTAN adaptively determines the importance of nodes, hyperedges, and hypergraphs during the information propagation among stocks, so that the potential synergies between stock movements can be fully exploited. Extensive experiments on real-world data demonstrate the effectiveness of our approach. Also, the results of investment simulation show that our approach can achieve a more desirable risk-adjusted return. The data and codes of our work have been released at https://github.com/lixiaojieff/HGTAN.
SSRN
This paper subjects the global fear index (GFI) for the COVID-19 pandemic to an empirical investigation to determine its relationship with realized volatility, continuous volatility, and jump volatility for seven international indices. The results show evidence of a positive relationship between the measures of realized volatilities and the global fear index, which is persistent in periods of high volatility, and a negative relationship in periods of low volatility. It is observed that when serial dependence, systemic risk and alternative measure of volatility are considered the significance of the COVID-19 global fear index and volatility relationship decreases significantly.
SSRN
Excessive household borrowing has been identified as an important determinant of financial crises. Borrower-based macroprudential instruments have been proposed as a possible remedy. In Germany, two instruments have been available to macroprudential supervisors since 2017: a cap on the loan-to-value (LTV) ratio and an amortization requirement, but none of them has been activated so far. Therefore, this paper presents a simulation tool that allows the impact of activating of borrower-based instruments to be evaluated ex ante. The simulation is based on microdata from the German Panel on Household Finances (PHF) and is at the same time calibrated to match aggregate developments in the residential real estate market. This micro-macro consistent simulation approach can be used to detect vulnerabilities in household balance sheets and perform an ex ante analysis of the activation and calibration of borrower-based macroprudential instruments. An illustrative example of a hypothetical activation shows that the introduction of a cap on the loan-to-value (LTV) ratio of new mortgage loans in Germany could improve important indicators of household vulnerability.
arXiv
There is mounting public concern over the influence that AI based systems has in our society. Coalitions in all sectors are acting worldwide to resist hamful applications of AI. From indigenous people addressing the lack of reliable data, to smart city stakeholders, to students protesting the academic relationships with sex trafficker and MIT donor Jeffery Epstein, the questionable ethics and values of those heavily investing in and profiting from AI are under global scrutiny. There are biased, wrongful, and disturbing assumptions embedded in AI algorithms that could get locked in without intervention. Our best human judgment is needed to contain AI's harmful impact. Perhaps one of the greatest contributions of AI will be to make us ultimately understand how important human wisdom truly is in life on earth.