# Research articles for the 2020-11-18

A New Form of Banking -- Concept and Mathematical Model of Venture Banking
Brian P Hanley
arXiv

This theoretical model contains concept, equations, and graphical results for venture banking. A system of 27 equations describes the behavior of the venture-bank and underwriter system allowing phase-space type graphs that show where profits and losses occur. These results confirm and expand those obtained from the original spreadsheet based model. An example investment in a castle at a loss is provided to clarify concept. This model requires that all investments are in enterprises that create new utility value. The assessed utility value created is the new money out of which the venture bank and underwriter are paid. The model presented chooses parameters that ensure that the venture-bank experiences losses before the underwriter does. Parameters are: DIN Premium, 0.05; Clawback lien fraction, 0.77; Clawback bonds and equity futures discount, 1.5 x (USA 12 month LIBOR); Range of clawback bonds sold, 0 to 100%; Range of equity futures sold 0 to 70%.

A Risk Based approach for the Solvency Capital requirement for Health Plans
Fabio Baione,Davide Biancalana,Paolo De Angelis
arXiv

The study deals with the assessment of risk measures for Health Plans in order to assess the Solvency Capital Requirement. For the estimation of the individual health care expenditure for several episode types, we suggest an original approach based on a three-part regression model. We propose three Generalized Linear Models (GLM) to assess claim counts, the allocation of each claim to a specific episode and the severity average expenditures respectively. One of the main practical advantages of our proposal is the reduction of the regression models compared to a traditional approach, where several two-part models for each episode types are requested. As most health plans require co-payments or co-insurance, considering at this stage the non-linearity condition of the reimbursement function, we adopt a Montecarlo simulation to assess the health plan costs. The simulation approach provides the probability distribution of the Net Asset Value of the Health Plan and the estimate of several risk measures.

An application of Zero-One Inflated Beta regression models for predicting health insurance reimbursement
Fabio Baione,Davide Biancalana,Paolo De Angelis
arXiv

In actuarial practice the dependency between contract limitations (deductibles, copayments) and health care expenditures are measured by the application of the Monte Carlo simulation technique. We propose, for the same goal, an alternative approach based on Generalized Linear Model for Location, Scale and Shape (GAMLSS). We focus on the estimate of the ratio between the one-year reimbursement amount (after the effect of limitations) and the one year expenditure (before the effect of limitations). We suggest a regressive model to investigate the relation between this response variable and a set of covariates, such as limitations and other rating factors related to health risk. In this way a dependency structure between reimbursement and limitations is provided. The density function of the ratio is a mixture distribution, indeed it can continuously assume values mass at 0 and 1, in addition to the probability density within (0, 1) . This random variable does not belong to the exponential family, then an ordinary Generalized Linear Model is not suitable. GAMLSS introduces a probability structure compliant with the density of the response variable, in particular zero-one inflated beta density is assumed. The latter is a mixture between a Bernoulli distribution and a Beta distribution.

Assessing the use of transaction and location based insights derived from Automatic Teller Machines (ATMs) as near real time sensing systems of economic shocks
Dharani Dhar Burra,Sriganesh Lokanathan
arXiv

Big data sources provide a significant opportunity for governments and development stakeholders to sense and identify in near real time, economic impacts of shocks on populations at high spatial and temporal resolutions. In this study, we assess the potential of transaction and location based measures obtained from automatic teller machine (ATM) terminals, belonging to a major private sector bank in Indonesia, to sense in near real time, the impacts of shocks across income groups. For each customer and separately for years 2014 and 2015, we model the relationship between aggregate measures of cash withdrawals for each year, total inter-terminal distance traversed by the customer for the specific year and reported customer income group. Results reveal that the model was able to predict the corresponding income groups with 80% accuracy, with high precision and recall values in comparison to the baseline model, across both the years. Shapley values suggest that the total inter-terminal distance traversed by a customer in each year differed significantly between customer income groups. Kruskal-Wallis test further showed that customers in the lower-middle class income group, have significantly high median values of inter-terminal distances traversed (7.21 Kms for 2014 and 2015) in comparison to high (2.55 Kms and 0.66 Kms for years 2014 and 2015), and low (6.47 Kms for 2014 and 2015) income groups. Although no major shocks were noted in 2014 and 2015, our results show that lower-middle class income group customers, exhibit relatively high mobility in comparison to customers in low and high income groups. Additional work is needed to leverage the sensing capabilities of this data to provide insights on, who, where and by how much is the population impacted by a shock to facilitate targeted responses.

Autoregressive models of the time series under volatility uncertainty and application to VaR model
Shige Peng,Shuzhen Yang
arXiv

Financial time series admits inherent uncertainty and randomness that changes over time. To clearly describe volatility uncertainty of the time series, we assume that the volatility of risky assets holds value between the minimum volatility and maximum volatility of the assets. This study establishes autoregressive models to determine the maximum and minimum volatilities, where the ratio of minimum volatility to maximum volatility can measure volatility uncertainty. By utilizing the value at risk (VaR) predictor model under volatility uncertainty, we introduce the risk and uncertainty, and show that the autoregressive model of volatility uncertainty is a powerful tool in predicting the VaR for a benchmark dataset.

Capital Structure and Financial Performance: Case Study from Pakistan Pharmaceutical Sector
Rehan, Muhammad,KARACA, SÃ¼leyman Serdar ,Alvi, Jahanzaib
SSRN
The main purpose of this research is to find out the relationship between capital structure and the firm's financial performance of the listed Pharmaceutical companies in the Pakistan Stock Exchange. A further specific objective is to find out the relation of debt-equity with gross profit, earning per share, and return on capital and return on equity. This research determines that Capital Structure is adversely linked with the profitability, it suggests that a decrease in the profitability of the organizations is due to an increase in debt capital & vice versa, further the results proclaims that capital is not dramatically significant and impacting, hence results in records that debt to equity is nothing to do with ROE, increasing or decrease in debt or equity financing would affect ROE.

Cooperation in the Age of COVID-19: Evidence from Public Goods Games
Patrick Mellacher
arXiv

Does COVID-19 change the willingness to cooperate? Four Austrian university courses in economics play a public goods game in consecutive semesters on the e-learning platform Moodle: two of them in the year before the crisis, one immediately after the beginning of the first lockdown in March 2020 and the last one in the days before the announcement of the second lockdown in October 2020. Between 67% and 76% of the students choose to cooperate, i.e. contribute to the public good, in the pre-crisis year. Immediately after the imposition of the lockdown, 71% choose to cooperate. Seven months into the crisis, however, cooperation drops to 43%. Depending on whether two types of biases resulting from the experimental design are eliminated or not, probit and logit regressions show that this drop is statistically significant at the 0.05 or the 0.1 significance level.

Corporate Governance, Law, Culture, Environmental Performance and CSR Disclosure: A Global Perspective
Lu, Jing,Wang, Jun
SSRN
This paper investigates the impact of corporate governance and culture background on firmsâ€™ environmental performance and CSR disclosure from a global perspective. It also provides evidence of a positive relationship between performance and CSR disclosure, supporting the voluntary disclosure theory. We find that common internal corporate governance best practices (such as CEO non-duality, board with ESG committee and gender diversified board) are associated with better environmental performance and more disclosure of CSR related information. Debt is an effective internal governance vehicle and positively affects firmsâ€™ environmental performance and CSR disclosure. Cross-listed firms perform better environmentally and disclose more CSR information. Firms residing in countries with stronger legal systems have less voluntary CSR disclosure, implying that external governance is functional and may partially serve as a substitute for internal governance. In terms of culture influence, we find that firms in countries with low power distance, individualism, femininity, high uncertainty avoidance, and long-term orientation perform better environmentally. Firms in low power distance, collectivistic, feminine, long-term oriented, high uncertainty avoidance and restrained countries disclose more CSR information.

Cournot-Nash equilibrium and optimal transport in a dynamic setting
Beatrice Acciaio,Julio Backhoff-Veraguas,Junchao Jia
arXiv

We consider a large population dynamic game in discrete time. The peculiarity of the game is that players are characterized by time-evolving types, and so reasonably their actions should not anticipate the future values of their types. When interactions between players are of mean-field kind, we relate Nash equilibria for such games to an asymptotic notion of dynamic Cournot-Nash equilibria. Inspired by the works of Blanchet and Carlier for the static situation, we interpret dynamic Cournot-Nash equilibria in the light of causal optimal transport theory. Further specializing to games of potential type, we establish existence, uniqueness and characterization of equilibria. Moreover we develop, for the first time, a numerical scheme for causal optimal transport, which is then leveraged in order to compute dynamic Cournot-Nash equilibria. This is illustrated in a detailed case study of a congestion game.

Determinants of Hedge Fund Investment in Corporate Endgames
Dobmeier, Ludwig,Lavrova, Renata,Schwetzler, Bernhard
SSRN
Under German law the corporate endgame process of obtaining full control over a company offers multiple investment opportunities for investors with high investment flexibility, and is therefore particularly attractive to hedge funds. This paper investigates the determinants of hedge fund investment in corporate endgame processes based on a sample of 76 endgame situations of publicly listed German companies and investment data of 326 hedge funds. Examining characteristics of investment targets, we find that hedge funds invest in companies with a non-dominant majority owner and high stake of index funds as latterâ€™s inability to react in change of control situations creates a supportive investment environment for hedge funds. Hedge funds are most likely to invest after takeover consummation and before announcement of a new endgame transaction. Investigating the determinants of ongoing engagement after initial investment, we find that the presence of other institutional investors, especially hedge funds positively affects engagement likelihood, serving as a validation of the own investment approach. Abnormal performance and trading liquidity of target stock also positively affect hedge fundsâ€™ engagement. The results indicate that the endgame process in Germany is an attractive investment opportunity for hedge funds, while hedge fund involvement also adds complexity to the corporate control process.

Does the Market for Corporate Control Influence Executive Risk-Taking Incentives? Evidence From Takeover Vulnerability
Ongsakul, Viput,Chatjuthamard, Pattanaporn,Jiraporn, Napatsorn (Pom),Jiraporn, Pornsit
SSRN
Purpose: We investigate the role of the market for corporate control as an external governance mechanism and its effect on executive risk-taking incentives. Managers tend to be risk-averse as they are more exposed to idiosyncratic risk, resulting in sub-optimal risk-taking that does not maximize shareholdersâ€™ wealth. The takeover market alleviates this problem, inducing managers to take more risk. Therefore, risk-taking incentives inside the firm are less powerful when the outside takeover market is more active. Design/Methodology: Exploiting a novel measure of takeover vulnerability recently constructed by Cain, McKeown, and Solomon (2017), we explore how takeover vulnerability influences executive risk-taking incentives. Using a large sample of U.S. firms, we employ fixed-effects regressions, propensity score matching, and instrumental variable analysisFindings: Consistent with our hypothesis, a more active takeover market results in less powerful risk-taking incentives. Specifically, a rise in takeover vulnerability by one standard deviation diminishes executive risk-taking incentives by 22.39%, an economically meaningful magnitude.Originality/Value: Our study is the first to explore the effect of the takeover market on managerial risk-taking incentives, using a novel measure of takeover susceptibility. Our measure of takeover vulnerability is considerably less susceptible to endogeneity, enabling us to draw causal inferences with more confidence.

Emotions in Online Content Diffusion
Yifan Yu,Shan Huang,Yuchen Liu,Yong Tan
arXiv

Social media-transmitted online information, particularly content that is emotionally charged, shapes our thoughts and actions. In this study, we incorporate social network theories and analyses to investigate how emotions shape online content diffusion, using a computational approach. We rigorously quantify and characterize the structural properties of diffusion cascades, in which more than six million unique individuals transmitted 387,486 articles in a massive-scale online social network, WeChat. We detected the degree of eight discrete emotions (i.e., surprise, joy, anticipation, love, anxiety, sadness, anger, and disgust) embedded in these articles, using a newly generated domain-specific and up-to-date emotion lexicon. We found that articles with a higher degree of anxiety and love reached a larger number of individuals and diffused more deeply, broadly, and virally, whereas sadness had the opposite effect. Age and network degree of the individuals who transmitted an article and, in particular, the social ties between senders and receivers, significantly mediated how emotions affect article diffusion. These findings offer valuable insight into how emotions facilitate or hinder information spread through social networks and how people receive and transmit online content that induces various emotions.

Equity Default Clawback Swaps to Implement Venture Banking
Brian P. Hanley
arXiv

In this theoretical paper, I propose creation of a venture bank, able to multiply the capital of a venture capital firm by at least 47 times, without requiring access to the Federal Reserve or other central bank apart from settlement. This concept rests on obtaining default swap instruments on loans in order to create the capital required, and expand Tier 1 and 2 base capital. Profitability depends on overall portfolio performance, availability of equity default swaps, cost of default swap, and the multiple of original capital (MOC) adopted by the venture bank. A new derivative financial instrument, the equity default swap (EDS), to cover loans made as venture investments. An EDS is similar to a credit default swap (CDS) but with some unique features. The features and operation of these new derivative instruments are outlined along with audit requirements. This instrument would be traded on open-outcry exchanges with special features to ensure orderly operation of the market. It is the creation of public markets for EDSs that makes possible the use of public market pricing to indirectly provide a potential market capitalization for the underlying venture-bank investment. Full coverage insulates the venture-bank from losses in most situations, and multiplies profitability quite dramatically in all scenarios. Ten year returns above 20X are attainable. Further, a new feature for EDS derivatives, a clawback lien, closes out the equity default swap. Here it is optimized at 77%, and is to be paid back to the underwriter at a future date to prevent perverse incentive to deliberately fail. This new feature creates an Equity Default Clawback Swap (EDCS) which can be used safely. This proposal also solves an old problem in banking, because it matches the term of the loan with the term of the investment. I show that the venture-bank investment and the EDCS underwriting business are profitable.

Estimating the Welfare Effects of School Vouchers
Vishal Kamat,Samuel Norris
arXiv

We analyze the welfare effects of voucher provision in the DC Opportunity Scholarship Program (OSP), a school voucher program in Washington, DC, that randomly allocated a voucher to students. To do so, we develop new discrete choice tools to learn about the average willingness to pay for a voucher of a given amount and its average costs in a nonparametric model of school choice. Our tools exploit the insight that these welfare parameters can be expressed as functions of the underlying demand for the different schools. However, while the random allocation of the voucher reveals the value of demand at two prices, the parameters generally depend on their values beyond these prices. Our tools show how to sharply characterize what we can learn when demand is allowed to remain entirely nonparametric or to be parameterized in a flexible manner, both of which imply that the parameters are not necessarily point identified. Applying our tools to the OSP, we find that provision of the status-quo as well as a wide range of counterfactual voucher amounts has a positive net average benefit. These positive results arise due to the presence of many low-tuition schools in the program; removing these schools from the program can result in a negative net average benefit.

Financing Costs and the Efficiency of Public-Private Partnerships
Avdiu, Besart,Weichenrieder, Alfons J.
SSRN
The paper compares provision of public infrastructure via public-private partnerships (PPPs) with provision under government management. Due to soft budget constraints of government management, PPPs exert more effort and therefore have a cost advantage in building infrastructure. At the same time, hard budget constraints for PPPs introduce a bankruptcy risk and bankruptcy costs. Consequently, if bankruptcy costs are high, PPPs may be less efficient than public management, although this does not result from PPPsâ€™ higher interest costs.

Firm Valuation, Capital Access and Transparency Effects of Dividend Distributions: An Emerging Market Story
Le, Ben Van,Reddy, Nischala
SSRN
This paper investigates the impact of the relative ease of capital access and firmâ€™s information transparency on the relationship between dividend cash payment and firm valuation using a panel data of Vietnamese firms. Several firms in Vietnam are state-owned enterprises (SOEs) that enjoy benefits, such as easier access to external capital at a lower cost, when compared to non- SOEs. We find that there is a significant and positive relationship between firm valuation and dividend cash payments. In addition, the same amount of increase in dividend payment is associated with a higher amount of increase in firm value for firms that have less access to external capital or have lower information transparency than for firms with better access to capital or in firms with more information transparency.

Government as the First Investor in Biopharmaceutical Innovation: Evidence From New Drug Approvals 2010â€"2019
Cleary, Ekaterina,Jackson, Matthew J. ,Ledley, Fred
SSRN
The discovery and development of new medicines classically involves a linear process of basic biomedical research to uncover potential targets for drug action, followed by applied, or translational, research to identify candidate products and establish their effectiveness and safety. This Working Paper describes the public sector contribution to that process by tracing funding from the National Institutes of Health (NIH) related to published research on each of the 356 new drugs approved by the U.S. Food and Drug Administration from 2010-2019 as well as research on their 219 biological targets. Specifically, we describe the timelines of clinical development for these products and proxy measures of their importance, including designations as first-in-class or expedited approvals. We model the maturation of basic research on the biological targets to determine the initiation and established points of this research and demonstrate that none of these products were approved before this enabling research passed the established point. This body of essential research comprised 2 million publications, of which 424 thousand were supported by 515 thousand Funding Years of NIH Project support totaling $195 billion. Research on the 356 drugs comprised 244 thousand publications, of which 39 thousand were supported by 64 thousand Funding Years of NIH Project support totaling$36 billion. Overall, NIH funding contributed to research associated with every new drug approved from 2010-2019, totaling \$230 billion. This funding supported investigator-initiated Research Projects, Cooperative Agreements for government-led research on topics of particular importance, as well as Research Program Projects and Centers and training to support the research infrastructure. This NIH funding also produced 22 thousand patents, which provided marketing exclusivity for 27 (8.6%) of the drugs approved 2010-2019. These data demonstrate the essential role of public sector-funded basic research in drug discovery and development, as well as the scale and character of this funding. It also demonstrates the limited mechanisms available for recognizing the value created by these early investments and ensuring appropriate public returns. This analysis demonstrates the importance of sustained public investment in basic biomedical science as well as the need for policy innovations that fully realize the value of public sector investments in pharmaceutical innovation that ensure that these investments yield meaningful improvements in health.

Marginal Returns to Talent for Material Risk Takers in Banking
Stieglitz, Moritz,Wagner, Konstantin
SSRN
Economies of scale can explain compensation differentials over time, across firms of different size, different hierarchy-levels, and different industries. Consequently, the most talented individuals tend to match with the largest firms in industries where marginal returns to their talent are greatest. We explore a new dimension of this size-pay nexus by showing that marginal returns also differ across activities within firms and industries. Using hand-collected data on managers in European banks well below the level of executive directors, we find that the size-pay nexus is strongest for investment banking business units and for banks with a market-based business model. Thus, managerial compensation is most sensitive to size increases for activities that can easily be scaled up.

Masters of Illusion: Bank and Regulatory Accounting for Losses in Distressed Banks
Kane, Edward J.
SSRN
This essay is part of a larger work on the history of Federal Reserve policymaking entitled Banking on Bull. The study seeks to explain why the instruments of central banking inevitably break down over time. A big part of the explanation is that policymakers want accounting measures of bank net worth to be flexible enough to allow bankers and regulators to slow the release of adverse information about distressed banks, particularly very large ones. Modern regulatory frameworks focus on maintaining what can be described as the adequacy of accounting capital. But this framework is bull, because in tough times, bank accountants know how to make losses disappear.

Modality for Scenario Analysis and Maximum Likelihood Allocation
Takaaki Koike,Marius Hofert
arXiv

We study the variability of a risk from the statistical viewpoint of multimodality of the conditional loss distribution given that the aggregate loss equals an exogenously provided capital. This conditional distribution serves as a building block for calculating risk allocations such as the Euler capital allocation of Value-at-Risk. A superlevel set of this conditional distribution can be interpreted as a set of severe and plausible stress scenarios the given capital is supposed to cover. We show that various distributional properties of this conditional distribution, such as modality, dependence and tail behavior, are inherited from those of the underlying joint loss distribution. Among these properties, we find that modality of the conditional distribution is an important feature in risk assessment related to the variety of risky scenarios likely to occur in a stressed situation. Under unimodality, we introduce a novel risk allocation method called maximum likelihood allocation (MLA), defined as the mode of the conditional distribution given the total capital. Under multimodality, a single vector of allocations can be less sound. To overcome this issue, we investigate the so-called multimodalty adjustment to increase the soundness of risk allocations. Properties of the conditional distribution, MLA and multimodality adjustment are demonstrated in numerical experiments. In particular, we observe that negative dependence among losses typically leads to multimodality, and thus a higher multimodality adjustment can be required.

Nonfinancial Resources Management: A Qualitative Study of Retention and Engagement in Nonprofit Community Fund Management
Edeigba, Jude,Singh, Deepica
SSRN
This study identifies the factors affecting volunteer retention and engagement in New Zealand. A dearth of research has focused on volunteering for nonprofit financing organisations in New Zealand. These organisations are involved particularly in raising funds through public benevolence. There have been rising trends of nonprofit organisations around the world while the number of volunteers decreases. Information on the factors influencing volunteer retention and engagement is expected to support the management of nonprofit community services. Therefore, this study uses a case study of a nonprofit organisation to identify the factors contributing to volunteer retention and engagement. The interview data is analysed using thematic data analysis. Benightedness, communication, management support, volunteer skills and volunteer participation in management decision making are associated with volunteer retention and engagement. These findings are expected to enhance the operation process of nonprofit Community Fund Management Organisations. Future research is suggested to enhance the management of volunteers in other types of nonprofit organisations.

On Simultaneous Long-Short Stock Trading Controllers with Cross-Coupling
Atul Deshpande,John A Gubner,B. Ross Barmish
arXiv

The Simultaneous Long-Short(SLS) controller for trading a single stock is known to guarantee positive expected value of the resulting gain-loss function with respect to a large class of stock price dynamics. In the literature, this is known as the Robust Positive Expectation(RPE)property. An obvious way to extend this theory to the trading of two stocks is to trade each one of them using its own independent SLS controller. Motivated by the fact that such a scheme does not exploit any correlation between the two stocks, we study the case when the relative sign between the drifts of the two stocks is known. The main contributions of this paper are three-fold: First, we put forward a novel architecture in which we cross-couple two SLS controllers for the two-stock case. Second, we derive a closed-form expression for the expected value of the gain-loss function. Third, we use this closed-form expression to prove that the RPE property is guaranteed with respect to a large class of stock-price dynamics. When more information over and above the relative sign is assumed, additional benefits of the new architecture are seen. For example, when bounds or precise values for the means and covariances of the stock returns are included in the model, numerical simulations suggest that our new controller can achieve lower trading risk than a pair of decoupled SLS controllers for the same level of expected trading gain.

Precise asymptotics: robust stochastic volatility models
Peter K. Friz,Paul Gassiat,Paolo Pigato
arXiv

We present a new methodology to analyze large classes of (classical and rough) stochastic volatility models, with special regard to short-time and small noise formulae for option prices. Our main tool is the theory of regularity structures, which we use in the form of [Bayer et al; A regularity structure for rough volatility, 2017]. In essence, we implement a Laplace method on the space of models (in the sense of Hairer), which generalizes classical works of Azencott and Ben Arous on path space and then Aida, Inahama--Kawabi on rough path space. When applied to rough volatility models, e.g. in the setting of [Forde-Zhang, Asymptotics for rough stochastic volatility models, 2017], one obtains precise asymptotic for European options which refine known large deviation asymptotics.

Predicting Disaggregated CPI Inflation Components via Hierarchical Recurrent Neural Networks
Oren Barkan,Itamar Caspi,Allon Hammer,Noam Koenigstein
arXiv

We present a hierarchical architecture based on Recurrent Neural Networks (RNNs) for predicting disaggregated inflation components of the Consumer Price Index (CPI). While the majority of existing research is focused mainly on predicting the inflation headline, many economic and financial entities are more interested in its partial disaggregated components. To this end, we developed the novel Hierarchical Recurrent Neural Network (HRNN) model that utilizes information from higher levels in the CPI hierarchy to improve predictions at the more volatile lower levels. Our evaluations, based on a large data-set from the US CPI-U index, indicate that the HRNN model significantly outperforms a vast array of well-known inflation prediction baselines.

Pricing the Information Quantity in Artworks
Lan Ju,Zhiyong Tu,Changyong Xue
arXiv

In the traditional art pricing models, the variables that capture the painting's content are often missing. Recent research starts to apply the computer graphic techniques to extract the information from the painting content. Most of the research concentrates on the reading of the color information from the painting images and analyzes how different color compositions can affect the sales prices of paintings. This paper takes a different approach, and tries to abstract away from the interpretation of the content information, while only focus on measuring the quantity of information contained. We extend the concept of Shannon entropy in information theory to the painting's scenario, and suggest using the variances of a painting's composing elements, i.e., line, color, value, shape/form and space, to measure the amount of information in the painting. These measures are calculated at the pixel level based on a picture's digital image. We include them into the traditional hedonic regression model to test their significance based on the auction samples from two famous artists (Picasso and Renoir). We find that all the variance measurements can significantly explain the sales price either at 1% or 5% level. The adjusted R square is also increased by more than ten percent. Our method greatly improves the traditional pricing models, and may also find applications in other areas such as art valuation and authentication.

Principal Component Analysis and Factor Analysis for Feature Selection in Credit Rating
Shenghuan Yang,lonut Florescu
arXiv

The credit rating is an evaluation of the credit risk of a company that values the ability to pay back the debt and predict the likelihood of the debtor defaulting. There are various features influencing credit rating. Therefore it is very important to select substantive features to explore the main reason for credit rating change. To address this issue, this paper exploits Principal Component Analysis and Factor Analysis as feature selection algorithms to select important features, summarise the similar features together and obtain a minimum set of features for four sectors, Financial Sector, Energy Sector, Health Care Sector, Consumer Discretionary Sector. This paper uses two data sets, Financial Ratio and Balance Sheet, with two mappings, Detailed Mapping and Coarse Mapping, converting the target variable(credit rating) into categorical variable. To test the accuracy of credit rating prediction, Random Forest Classifier is used to test and train feature sets. The results show that the accuracy of Financial Ratio feature sets are higher than that of Balance Sheet feature sets. What is more, Factor Analysis can reduce significantly the number of features to obtain almost the same accuracy that can decrease dramatically the time spent on analyzing data.In addition, we summarise seven dominant factors and ten dominant factors having effect on credit rating change in Financial Ratio and Balance Sheet respectively by utilizing Factor Analysis which can explain the reason of credit rating change better.

Recent Developments in Financing and Bank Lending to the Non-financial Private Sector
SSRN
The COVID-19 pandemic has significantly altered the financing of the non-financial private sector. Financing of the self-employed and businesses has risen as a consequence of both the increase in demand, stemming from greater liquidity needs and from the perceived increase in refinancing risks, and the expansion of supply, stimulated by the introduction of public guarantee programmes and by the European Central Bankâ€™s policies on the provision of liquidity to credit institutions. In contrast, new lending to individuals has fallen, largely as a consequence of the deterioration in the macroeconomic outlook, which has reduced the supply and demand for credit in this segment. The adverse impact of the COVID-19 crisis on the credit quality of deposit institutionsâ€™ portfolios is currently being mitigated by the measures taken by the economic authorities and the institutions themselves (in particular, the public guarantee programme and legislative and banking sector moratoria). However, non-performing loans have increased since the start of the pandemic, both in the case of lending to non-financial corporations and to households. The non-performing loans ratio of deposit institutions has, however, held steady since March, as the expansion in lending (the denominator of the ratio) has offset the increase in the volume of non-performing loans (the numerator).

Sovereign Debt, Default Risk, and the Liquidity of Government Bonds
Chaumont, Gaston
SSRN
Secondary markets for sovereign bonds are illiquid because of trading frictions. I build a framework with endogenous illiquidity to study its implications on credit spreads and default risk. The model integrates directed search in secondary markets into a macro model of sovereign default. In equilibrium, Investors face a state dependent trade-off between transaction costs and trading probabilities that generates a time-varying liquidity premium. In the calibrated model, illiquidity increases with default risk and accounts for a sizable fraction of credit spreads - between 10% and 50%. Changes in trade flows significantly affect bondsâ€™ price and amplify credit spreadsâ€™ volatility.

Tailwind and Headwind Bidding in German Takeover Offers â€" The Impact of Price Runups on Takeover Success
Schwetzler, Bernhard,Uhlenkamp, Lisa M.
SSRN
Pre-bid target share price runups are increasing the cost of takeovers and thus are seen as a detriment for the efficiency of the market for corporate control. This paper investigates the three-way relationship of pre-bid runups, offer premia and takeover success with a sample of 324 takeover offers of German publicly listed companies from 2006 to 2019. Germany is of particular interest as the minimum offer price required by takeover law is the 3 month average stock price of the target (VWAP). Combined with price runups or rundowns be-fore the offer this regulation creates â€œheadwindâ€ and â€œtailwindâ€ environments for the bid-der: In case of a runup, the stock price at the offer is higher than the VWAP and thus a positive premium on the VWAP may not be sufficient to create a positive premium on the current stock price (â€œHeadwindâ€). If the pre-bid stock price decreases, the VWAP is higher than the current stock price and the legal minimum requirement automatically enforces a positive premium on the current stock price (â€œTailwindâ€). This asymmetry in setting the of-fer price is hypothesized to additionally reduce the chances of an offer being successful.Our results support this hypothesis. While we document a general significant negative ef-fect of runups on takeover success we also show that the impact of the (VWAP) premium on takeover success is significantly higher in headwind than in tailwind environments. Our results further suggest that offer announcements in the German takeover market are pre-ceded by significant price runups and that these are only partly substituted by lower markups.

Tempered stable distributions and finite variation Ornstein-Uhlenbeck processes
Nicola Cufaro Petroni,Piergiacomo Sabino
arXiv

Constructing \Levy-driven Ornstein-Uhlenbeck processes is a task closely related to the notion of self-decomposability. In particular, their transition laws are linked to the properties of what will be hereafter called the \emph{a-reminder} of their self-decomposable stationary laws. In the present study we fully characterize the L\'evy triplet of these a-reminder s and we provide a general framework to deduce the transition laws of the finite variation Ornstein-Uhlenbeck processes associated with tempered stable distributions. We focus finally on the subclass of the exponentially-modulated tempered stable laws and we derive the algorithms for an exact generation of the skeleton of Ornstein-Uhlenbeck processes related to such distributions, with the further advantage of adopting a procedure computationally more efficient than those already available in the existing literature.

The Impact of Informational Content of Intangible Assets on the Firm Performance: Evidence from MENA Countries Stock Exchanges