Author: Alain P. Chaboud
Publisher:
ISBN:
Category : Exchange rate pass-through
Languages : en
Pages : 58
Book Description
Using two newly available ultrahigh-frequency datasets, we investigate empirically how frequently one can sample certain foreign exchange and U.S. Treasury security returns without contaminating estimates of their integrated volatility with market microstructure noise. Using volatility signature plots and a recently-proposed formal decision rule to select the sampling frequency, we find that one can sample FX returns as frequently as once every 15 to 20 seconds without contaminating volatility estimates; bond returns may be sampled as frequently as once every 2 to 3 minutes on days without U.S. macroeconomic announcements, and as frequently as once every 40 seconds on announcement days. With a simple realized kernel estimator, the sampling frequencies can be increased to once every 2 to 5 seconds for FX returns and to about once every 30 to 40 seconds for bond returns. These sampling frequencies, especially in the case of FX returns, are much higher than those often recommended in the empirical literature on realized volatility in equity markets. We suggest that the generally superior depth and liquidity of trading in FX and government bond markets contributes importantly to this difference.
Frequency of Observation and the Estimation of Integrated Volatility in Deep and Liquid Financial Markets
Author: Alain Chaboud
Publisher:
ISBN:
Category : Bond market
Languages : en
Pages : 60
Book Description
Using two newly available ultrahigh-frequency datasets, we investigate empirically how frequently one can sample certain foreign exchange and U.S. Treasury security returns without contaminating estimates of their integrated volatility with market microstructure noise. We find that one can sample FX returns as frequently as once every 15 to 20 seconds without contaminating volatility estimates; bond returns may be sampled as frequently as once every 2 to 3 minutes on days without U.S. macroeconomic announcements, and as frequently as once every 40 seconds on announcement days. With a simple realized kernel estimator, the sampling frequencies can be increased to once every 2 to 5 seconds for FX returns and to about once every 30 to 40 seconds for bond returns. These sampling frequencies, especially in the case of FX returns, are much higher than those often recommended in the empirical literature on realized volatility in equity markets. The higher sampling frequencies for FX and bond returns likely reflects the superior depth and liquidity of these markets.
Publisher:
ISBN:
Category : Bond market
Languages : en
Pages : 60
Book Description
Using two newly available ultrahigh-frequency datasets, we investigate empirically how frequently one can sample certain foreign exchange and U.S. Treasury security returns without contaminating estimates of their integrated volatility with market microstructure noise. We find that one can sample FX returns as frequently as once every 15 to 20 seconds without contaminating volatility estimates; bond returns may be sampled as frequently as once every 2 to 3 minutes on days without U.S. macroeconomic announcements, and as frequently as once every 40 seconds on announcement days. With a simple realized kernel estimator, the sampling frequencies can be increased to once every 2 to 5 seconds for FX returns and to about once every 30 to 40 seconds for bond returns. These sampling frequencies, especially in the case of FX returns, are much higher than those often recommended in the empirical literature on realized volatility in equity markets. The higher sampling frequencies for FX and bond returns likely reflects the superior depth and liquidity of these markets.
Econophysics Approaches to Large-Scale Business Data and Financial Crisis
Author: Misako Takayasu
Publisher: Springer Science & Business Media
ISBN: 4431538534
Category : Science
Languages : en
Pages : 320
Book Description
In recent years, as part of the increasing “informationization” of industry and the economy, enterprises have been accumulating vast amounts of detailed data such as high-frequency transaction data in nancial markets and point-of-sale information onindividualitems in theretail sector. Similarly,vast amountsof data arenow ava- able on business networks based on inter rm transactions and shareholdings. In the past, these types of information were studied only by economists and management scholars. More recently, however, researchers from other elds, such as physics, mathematics, and information sciences, have become interested in this kind of data and, based on novel empirical approaches to searching for regularities and “laws” akin to those in the natural sciences, have produced intriguing results. This book is the proceedings of the international conference THICCAPFA7 that was titled “New Approaches to the Analysis of Large-Scale Business and E- nomic Data,” held in Tokyo, March 1–5, 2009. The letters THIC denote the Tokyo Tech (Tokyo Institute of Technology)–Hitotsubashi Interdisciplinary Conference. The conference series, titled APFA (Applications of Physics in Financial Analysis), focuses on the analysis of large-scale economic data. It has traditionally brought physicists and economists together to exchange viewpoints and experience (APFA1 in Dublin 1999, APFA2 in Liege ` 2000, APFA3 in London 2001, APFA4 in Warsaw 2003, APFA5 in Torino 2006, and APFA6 in Lisbon 2007). The aim of the conf- ence is to establish fundamental analytical techniques and data collection methods, taking into account the results from a variety of academic disciplines.
Publisher: Springer Science & Business Media
ISBN: 4431538534
Category : Science
Languages : en
Pages : 320
Book Description
In recent years, as part of the increasing “informationization” of industry and the economy, enterprises have been accumulating vast amounts of detailed data such as high-frequency transaction data in nancial markets and point-of-sale information onindividualitems in theretail sector. Similarly,vast amountsof data arenow ava- able on business networks based on inter rm transactions and shareholdings. In the past, these types of information were studied only by economists and management scholars. More recently, however, researchers from other elds, such as physics, mathematics, and information sciences, have become interested in this kind of data and, based on novel empirical approaches to searching for regularities and “laws” akin to those in the natural sciences, have produced intriguing results. This book is the proceedings of the international conference THICCAPFA7 that was titled “New Approaches to the Analysis of Large-Scale Business and E- nomic Data,” held in Tokyo, March 1–5, 2009. The letters THIC denote the Tokyo Tech (Tokyo Institute of Technology)–Hitotsubashi Interdisciplinary Conference. The conference series, titled APFA (Applications of Physics in Financial Analysis), focuses on the analysis of large-scale economic data. It has traditionally brought physicists and economists together to exchange viewpoints and experience (APFA1 in Dublin 1999, APFA2 in Liege ` 2000, APFA3 in London 2001, APFA4 in Warsaw 2003, APFA5 in Torino 2006, and APFA6 in Lisbon 2007). The aim of the conf- ence is to establish fundamental analytical techniques and data collection methods, taking into account the results from a variety of academic disciplines.
International Finance Discussion Papers
Author:
Publisher:
ISBN:
Category : International finance
Languages : en
Pages : 68
Book Description
Publisher:
ISBN:
Category : International finance
Languages : en
Pages : 68
Book Description
Testing for Cointegration Using the Johansen Methodology when Variables are Near-integrated
Author: Erik Hjalmarsson
Publisher:
ISBN:
Category : Econometric models
Languages : en
Pages : 28
Book Description
We investigate the properties of Johansen's (1988, 1991) maximum eigenvalue and trace tests for cointegration under the empirically relevant situation of near-integrated variables. Using Monte Carlo techniques, we show that in a system with near-integrated variables, the probability of reaching an erroneous conclusion regarding the cointegrating rank of the system is generally substantially higher than the nominal size. The risk of concluding that completely unrelated series are cointegrated is therefore non-negligible. The spurious rejection rate can be reduced by performing additional tests of restrictions on the cointegrating vector(s), although it is still substantially larger than the nominal size.
Publisher:
ISBN:
Category : Econometric models
Languages : en
Pages : 28
Book Description
We investigate the properties of Johansen's (1988, 1991) maximum eigenvalue and trace tests for cointegration under the empirically relevant situation of near-integrated variables. Using Monte Carlo techniques, we show that in a system with near-integrated variables, the probability of reaching an erroneous conclusion regarding the cointegrating rank of the system is generally substantially higher than the nominal size. The risk of concluding that completely unrelated series are cointegrated is therefore non-negligible. The spurious rejection rate can be reduced by performing additional tests of restrictions on the cointegrating vector(s), although it is still substantially larger than the nominal size.
On the Application of Automatic Differentiation to the Likelihood Function for Dynamic General Equilibrium Models
Author: Houtan Bastani
Publisher:
ISBN:
Category : Simulation methods
Languages : en
Pages : 28
Book Description
A key application of automatic differentiation (AD) is to facilitate numerical optimization problems. Such problems are at the core of many estimation techniques, including maximum likelihood. As one of the first applications of AD in the field of economics, we used Tapenade to construct derivatives for the likelihood function of any linear or linearized general equilibrium model solved under the assumption of rational expectations. We view our main contribution as providing an important check on finite-difference (FD) numerical derivatives. We also construct Monte Carlo experiments to compare maximum-likelihood estimates obtained with and without the aid of automatic derivatives. We find that the convergence rate of our optimization algorithm can increase substantially when we use AD derivatives.
Publisher:
ISBN:
Category : Simulation methods
Languages : en
Pages : 28
Book Description
A key application of automatic differentiation (AD) is to facilitate numerical optimization problems. Such problems are at the core of many estimation techniques, including maximum likelihood. As one of the first applications of AD in the field of economics, we used Tapenade to construct derivatives for the likelihood function of any linear or linearized general equilibrium model solved under the assumption of rational expectations. We view our main contribution as providing an important check on finite-difference (FD) numerical derivatives. We also construct Monte Carlo experiments to compare maximum-likelihood estimates obtained with and without the aid of automatic derivatives. We find that the convergence rate of our optimization algorithm can increase substantially when we use AD derivatives.
DSGE Models and Central Banks
Author: Camilo Ernesto Tovar Mora
Publisher:
ISBN:
Category : Banks and banking, Central
Languages : en
Pages : 36
Book Description
Over the past 15 years there has been remarkable progress in the specification and estimation of dynamic stochastic general equilibrium (DSGE) models. Central banks in developed and emerging market economies have become increasingly interested in their usefulness for policy analysis and forecasting. This paper reviews some issues and challenges surrounding the use of these models at central banks. It recognises that they offer coherent frameworks for structuring policy discussions. Nonetheless, they are not ready to accomplish all that is being asked of them. First, they still need to incorporate relevant transmission mechanisms or sectors of the economy; second, issues remain on how to empirically validate them; and finally, challenges remain on how to effectively communicate their features and implications to policy makers and to the public. Overall, at their current stage DSGE models have important limitations. How much of a problem this is will depend on their specific use at central banks.
Publisher:
ISBN:
Category : Banks and banking, Central
Languages : en
Pages : 36
Book Description
Over the past 15 years there has been remarkable progress in the specification and estimation of dynamic stochastic general equilibrium (DSGE) models. Central banks in developed and emerging market economies have become increasingly interested in their usefulness for policy analysis and forecasting. This paper reviews some issues and challenges surrounding the use of these models at central banks. It recognises that they offer coherent frameworks for structuring policy discussions. Nonetheless, they are not ready to accomplish all that is being asked of them. First, they still need to incorporate relevant transmission mechanisms or sectors of the economy; second, issues remain on how to empirically validate them; and finally, challenges remain on how to effectively communicate their features and implications to policy makers and to the public. Overall, at their current stage DSGE models have important limitations. How much of a problem this is will depend on their specific use at central banks.
A Residual-based Cointegration Test for Near Unit Root Variables
Author: Erik Hjalmarsson
Publisher:
ISBN:
Category : Econometric models
Languages : en
Pages : 40
Book Description
Methods of inference based on a unit root assumption in the data are typically not robust to even small deviations from this assumption. In this paper, we propose robust procedures for a residual-based test of cointegration when the data are generated by a near unit root process. A Bonferroni method is used to address the uncertainty regarding the exact degree of persistence in the process. We thus provide a method for valid inference in multivariate near unit root processes where standard cointegration tests may be subject to substantial size distortions and standard OLS inference may lead to spurious results. Empirical illustrations are given by: (i) a re-examination of the Fisher hypothesis, and (ii) a test of the validity of the cointegrating relationship between aggregate consumption, asset holdings, and labor income, which has attracted a great deal of attention in the recent finance literature.
Publisher:
ISBN:
Category : Econometric models
Languages : en
Pages : 40
Book Description
Methods of inference based on a unit root assumption in the data are typically not robust to even small deviations from this assumption. In this paper, we propose robust procedures for a residual-based test of cointegration when the data are generated by a near unit root process. A Bonferroni method is used to address the uncertainty regarding the exact degree of persistence in the process. We thus provide a method for valid inference in multivariate near unit root processes where standard cointegration tests may be subject to substantial size distortions and standard OLS inference may lead to spurious results. Empirical illustrations are given by: (i) a re-examination of the Fisher hypothesis, and (ii) a test of the validity of the cointegrating relationship between aggregate consumption, asset holdings, and labor income, which has attracted a great deal of attention in the recent finance literature.
Estimating Hedge Fund Leverage
Author: Patrick McGuire
Publisher:
ISBN:
Category : Financial leverage
Languages : en
Pages : 48
Book Description
Publisher:
ISBN:
Category : Financial leverage
Languages : en
Pages : 48
Book Description
Do Differences in Financial Development Explain the Global Pattern of Current Account Imbalances?
Author: Joseph W. Gruber
Publisher:
ISBN:
Category : Accounts current
Languages : en
Pages : 78
Book Description
This paper addresses the popular view that differences in financial development explain the pattern of global current account imbalances. One strain of thinking explains the net flow of capital from developing to industrial economies on the basis of the industrial economies' more advanced financial systems and correspondingly more attractive assets. A related view addresses why the United States has attracted the lion's share of capital flows from developing to industrial economies; it stresses the exceptional depth, breadth, and safety of U.S. financial markets.
Publisher:
ISBN:
Category : Accounts current
Languages : en
Pages : 78
Book Description
This paper addresses the popular view that differences in financial development explain the pattern of global current account imbalances. One strain of thinking explains the net flow of capital from developing to industrial economies on the basis of the industrial economies' more advanced financial systems and correspondingly more attractive assets. A related view addresses why the United States has attracted the lion's share of capital flows from developing to industrial economies; it stresses the exceptional depth, breadth, and safety of U.S. financial markets.
The Financial Turmoil of 2007-?
Author: C. E. V. Borio
Publisher:
ISBN:
Category : Banks and banking
Languages : en
Pages : 40
Book Description
The unfolding financial turmoil in mature economies has prompted the official and private sectors to reconsider policies, business models and risk management practices. Regardless of its future evolution, it already threatens to become one of the defining economic moments of the 21st century. This essay seeks to provide a preliminary assessment of the events and to draw some lessons for policies designed to strengthen the financial system on a long-term basis. It argues that the turmoil is best seen as a natural result of a prolonged period of generalised and aggressive risk-taking, which happened to have the subprime market at its epicentre. In other words, it represents the archetypal example of financial instability with potentially serious macroeconomic consequences that follows the build-up of financial imbalances in good times. The significant idiosyncratic elements, including the threat of an unprecedented involuntary "reintermediation" wave for banks and the dislocations associated with new credit risk transfer instruments, are arguably symptoms of more fundamental common causes. The policy response, while naturally taking into account the idiosyncratic weaknesses brought to light by the turmoil, should be firmly anchored to the more enduring factors that drive financial instability. This essay highlights possible mutually reinforcing steps in three areas: accounting, disclosure and risk management; the architecture of prudential regulation; and monetary policy.
Publisher:
ISBN:
Category : Banks and banking
Languages : en
Pages : 40
Book Description
The unfolding financial turmoil in mature economies has prompted the official and private sectors to reconsider policies, business models and risk management practices. Regardless of its future evolution, it already threatens to become one of the defining economic moments of the 21st century. This essay seeks to provide a preliminary assessment of the events and to draw some lessons for policies designed to strengthen the financial system on a long-term basis. It argues that the turmoil is best seen as a natural result of a prolonged period of generalised and aggressive risk-taking, which happened to have the subprime market at its epicentre. In other words, it represents the archetypal example of financial instability with potentially serious macroeconomic consequences that follows the build-up of financial imbalances in good times. The significant idiosyncratic elements, including the threat of an unprecedented involuntary "reintermediation" wave for banks and the dislocations associated with new credit risk transfer instruments, are arguably symptoms of more fundamental common causes. The policy response, while naturally taking into account the idiosyncratic weaknesses brought to light by the turmoil, should be firmly anchored to the more enduring factors that drive financial instability. This essay highlights possible mutually reinforcing steps in three areas: accounting, disclosure and risk management; the architecture of prudential regulation; and monetary policy.