Author: B. B. van der Genugten
Publisher:
ISBN:
Category :
Languages : en
Pages : 27
Book Description
Asymptotic Normality of Least Squares Estimators in Autoregressive Linear Regression Models
Author: B. B. van der Genugten
Publisher:
ISBN:
Category :
Languages : en
Pages : 27
Book Description
Publisher:
ISBN:
Category :
Languages : en
Pages : 27
Book Description
Partially Linear Models
Author: Wolfgang Härdle
Publisher: Springer Science & Business Media
ISBN: 3642577008
Category : Mathematics
Languages : en
Pages : 210
Book Description
In the last ten years, there has been increasing interest and activity in the general area of partially linear regression smoothing in statistics. Many methods and techniques have been proposed and studied. This monograph hopes to bring an up-to-date presentation of the state of the art of partially linear regression techniques. The emphasis is on methodologies rather than on the theory, with a particular focus on applications of partially linear regression techniques to various statistical problems. These problems include least squares regression, asymptotically efficient estimation, bootstrap resampling, censored data analysis, linear measurement error models, nonlinear measurement models, nonlinear and nonparametric time series models.
Publisher: Springer Science & Business Media
ISBN: 3642577008
Category : Mathematics
Languages : en
Pages : 210
Book Description
In the last ten years, there has been increasing interest and activity in the general area of partially linear regression smoothing in statistics. Many methods and techniques have been proposed and studied. This monograph hopes to bring an up-to-date presentation of the state of the art of partially linear regression techniques. The emphasis is on methodologies rather than on the theory, with a particular focus on applications of partially linear regression techniques to various statistical problems. These problems include least squares regression, asymptotically efficient estimation, bootstrap resampling, censored data analysis, linear measurement error models, nonlinear measurement models, nonlinear and nonparametric time series models.
On the Rates of Convergence to Asymptotic Normality of Least Squares Estimators in Linear Regression Model with Autocorrelated Errors
Author: Subhash Chander Sharma
Publisher:
ISBN:
Category : Asymptotes
Languages : en
Pages : 236
Book Description
Publisher:
ISBN:
Category : Asymptotes
Languages : en
Pages : 236
Book Description
Asymptotic Properties of Some Estimators in Moving Average Models
Author: Stanford University. Department of Statistics
Publisher:
ISBN:
Category : Time-series analysis
Languages : en
Pages : 318
Book Description
The author considers estimation procedures for the moving average model of order q. Walker's method uses k sample autocovariances (k> or = q). Assume that k depends on T in such a way that k nears infinity as T nears infinity. The estimates are consistent, asymptotically normal and asymptotically efficient if k = k (T) dominates log T and is dominated by (T sub 1/2). The approach in proving these theorems involves obtaining an explicit form for the components of the inverse of a symmetric matrix with equal elements along its five central diagonals, and zeroes elsewhere. The asymptotic normality follows from a central limit theorem for normalized sums of random variables that are dependent of order k, where k tends to infinity with T. An alternative form of the estimator facilitates the calculations and the analysis of the role of k, without changing the asymptotic properties.
Publisher:
ISBN:
Category : Time-series analysis
Languages : en
Pages : 318
Book Description
The author considers estimation procedures for the moving average model of order q. Walker's method uses k sample autocovariances (k> or = q). Assume that k depends on T in such a way that k nears infinity as T nears infinity. The estimates are consistent, asymptotically normal and asymptotically efficient if k = k (T) dominates log T and is dominated by (T sub 1/2). The approach in proving these theorems involves obtaining an explicit form for the components of the inverse of a symmetric matrix with equal elements along its five central diagonals, and zeroes elsewhere. The asymptotic normality follows from a central limit theorem for normalized sums of random variables that are dependent of order k, where k tends to infinity with T. An alternative form of the estimator facilitates the calculations and the analysis of the role of k, without changing the asymptotic properties.
Asymptotic Normality of Minimum L1-Norm Estimates in Linear Models
Author: Z. D. Bai
Publisher:
ISBN:
Category :
Languages : en
Pages : 36
Book Description
This document considers a standard linear regression model with assumed known p-vectors, unknown p-vectors of regression coefficients, and an independent random error sequence each having a median zero. Keywords: Minimization problem; Least squares estimates.
Publisher:
ISBN:
Category :
Languages : en
Pages : 36
Book Description
This document considers a standard linear regression model with assumed known p-vectors, unknown p-vectors of regression coefficients, and an independent random error sequence each having a median zero. Keywords: Minimization problem; Least squares estimates.
Robust Methods and Asymptotic Theory in Nonlinear Econometrics
Author: H. J. Bierens
Publisher: Springer Science & Business Media
ISBN: 3642455298
Category : Mathematics
Languages : en
Pages : 211
Book Description
This Lecture Note deals with asymptotic properties, i.e. weak and strong consistency and asymptotic normality, of parameter estimators of nonlinear regression models and nonlinear structural equations under various assumptions on the distribution of the data. The estimation methods involved are nonlinear least squares estimation (NLLSE), nonlinear robust M-estimation (NLRME) and non linear weighted robust M-estimation (NLWRME) for the regression case and nonlinear two-stage least squares estimation (NL2SLSE) and a new method called minimum information estimation (MIE) for the case of structural equations. The asymptotic properties of the NLLSE and the two robust M-estimation methods are derived from further elaborations of results of Jennrich. Special attention is payed to the comparison of the asymptotic efficiency of NLLSE and NLRME. It is shown that if the tails of the error distribution are fatter than those of the normal distribution NLRME is more efficient than NLLSE. The NLWRME method is appropriate if the distributions of both the errors and the regressors have fat tails. This study also improves and extends the NL2SLSE theory of Amemiya. The method involved is a variant of the instrumental variables method, requiring at least as many instrumental variables as parameters to be estimated. The new MIE method requires less instrumental variables. Asymptotic normality can be derived by employing only one instrumental variable and consistency can even be proved with out using any instrumental variables at all.
Publisher: Springer Science & Business Media
ISBN: 3642455298
Category : Mathematics
Languages : en
Pages : 211
Book Description
This Lecture Note deals with asymptotic properties, i.e. weak and strong consistency and asymptotic normality, of parameter estimators of nonlinear regression models and nonlinear structural equations under various assumptions on the distribution of the data. The estimation methods involved are nonlinear least squares estimation (NLLSE), nonlinear robust M-estimation (NLRME) and non linear weighted robust M-estimation (NLWRME) for the regression case and nonlinear two-stage least squares estimation (NL2SLSE) and a new method called minimum information estimation (MIE) for the case of structural equations. The asymptotic properties of the NLLSE and the two robust M-estimation methods are derived from further elaborations of results of Jennrich. Special attention is payed to the comparison of the asymptotic efficiency of NLLSE and NLRME. It is shown that if the tails of the error distribution are fatter than those of the normal distribution NLRME is more efficient than NLLSE. The NLWRME method is appropriate if the distributions of both the errors and the regressors have fat tails. This study also improves and extends the NL2SLSE theory of Amemiya. The method involved is a variant of the instrumental variables method, requiring at least as many instrumental variables as parameters to be estimated. The new MIE method requires less instrumental variables. Asymptotic normality can be derived by employing only one instrumental variable and consistency can even be proved with out using any instrumental variables at all.
Seemingly Unrelated Regression Equations Models
Author: Virendera K. Srivastava
Publisher: CRC Press
ISBN: 9780824776107
Category : Mathematics
Languages : en
Pages : 398
Book Description
The seemingly unrelated regression equations model; The least squares estimator and its variants; Approximate destribution theory for feasible generalized least squares estimators; Exact finite-sample properties of feasible generalized least squares estimators; Iterative estimators; Shrinkage estimators; Autoregressive disturbances; Heteroscedastic disturbances; Constrained error covariance structures; Prior information; Some miscellaneous topics.
Publisher: CRC Press
ISBN: 9780824776107
Category : Mathematics
Languages : en
Pages : 398
Book Description
The seemingly unrelated regression equations model; The least squares estimator and its variants; Approximate destribution theory for feasible generalized least squares estimators; Exact finite-sample properties of feasible generalized least squares estimators; Iterative estimators; Shrinkage estimators; Autoregressive disturbances; Heteroscedastic disturbances; Constrained error covariance structures; Prior information; Some miscellaneous topics.
Nonparametric Estimation of Autoregression and Multiple Regression Parameters
Author: Ronald Carlton Pruitt
Publisher:
ISBN:
Category :
Languages : en
Pages : 212
Book Description
Publisher:
ISBN:
Category :
Languages : en
Pages : 212
Book Description
Sign-based Methods in Linear Statistical Models
Author: M. V. Boldin
Publisher: American Mathematical Soc.
ISBN: 9780821897768
Category : Mathematics
Languages : en
Pages : 252
Book Description
For nonparametric statistics, the last half of this century was the time when rank-based methods originated, were vigorously developed, reached maturity, and received wide recognition. The rank-based approach in statistics consists in ranking the observed values and using only the ranks rather than the original numerical data. In fitting relationships to observed data, the ranks of residuals from the fitted dependence are used. The signed-based approach is based on the assumption that random errors take positive or negative values with equal probabilities. Under this assumption, the sign procedures are distribution-free. These procedures are robust to violations of model assumptions, for instance, to even a considerable number of gross errors in observations. In addition, sign procedures have fairly high relative asymptotic efficiency, in spite of the obvious loss of information incurred by the use of signs instead of the corresponding numerical values. In this work, sign-based methods in the framework of linear models are developed. In the first part of the book, there are linear and factor models involving independent observations. In the second part, linear models of time series, primarily autoregressive models, are considered.
Publisher: American Mathematical Soc.
ISBN: 9780821897768
Category : Mathematics
Languages : en
Pages : 252
Book Description
For nonparametric statistics, the last half of this century was the time when rank-based methods originated, were vigorously developed, reached maturity, and received wide recognition. The rank-based approach in statistics consists in ranking the observed values and using only the ranks rather than the original numerical data. In fitting relationships to observed data, the ranks of residuals from the fitted dependence are used. The signed-based approach is based on the assumption that random errors take positive or negative values with equal probabilities. Under this assumption, the sign procedures are distribution-free. These procedures are robust to violations of model assumptions, for instance, to even a considerable number of gross errors in observations. In addition, sign procedures have fairly high relative asymptotic efficiency, in spite of the obvious loss of information incurred by the use of signs instead of the corresponding numerical values. In this work, sign-based methods in the framework of linear models are developed. In the first part of the book, there are linear and factor models involving independent observations. In the second part, linear models of time series, primarily autoregressive models, are considered.
Asymptotic Distribution of the Bias Corrected Least Squares Estimators in Measurement Error Linear Regression Models Under Long Memory
Author: Hira Koul
Publisher:
ISBN:
Category :
Languages : en
Pages : 0
Book Description
This article derives the consistency and asymptotic distribution of the bias corrected least squares estimators (LSEs) of the regression parameters in linear regression models when covariates have measurement error (ME) and errors and covariates form mutually independent long memory moving average processes. In the structural ME linear regression model, the nature of the asymptotic distribution of suitably standardized bias corrected LSEs depends on the range of the values of where ,, and are the LM parameters of the covariate, ME and regression error processes respectively. This limiting distribution is Gaussian when and non-Gaussian in the case . In the former case some consistent estimators of the asymptotic variances of these estimators and a log()-consistent estimator of an underlying LM parameter are also provided. They are useful in the construction of the large sample confidence intervals for regression parameters. The article also discusses the asymptotic distribution of these estimators in some functional ME linear regression models, where the unobservable covariate is non-random. In these models, the limiting distribution of the bias corrected LSEs is always a Gaussian distribution determined by the range of the values of )-)
Publisher:
ISBN:
Category :
Languages : en
Pages : 0
Book Description
This article derives the consistency and asymptotic distribution of the bias corrected least squares estimators (LSEs) of the regression parameters in linear regression models when covariates have measurement error (ME) and errors and covariates form mutually independent long memory moving average processes. In the structural ME linear regression model, the nature of the asymptotic distribution of suitably standardized bias corrected LSEs depends on the range of the values of where ,, and are the LM parameters of the covariate, ME and regression error processes respectively. This limiting distribution is Gaussian when and non-Gaussian in the case . In the former case some consistent estimators of the asymptotic variances of these estimators and a log()-consistent estimator of an underlying LM parameter are also provided. They are useful in the construction of the large sample confidence intervals for regression parameters. The article also discusses the asymptotic distribution of these estimators in some functional ME linear regression models, where the unobservable covariate is non-random. In these models, the limiting distribution of the bias corrected LSEs is always a Gaussian distribution determined by the range of the values of )-)