Author: Michael Mohr
Publisher:
ISBN:
Category :
Languages : de
Pages : 25
Book Description
Asymptotic properties of least squares estimators in regression models with forecast feedback
Author: Michael Mohr
Publisher:
ISBN:
Category :
Languages : de
Pages : 25
Book Description
Publisher:
ISBN:
Category :
Languages : de
Pages : 25
Book Description
Asymptotic Theory for Ordinary Least Squares Estimators in Regression Models with Forecast Feedback
Author: Michael Mohr
Publisher:
ISBN:
Category :
Languages : en
Pages : 107
Book Description
Publisher:
ISBN:
Category :
Languages : en
Pages : 107
Book Description
Asymptotic properties of least-squares estimators in semimartingale regression models
Author: Norbert Christopeit
Publisher:
ISBN:
Category :
Languages : de
Pages : 14
Book Description
Publisher:
ISBN:
Category :
Languages : de
Pages : 14
Book Description
Asymptotic Properties of Nonlinear Least Squares Estimates in Stochastic Regression Models
Author: Stanford University. Department of Statistics
Publisher:
ISBN:
Category :
Languages : en
Pages : 12
Book Description
Publisher:
ISBN:
Category :
Languages : en
Pages : 12
Book Description
Properties of Ordinary Least Squares Estimators in Regression Models with Non-spherical Disturbances
Author: Denzil G. Fiebig
Publisher:
ISBN:
Category : Least squares
Languages : en
Pages : 44
Book Description
Publisher:
ISBN:
Category : Least squares
Languages : en
Pages : 44
Book Description
Asymptotic Properties of Nonlinear Least Squares Estimators in a Replicated Time Series Model
Author: Jeremy Sin-hing Wu
Publisher:
ISBN:
Category :
Languages : en
Pages : 246
Book Description
Publisher:
ISBN:
Category :
Languages : en
Pages : 246
Book Description
Asymptotic Properties of Maximum Likelihood Estimators in the General Sampling Framework, and Some Results in Non-normal Linear Regression
Author: Robert Ernest Tarone
Publisher:
ISBN:
Category :
Languages : en
Pages : 190
Book Description
Publisher:
ISBN:
Category :
Languages : en
Pages : 190
Book Description
Some Properties of the Least Squares Estimator in Regression Analysis when the Independent Variables are Stochastic
Author: P. K. Bhattacharya (Mathematician)
Publisher:
ISBN:
Category : Matrices
Languages : en
Pages : 32
Book Description
For the linear regression of y on x observations the loss in estimating the true regression function by another function is considered as a loss function. For the loss function, it is shown under certain conditions that if the class of estimates which are linear in y's and have bounded risk is non-empty, then the estimate obtained by the method of least squares belongs to this class and has uniformly minimum risk in this class. A necessary and sufficient condition on the distribution function of x observations is obtained for this class to be non-empty, which unfortunately is not easy to verify in particular cases and is violated in a ver simple situation. owever, by a sequential modification of the sampling scheme, this condition may always be satisfied at the cost of an arbitrarily small increase in the expected sa ple size. I T IS ALSO SHOWN UNDER CERTAIN FURTHER C NDITIONS ON THE FAMILY OF ADMISSIBLE DISTRIB TIONS THAT THE LEAST SQUARES ESTIMATOR IS MINIMAX IN THE CLASS OF ALL ESTIMATORS. (Author).
Publisher:
ISBN:
Category : Matrices
Languages : en
Pages : 32
Book Description
For the linear regression of y on x observations the loss in estimating the true regression function by another function is considered as a loss function. For the loss function, it is shown under certain conditions that if the class of estimates which are linear in y's and have bounded risk is non-empty, then the estimate obtained by the method of least squares belongs to this class and has uniformly minimum risk in this class. A necessary and sufficient condition on the distribution function of x observations is obtained for this class to be non-empty, which unfortunately is not easy to verify in particular cases and is violated in a ver simple situation. owever, by a sequential modification of the sampling scheme, this condition may always be satisfied at the cost of an arbitrarily small increase in the expected sa ple size. I T IS ALSO SHOWN UNDER CERTAIN FURTHER C NDITIONS ON THE FAMILY OF ADMISSIBLE DISTRIB TIONS THAT THE LEAST SQUARES ESTIMATOR IS MINIMAX IN THE CLASS OF ALL ESTIMATORS. (Author).
Asymptotic Properties of the Ordinary Least Squares Estimator in Simultaneous Equations Models
Author: Virendra K. Srivastava
Publisher:
ISBN:
Category : Econometrics
Languages : en
Pages : 18
Book Description
Publisher:
ISBN:
Category : Econometrics
Languages : en
Pages : 18
Book Description
Asymptotic Properties of Some Estimators in Moving Average Models
Author: Stanford University. Department of Statistics
Publisher:
ISBN:
Category : Time-series analysis
Languages : en
Pages : 318
Book Description
The author considers estimation procedures for the moving average model of order q. Walker's method uses k sample autocovariances (k> or = q). Assume that k depends on T in such a way that k nears infinity as T nears infinity. The estimates are consistent, asymptotically normal and asymptotically efficient if k = k (T) dominates log T and is dominated by (T sub 1/2). The approach in proving these theorems involves obtaining an explicit form for the components of the inverse of a symmetric matrix with equal elements along its five central diagonals, and zeroes elsewhere. The asymptotic normality follows from a central limit theorem for normalized sums of random variables that are dependent of order k, where k tends to infinity with T. An alternative form of the estimator facilitates the calculations and the analysis of the role of k, without changing the asymptotic properties.
Publisher:
ISBN:
Category : Time-series analysis
Languages : en
Pages : 318
Book Description
The author considers estimation procedures for the moving average model of order q. Walker's method uses k sample autocovariances (k> or = q). Assume that k depends on T in such a way that k nears infinity as T nears infinity. The estimates are consistent, asymptotically normal and asymptotically efficient if k = k (T) dominates log T and is dominated by (T sub 1/2). The approach in proving these theorems involves obtaining an explicit form for the components of the inverse of a symmetric matrix with equal elements along its five central diagonals, and zeroes elsewhere. The asymptotic normality follows from a central limit theorem for normalized sums of random variables that are dependent of order k, where k tends to infinity with T. An alternative form of the estimator facilitates the calculations and the analysis of the role of k, without changing the asymptotic properties.