Asymptotic Theory for Ordinary Least Squares Estimators in Regression Models with Forecast Feedback

Asymptotic Theory for Ordinary Least Squares Estimators in Regression Models with Forecast Feedback PDF Author: Michael Mohr
Publisher:
ISBN:
Category :
Languages : de
Pages : 107

Get Book Here

Book Description

Asymptotic Theory for Ordinary Least Squares Estimators in Regression Models with Forecast Feedback

Asymptotic Theory for Ordinary Least Squares Estimators in Regression Models with Forecast Feedback PDF Author: Michael Mohr
Publisher:
ISBN:
Category :
Languages : de
Pages : 107

Get Book Here

Book Description


Asymptotic properties of least squares estimators in regression models with forecast feedback

Asymptotic properties of least squares estimators in regression models with forecast feedback PDF Author: Michael Mohr
Publisher:
ISBN:
Category :
Languages : de
Pages : 25

Get Book Here

Book Description


Stochastic Approximation and Optimization of Random Systems

Stochastic Approximation and Optimization of Random Systems PDF Author: L. Ljung
Publisher: Birkhäuser
ISBN: 3034886098
Category : Mathematics
Languages : en
Pages : 120

Get Book Here

Book Description
The DMV seminar "Stochastische Approximation und Optimierung zufalliger Systeme" was held at Blaubeuren, 28. 5. -4. 6. 1989. The goal was to give an approach to theory and application of stochas tic approximation in view of optimization problems, especially in engineering systems. These notes are based on the seminar lectures. They consist of three parts: I. Foundations of stochastic approximation (H. Walk); n. Applicational aspects of stochastic approximation (G. PHug); In. Applications to adaptation :ugorithms (L. Ljung). The prerequisites for reading this book are basic knowledge in probability, mathematical statistics, optimization. We would like to thank Prof. M. Barner and Prof. G. Fischer for the or ganization of the seminar. We also thank the participants for their cooperation and our assistants and secretaries for typing the manuscript. November 1991 L. Ljung, G. PHug, H. Walk Table of contents I Foundations of stochastic approximation (H. Walk) §1 Almost sure convergence of stochastic approximation procedures 2 §2 Recursive methods for linear problems 17 §3 Stochastic optimization under stochastic constraints 22 §4 A learning model; recursive density estimation 27 §5 Invariance principles in stochastic approximation 30 §6 On the theory of large deviations 43 References for Part I 45 11 Applicational aspects of stochastic approximation (G. PHug) §7 Markovian stochastic optimization and stochastic approximation procedures 53 §8 Asymptotic distributions 71 §9 Stopping times 79 §1O Applications of stochastic approximation methods 80 References for Part II 90 III Applications to adaptation algorithms (L.

An Asymptotic Theory for Weighted Least Squares with Weights Estimated by Replication

An Asymptotic Theory for Weighted Least Squares with Weights Estimated by Replication PDF Author: Raymond J. Carroll
Publisher:
ISBN:
Category :
Languages : en
Pages : 19

Get Book Here

Book Description
This document considers a heteroscedastic linear regression model with replication. To estimate the variances, one can use the sample variances or the sample average squared errors from a regression fit. The authors study the large sample properties of these weighted least squares estimates with estimated weights when the number of replicates is small. The estimates are generally inconsistent for asymmetrically distributed data. If sample variances are used based on m replicates, the weighted least squares estimates are inconsistent for m=2 replicates even when the data are normally distributed. With between 3 and 5 replicates, the rates of convergence are slower than the usual square root of N. With m> or = 6 replicates, the effect of estimating the weights is to increase variances by (m-5)/(m-3), relative to weighted least squares estimates with known weights. (KR).

Properties of Ordinary Least Squares Estimators in Regression Models with Non-spherical Disturbances

Properties of Ordinary Least Squares Estimators in Regression Models with Non-spherical Disturbances PDF Author: Denzil G. Fiebig
Publisher:
ISBN:
Category : Least squares
Languages : en
Pages : 44

Get Book Here

Book Description


Asymptotic Normality of Least Squares Estimators in Autoregressive Linear Regression Models

Asymptotic Normality of Least Squares Estimators in Autoregressive Linear Regression Models PDF Author: B. B. van der Genugten
Publisher:
ISBN:
Category :
Languages : en
Pages : 27

Get Book Here

Book Description


Asymptotic Theory of Nonlinear Regression

Asymptotic Theory of Nonlinear Regression PDF Author: A. V. Ivanov
Publisher: Springer
ISBN:
Category : Language Arts & Disciplines
Languages : en
Pages : 344

Get Book Here

Book Description
This book presents up-to-date mathematical results in asymptotic theory on nonlinear regression on the basis of various asymptotic expansions of least squares, its characteristics, and its distribution functions of functionals of Least Squares Estimator. It is divided into four chapters. In Chapter 1 assertions on the probability of large deviation of normal Least Squares Estimator of regression function parameters are made. Chapter 2 indicates conditions for Least Moduli Estimator asymptotic normality. An asymptotic expansion of Least Squares Estimator as well as its distribution function are obtained and two initial terms of these asymptotic expansions are calculated. Separately, the Berry-Esseen inequality for Least Squares Estimator distribution is deduced. In the third chapter asymptotic expansions related to functionals of Least Squares Estimator are dealt with. Lastly, Chapter 4 offers a comparison of the powers of statistical tests based on Least Squares Estimators. The Appendix gives an overview of subsidiary facts and a list of principal notations. Additional background information, grouped per chapter, is presented in the Commentary section. The volume concludes with an extensive Bibliography. Audience: This book will be of interest to mathematicians and statisticians whose work involves stochastic analysis, probability theory, mathematics of engineering, mathematical modelling, systems theory or cybernetics.

Asymptotic Theory for Weighted Least Squares Estimators in Aalen's Additive Risk Model

Asymptotic Theory for Weighted Least Squares Estimators in Aalen's Additive Risk Model PDF Author: Ian W. McKeague
Publisher:
ISBN:
Category :
Languages : en
Pages : 16

Get Book Here

Book Description
Let h(t/Z sub i) be the conditional hazard function for the survival time of an individual sub i given the p-dimensional covariate process Z sub i(t). This document inference for Aalen's additive risk model h(t/Z sub i)=Z sub i(t) alpha (t), where alpha is a p-vector of unknown hazard functions. The theory of counting processes is used to obtain weak convergence results for weighted least squares estimators of the hazard functions and the cumulative hazard functions based on continuous data. Results for weighted least squares estimators based on grouped data are also described. Keywords: Regression models, Biostatistics.

Asymptotic Theory of the Least Squares Estimators of Sinusoidal Signal

Asymptotic Theory of the Least Squares Estimators of Sinusoidal Signal PDF Author:
Publisher:
ISBN:
Category :
Languages : en
Pages : 16

Get Book Here

Book Description
The consistency and the asymptotic normality of the least squares estimators are derived of the sinusoidal model under the assumption of stationary random error. It is observed that the model does not satisfy the standard sufficient conditions of Jennrich (1969) Wu (1981) or Kundu (1991). Recently the consistency and the asymptotic normality are derived for the sinusoidal signal under the assumption of normal error (Kundu; 1993) and under the assumptions of independent and identically distributed random variables in Kundu and Mitra (1996). This paper will generalize them. Hannan (1971) also considered the similar kind of model and establish the result after making the Fourier transform of the data for one parameter model. We establish the result without making the Fourier transform of the data. We give an explicit expression of the asymptotic distribution of the multiparameter case, which is not available in the literature. Our approach is different from Hannan's approach. We do some simulations study to see the small sample properties of the two types of estimators.

Asymptotic Distribution of the Bias Corrected Least Squares Estimators in Measurement Error Linear Regression Models Under Long Memory

Asymptotic Distribution of the Bias Corrected Least Squares Estimators in Measurement Error Linear Regression Models Under Long Memory PDF Author: Hira Koul
Publisher:
ISBN:
Category :
Languages : en
Pages : 0

Get Book Here

Book Description
This article derives the consistency and asymptotic distribution of the bias corrected least squares estimators (LSEs) of the regression parameters in linear regression models when covariates have measurement error (ME) and errors and covariates form mutually independent long memory moving average processes. In the structural ME linear regression model, the nature of the asymptotic distribution of suitably standardized bias corrected LSEs depends on the range of the values of where ,, and are the LM parameters of the covariate, ME and regression error processes respectively. This limiting distribution is Gaussian when and non-Gaussian in the case . In the former case some consistent estimators of the asymptotic variances of these estimators and a log()-consistent estimator of an underlying LM parameter are also provided. They are useful in the construction of the large sample confidence intervals for regression parameters. The article also discusses the asymptotic distribution of these estimators in some functional ME linear regression models, where the unobservable covariate is non-random. In these models, the limiting distribution of the bias corrected LSEs is always a Gaussian distribution determined by the range of the values of )-)