Asymptotic Property
Asymptotic Property: An asymptotic property is a property of an estimator that holds as the sample size approaches infinity. Browse Other Glossary Entries
View Full DescriptionAsymptotic Property: An asymptotic property is a property of an estimator that holds as the sample size approaches infinity. Browse Other Glossary Entries
View Full DescriptionAsymptotic Relative Efficiency (of estimators): Unbiased estimators are usually compared in terms of their variances. The limit (as the sample size tends to infinity) of the ratio of the variance of the first estimator to the variance of the second estimator is called the asymptotic efficiency of the second estimator...
View Full DescriptionAsymptotically Unbiased Estimator: An asymptotically unbiased estimator is an estimator that is unbiased as the sample size tends to infinity. Some biased estimators are asymptotically unbiased but all unbiased estimators are asymptotically unbiased. Browse Other Glossary Entries
View Full DescriptionBiased Estimator: An estimator is a biased estimator if its expected value is not equal to the value of the population parameter being estimated. Browse Other Glossary Entries
View Full DescriptionCoefficient of Determination: In regression analysis, the coefficient of determination is a measure of goodness-of-fit (i.e. how well or tightly the data fit the estimated model). The coefficient is defined as the ratio of two sums of squares: r2 = SSR SST , where SSR is the sum of...
View Full DescriptionConfidence Interval: A confidence interval is an interval that brackets a sample estimate that quantifies uncertainty around this estimate. Since there are a variety of samples that might be drawn from a population, there are likewise a variety of confidence intervals that might be imagined for a given population parameter...
View Full DescriptionConsistent Estimator: An estimator is a measure or metric intended to be calculated from a sample drawn from a larger population. A consistent estimator is an estimator with the property that the probability of the estimated value and the true value of the population parameter not lying within c units...
View Full DescriptionCox-Regression: See Proportional hazard model Browse Other Glossary Entries
View Full DescriptionCramer - Rao Inequality: Every unbiased estimator has a variance greater than or equal to a lower bound called the Cramer - Rao lower bound. If the variance of an unbiased estimator achieves the Cramer - Rao lower bound, then that estimator is a minimum variance unbiased estimator, or, simply,...
View Full DescriptionEfficiency: For an unbiased estimator, efficiency indicates how much its precision is lower than the theoretical limit of precision provided by the Cramer-Rao inequality. A measure of efficiency is the ratio of the theoretically minimal variance to the actual variance of the estimator. This measure falls between 0 and 1....
View Full DescriptionEstimation: Estimation is deriving a guess about the actual value of a population parameter (or parameters) from a sample drawn from this population. See also Estimator. Browse Other Glossary Entries
View Full DescriptionEstimator: A statistic, measure, or model, applied to a sample, intended to estimate some parameter of the population that the sample came from. Browse Other Glossary Entries
View Full DescriptionForward Selection: Forward selection is one of several computer-based iterative variable-selection procedures. It resembles step-wise regression except that a variable added to the model is not permitted to be removed in the subsequent steps. See also Backward elimination. Browse Other Glossary Entries
View Full DescriptionKaplan-Meier Estimator: The Kaplan-Meier estimator is aimed at estimation of the survival function from censored life-time data. The value of the survival function between successive distinct uncensored observations is taken as constant, and the graph of the Kaplan-Meier estimate of the survival function is a series of horizontal steps of...
View Full DescriptionLeast Squares Method: In a narrow sense, the Least Squares Method is a technique for fitting a straight line through a set of points in such a way that the sum of the squared vertical distances from the observed points to the fitted line is minimized. In a wider sense,...
View Full DescriptionLine of Regression: The line of regression is the line that best fits the data in simple linear regression, i.e. the line that corresponds to the "best-fit" parameters (slope and intercept) of the regression equation. Browse Other Glossary Entries
View Full DescriptionLinear Regression: Linear regression is aimed at finding the "best-fit" linear relationship between the dependent variable and independent variable(s). See also: Regression analysis, Simple linear regression, Multiple regression Browse Other Glossary Entries
View Full DescriptionLogistic Regression: Logistic regression is used with binary data when you want to model the probability that a specified outcome will occur. Specifically, it is aimed at estimating parameters a and b in the following model: Li = log pi 1-pi = a + b xi, where pi is...
View Full DescriptionLoglinear regression: Loglinear regression is a kind of regression aimed at finding the best fit between the data and a loglinear model . The major assumption of loglinear regression is that a linear relationship exists between the log of the dependent variable and the inependent variables. Browse Other Glossary Entries
View Full DescriptionMargin of Error: A margin of error typically refers to a range within which an unknown parameter is estimated to fall, given the variation that can arise from one sample to another. For example, in an opinion survey based on a randomly-drawn sample from a larger population results are usually...
View Full Description