Register today for our Generative AI Foundations course. Use code GenAI99 for a discount price of $99!
Skip to content

Asymptotic Relative Efficiency (of estimators)

Asymptotic Relative Efficiency (of estimators): Unbiased estimators are usually compared in terms of their variances. The limit (as the sample size tends to infinity) of the ratio of the variance of the first estimator to the variance of the second estimator is called the asymptotic efficiency of the second estimator...

View Full Description

Asymptotically Unbiased Estimator

Asymptotically Unbiased Estimator: An asymptotically unbiased estimator is an estimator that is unbiased as the sample size tends to infinity. Some biased estimators are asymptotically unbiased but all unbiased estimators are asymptotically unbiased. Browse Other Glossary Entries

View Full Description

Biased Estimator

Biased Estimator: An estimator is a biased estimator if its expected value is not equal to the value of the population parameter being estimated. Browse Other Glossary Entries

View Full Description

Coefficient of Determination

Coefficient of Determination: In regression analysis, the coefficient of determination is a measure of goodness-of-fit (i.e. how well or tightly the data fit the estimated model). The coefficient is defined as the ratio of two sums of squares:   r2 =  SSR SST , where SSR is the sum of...

View Full Description

Confidence Interval

Confidence Interval: A confidence interval is an interval that brackets a sample estimate that quantifies uncertainty around this estimate. Since there are a variety of samples that might be drawn from a population, there are likewise a variety of confidence intervals that might be imagined for a given population parameter...

View Full Description

Consistent Estimator

Consistent Estimator: An estimator is a measure or metric intended to be calculated from a sample drawn from a larger population. A consistent estimator is an estimator with the property that the probability of the estimated value and the true value of the population parameter not lying within c units...

View Full Description

Cramer – Rao Inequality

Cramer - Rao Inequality: Every unbiased estimator has a variance greater than or equal to a lower bound called the Cramer - Rao lower bound. If the variance of an unbiased estimator achieves the Cramer - Rao lower bound, then that estimator is a minimum variance unbiased estimator, or, simply,...

View Full Description

Efficiency

Efficiency: For an unbiased estimator, efficiency indicates how much its precision is lower than the theoretical limit of precision provided by the Cramer-Rao inequality. A measure of efficiency is the ratio of the theoretically minimal variance to the actual variance of the estimator. This measure falls between 0 and 1....

View Full Description

Estimation

Estimation: Estimation is deriving a guess about the actual value of a population parameter (or parameters) from a sample drawn from this population. See also Estimator. Browse Other Glossary Entries

View Full Description

Estimator

Estimator: A statistic, measure, or model, applied to a sample, intended to estimate some parameter of the population that the sample came from. Browse Other Glossary Entries

View Full Description

Forward Selection

Forward Selection: Forward selection is one of several computer-based iterative variable-selection procedures. It resembles step-wise regression except that a variable added to the model is not permitted to be removed in the subsequent steps. See also Backward elimination. Browse Other Glossary Entries

View Full Description

Kaplan-Meier Estimator

Kaplan-Meier Estimator: The Kaplan-Meier estimator is aimed at estimation of the survival function from censored life-time data. The value of the survival function between successive distinct uncensored observations is taken as constant, and the graph of the Kaplan-Meier estimate of the survival function is a series of horizontal steps of...

View Full Description

Least Squares Method

Least Squares Method: In a narrow sense, the Least Squares Method is a technique for fitting a straight line through a set of points in such a way that the sum of the squared vertical distances from the observed points to the fitted line is minimized. In a wider sense,...

View Full Description

Line of Regression

Line of Regression: The line of regression is the line that best fits the data in simple linear regression, i.e. the line that corresponds to the "best-fit" parameters (slope and intercept) of the regression equation. Browse Other Glossary Entries

View Full Description

Linear Regression

Linear Regression: Linear regression is aimed at finding the "best-fit" linear relationship between the dependent variable and independent variable(s). See also: Regression analysis, Simple linear regression, Multiple regression Browse Other Glossary Entries

View Full Description

Logistic Regression

Logistic Regression: Logistic regression is used with binary data when you want to model the probability that a specified outcome will occur. Specifically, it is aimed at estimating parameters a and b in the following model:   Li = log  pi 1-pi = a + b xi, where pi is...

View Full Description

Loglinear regression

Loglinear regression: Loglinear regression is a kind of regression aimed at finding the best fit between the data and a loglinear model . The major assumption of loglinear regression is that a linear relationship exists between the log of the dependent variable and the inependent variables. Browse Other Glossary Entries

View Full Description

Margin of Error

Margin of Error: A margin of error typically refers to a range within which an unknown parameter is estimated to fall, given the variation that can arise from one sample to another. For example, in an opinion survey based on a randomly-drawn sample from a larger population results are usually...

View Full Description