Likelihood Function:
Likelihood function is a fundamental concept in statistical inference. It indicates how likely a particular population is to produce an observed sample.
Let P(X; T) be the distribution of a random vector X, where T is the vector of parameters of the distribution. If Xo is the observed realization of vector X, an outcome of an experiment, then the function is called a likelihood function. In other words, you have to substitute the observations instead of the random vector into the expression for probability of the random vector, and to consider the new expression as a function of parameters T. The likelihood function varies from outcome to outcome of the same experiment, for example, from sample to sample.
The likelihood function itself is not probability (nor density) because its argument is the parameter T of the distribution, not the random (vector) variable X itself. For example, the sum (or integral) of the likelihood function over all possible values of T should not be equal to 1.
Even if the set of all possible values of the vector T is discrete, the likelihood function still may be continuous (as far as the set of parameters T is continuous).
Suppose you have a sample of 50 balls – 10 white and 40 black. The balls have been drawn randomly from a large bag with black and white balls (population). The question of interest is the proportion of white balls in the population. A Binomial distribution Pb(X; N=50, p=T) is a reasonable statistical model for the number X of black balls in a sample of N=50 balls drawn from a population with proportion T of black balls. To obtain the likelihood function for your data you have to substitute observation X=10 into the formula for the binomial distribution, and to consider the expression as a function of T. (Note: the number of possible outcomes X is finite – 51, but the likelihood function is still a function of a continuous parameter T – the proportion of black balls in the population).
See also: Maximum Likelihood Estimator