On 2 Jul 2005, at 06:01, Spencer Graves wrote:
The issue is not 30 observations but whether it is possible to
perfectly separate the two possible outcomes. Consider the following:
tst.glm - data.frame(x=1:3, y=c(0, 1, 0))
glm(y~x, family=binomial, data=tst.glm)
tst2.glm -
On 02-Jul-05 Kerry Bush wrote:
I have a very simple problem. When using glm to fit
binary logistic regression model, sometimes I receive
the following warning:
Warning messages:
1: fitted probabilities numerically 0 or 1 occurred
in: glm.fit(x = X, y = Y, weights = weights, start =
start,
I agree with Ted: in model-fitting terms, it is a
resounding success! With any data set having at least one point with a
binomial yield of 0 or 100%, you can get this phenomenon by adding
series of random numbers sequentially to a model. Eventually, you will
add enough variables
The issue is not 30 observations but whether it is possible to
perfectly separate the two possible outcomes. Consider the following:
tst.glm - data.frame(x=1:3, y=c(0, 1, 0))
glm(y~x, family=binomial, data=tst.glm)
tst2.glm - data.frame(x=1:1000,
y=rep(0:1,