On Nov 9, 2009, at 1:54 PM, Roberto Patuelli wrote:

Dear Daniel,

Thanks for your prompt reply.
Indeed I was aware of the possibility of computing at mean(x) or doing the mean afterwards.
But what you suggest is marginal effects, right?

They might be called "marginal effects" by some.


Isn't that the effect on y of a 1-unit increase in x (what I was not interested in)?

Not exactly. The coefficient in a logistic regression analysis is the increase in log-odds(y) for a one unit increase in x. On the original y scale, the odds ratio for event y=1 (versus event y=0) for two situations differing by one unit of x would be exp(coef(x)).

I'm concerned that this distinction may have escaped you since you stated that Poisson regression coefficients would have the same interpretation as OLS estimates.


I'm interested in the effect on y of a 1% increase in x (called percentage effects, right?).

You might attract more interest if you posed a specific question regarding a specific dataset which you analyzed using methods which you may understand. The term "percentage effect" may be a domain- specific term for something that has a particular interpretation, but it's not a familiar term for some of us readers.


Could you please clarify?

Thanks
Roberto


----- Original Message ----- From: "Daniel Malter" <dan...@umd.edu>
To: "Patuelli Roberto" <roberto.patue...@usi.ch>; <r-help@r-project.org >
Sent: Monday, November 09, 2009 7:44 PM
Subject: AW: [R] Percentage effects in logistic regression


Somebody might have done this, but in fact it's not difficult to compute the
marginal effects yourself (which is the beauty of R). For a univariate
logistic regression, I illustrate two ways to compute the marginal effects (one corresponds to the mfx, the other one to the margeff command in Stata). With the first you compute the marginal effect based on the mean fitted values; with the second you compute the marginal effect based on the fitted
values for each observation and then mean over the individual marginal
effects. Often the second way is considered better. You can easily extend
the R-code below to a multivariate regression.

#####
#####Simulate data and run regression
#####

set.seed(343)
x=rnorm(100,0,1)      #linear predictor
lp=exp(x)/(1+exp(x)) #probability
y=rbinom(100,1,lp) #Bernoulli draws with probability lp

#Run logistic regression
reg=glm(y~x,binomial)
summary(reg)

#####
#####Regression output
#####

Coefficients:
          Estimate Std. Error z value Pr(>|z|)
(Intercept)   0.1921     0.2175   0.883 0.377133
x             0.9442     0.2824   3.343 0.000829 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

(Dispersion parameter for binomial family taken to be 1)

  Null deviance: 138.47  on 99  degrees of freedom
Residual deviance: 125.01  on 98  degrees of freedom
AIC: 129.01

#####
#####Compute marginal effects
#####

#Way 1
mean(fitted(reg))*mean(1-fitted(reg))*coefficients(reg)[2]

0.2356697

#Way 2
mean(fitted(reg)*(1-fitted(reg))*coefficients(reg)[2])

0.2057041


#####
#####Check with Stata
#####

Logistic regression                               Number of obs   =
100
                                                LR chi2(1)      =
13.46
                                                Prob > chi2     =
0.0002
Log likelihood = -62.506426                       Pseudo R2       =
0.0972

----------------------------------------------------------------------------
--
         y |      Coef.   Std. Err.      z    P>|z|     [95% Conf.
Interval]
------------- +--------------------------------------------------------------
--
         x |   .9441896   .2824403     3.34   0.001     .3906167
1.497762
     _cons |   .1920529   .2174531     0.88   0.377    -.2341474
.6182532
----------------------------------------------------------------------------
--

#####
#####Compute marginal effects in Stata
#####

#Way 1
Marginal effects after logit
    y  = Pr(y) (predict)
       =  .52354297
----------------------------------------------------------------------------
--
variable | dy/dx Std. Err. z P>|z| [ 95% C.I. ] X --------- +------------------------------------------------------------------
--
     x |   .2355241      .07041    3.35   0.001   .097532  .373516
-.103593
----------------------------------------------------------------------------
--

#Way 2
Average marginal effects on Prob(y==1) after logit

----------------------------------------------------------------------------
--
         y |      Coef.   Std. Err.      z    P>|z|     [95% Conf.
Interval]
------------- +--------------------------------------------------------------
--
         x |   .2057041   .0473328     4.35   0.000     .1129334
.2984747
----------------------------------------------------------------------------
--


HTH,
Daniel



-------------------------
cuncta stricte discussurus
-------------------------

-----Ursprüngliche Nachricht-----
Von: r-help-boun...@r-project.org [mailto:r-help-boun...@r- project.org] Im
Auftrag von Roberto Patuelli
Gesendet: Monday, November 09, 2009 12:04 PM
An: r-help@r-project.org
Betreff: [R] Percentage effects in logistic regression

Dear ALL,

I'm trying to figure out what the percentage effects are in a logistic
regression. To be more clear, I'm not interested in the effect on y of a 1-unit increase in x, but on the percentage effect on y of a 1% increase in
x (in economics this is also often called an "elasticity").
For example, if my independent variables are in logs, the betas can be
directly interpreted as percentage effects both in OLS and Poisson
regression. What about the logistic regression?

Is there a package (maybe effects?) that can compute these automatically?

Thanks and best regards,
Roberto Patuelli



********************
Roberto Patuelli, Ph.D.
Istituto Ricerche Economiche (IRE) (Institute for Economic Research)
Università della Svizzera Italiana (University of Lugano) via Maderno 24, CP
4361
CH-6904 Lugano
Switzerland
Phone: +41-(0)58-666-4166
Fax: +39-02-700419665

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

David Winsemius, MD
Heritage Laboratories
West Hartford, CT

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to