I am not so sure what we should understand the "real interest rate" to mean
nowadays. It's normally thought of as the "average" inflation-adjusted cost
of credit. 

 

However, there are big differences in the cost of credit depending on what
type of credit we are talking about, and so, while an "average" interest
rate adjusted for the CPI might be an indicator, it may not say a lot about
how much interest most ordinary people actually have to pay. The other
aspect is that the official US inflation rate, currently claimed to be at
circa 1.8%, may underestimate the real inflation rate. 

 

The "benchmark" US real interest rate  is near-zero, but US consumer
interest rates range to 5% or so, and on credit cards you can pay up to
about 12% in the US. So presumably if you wanted a "really real" interest
rate, then you would have to calculate weighted average costs for consumer
credit and investment credit, factoring in the structure of outstanding
debts by type. Just because some bankers can get cheap credit, it doesn't
mean that everybody else can.

 

No doubt Jim is right when he reports that the response of saving behavior
to changes in real interest rates is very little. It would be interesting to
investigate how, when a population is heavily indebted, this affects the
Keynesian "multiplier effect". My hunch is that, the more people are in
debt, the lower the multiplier effect of additional sales will be, in part
because if people get extra income, they will very likely in the first
instance use it to pay off debt. 

 

J.

_______________________________________________
pen-l mailing list
[email protected]
https://lists.csuchico.edu/mailman/listinfo/pen-l

Reply via email to