Thank you for the quick response! I think you are on the right track - but is
there any way of "calling" (is that the word for it) the function price_call
in the mapply, so that this price_call function is changed to handle
vectors. I believe that this should, in theory if it is correct, make the
r
Dear R-users
I am trying to "vectorize" a function so that it can handle two vectors of
inputs. I want the function to use phi (a function), k[1] and t[1] for the
first price, and so on for the second, third and fourth price. I tried to do
the mapply, but I dont know how to specify to R what input
Hi
I have created the following plot over the empirical returns.. What I now
want to do is to overlay a curve/line with the normal density as a
comparison of the two. Does anyone know how to do this?
(NB the last two lines are the problem, and are wrong, I know).
Thank you in advance!
Rikke
http
Dear R-users.
I am faced with a problem I dont know how to solve.
I need to calibrate the Heston stochastic volatility model, and have (to my
own belief) created a code for calculating the prices of options by this
model. However, when I calibrate the model using NLMINB I also evaluate my
initial
I think I have found my problem, but I dont know how to correct it. I have
found an old post saying that it might be a problem if the starting values
are evaluated at Inf (see link here
http://r.789695.n4.nabble.com/Help-about-nlminb-function-td3089048.html)
But how can I run nlminb without the st
Thank you for your help, even though there was such an obvious mistake, Im
sorry for that
I have now tried to incorporate your suggested solution, but just as last
time (the other post that you referred to), I get the values of the initial
parameters when I run nlminb.
I have changed the code a bi
Dear R-users
I need to calibrate kappa, rho, eta, theta, v0 in the following code, see
below. However when I run it, I get:
y <- function(kappahat, rhohat, etahat, thetahat, v0hat) {sum(difference(k,
t, S0, X, r, implvol, q, kappahat, rhohat, etahat, thetahat, v0hat)^2)}
> nlminb(start=list(kappa
thank you for your help!
kinds Rikke
--
View this message in context:
http://r.789695.n4.nabble.com/Calibrating-the-risk-free-interest-rate-using-nlminb-tp3747509p3748442.html
Sent from the R help mailing list archive at Nabble.com.
__
R-help@r-proje
I used:
marketdata <- read.csv(file="S&P 500 calls, jan-jun 2010.csv", header=TRUE,
sep=";")
after changing my directory to where the file is saved.
The data imported should be correct.
The spot is equal to S0, I typed it double in the post, sorry for that.
So S0 = 1136.03 is the spot
--
View t
Dear R-users
I am trying to find a value for the risk free rate minimizing the difference
between a BS call value with impl. volatilities minus the market price of a
call (assuming this is just the average bid ask price)
Here is my data:
http://r.789695.n4.nabble.com/file/n3747509/S%26P_500_calls
Hello everyone !
I am currently trying to convert a program from S-plus to R, and I am
having some trouble with the S-plus function called "influence(data,
statistic,...)".
This function aims to "calculate empirical influence values and related
quantities",
and is part of the Resample library t
Just to clarify some of your language here before some others rip you apart,
you mean to say you'd like to take a random sample of size 5 and not 5 random
samples.
Now, I believe you can control the probability with which each element of your
original data set is sampled (using weights), but in
Could I have some suggestions as to how (various ways) I can display my
confidence interval results?
rm(list = ls())
set.seed(1)
func <- function(d,t,beta,lambda,alpha,p.gamma,delta,B){
d <- c(5,1,5,14,3,19,1,1,4,22)
t <- c(94.32,15.72,62.88,125.76,5.24,31.44,1.048,1.048,2.096,10.
Does anyone know how to assign (column names only) to a matrix?
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal,
14 matches
Mail list logo