Re: [R] Download data ph soil

2020-12-03 Thread Jim Lemon
Hi Jose,
Searching for "soil pH data" reveals a bucketload of sites with this
sort of data in lots of formats.

Jim



On Thu, Dec 3, 2020 at 10:07 PM José Luis Aguilar
 wrote:
>
> Dear list members,
>
> I am looking for soil pH data for Europe and Africa, but I don't.
> I need them to set up a map in R for my research.
>
> Please, someone where to find this data in tif formats preferably.
>
> thank you
>
> [[alternative HTML version deleted]]
>
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Download data ph soil

2020-12-03 Thread José Luis Aguilar
Dear list members,

I am looking for soil pH data for Europe and Africa, but I don't.
I need them to set up a map in R for my research.

Please, someone where to find this data in tif formats preferably.

thank you

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data from NASA for multiple locations - RCurl

2017-10-16 Thread David Winsemius

> On Oct 16, 2017, at 1:43 PM, Miluji Sb  wrote:
> 
> I have done the following using readLines
> 
> directory <- "~/"
> files <- list.files(directory)
> data_frames <- vector("list", length(files))
> for (i in seq_along(files)) {
>   df <- readLines(file.path(directory, files[i]))
>   df <- df[-(1:13)]
>   df <- data.frame(year = substr(df,1,4),
>month = substr(df, 6,7),
>day = substr(df, 9, 10),
>hour = substr(df, 12, 13),
>temp = substr(df, 21, 27))




>   data_frames[[i]] <- df
> }
> 
> What I have been have been having trouble is adding the following information 
> from the cities file (100 cities) for each of the downloaded data files. I 
> would like to do the following but automatically:
> 
> ###
> mydata$city <- rep(cities[1,1], nrow(mydata))  
> mydata$state <- rep(cities[1,2], nrow(mydata))
> mydata$lon <- rep(cities[1,3], nrow(mydata))
> mydata$lat <- rep(cities[1,4], nrow(mydata))
> ###
> 

Why not store  the lat/lon data in the file name and then extract all 4 items 
from the file name within the loop?

-- 
David.


> The information for cities look like this:
> 
> ###
> cities <-  dput(droplevels(head(cities, 5)))
> structure(list(city = structure(1:5, .Label = c("Boston", "Bridgeport", 
> "Cambridge", "Fall River", "Hartford"), class = "factor"), state = 
> structure(c(2L, 
> 1L, 2L, 2L, 1L), .Label = c(" CT ", " MA "), class = "factor"), 
> lon = c(-71.06, -73.19, -71.11, -71.16, -72.67), lat = c(42.36, 
> 41.18, 42.37, 41.7, 41.77)), .Names = c("city", "state", 
> "lon", "lat"), row.names = c(NA, 5L), class = "data.frame")
> ###
> 
> Apologies if this seems trivial but I have been having a hard time. Thank you 
> again.
> 
> Sincerely,
> 
> Milu
> 
> On Mon, Oct 16, 2017 at 7:13 PM, David Winsemius  
> wrote:
> 
> > On Oct 15, 2017, at 3:35 PM, Miluji Sb  wrote:
> >
> > Dear David,
> >
> > This is amazing, thank you so much. If I may ask another question:
> >
> > The output looks like the following:
> >
> > ###
> > dput(head(x,15))
> > c("Metadata for Requested Time Series:", "", 
> > "prod_name=GLDAS_NOAH025_3H_v2.0",
> > "param_short_name=Tair_f_inst", "param_name=Near surface air temperature",
> > "unit=K", "begin_time=1970-01-01T00", "end_time=1979-12-31T21",
> > "lat= 42.36", "lon=-71.06", "Request_time=2017-10-15 22:20:03 GMT",
> > "", "Date   Data", "1970-01-01T00:00:00\t267.769",
> > "1970-01-01T03:00:00\t264.595")
> > ###
> >
> > Thus I need to drop the first 13 rows and do the following to add 
> > identifying information:
> 
> Are you having difficulty reading in the data from disk? The `read.table` 
> function has a "skip" parameter.
> >
> > ###
> > mydata <- data.frame(year = substr(x,1,4),
> 
> That would not appear to do anything useful with x. The `x` object is not a 
> long string. The items you want are in separate elements of x.
> 
> substr(x,1,4)   # now returns
>  [1] "Meta" "" "prod" "para" "para" "unit" "begi" "end_" "lat=" "lon=" 
> "Requ" "" "Date"
> [14] "1970" "1970"
> 
> You need to learn basic R indexing. The year might be extracted from the 7th 
> element of x x via code like this:
> 
> year <- substr( x[7], 1,4)
> 
> >  month = substr(x, 6,7),
> >  day = substr(x, 9, 10),
> >  hour = substr(x, 12, 13),
> >  temp = substr(x, 21, 27))
> 
> The time and temp items would naturally be read in with read.table (or in the 
> case of tab-delimited data with read.delim) after skipping the first 14 lines.
> 
> 
> >
> > mydata$city <- rep(cities[1,1], nrow(mydata))
> 
> There's no need to use `rep` with data.frame. If one argument to data.frame 
> is length n then all single elelment arguments will be "recycled" to fill in 
> the needed number of rows. Please take the time to work through all the pages 
> of "Introduction to R" (shipped with all distributions of R) or pick another 
> introductory text. We cannot provide tutoring to all students. You need to 
> put in the needed self-study first.
> 
> --
> David.
> 
> 
> > mydata$state <- rep(cities[1,2], nrow(mydata))
> > mydata$lon <- rep(cities[1,3], nrow(mydata))
> > mydata$lat <- rep(cities[1,4], nrow(mydata))
> > ###
> >
> > Is it possible to incorporate these into your code so the data looks like 
> > this:
> >
> > dput(droplevels(head(mydata)))
> > structure(list(year = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "1970", 
> > class = "factor"),
> > month = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "01", class = 
> > "factor"),
> > day = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "01", class = 
> > "factor"),
> > hour = structure(1:6, .Label = c("00", "03", "06", "09",
> > "12", "15"), class = "factor"), temp = structure(c(6L, 4L,
> > 2L, 1L, 3L, 5L), .Label = c("261.559", "262.525", "262.648",
> > "264.595", 

Re: [R] Download data from NASA for multiple locations - RCurl

2017-10-16 Thread Miluji Sb
I have done the following using readLines

directory <- "~/"
files <- list.files(directory)
data_frames <- vector("list", length(files))
for (i in seq_along(files)) {
  df <- readLines(file.path(directory, files[i]))
  df <- df[-(1:13)]
  df <- data.frame(year = substr(df,1,4),
   month = substr(df, 6,7),
   day = substr(df, 9, 10),
   hour = substr(df, 12, 13),
   temp = substr(df, 21, 27))
  data_frames[[i]] <- df
}

What I have been have been having trouble is adding the following
information from the cities file (100 cities) for each of the downloaded
data files. I would like to do the following but automatically:

###
mydata$city <- rep(cities[1,1], nrow(mydata))
mydata$state <- rep(cities[1,2], nrow(mydata))
mydata$lon <- rep(cities[1,3], nrow(mydata))
mydata$lat <- rep(cities[1,4], nrow(mydata))
###

The information for cities look like this:

###
cities <-  dput(droplevels(head(cities, 5)))
structure(list(city = structure(1:5, .Label = c("Boston", "Bridgeport",
"Cambridge", "Fall River", "Hartford"), class = "factor"), state =
structure(c(2L,
1L, 2L, 2L, 1L), .Label = c(" CT ", " MA "), class = "factor"),
lon = c(-71.06, -73.19, -71.11, -71.16, -72.67), lat = c(42.36,
41.18, 42.37, 41.7, 41.77)), .Names = c("city", "state",
"lon", "lat"), row.names = c(NA, 5L), class = "data.frame")
###

Apologies if this seems trivial but I have been having a hard time. Thank
you again.

Sincerely,

Milu

On Mon, Oct 16, 2017 at 7:13 PM, David Winsemius 
wrote:

>
> > On Oct 15, 2017, at 3:35 PM, Miluji Sb  wrote:
> >
> > Dear David,
> >
> > This is amazing, thank you so much. If I may ask another question:
> >
> > The output looks like the following:
> >
> > ###
> > dput(head(x,15))
> > c("Metadata for Requested Time Series:", "", "prod_name=GLDAS_NOAH025_3H_
> v2.0",
> > "param_short_name=Tair_f_inst", "param_name=Near surface air
> temperature",
> > "unit=K", "begin_time=1970-01-01T00", "end_time=1979-12-31T21",
> > "lat= 42.36", "lon=-71.06", "Request_time=2017-10-15 22:20:03 GMT",
> > "", "Date   Data", "1970-01-01T00:00:00\t267.769",
> > "1970-01-01T03:00:00\t264.595")
> > ###
> >
> > Thus I need to drop the first 13 rows and do the following to add
> identifying information:
>
> Are you having difficulty reading in the data from disk? The `read.table`
> function has a "skip" parameter.
> >
> > ###
> > mydata <- data.frame(year = substr(x,1,4),
>
> That would not appear to do anything useful with x. The `x` object is not
> a long string. The items you want are in separate elements of x.
>
> substr(x,1,4)   # now returns
>  [1] "Meta" "" "prod" "para" "para" "unit" "begi" "end_" "lat=" "lon="
> "Requ" "" "Date"
> [14] "1970" "1970"
>
> You need to learn basic R indexing. The year might be extracted from the
> 7th element of x x via code like this:
>
> year <- substr( x[7], 1,4)
>
> >  month = substr(x, 6,7),
> >  day = substr(x, 9, 10),
> >  hour = substr(x, 12, 13),
> >  temp = substr(x, 21, 27))
>
> The time and temp items would naturally be read in with read.table (or in
> the case of tab-delimited data with read.delim) after skipping the first 14
> lines.
>
>
> >
> > mydata$city <- rep(cities[1,1], nrow(mydata))
>
> There's no need to use `rep` with data.frame. If one argument to
> data.frame is length n then all single elelment arguments will be
> "recycled" to fill in the needed number of rows. Please take the time to
> work through all the pages of "Introduction to R" (shipped with all
> distributions of R) or pick another introductory text. We cannot provide
> tutoring to all students. You need to put in the needed self-study first.
>
> --
> David.
>
>
> > mydata$state <- rep(cities[1,2], nrow(mydata))
> > mydata$lon <- rep(cities[1,3], nrow(mydata))
> > mydata$lat <- rep(cities[1,4], nrow(mydata))
> > ###
> >
> > Is it possible to incorporate these into your code so the data looks
> like this:
> >
> > dput(droplevels(head(mydata)))
> > structure(list(year = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label =
> "1970", class = "factor"),
> > month = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "01", class =
> "factor"),
> > day = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "01", class =
> "factor"),
> > hour = structure(1:6, .Label = c("00", "03", "06", "09",
> > "12", "15"), class = "factor"), temp = structure(c(6L, 4L,
> > 2L, 1L, 3L, 5L), .Label = c("261.559", "262.525", "262.648",
> > "264.595", "265.812", "267.769"), class = "factor"), city =
> structure(c(1L,
> > 1L, 1L, 1L, 1L, 1L), .Label = "Boston", class = "factor"),
> > state = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = " MA ", class
> = "factor"),
> > lon = c(-71.06, -71.06, -71.06, -71.06, -71.06, -71.06),
> > lat = c(42.36, 42.36, 42.36, 42.36, 42.36, 42.36)), 

Re: [R] Download data from NASA for multiple locations - RCurl

2017-10-16 Thread David Winsemius

> On Oct 15, 2017, at 3:35 PM, Miluji Sb  wrote:
> 
> Dear David,
> 
> This is amazing, thank you so much. If I may ask another question:
> 
> The output looks like the following:
> 
> ###
> dput(head(x,15))
> c("Metadata for Requested Time Series:", "", 
> "prod_name=GLDAS_NOAH025_3H_v2.0", 
> "param_short_name=Tair_f_inst", "param_name=Near surface air temperature", 
> "unit=K", "begin_time=1970-01-01T00", "end_time=1979-12-31T21", 
> "lat= 42.36", "lon=-71.06", "Request_time=2017-10-15 22:20:03 GMT", 
> "", "Date   Data", "1970-01-01T00:00:00\t267.769", 
> "1970-01-01T03:00:00\t264.595")
> ###
> 
> Thus I need to drop the first 13 rows and do the following to add identifying 
> information:

Are you having difficulty reading in the data from disk? The `read.table` 
function has a "skip" parameter.
> 
> ###
> mydata <- data.frame(year = substr(x,1,4),

That would not appear to do anything useful with x. The `x` object is not a 
long string. The items you want are in separate elements of x. 

substr(x,1,4)   # now returns
 [1] "Meta" "" "prod" "para" "para" "unit" "begi" "end_" "lat=" "lon=" 
"Requ" "" "Date"
[14] "1970" "1970"

You need to learn basic R indexing. The year might be extracted from the 7th 
element of x x via code like this:

year <- substr( x[7], 1,4)

>  month = substr(x, 6,7),
>  day = substr(x, 9, 10),
>  hour = substr(x, 12, 13),
>  temp = substr(x, 21, 27))

The time and temp items would naturally be read in with read.table (or in the 
case of tab-delimited data with read.delim) after skipping the first 14 lines.


> 
> mydata$city <- rep(cities[1,1], nrow(mydata))

There's no need to use `rep` with data.frame. If one argument to data.frame is 
length n then all single elelment arguments will be "recycled" to fill in the 
needed number of rows. Please take the time to work through all the pages of 
"Introduction to R" (shipped with all distributions of R) or pick another 
introductory text. We cannot provide tutoring to all students. You need to put 
in the needed self-study first.

-- 
David.


> mydata$state <- rep(cities[1,2], nrow(mydata))
> mydata$lon <- rep(cities[1,3], nrow(mydata))
> mydata$lat <- rep(cities[1,4], nrow(mydata))
> ###
> 
> Is it possible to incorporate these into your code so the data looks like 
> this:
> 
> dput(droplevels(head(mydata)))
> structure(list(year = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "1970", 
> class = "factor"), 
> month = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "01", class = 
> "factor"), 
> day = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "01", class = 
> "factor"), 
> hour = structure(1:6, .Label = c("00", "03", "06", "09", 
> "12", "15"), class = "factor"), temp = structure(c(6L, 4L, 
> 2L, 1L, 3L, 5L), .Label = c("261.559", "262.525", "262.648", 
> "264.595", "265.812", "267.769"), class = "factor"), city = 
> structure(c(1L, 
> 1L, 1L, 1L, 1L, 1L), .Label = "Boston", class = "factor"), 
> state = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = " MA ", class = 
> "factor"), 
> lon = c(-71.06, -71.06, -71.06, -71.06, -71.06, -71.06), 
> lat = c(42.36, 42.36, 42.36, 42.36, 42.36, 42.36)), .Names = c("year", 
> "month", "day", "hour", "temp", "city", "state", "lon", "lat"
> ), row.names = c(NA, 6L), class = "data.frame")
> 
> Apologies for asking repeated questions and thank you again!

Of course it's possible. I don't understand where the difficulty lies. 
> 
> Sincerely,
> 
> Milu
> 
> On Sun, Oct 15, 2017 at 11:45 PM, David Winsemius  
> wrote:
> 
> > On Oct 15, 2017, at 2:02 PM, Miluji Sb  wrote:
> >
> > Dear all,
> >
> > i am trying to download time-series climatic data from GES DISC (NASA)
> > Hydrology Data Rods web-service. Unfortunately, no wget method is
> > available.
> >
> > Five parameters are needed for data retrieval: variable, location,
> > startDate, endDate, and type. For example:
> >
> > ###
> > https://hydro1.gesdisc.eosdis.nasa.gov/daac-bin/access/timeseries.cgi?variable=GLDAS2:GLDAS_NOAH025_3H_v2.0:Tair_f_inst=1970-01-01T00=1979-12-31T00=GEOM:POINT(-71.06,%2042.36)=asc2
> > ###
> >
> > In this case, variable: Tair_f_inst (temperature), location: (-71.06,
> > 42.36), startDate: 01 January 1970; endDate: 31 December 1979; type:  asc2
> > (output 2-column ASCII).
> >
> > I am trying to download data for 100 US cities, data for which I have in
> > the following data.frame:
> >
> > ###
> > cities <-  dput(droplevels(head(cities, 5)))
> > structure(list(city = structure(1:5, .Label = c("Boston", "Bridgeport",
> > "Cambridge", "Fall River", "Hartford"), class = "factor"), state =
> > structure(c(2L,
> > 1L, 2L, 2L, 1L), .Label = c(" CT ", " MA "), class = "factor"),
> >lon = c(-71.06, -73.19, -71.11, -71.16, -72.67), lat = c(42.36,
> >41.18, 42.37, 41.7, 41.77)), .Names = c("city", "state",
> > 

Re: [R] Download data from NASA for multiple locations - RCurl

2017-10-15 Thread Miluji Sb
Dear David,

This is amazing, thank you so much. If I may ask another question:

The output looks like the following:

###
dput(head(x,15))
c("Metadata for Requested Time Series:", "",
"prod_name=GLDAS_NOAH025_3H_v2.0",
"param_short_name=Tair_f_inst", "param_name=Near surface air temperature",
"unit=K", "begin_time=1970-01-01T00", "end_time=1979-12-31T21",
"lat= 42.36", "lon=-71.06", "Request_time=2017-10-15 22:20:03 GMT",
"", "Date   Data", "1970-01-01T00:00:00\t267.769",
"1970-01-01T03:00:00\t264.595")
###

Thus I need to drop the first 13 rows and do the following to add
identifying information:

###
mydata <- data.frame(year = substr(x,1,4),
 month = substr(x, 6,7),
 day = substr(x, 9, 10),
 hour = substr(x, 12, 13),
 temp = substr(x, 21, 27))

mydata$city <- rep(cities[1,1], nrow(mydata))
mydata$state <- rep(cities[1,2], nrow(mydata))
mydata$lon <- rep(cities[1,3], nrow(mydata))
mydata$lat <- rep(cities[1,4], nrow(mydata))
###

Is it possible to incorporate these into your code so the data looks like
this:

dput(droplevels(head(mydata)))
structure(list(year = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "1970",
class = "factor"),
month = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "01", class =
"factor"),
day = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = "01", class =
"factor"),
hour = structure(1:6, .Label = c("00", "03", "06", "09",
"12", "15"), class = "factor"), temp = structure(c(6L, 4L,
2L, 1L, 3L, 5L), .Label = c("261.559", "262.525", "262.648",
"264.595", "265.812", "267.769"), class = "factor"), city =
structure(c(1L,
1L, 1L, 1L, 1L, 1L), .Label = "Boston", class = "factor"),
state = structure(c(1L, 1L, 1L, 1L, 1L, 1L), .Label = " MA ", class =
"factor"),
lon = c(-71.06, -71.06, -71.06, -71.06, -71.06, -71.06),
lat = c(42.36, 42.36, 42.36, 42.36, 42.36, 42.36)), .Names = c("year",
"month", "day", "hour", "temp", "city", "state", "lon", "lat"
), row.names = c(NA, 6L), class = "data.frame")

Apologies for asking repeated questions and thank you again!

Sincerely,

Milu

On Sun, Oct 15, 2017 at 11:45 PM, David Winsemius 
wrote:

>
> > On Oct 15, 2017, at 2:02 PM, Miluji Sb  wrote:
> >
> > Dear all,
> >
> > i am trying to download time-series climatic data from GES DISC (NASA)
> > Hydrology Data Rods web-service. Unfortunately, no wget method is
> > available.
> >
> > Five parameters are needed for data retrieval: variable, location,
> > startDate, endDate, and type. For example:
> >
> > ###
> > https://hydro1.gesdisc.eosdis.nasa.gov/daac-bin/access/
> timeseries.cgi?variable=GLDAS2:GLDAS_NOAH025_3H_v2.0:
> Tair_f_inst=1970-01-01T00=1979-12-31T00&
> location=GEOM:POINT(-71.06,%2042.36)=asc2
> > ###
> >
> > In this case, variable: Tair_f_inst (temperature), location: (-71.06,
> > 42.36), startDate: 01 January 1970; endDate: 31 December 1979; type:
> asc2
> > (output 2-column ASCII).
> >
> > I am trying to download data for 100 US cities, data for which I have in
> > the following data.frame:
> >
> > ###
> > cities <-  dput(droplevels(head(cities, 5)))
> > structure(list(city = structure(1:5, .Label = c("Boston", "Bridgeport",
> > "Cambridge", "Fall River", "Hartford"), class = "factor"), state =
> > structure(c(2L,
> > 1L, 2L, 2L, 1L), .Label = c(" CT ", " MA "), class = "factor"),
> >lon = c(-71.06, -73.19, -71.11, -71.16, -72.67), lat = c(42.36,
> >41.18, 42.37, 41.7, 41.77)), .Names = c("city", "state",
> > "lon", "lat"), row.names = c(NA, 5L), class = "data.frame")
> > ###
> >
> > Is it possible to download the data for the multiple locations
> > automatically (e.g. RCurl) and save them as csv? Essentially, reading
> > coordinates from the data.frame and entering it in the URL.
> >
> > I would also like to add identifying information to each of the data
> files
> > from the cities data.frame. I have been doing the following for a single
> > file:
>
> Didn't seem that difficult:
>
> library(downloader)  # makes things easier for Macs, perhaps not needed
> # if not used will need to use download.file
>
> for( i in 1:5) {
>   target1 <- paste0("https://hydro1.gesdisc.eosdis.nasa.gov/daac-
> bin/access/timeseries.cgi?variable=GLDAS2:GLDAS_NOAH025_
> 3H_v2.0:Tair_f_inst=1970-01-01T00=1979-12-
> 31T00=GEOM:POINT(",
>  cities[i, "lon"],
>  ",%20", cities[i,"lat"],
>  ")=asc2")
>   target2 <- paste0("~/",# change for whatever destination directory
> you may prefer.
> cities[i,"city"],
> cities[i,"state"], ".asc")
>   download(url=target1, destfile=target2)
> }
>
> Now I have 5 named files with extensions ".asc" in my user directory
> (since I'm on a Mac). It is a slow website so patience is needed.
>
> --
> David
>
>
> >
> > ###
> > x <- readLines(con=url("
> > 

Re: [R] Download data from NASA for multiple locations - RCurl

2017-10-15 Thread David Winsemius

> On Oct 15, 2017, at 2:02 PM, Miluji Sb  wrote:
> 
> Dear all,
> 
> i am trying to download time-series climatic data from GES DISC (NASA)
> Hydrology Data Rods web-service. Unfortunately, no wget method is
> available.
> 
> Five parameters are needed for data retrieval: variable, location,
> startDate, endDate, and type. For example:
> 
> ###
> https://hydro1.gesdisc.eosdis.nasa.gov/daac-bin/access/timeseries.cgi?variable=GLDAS2:GLDAS_NOAH025_3H_v2.0:Tair_f_inst=1970-01-01T00=1979-12-31T00=GEOM:POINT(-71.06,%2042.36)=asc2
> ###
> 
> In this case, variable: Tair_f_inst (temperature), location: (-71.06,
> 42.36), startDate: 01 January 1970; endDate: 31 December 1979; type:  asc2
> (output 2-column ASCII).
> 
> I am trying to download data for 100 US cities, data for which I have in
> the following data.frame:
> 
> ###
> cities <-  dput(droplevels(head(cities, 5)))
> structure(list(city = structure(1:5, .Label = c("Boston", "Bridgeport",
> "Cambridge", "Fall River", "Hartford"), class = "factor"), state =
> structure(c(2L,
> 1L, 2L, 2L, 1L), .Label = c(" CT ", " MA "), class = "factor"),
>lon = c(-71.06, -73.19, -71.11, -71.16, -72.67), lat = c(42.36,
>41.18, 42.37, 41.7, 41.77)), .Names = c("city", "state",
> "lon", "lat"), row.names = c(NA, 5L), class = "data.frame")
> ###
> 
> Is it possible to download the data for the multiple locations
> automatically (e.g. RCurl) and save them as csv? Essentially, reading
> coordinates from the data.frame and entering it in the URL.
> 
> I would also like to add identifying information to each of the data files
> from the cities data.frame. I have been doing the following for a single
> file:

Didn't seem that difficult:

library(downloader)  # makes things easier for Macs, perhaps not needed 
# if not used will need to use download.file

for( i in 1:5) { 
  target1 <- 
paste0("https://hydro1.gesdisc.eosdis.nasa.gov/daac-bin/access/timeseries.cgi?variable=GLDAS2:GLDAS_NOAH025_3H_v2.0:Tair_f_inst=1970-01-01T00=1979-12-31T00=GEOM:POINT(",
 
 cities[i, "lon"],
 ",%20", cities[i,"lat"],
 ")=asc2")
  target2 <- paste0("~/",# change for whatever destination directory you 
may prefer.
cities[i,"city"], 
cities[i,"state"], ".asc")
  download(url=target1, destfile=target2)
}

Now I have 5 named files with extensions ".asc" in my user directory (since I'm 
on a Mac). It is a slow website so patience is needed.

-- 
David


> 
> ###
> x <- readLines(con=url("
> https://hydro1.gesdisc.eosdis.nasa.gov/daac-bin/access/timeseries.cgi?variable=GLDAS2:GLDAS_NOAH025_3H_v2.0:Tair_f_inst=1970-01-01T00=1979-12-31T00=GEOM:POINT(-71.06,%2042.36)=asc2
> "))
> x <- x[-(1:13)]
> 
> mydata <- data.frame(year = substr(x,1,4),
> month = substr(x, 6,7),
> day = substr(x, 9, 10),
> hour = substr(x, 12, 13),
> temp = substr(x, 21, 27))
> 
> mydata$city <- rep(cities[1,1], nrow(mydata))
> mydata$state <- rep(cities[1,2], nrow(mydata))
> mydata$lon <- rep(cities[1,3], nrow(mydata))
> mydata$lat <- rep(cities[1,4], nrow(mydata))
> ###
> 
> Help and advice would be greatly appreciated. Thank you!
> 
> Sincerely,
> 
> Milu
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

David Winsemius
Alameda, CA, USA

'Any technology distinguishable from magic is insufficiently advanced.'   
-Gehm's Corollary to Clarke's Third Law

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Download data from NASA for multiple locations - RCurl

2017-10-15 Thread Miluji Sb
Dear all,

i am trying to download time-series climatic data from GES DISC (NASA)
Hydrology Data Rods web-service. Unfortunately, no wget method is
available.

Five parameters are needed for data retrieval: variable, location,
startDate, endDate, and type. For example:

###
https://hydro1.gesdisc.eosdis.nasa.gov/daac-bin/access/timeseries.cgi?variable=GLDAS2:GLDAS_NOAH025_3H_v2.0:Tair_f_inst=1970-01-01T00=1979-12-31T00=GEOM:POINT(-71.06,%2042.36)=asc2
###

In this case, variable: Tair_f_inst (temperature), location: (-71.06,
42.36), startDate: 01 January 1970; endDate: 31 December 1979; type:  asc2
(output 2-column ASCII).

I am trying to download data for 100 US cities, data for which I have in
the following data.frame:

###
cities <-  dput(droplevels(head(cities, 5)))
structure(list(city = structure(1:5, .Label = c("Boston", "Bridgeport",
"Cambridge", "Fall River", "Hartford"), class = "factor"), state =
structure(c(2L,
1L, 2L, 2L, 1L), .Label = c(" CT ", " MA "), class = "factor"),
lon = c(-71.06, -73.19, -71.11, -71.16, -72.67), lat = c(42.36,
41.18, 42.37, 41.7, 41.77)), .Names = c("city", "state",
"lon", "lat"), row.names = c(NA, 5L), class = "data.frame")
###

Is it possible to download the data for the multiple locations
automatically (e.g. RCurl) and save them as csv? Essentially, reading
coordinates from the data.frame and entering it in the URL.

I would also like to add identifying information to each of the data files
from the cities data.frame. I have been doing the following for a single
file:

###
x <- readLines(con=url("
https://hydro1.gesdisc.eosdis.nasa.gov/daac-bin/access/timeseries.cgi?variable=GLDAS2:GLDAS_NOAH025_3H_v2.0:Tair_f_inst=1970-01-01T00=1979-12-31T00=GEOM:POINT(-71.06,%2042.36)=asc2
"))
x <- x[-(1:13)]

mydata <- data.frame(year = substr(x,1,4),
 month = substr(x, 6,7),
 day = substr(x, 9, 10),
 hour = substr(x, 12, 13),
 temp = substr(x, 21, 27))

mydata$city <- rep(cities[1,1], nrow(mydata))
mydata$state <- rep(cities[1,2], nrow(mydata))
mydata$lon <- rep(cities[1,3], nrow(mydata))
mydata$lat <- rep(cities[1,4], nrow(mydata))
###

Help and advice would be greatly appreciated. Thank you!

Sincerely,

Milu

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data from Internet contained in a Zip file

2016-12-25 Thread Jeff Newmiller
The fact that you are downloading a zip is irrelevant. The "s" in "https" means 
you are trying to download using the secure http protocol. I don't use a Mac, 
but Google finds https://stat.ethz.ch/pipermail/r-sig-mac/2011-July/008395.html 
when I search on "r unsupported url scheme mac osx" which says you need to use 
different options (but that post is old so the options may be different now). 
In any case, you should be posting this on R-sig-mac because the answer depends 
on both your OS and version of R.
-- 
Sent from my phone. Please excuse my brevity.

On December 25, 2016 1:50:00 PM PST, Christofer Bogaso 
 wrote:
>Hi Jeff, I dont think I understood all that you said. I am using Mac
>OS X 10.7.5 as OS
>
>On Mon, Dec 26, 2016 at 3:10 AM, Jeff Newmiller
> wrote:
>> This has nothing to do with the fact that you are dealing with a zip
>file and everything to do with the "s" in "https", and all the
>solutions I have seen involve knowing what operating system you are
>using. I highly recommend that you study what Google has to say when
>you include that detail and leave out the zip keyword.
>> --
>> Sent from my phone. Please excuse my brevity.
>>
>> On December 25, 2016 12:32:16 PM PST, Christofer Bogaso
> wrote:
>>>Hi again,
>>>
>>>I was following the instruction available in
>>>"http://stackoverflow.com/questions/3053833/using-r-to-download-zipped-data-file-extract-and-import-data;
>>>to download data from Internet contained in a zip file from the
>>>address :
>>>
>>>https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip
>>>
>>>However when I tried to follow the instruction I am facing below
>error
>>>:
>>>
 temp <- tempfile()

>>>download.file("https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip",temp)
>>>Error in
>>>download.file("https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip;,
>>> :
>>>  unsupported URL scheme
>>>
>>>Can someone here please tell me what went wrong in above?
>>>
>>>Highly appreciate your feedback.
>>>
>>>Thanks for your time.
>>>
>>>__
>>>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>>https://stat.ethz.ch/mailman/listinfo/r-help
>>>PLEASE do read the posting guide
>>>http://www.R-project.org/posting-guide.html
>>>and provide commented, minimal, self-contained, reproducible code.
>>

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data from Internet contained in a Zip file

2016-12-25 Thread Christofer Bogaso
Hi Jeff, I dont think I understood all that you said. I am using Mac
OS X 10.7.5 as OS

On Mon, Dec 26, 2016 at 3:10 AM, Jeff Newmiller
 wrote:
> This has nothing to do with the fact that you are dealing with a zip file and 
> everything to do with the "s" in "https", and all the solutions I have seen 
> involve knowing what operating system you are using. I highly recommend that 
> you study what Google has to say when you include that detail and leave out 
> the zip keyword.
> --
> Sent from my phone. Please excuse my brevity.
>
> On December 25, 2016 12:32:16 PM PST, Christofer Bogaso 
>  wrote:
>>Hi again,
>>
>>I was following the instruction available in
>>"http://stackoverflow.com/questions/3053833/using-r-to-download-zipped-data-file-extract-and-import-data;
>>to download data from Internet contained in a zip file from the
>>address :
>>
>>https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip
>>
>>However when I tried to follow the instruction I am facing below error
>>:
>>
>>> temp <- tempfile()
>>>
>>download.file("https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip",temp)
>>Error in
>>download.file("https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip;,
>> :
>>  unsupported URL scheme
>>
>>Can someone here please tell me what went wrong in above?
>>
>>Highly appreciate your feedback.
>>
>>Thanks for your time.
>>
>>__
>>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>https://stat.ethz.ch/mailman/listinfo/r-help
>>PLEASE do read the posting guide
>>http://www.R-project.org/posting-guide.html
>>and provide commented, minimal, self-contained, reproducible code.
>

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data from Internet contained in a Zip file

2016-12-25 Thread Jeff Newmiller
This has nothing to do with the fact that you are dealing with a zip file and 
everything to do with the "s" in "https", and all the solutions I have seen 
involve knowing what operating system you are using. I highly recommend that 
you study what Google has to say when you include that detail and leave out the 
zip keyword.
-- 
Sent from my phone. Please excuse my brevity.

On December 25, 2016 12:32:16 PM PST, Christofer Bogaso 
 wrote:
>Hi again,
>
>I was following the instruction available in
>"http://stackoverflow.com/questions/3053833/using-r-to-download-zipped-data-file-extract-and-import-data;
>to download data from Internet contained in a zip file from the
>address :
>
>https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip
>
>However when I tried to follow the instruction I am facing below error
>:
>
>> temp <- tempfile()
>>
>download.file("https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip",temp)
>Error in
>download.file("https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip;,
> :
>  unsupported URL scheme
>
>Can someone here please tell me what went wrong in above?
>
>Highly appreciate your feedback.
>
>Thanks for your time.
>
>__
>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Download data from Internet contained in a Zip file

2016-12-25 Thread Christofer Bogaso
Hi again,

I was following the instruction available in
"http://stackoverflow.com/questions/3053833/using-r-to-download-zipped-data-file-extract-and-import-data;
to download data from Internet contained in a zip file from the
address :

https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip

However when I tried to follow the instruction I am facing below error :

> temp <- tempfile()
> download.file("https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip",temp)
Error in 
download.file("https://npscra.nsdl.co.in/download.php?path=download/=NAV_File_23122016.zip;,
 :
  unsupported URL scheme

Can someone here please tell me what went wrong in above?

Highly appreciate your feedback.

Thanks for your time.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data

2013-09-19 Thread jcrosbie
Thank you for all your help. I'm still not able to figure out how automate
downloads from online websites.

This is a daily function to download the needed data. I would also like to
be able to do this on other websites such as: 

http://ets.aeso.ca/ets_web/docroot/Market/Reports/HistoricalReportsStart.html

and

http://www.ngx.com/?page_id=561




--
View this message in context: 
http://r.789695.n4.nabble.com/Download-data-tp4668138p4676465.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data

2013-09-19 Thread Jeff Newmiller
I am sorry to hear that you are having difficulty, but your automation task is 
one that requires operating-system-specific knowledge that would be off-topic 
for this list, and web-scraping of forms really requires knowledge of web 
protocols and (in this case) Java and JavaScript that are also off-topic here. 
There exist packages in CRAN that may be helpful in your endeavor, but likely 
only if you study the appropriate subject areas outside of R first so that you 
know what you need to accomplish in detail. My quick estimation is that the 
aeso web site will be unusually difficult to extract data from, so you may need 
to pay a consultant to help you with this and/or ask the website developers if 
they support an automation mechanism that you can use.
---
Jeff NewmillerThe .   .  Go Live...
DCN:jdnew...@dcn.davis.ca.usBasics: ##.#.   ##.#.  Live Go...
  Live:   OO#.. Dead: OO#..  Playing
Research Engineer (Solar/BatteriesO.O#.   #.O#.  with
/Software/Embedded Controllers)   .OO#.   .OO#.  rocks...1k
--- 
Sent from my phone. Please excuse my brevity.

jcrosbie ja...@crosb.ie wrote:
Thank you for all your help. I'm still not able to figure out how
automate
downloads from online websites.

This is a daily function to download the needed data. I would also like
to
be able to do this on other websites such as: 

http://ets.aeso.ca/ets_web/docroot/Market/Reports/HistoricalReportsStart.html

and

http://www.ngx.com/?page_id=561




--
View this message in context:
http://r.789695.n4.nabble.com/Download-data-tp4668138p4676465.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data

2013-05-29 Thread Jim Lemon

On 05/29/2013 02:02 AM, jcrosbie wrote:

Hi, I'm trying to download data from:
http://www.ngx.com/settlehistory.html

Is it possible to fetch the data with R?


Hi jcrosbie,
The simplest way seems to be to highlight the desired spreadsheet (less 
the title row), copy (Ctrl-C) and paste (Ctrl-V) it into a text editor 
and save it (e.g. ss1.tab). This produces a TAB delimited file that can 
be read into a data frame in R with:


ss1-read.table(ss1.tab)

Jim

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data

2013-05-29 Thread Pascal Oettli

Hi,

The combination read.table (and its arguments) + stdin also can be 
used, directly in R.


 read.table(stdin(), ...)

Regards,
Pascal

On 29/05/2013 21:35, Jim Lemon wrote:

On 05/29/2013 02:02 AM, jcrosbie wrote:

Hi, I'm trying to download data from:
http://www.ngx.com/settlehistory.html

Is it possible to fetch the data with R?


Hi jcrosbie,
The simplest way seems to be to highlight the desired spreadsheet (less
the title row), copy (Ctrl-C) and paste (Ctrl-V) it into a text editor
and save it (e.g. ss1.tab). This produces a TAB delimited file that can
be read into a data frame in R with:

ss1-read.table(ss1.tab)

Jim

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data

2013-05-29 Thread Adams, Jean
I tried reading in the data using the XML package, but I can't figure out
how to read either ALL of the tables or a particular table.  The code below
just reads the first table.  Maybe someone else will know how.

Jean


library(XML)
look - readHTMLTable(http://www.ngx.com/settlehistory.html;)
head(look[[1]])
  V1 V2 V3 V4 V5 V6
V7 V8   V9 V10
1 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-05-01 2013-05-31  0
  -16.
2 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-06-01 2013-06-30  0
  -18.2500
3 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-07-01 2013-07-31  0
  -19.7500
4 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-08-01 2013-08-31  0
  -21.2500
5 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-09-01 2013-09-30  0
  -22.7500
6 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-10-01 2013-10-31  0
  -23.



On Tue, May 28, 2013 at 11:02 AM, jcrosbie ja...@crosb.ie wrote:

 Hi, I'm trying to download data from:
 http://www.ngx.com/settlehistory.html

 Is it possible to fetch the data with R?

 Thank you



 --
 View this message in context:
 http://r.789695.n4.nabble.com/Download-data-tp4668138.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data

2013-05-29 Thread james crosbie
Thank you, I will have to wait until I get home form work to test the XML. But 
I'm looking to do something more along the lines of an automated process over 
copy and pasting. 

James



 From: Adams, Jean jvad...@usgs.gov
To: jcrosbie ja...@crosb.ie 
Cc: R help r-help@r-project.org 
Sent: Wednesday, May 29, 2013 6:50 AM
Subject: Re: [R] Download data



I tried reading in the data using the XML package, but I can't figure out how 
to read either ALL of the tables or a particular table.  The code below just 
reads the first table.  Maybe someone else will know how.

Jean



library(XML)
look - readHTMLTable(http://www.ngx.com/settlehistory.html;)
head(look[[1]])
                                          V1         V2         V3 V4 V5 V6 V7 
V8       V9 V10
1 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-05-01 2013-05-31  0           
  -16.    
2 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-06-01 2013-06-30  0           
  -18.2500    
3 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-07-01 2013-07-31  0           
  -19.7500    
4 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-08-01 2013-08-31  0           
  -21.2500    
5 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-09-01 2013-09-30  0           
  -22.7500    
6 Crude Firm Phys AHS, ID, WTI, Edm-Enbridge 2013-10-01 2013-10-31  0           
  -23.    




On Tue, May 28, 2013 at 11:02 AM, jcrosbie ja...@crosb.ie wrote:

Hi, I'm trying to download data from:
http://www.ngx.com/settlehistory.html

Is it possible to fetch the data with R?

Thank you



--
View this message in context: 
http://r.789695.n4.nabble.com/Download-data-tp4668138.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Download data

2013-05-28 Thread jcrosbie
Hi, I'm trying to download data from:
http://www.ngx.com/settlehistory.html

Is it possible to fetch the data with R? 

Thank you



--
View this message in context: 
http://r.789695.n4.nabble.com/Download-data-tp4668138.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data

2013-05-24 Thread David Reiner
Quandl package is your friend:
library(Quandl) # download and install if necessary
oil - Quandl('FRED/WCOILWTICO', type='xts')
# Search the quandl.com site to make sure this is the data you want.
# I just put in your text string 'WTI - Cushing, Oklahoma' and got several 
results which might be the one you want.

This may give a warning if you haven't gotten an authentication token.

David L. Reiner, Ph.D.
Head Quant
XR Trading LLC
550 West Jackson Boulevard, Suite 1000
Chicago, IL 60661-5704
(312) 244-4610 direct
(312) 244-4500 main
david.rei...@xrtrading.com

-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
Behalf Of Christofer Bogaso
Sent: Thursday, May 23, 2013 11:32 PM
To: r-help
Subject: [R] Download data

Hello again,

I need to download 'WTI - Cushing, Oklahoma' from '
http://www.eia.gov/dnav/pet/pet_pri_spt_s1_d.htm' which is available under
the column 'View
History'

While I can get the data manually, however I was looking for some R
implementation which can directly download data into R.

Can somebody point me how to achieve that?

Thanks for your help.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


This e-mail and any materials attached hereto, including, without limitation, 
all content hereof and thereof (collectively, XR Content) are confidential 
and proprietary to XR Trading, LLC (XR) and/or its affiliates, and are 
protected by intellectual property laws.  Without the prior written consent of 
XR, the XR Content may not (i) be disclosed to any third party or (ii) be 
reproduced or otherwise used by anyone other than current employees of XR or 
its affiliates, on behalf of XR or its affiliates.

THE XR CONTENT IS PROVIDED AS IS, WITHOUT REPRESENTATIONS OR WARRANTIES OF ANY 
KIND.  TO THE MAXIMUM EXTENT PERMISSIBLE UNDER APPLICABLE LAW, XR HEREBY 
DISCLAIMS ANY AND ALL WARRANTIES, EXPRESS AND IMPLIED, RELATING TO THE XR 
CONTENT, AND NEITHER XR NOR ANY OF ITS AFFILIATES SHALL IN ANY EVENT BE LIABLE 
FOR ANY DAMAGES OF ANY NATURE WHATSOEVER, INCLUDING, BUT NOT LIMITED TO, 
DIRECT, INDIRECT, CONSEQUENTIAL, SPECIAL AND PUNITIVE DAMAGES, LOSS OF PROFITS 
AND TRADING LOSSES, RESULTING FROM ANY PERSON'S USE OR RELIANCE UPON, OR 
INABILITY TO USE, ANY XR CONTENT, EVEN IF XR IS ADVISED OF THE POSSIBILITY OF 
SUCH DAMAGES OR IF SUCH DAMAGES WERE FORESEEABLE.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Download data

2013-05-24 Thread David Reiner
Actually I think the one you want is 'FRED/DCOILWTICO'

-- David


-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
Behalf Of David Reiner
Sent: Friday, May 24, 2013 8:46 AM
To: Christofer Bogaso; r-help
Subject: Re: [R] Download data

Quandl package is your friend:
library(Quandl) # download and install if necessary
oil - Quandl('FRED/WCOILWTICO', type='xts')
# Search the quandl.com site to make sure this is the data you want.
# I just put in your text string 'WTI - Cushing, Oklahoma' and got several 
results which might be the one you want.

This may give a warning if you haven't gotten an authentication token.

David L. Reiner, Ph.D.
Head Quant
XR Trading LLC
550 West Jackson Boulevard, Suite 1000
Chicago, IL 60661-5704
(312) 244-4610 direct
(312) 244-4500 main
david.rei...@xrtrading.com

-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
Behalf Of Christofer Bogaso
Sent: Thursday, May 23, 2013 11:32 PM
To: r-help
Subject: [R] Download data

Hello again,

I need to download 'WTI - Cushing, Oklahoma' from '
http://www.eia.gov/dnav/pet/pet_pri_spt_s1_d.htm' which is available under
the column 'View
History'

While I can get the data manually, however I was looking for some R
implementation which can directly download data into R.

Can somebody point me how to achieve that?

Thanks for your help.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


This e-mail and any materials attached hereto, including, without limitation, 
all content hereof and thereof (collectively, XR Content) are confidential 
and proprietary to XR Trading, LLC (XR) and/or its affiliates, and are 
protected by intellectual property laws.  Without the prior written consent of 
XR, the XR Content may not (i) be disclosed to any third party or (ii) be 
reproduced or otherwise used by anyone other than current employees of XR or 
its affiliates, on behalf of XR or its affiliates.

THE XR CONTENT IS PROVIDED AS IS, WITHOUT REPRESENTATIONS OR WARRANTIES OF ANY 
KIND.  TO THE MAXIMUM EXTENT PERMISSIBLE UNDER APPLICABLE LAW, XR HEREBY 
DISCLAIMS ANY AND ALL WARRANTIES, EXPRESS AND IMPLIED, RELATING TO THE XR 
CONTENT, AND NEITHER XR NOR ANY OF ITS AFFILIATES SHALL IN ANY EVENT BE LIABLE 
FOR ANY DAMAGES OF ANY NATURE WHATSOEVER, INCLUDING, BUT NOT LIMITED TO, 
DIRECT, INDIRECT, CONSEQUENTIAL, SPECIAL AND PUNITIVE DAMAGES, LOSS OF PROFITS 
AND TRADING LOSSES, RESULTING FROM ANY PERSON'S USE OR RELIANCE UPON, OR 
INABILITY TO USE, ANY XR CONTENT, EVEN IF XR IS ADVISED OF THE POSSIBILITY OF 
SUCH DAMAGES OR IF SUCH DAMAGES WERE FORESEEABLE.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Download data

2013-05-23 Thread Christofer Bogaso
Hello again,

I need to download 'WTI - Cushing, Oklahoma' from '
http://www.eia.gov/dnav/pet/pet_pri_spt_s1_d.htm' which is available under
the column 'View
History'

While I can get the data manually, however I was looking for some R
implementation which can directly download data into R.

Can somebody point me how to achieve that?

Thanks for your help.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.