[R] load ff object in a different computer

2013-05-26 Thread Djordje Bajic
Hi all,

I am having trouble loading a ff object previously saved in a different
computer. I have both files .ffData and .RData, and the first of them is
13Mb large from which I know the data is therein. But when I try to ffload
it,

checkdir error:  cannot create /home/_myUser_
 Permission denied
 unable to process
home/_myUser_/Rtempdir/ff1a831d500b8d.ff.

and some warnings.  In the original computer, this temporary file is
deleted each time I exit R, and my expectation is that data is actually
stored in the .ffData and .RData that I have here. But maybe I don't realy
understand the underpinnings of ff. In thois computer, my username is
different, so user xyz does not exist. I tried changing the option
Rtempdir but no luck.

In addition, when I open the .ffData file to see what is inside, there is
only a path to the ff1a831... temporary file. As information about ff
package in internet is rather scarce, could anyome please help me to
understand this, and possibly recover my data if it is possible?

Thank you!

Djordje

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] load ff object in a different computer

2013-05-26 Thread Djordje Bajic
i *SOLVED*

Thanks Milan, I have receivd some feedback externally to the list and
managed to solve the issue.

I saved the document as follows:

x.Arr - ff(NA, dim=rep(ngen,3), vmode=double)
ffsave (x.Arr, file=x.Arr)
finalizer(x.Arr) - delete

The problem was related to the rootpath argument. As the one in one
computer does not exist in the other, the solution was to set it when
ffloading, so:

ffload(file=/path/to/saved/x.Arr, rootpath =
/path/on/your/other/computer/where/to/extract/x.Arr)

Cheers,

Djordje



2013/5/26 Milan Bouchet-Valat nalimi...@club.fr

 Le dimanche 26 mai 2013 à 13:53 +0200, Djordje Bajic a écrit :
  Hi all,
 
  I am having trouble loading a ff object previously saved in a different
  computer. I have both files .ffData and .RData, and the first of them is
  13Mb large from which I know the data is therein. But when I try to
 ffload
  it,
 
  checkdir error:  cannot create /home/_myUser_
   Permission denied
   unable to process
  home/_myUser_/Rtempdir/ff1a831d500b8d.ff.
 
  and some warnings.  In the original computer, this temporary file is
  deleted each time I exit R, and my expectation is that data is actually
  stored in the .ffData and .RData that I have here. But maybe I don't
 realy
  understand the underpinnings of ff. In thois computer, my username is
  different, so user xyz does not exist. I tried changing the option
  Rtempdir but no luck.
 
  In addition, when I open the .ffData file to see what is inside, there is
  only a path to the ff1a831... temporary file. As information about ff
  package in internet is rather scarce, could anyome please help me to
  understand this, and possibly recover my data if it is possible?
 Please tell us exactly how you saved that ff object. You should try to
 reproduce the problem with very simple data you post in your message
 using dput(), and provide us with all the code and the errors it
 triggers.


 Regards

  Thank you!
 
  Djordje
 
[[alternative HTML version deleted]]
 
  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.



[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Merging data tables

2012-12-28 Thread Djordje Bajic
see ?merge,

for your dataframes, the following should work:

merge(bat_activity, weather, by=c(date, time), all=T)


2012/12/28 Neotropical bat risk assessments neotropical.b...@gmail.com

 Hi all,

 I am trying to merge several data sets and end up with a long data
 format by date  time so I can run correlations and plots.  I am using
 Deducer as an R GUI but can just use the R console if easier.

 The data sets are weather with wind speed, relative humidity and
 temperatures by date and minute and bat activity with date, time, label,
 and an activity index number.  The bat activity is only during the
 nocturnal time frames while the weather data was recorded for 24 hours.
 Therefore a lot of weather data with no related activity for bats.

 I have failed so far to achieve what I need and tried plyr and reshape2.
 There are many Null data rows with no data or 0 (zero) for wind speed..
 What other tools steps would be more appropriate short of manually in
 Excel cutting and pasting for a day or more?


 Data formats are:
 bat activity
 DateTimeLabel   Number
 6/3/201110:01   Tadbra  2
 6/3/201110:02   Tadbra  4
 6/3/201110:08   Tadbra  1
 6/3/201110:13   Tadbra  2
 6/3/201110:49   Tadbra  2
 6/3/201110:51   Tadbra  2
 6/3/201110:52   Tadbra  4


 Weather:

 datetimeTemp_I  Temp_E  RH  mph_a   mph_x
 6/3/20110:0015  15.730.44.4 15.5
 6/3/20110:3015  15.231.65.7 11.2
 6/3/20111:0015  15.131.310.317.5
 6/3/20111:3014  13.644.54.7 12.1
 6/3/20112:0012.513.237.92.1 6.5
 6/3/20112:3012.513.535.36.3 10.1
 6/3/20113:0012  12.137.73   7.4
 6/3/20113:3011.511.538.73.4 6
 6/3/20114:0010  9.9 52.71.4 4.2
 6/3/20114:309.5 9.1 43.21.3 3.8
 6/3/20115:008   8.7 59.21.2 3.1
 6/3/20115:307   8   62.51.1 4.2
 6/3/20116:006   7.8 47.80.5 2.2
 6/3/20116:307.5 11.537.51.7 3.8
 6/3/20117:0010.514  33.10.6 2.2
 6/3/20117:3014  17.332.11.6 3.6
 6/3/20118:0017.520.323.90.4 2
 6/3/20118:3021.522.820.70.4 1.8
 6/3/20119:0024.524.914.10.3 2.2
 6/3/20119:3026  26.920.31.7 5.6
 6/3/201110:00   27.527.420.72.5 6.5
 6/3/201110:30   28.529.810  1.6 4.2
 6/3/201111:00   29.529.49.8 1.8 4
 6/3/201111:30   30  29.99.2 2.7 7.8


 Tnx for any suggestions,

 Bruce

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Read File for Matrix with rownames

2012-03-23 Thread Djordje Bajic
first problem: the blank space in first position in the first line. Try
removing it, so that the file looks like this:

1,2,3,4
1,484,43,67,54
2,54,35,67,34
3,69,76,78,55
4,67,86,44,34

Second: your colnames and rownames are numeric; R recognizes it but puts an
X (but it recognizes the rownames and puts them correctly!). To see it try:

test.csv:

23,3,33,31
25,484,43,67,54
54,54,35,67,34
43,69,76,78,55
34,67,86,44,34

 test - read.table(test.csv, sep=,, header=T)

Then you can remove the X in the colnames:

  colnames(test) - gsub(X, , colnames(test))




2012/3/23 MSousa ricardosousa2...@clix.pt

 Good morning,

 Good morning,

I'm trying to read the file into an array, with the following code.

 A- as.matrix(read.csv(~/Desktop/Results/Cfile.csv, header = FALSE,
 sep=,))

 The content of the file
,1,2,3,4
 1, 484,43,67,54
 2,54,35,67,34
 3,69,76,78,55
 4,67,86,44,34

 What I needed is that the first line was the name of the columns and the
 first column was the name of the lines.

 Thanks


 --
 View this message in context:
 http://r.789695.n4.nabble.com/Read-File-for-Matrix-with-rownames-tp4498280p4498280.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] best option for big 3D arrays?

2012-02-27 Thread Djordje Bajic
Steven, sorry for the delay in responding,

I have been investigating this also and here is the way I do it (though
probably not the best way):

# .. define a 3D array
 ngen = 904
 gratios - ff(NA, dim=rep(ngen,3), vmode=double)

# .. fill the array with standard R functions

 ffsave (gratios, file=mydir/myfile)# without extension
 finalizer(gratios) - delete

# ..

so, you firstly define the ff object, you put the data inside, and you
ffsave it. The ffsave function will generate two files, with extensions
ffdata and a Rdata. Then you set 'delete' to be the 'finalizer' of the
object; in this way you avoid ff to save it in some tmp dir and occupy disk
space forever. Then, you can access your object in the next R session:

 ffload(mydir/myfile)# also without extension

I hope this helped.

Cheers,

djordje



2012/2/23 steven mosher mosherste...@gmail.com

 Did you have to use a particular filename?  or extension.

 I created a similar file but then could not read it back in

 Steve

 On Mon, Feb 13, 2012 at 6:45 AM, Djordje Bajic je.li@gmail.comwrote:

 I've been investigating and I partially respond myself. I tried the
 packages 'bigmemory' and 'ff' and for me the latter did the work I need
 pretty straightforward. I create the array in filebacked form with the
 function ff, and it seems that the usual R indexing works well. I have yet
 to see the limitations, but I hope it helps.

 a foo example:

 myArr - ff(NA, dim=rep(904,3), filename=arr.ffd, vmode=double)
 myMat - matrix(1:904^2, ncol=904)
 for ( i in 1:904 ) {
myArr[,,i] - myMat
 }

 Thanks all,

 2012/2/11 Duncan Murdoch murdoch.dun...@gmail.com

  On 12-02-10 9:12 AM, Djordje Bajic wrote:
 
  Hi all,
 
  I am trying to fill a 904x904x904 array, but at some point of the loop
 R
  states that the 5.5Gb sized vector is too big to allocate. I have
 looked
  at
  packages such as bigmemory, but I need help to decide which is the
 best
  way to store such an object. It would be perfect to store it in this
  cube
  form (for indexing and computation purpouses). If not possible, maybe
 the
  best is to store the 904 matrices separately and read them individually
  when needed?
 
  Never dealed with such a big dataset, so any help will be appreciated
 
  (R+ESS, Debian 64bit, 4Gb RAM, 4core)
 
 
  I'd really recommend getting more RAM, so you can have the whole thing
  loaded in memory.  16 Gb would be nice, but even 8Gb should make a
  substantial difference.  It's going to be too big to store as an array
  since arrays have a limit of 2^31-1 entries, but you could store it as a
  list of matrices, e.g.
 
  x - vector(list, 904)
  for (i in 1:904)
   x[[i]] - matrix(0, 904,904)
 
  and then refer to entry i,j,k as x[[i]][j,k].
 
  Duncan Murdoch
 
 
 

[[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] best option for big 3D arrays?

2012-02-13 Thread Djordje Bajic
I've been investigating and I partially respond myself. I tried the
packages 'bigmemory' and 'ff' and for me the latter did the work I need
pretty straightforward. I create the array in filebacked form with the
function ff, and it seems that the usual R indexing works well. I have yet
to see the limitations, but I hope it helps.

a foo example:

myArr - ff(NA, dim=rep(904,3), filename=arr.ffd, vmode=double)
myMat - matrix(1:904^2, ncol=904)
for ( i in 1:904 ) {
myArr[,,i] - myMat
}

Thanks all,

2012/2/11 Duncan Murdoch murdoch.dun...@gmail.com

 On 12-02-10 9:12 AM, Djordje Bajic wrote:

 Hi all,

 I am trying to fill a 904x904x904 array, but at some point of the loop R
 states that the 5.5Gb sized vector is too big to allocate. I have looked
 at
 packages such as bigmemory, but I need help to decide which is the best
 way to store such an object. It would be perfect to store it in this
 cube
 form (for indexing and computation purpouses). If not possible, maybe the
 best is to store the 904 matrices separately and read them individually
 when needed?

 Never dealed with such a big dataset, so any help will be appreciated

 (R+ESS, Debian 64bit, 4Gb RAM, 4core)


 I'd really recommend getting more RAM, so you can have the whole thing
 loaded in memory.  16 Gb would be nice, but even 8Gb should make a
 substantial difference.  It's going to be too big to store as an array
 since arrays have a limit of 2^31-1 entries, but you could store it as a
 list of matrices, e.g.

 x - vector(list, 904)
 for (i in 1:904)
  x[[i]] - matrix(0, 904,904)

 and then refer to entry i,j,k as x[[i]][j,k].

 Duncan Murdoch




[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] best option for big 3D arrays?

2012-02-10 Thread Djordje Bajic
Hi all,

I am trying to fill a 904x904x904 array, but at some point of the loop R
states that the 5.5Gb sized vector is too big to allocate. I have looked at
packages such as bigmemory, but I need help to decide which is the best
way to store such an object. It would be perfect to store it in this cube
form (for indexing and computation purpouses). If not possible, maybe the
best is to store the 904 matrices separately and read them individually
when needed?

Never dealed with such a big dataset, so any help will be appreciated

(R+ESS, Debian 64bit, 4Gb RAM, 4core)

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.