Re: [R] Help on Aggregation

2018-03-12 Thread PIKAL Petr
, "CET", "CET", "CET", "CET", "CET", "CET", "CET",
"CET", "CET", "CET", "CET", "CET", "CET", "CET", "CET", "CET",
"CET", "CET", "CET", "CET", "CET", "CET", "CET", "CET", ""
), gmtoff = c(NA_integer_, NA_integer_, NA_integer_, NA_integer_,
NA_integer_, NA_integer_, NA_integer_, NA_integer_, NA_integer_,
NA_integer_, NA_integer_, NA_integer_, NA_integer_, NA_integer_,
NA_integer_, NA_integer_, NA_integer_, NA_integer_, NA_integer_,
NA_integer_, NA_integer_, NA_integer_, NA_integer_, NA_integer_,
NA_integer_, NA_integer_, NA_integer_, NA_integer_, NA_integer_,
NA_integer_, NA_integer_, NA_integer_, NA_integer_, NA_integer_,
NA_integer_, NA_integer_, NA_integer_, NA_integer_, NA_integer_,
NA_integer_, NA_integer_)), class = c("POSIXlt", "POSIXt"
))), row.names = c(NA, -41L), class = "data.frame")
>


> -Original Message-
> From: R-help [mailto:r-help-boun...@r-project.org] On Behalf Of Emeka Don
> Sent: Monday, March 12, 2018 12:18 PM
> To: r-help@r-project.org
> Subject: [R] Help on Aggregation
>
> Dear All,
> 1.I have been trying to aggregate my data but I have not been getting it
> correctly. In the data I want to convert hourly data into daily averages.
> Here is a sample of the data:
> Neph_no Date Time Temp_C   Pressure_kPa RH
> 9   2014/03/28  10:00:00 38.4 95.9 29.7
> 9   2014/03/28  11:00:00 37.8 95.8 29.2
> 9   2014/03/28  12:00:00 36.7 95.8 35.1
> 9   2014/03/28  13:00:00 35.4 95.8 38.9
> 9   2014/03/28  14:00:00 34.1 95.8 44
> 9   2014/03/28  15:00:00 32.7 95.9 52.9
> 9   2014/03/28  16:00:00 31.8 96 55.1
> 9   2014/03/28  17:00:00 31.2 96 57.8
> 9   2014/03/28  18:00:00 30.6 96.1 62.1
> 9   2014/03/28  19:00:00 29.8 96.1 68.1
> 9   2014/03/28  20:00:00 29.1 96.1 69.5
> 9   2014/03/28  21:00:00 28.6 96 68.3
> 9   2014/03/28  22:00:00 28 96 71.9
> 9   2014/03/28  23:00:00 27.4 95.9 76.4
> 9   2014/03/29  00:00:00 27 96 80.6
> 9   2014/03/29  01:00:00 27.1 96 80.4
> 9   2014/03/29  02:00:00 27 96.1 80.5
> 9   2014/03/29  03:00:00 27.8 96.1 78.1
> 9   2014/03/29  04:00:00 30.4 96.2 66.8
> 9   2014/03/29  05:00:00 33.7 96.3 54
> 9   2014/03/29  06:00:00 36.3 96.3 45
> 9   2014/03/29  07:00:00 37.7 96.3 38.8
> 9   2014/03/29  08:00:00 38.7 96.3 34.7
> 9   2014/03/29  09:00:00 38.9 96.1 32.6
> 9   2014/03/29  10:00:00 39.4 96 30.2
> 9   2014/03/29  11:00:00 38.9 95.9 31.5
> 9   2014/03/29  12:00:00 38.2 95.8 33.8
> 9   2014/03/29  13:00:00 37.4 95.8 35.4
> 9   2014/03/29  14:00:00 35.8 95.9 39.5
> 9   2014/03/29  15:00:00 33.9 96 46.9
> 9   2014/03/29  16:00:00 31.4 96.1 59.8
> 9   2014/03/29  17:00:00 29.4 96.3 72.9
> 9   2014/03/29  18:00:00 29 96.3 69.6
> 9   2014/03/29  19:00:00 26.5 96.2 81.7
> 9   2014/03/29  20:00:00 27 96.1 82.9
> 9   2014/03/29  21:00:00 27.5 96.1 81.7
> 9   2014/03/29  22:00:00 27.4 96 82.6
> 9   2014/03/29  23:00:00 27.3 96 83.1
> 9   2014/03/30  00:00:00 27.1 96 84.5
> 9   2014/03/30  01:00:00 27.6 96.1 81.8
> 9   2014/03/30  02:00:00 27.8 96.1 81
>
> 2. I have some data of about 6100 rows and 6 columns, but anytime i read it
> into R, it removes the header and a large chunk of the data.( i:e it will 
> import
> from row 3550-6099). This is the code i used for it
> "met_data=read.csv(file.choose(),header=TRUE)", so, how do I read the entire
> data into R without missing some?
> Thank you.
>
> --
> Onyeuwaoma Nnaemeka Dom
>
> [[alternative HTML version deleted]]
>
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.


Tento e-mail a jakékoliv k němu připojené dokumenty jsou důvěrné a jsou určeny 
pouze jeho adresátům.
Jestliže jste obdržel(a) tento e-mail omylem, informujte laskavě neprodleně 
jeho odesílatele. Obsah tohoto emailu i s přílohami a jeho kopie vymažte ze 
svého systému.
Nejste-li zamýšleným adresátem tohoto emailu, nejste oprávněni tento email 
jakkoliv užívat, rozšiřovat, kopírovat či zveřejňovat.
Odesílatel e-mailu neodpovídá za eventuální škodu způsobenou modifikacemi či 
zpožděním přenosu e-mailu.

V případě, že je tento e-mail součástí obchodního jednání:
- vyhrazuje si odesílatel právo ukončit kdykoliv jednání o uzavření smlouvy, a 
to z jakéhokoliv důvodu i bez uvedení důvodu.
- a obsahuje-li nabídku, je adresát

[R] Help on Aggregation

2018-03-12 Thread Emeka Don
Dear All,
1.I have been trying to aggregate my data but I have not been getting it
correctly. In the data I want to convert hourly data into daily averages.
Here is a sample of the data:
Neph_no Date Time Temp_C   Pressure_kPa RH
9   2014/03/28  10:00:00 38.4 95.9 29.7
9   2014/03/28  11:00:00 37.8 95.8 29.2
9   2014/03/28  12:00:00 36.7 95.8 35.1
9   2014/03/28  13:00:00 35.4 95.8 38.9
9   2014/03/28  14:00:00 34.1 95.8 44
9   2014/03/28  15:00:00 32.7 95.9 52.9
9   2014/03/28  16:00:00 31.8 96 55.1
9   2014/03/28  17:00:00 31.2 96 57.8
9   2014/03/28  18:00:00 30.6 96.1 62.1
9   2014/03/28  19:00:00 29.8 96.1 68.1
9   2014/03/28  20:00:00 29.1 96.1 69.5
9   2014/03/28  21:00:00 28.6 96 68.3
9   2014/03/28  22:00:00 28 96 71.9
9   2014/03/28  23:00:00 27.4 95.9 76.4
9   2014/03/29  00:00:00 27 96 80.6
9   2014/03/29  01:00:00 27.1 96 80.4
9   2014/03/29  02:00:00 27 96.1 80.5
9   2014/03/29  03:00:00 27.8 96.1 78.1
9   2014/03/29  04:00:00 30.4 96.2 66.8
9   2014/03/29  05:00:00 33.7 96.3 54
9   2014/03/29  06:00:00 36.3 96.3 45
9   2014/03/29  07:00:00 37.7 96.3 38.8
9   2014/03/29  08:00:00 38.7 96.3 34.7
9   2014/03/29  09:00:00 38.9 96.1 32.6
9   2014/03/29  10:00:00 39.4 96 30.2
9   2014/03/29  11:00:00 38.9 95.9 31.5
9   2014/03/29  12:00:00 38.2 95.8 33.8
9   2014/03/29  13:00:00 37.4 95.8 35.4
9   2014/03/29  14:00:00 35.8 95.9 39.5
9   2014/03/29  15:00:00 33.9 96 46.9
9   2014/03/29  16:00:00 31.4 96.1 59.8
9   2014/03/29  17:00:00 29.4 96.3 72.9
9   2014/03/29  18:00:00 29 96.3 69.6
9   2014/03/29  19:00:00 26.5 96.2 81.7
9   2014/03/29  20:00:00 27 96.1 82.9
9   2014/03/29  21:00:00 27.5 96.1 81.7
9   2014/03/29  22:00:00 27.4 96 82.6
9   2014/03/29  23:00:00 27.3 96 83.1
9   2014/03/30  00:00:00 27.1 96 84.5
9   2014/03/30  01:00:00 27.6 96.1 81.8
9   2014/03/30  02:00:00 27.8 96.1 81

2. I have some data of about 6100 rows and 6 columns, but anytime i read it
into R, it removes the header and a large chunk of the data.( i:e it will
import from row 3550-6099). This is the code i used for it
"met_data=read.csv(file.choose(),header=TRUE)", so, how do I read the
entire data into R without missing some?
Thank you.

-- 
Onyeuwaoma Nnaemeka Dom

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Help with aggregation

2008-10-02 Thread Michael Pearmain
Hi All,
I seem to be having a few troubles with aggregating data back onto the the
dataframe,
I want to take the max value of a user, and then apply this max value back
against all id's that match (i.e a one to many matching)
Can anyone offer any advice?  is there a better way of doing this?
Dummy data and code are listed below:-

dataset is called Mcookie

user_idc_we_conversion
1 1
1 0
1 0
2 1
2 1
3 0
3 0

new data

user_idc_we_conversionc_we_conversion
1 1  1
1 0  1
1 0  1
2 1  1
2 1  1
3 0  0
3 0  0

library(Hmisc)
myAgg-summarize(Mcookie$c_we_conversion, by=Mcookie$user_id, FUN=max,
na.rm=TRUE)
names(myAgg)- c(user_id,c_we_converter)
Mcookie - merge(Mcookie, myAgg, by.x = user_id, by.y = user_id)


Thanks in advance,

Mike

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Help with aggregation

2008-10-02 Thread Hans Gardfjell
See ?ave (in package stats)

/hans

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Michael Pearmain
Sent: den 2 oktober 2008 15:28
To: r-help@r-project.org
Subject: [R] Help with aggregation

Hi All,
I seem to be having a few troubles with aggregating data back onto the the
dataframe,
I want to take the max value of a user, and then apply this max value back
against all id's that match (i.e a one to many matching)
Can anyone offer any advice?  is there a better way of doing this?
Dummy data and code are listed below:-

dataset is called Mcookie

user_idc_we_conversion
1 1
1 0
1 0
2 1
2 1
3 0
3 0

new data

user_idc_we_conversionc_we_conversion
1 1  1
1 0  1
1 0  1
2 1  1
2 1  1
3 0  0
3 0  0

library(Hmisc)
myAgg-summarize(Mcookie$c_we_conversion, by=Mcookie$user_id, FUN=max,
na.rm=TRUE)
names(myAgg)- c(user_id,c_we_converter)
Mcookie - merge(Mcookie, myAgg, by.x = user_id, by.y = user_id)


Thanks in advance,

Mike

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.