Hi Tim,
IMHO, if i understood correctly your problem, you have so many data:

>Each animal has approximately 432000 data points (once/second for 5 days)

that i don't see any need to use "statistical" approaches to compute your
probability surfaces (or home range probability polygons). you already have a
perfect description of the animal movements.
in my opinion, kernel etc. are very useful when you have FEW data and, making
some assumption, you can "extend" your information (as in the case of VHF 
data). 
in your case you have to "extract" (or synthesize) the information already
contained in the data.
i don't know your final scientific goal, but if you want a probability surface,
for example, create a grid and count how many points are inside any cell and you
have how many seconds (and thus the probability) your animal spent there, or use
small buffer around each point, or something like that.
i suggest you a spatial database to manage (and partially analyze) such a big
amount of vector data.

bye
ferdinando
---- 
Área de Clientes Clix – Toda a gestão dos seus serviços online! 
http://cliente.clix.pt/. 
_______________________________________________
AniMov mailing list
[email protected]
http://lists.faunalia.it/cgi-bin/mailman/listinfo/animov

Reply via email to