----------------------------------------
> Date: Sat, 20 Nov 2010 12:29:55 +0100
> From: matevz.pav...@gi-zrmk.si
> To: r-sig-ecol...@r-project.org; r-sig-...@stat.math.ethz.ch; 
> r-help@r-project.org
> Subject: [R] Zerodist
>
> Hi all,
>
>
>
> I got the
>
> >"chfactor.c", line 130: singular matrix in function LDLfactor()
>
> Error in predict.gstat(g, newdata = newdata, block = block, nsim = nsim, :
>
> LDLfactor<
>
>
>
> Error, probably because there are some point pairs with zero distance in the 
> matrix. My question is, how can i delete these duplicate nodes from the data 
> set?



Is your question literally about finding duplicates to diagnose your
input file so you can consider which data points to use or do you
want some diagnostic of the matrix to find any equations are almost
identical or otherwise making it hard to invert? In the first case,
I would use utilities like bash sort on whichever keys you
are worried about, 

cat datafile | sort -k 4 | uniq -c

and see man pages for various options ( text can have things like whitespace
that prevents lines from being identical etc etc ).

I seem to recall in the few times I've done modelling that I 
used SVD to diagnose the relevant matricies but not sure if there
are other options. Almost singular cases can be hard to relate to
a phyical situation or even find just visually inspecting a big matrix.










>
>
>
> Thanks, m
>
>
>
>
>
>
>
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.            
>                           
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to