Hi all, I deeply appreciate your invaluable attention answering to my (others! see below) questions. I read them carefully. Here I should add something to our discussion:
As a user of Matlab, that is the truth, the automation of many of processes, say, data preparing, is so easy by means of coding a few lines instead of hundreds clicks on buttons, controlling computation core and/or graphing features. One wants to do the same task by using S, R, Python or SAS. Of course, Matlab is the most powerful for doing this. They are some free similar tools of it e.g FreeMat and Octave. But all of them have a weak not interactive graph output! In addition, drawing of mega-data, say, as a h-scatter plot is too slow! Sometimes with crashes! Nowadays, visualization is a subject of sciences rather art. No impressive output, no success, no future! I think the developing such a powerful-inside impressive-outside tool for geostatistics is not too hard, but a necessary for close future. My list was short but I've tested more software for above two mentioned features. Handling large dataset is so difficult with all available tools. (If you wish you can try!) Why? Is this going to be nth mystery of life?! ;) Can be this year a new year for Geostatistics paying more attention to users? Will Geostatistics celebrate its modern fresh look and young heart (core) at the end of year? I am hopeful. Regards, Younes Beginning of Log (Shortened)========================================================= << Guillaume wrote: "...Vesper. Not very powerful, but the interface is not bad.... The command line is not a worry for professionals, so Gstat, GSLIB, R or Matlab do the job perfectly fine. Novice users tend to use more something within a GIS software for their analyses (e.g. ArcGIS Geostatistical toolbox, etc.). Probably the best tool I have seen so far is the GSTAT interface used within the Idrisi GIS. But you are probably right in saying that there are no user-friendly programs that are inexpensive and that would suit both novice users and professionals." << Edzer wrote: "...looking for this free or cheap, all-capable package with a complete, friendly and robust graphical user interface with dynamic graphics. A problem is that such a thing is hard and expensive to develop, and unlikely to be arise as a side product of a research project. Look at the worlds of GIS or image analysis -- there's a lot of high quality things out there for free, but the thing you're looking for is very expensive. In your list I missed at least: 9. ArcGIS + geostatistical analyst 10. SGEMS, the new Stanford software after GSLIB 11. other packages in R, such as gstat, randomFields, rsaga, and so on. 12. ... (I hope others will finish this list!) ...I have the impression that I'm not alone when thinking that although graphical, interactive exploratory data analysis is a very nice thing to have, a solid data analysis should start from the principle of reproducability, and therefore as little as possible depend on the reproduction of long sequences of mouse clicks. Are users of this list aware of other communities and/or mailing lists where considerable activity around geostatistics and/or geostatistical software takes place?..." << Seth wrote: "I believe that software for statistical analysis of spatial data must realize that many analyses now are on GIS data that covers a large area and so is a data rich environment. I have the exact opposite problem from many, too many data points! The point and click software that can do geostats (IDRISI, ARCMAP, SAM) that I'm familiar with either handles only vector data and/or smaller data sets. For instance, a simple semivariogram with a sample of the data I had took all night to calculate in IDRISI and came back in error. For simple analyses of large data sets, I've taken to writing my own code in fortran 90. I can calculate a semivariogram with 1,000s of points that cover half of a US state with a huge maximum search distance and many bins in about 30 minutes. Most spatial autocorrelation stats are simple to write code to calculate...." << Pierre wrote: "Your list is far from complete. I would recommened you take a look at the following paper...that provides an overview and comparison of functionalities in a series of geostat software, most of them listed on ai-geostat website." << Seth wrote again: "I'm wondering what geostatistical software is best for handling very large data sets. With the advent of GIS and remote sensing, having too much data is a problem. Sampling of course is useful, but only to a point if a large study area is used. I've read other places that among the commercial stat packages, SAS is best at handling large data sets. Is this true? Also, I've produced my own little routine in IDRISI that can create 'random' samples that are clustered by inverse distance, so that short lags are preferred. Are there any software packages that can create a random sample of points that show a pre-specified clustering pattern in space?" << JanWMerks wrote: "I read with a great deal of interest your emessage about Geostatistics in pain. Read what I have found out. Real statistics turned into surreal geostatistics under the guidance of Professor Dr Georges Matheron..." << Sebastiano: "In general I agree with the comments reported in the preceding replies. Then I would add that the problem, if any one exists, doesn't relies on the lack of a good gui but maybe on the lack of a kind of standard and internationally accepted set of programming routines directed to geostatistical analysis. As a final consideration I think that the world of spatial analysis is more complex than in the past and a kind of holistic view and is needed. For example I'm thinking to other techniques based on statistical learning theory,data mining, etc..." << Paul wrote: Being an R user myself (gstat, automap) I would like to comment a little on your problem with command line tools. I think what is most important is that each application has its own best tool to use. When a novice user wants to quickly make some maps, a GUI would probably be the preferred tool. But if you, as in my case, want to interpolate thousands of maps, put them on a webservice and allow user to get those maps from the web, a GUI tool such as ArcGIS is probably not the best option. R is great for these kinds of large analysis. In addition, I'm a Linux user and would not trade my command line for any GUI :). Therefore I believe a GUI is not per definition better than command line. It's just that people are used to GUI nowadays, making command line seem old. I agree that it takes quite some time to learn R and that it is not a tool suitable for the casual user. A great combo would be a tool that has the flexibility and power of R and the ease of use of e.g. ArcGIS.Hope you find a solution that suits your particular needs. End of Log ====================================================== King Regards, Younes __________________________________________________________________________________ See what's on at the movies in your area. Find out now: http://au.movies.yahoo.com/session-times/ + + To post a message to the list, send it to ai-geost...@jrc.ec.europa.eu + To unsubscribe, send email to majordomo@ jrc.ec.europa.eu with no subject and "unsubscribe ai-geostats" in the message body. DO NOT SEND Subscribe/Unsubscribe requests to the list + As a general service to list users, please remember to post a summary of any useful responses to your questions. + Support to the forum can be found at http://www.ai-geostats.org/