You are very welcome, it is the work of Paul Ramsey though ;-) Cheers, Rémi-C
2014-12-08 0:40 GMT+01:00 George Merticariu <[email protected]>: > Hello Remi, > > Your suggestion with point clouds worked perfectly. > > Thank you! > > Best regards, > George > > On Wed, Dec 3, 2014 at 1:21 PM, Rémi Cura <[email protected]> wrote: > >> Hey, >> I'm afraid you don't have given the most important information >> , thus not allowing to choose between a value per row or a file per row! >> >> So I discuss both : >> first hypothesys : you'll have few files at a time, >> - number of values is not very high (few millions) >> - you want to get parts of the data based on index (1D,2D,3D) or maybe >> value >> You will store each value of the file into a row. >> Thus a file with a char array of size N will yield a table of N rows. >> CREATE TABLE one_value_per_row ( >> gid int, --contains 1,2,..N , ordered by x,y,z >> X int, >> Y int, >> Z int, >> value smallint --or somehting like that >> ); >> if you want your point ordered with original array index >> SELECT * >> FROM one_value_per_row >> WHERE gid BETWEEN 2 AND 45; >> More interesting, you can select 1-2-3D line/square/cube like this >> SELECT * >> FROM one_value_per_row >> WHERE X BETWEEN 2 AND 3 >> AND Y BETWEEN 56 AND 89 >> AND Z BETWEEN 21 AND 42; >> >> of course for this kind of querry to beb efficient, you need some index >> CREATE INDEX ON one_value_per_row (gid);--the same for all columns >> >> This approach __won't__ scale >> >> Second hypothesis : you have a great many of files (over the hundred >> range) >> *you will store each file in one row.* >> Now you have at least 3 options. >> * You store the data as a postgres type, the simplest is a postgres array >> each row would contain a smallint[] (for instance) >> This may not perform well with millions of values >> You access your data like you would for a C char[] : value[35%12] >> I think it is a pretty bad idea >> >> * You ask the database to store the original binary, and you have to >> create C/python function in the database to access the data >> (like get_values(Xmin=1,XMax=3,YMin=56,YMax=89,Zmin=21,ZMax=42)) >> this is an approach identical to the pg_pointcloud >> <https://github.com/pgpointcloud/pointcloud> project >> For instance, your data could be stored as a 3D numpy array. slicing >> it would be then efficient and very easy (one line) >> >> * You don't want to redevelop, and use pg_pointcloud out of the box : >> You consider that each value of your original array is a 3D point >> Then each file is a point cloud stored on one row of the database. >> you define each value of your initial array as a point X,Y,Z being >> the 3D coordinate. you store the value in an attribute, You can also store >> the original array index >> This is the easiest and most efficient out-of-the-box solution >> (see this introduction presentation >> <http://boundlessgeo.com/wp-content/uploads/2013/10/pgpointcloud-foss4-2013.pdf> >> , and this simple to complex presentation >> <https://github.com/Remi-C/Postgres_Day_2014_10_RemiC/raw/master/presentation/A%20PostgreSQL%20Server%20for%20Point%20Cloud%20Storage%20and%20Processing.pdf> >> ) >> >> you would get your data like this >> SELECT pc_get(pt,'X'),pc_get(pt,'Y'),pc_get(pt,'Z'),pc_get(pt,'value'), >> FROM one_file_per_row, >> PC_FilterBetween( >> PC_FilterBetween( >> PC_FilterBetween(point_cloud, 'X',2,3) >> ,'Y',56,89) >> ,'Z',21,42) as patch >> , pc_explode(patch) AS pt >> WHERE file_name = .... >> >> >> Choosing a solution depends on a lot of factors, >> pondering performances, storage, usage, facility to create, facility ot >> maintain, etc etc. >> >> Cheers, >> Rémi-C >> >> 2014-12-02 21:07 GMT+01:00 George Merticariu <[email protected]>: >> >>> Hello! >>> >>> Thank you for your answer! >>> >>> Please find below more information: >>> >>> The type of data which I will insert is an 1D array of values between 0 >>> and 255. >>> This array should be mapped to different dimensions: >>> 1D - grey string >>> 2D - grey image >>> 3d - grey cube >>> >>> There will be a lot of reads from the arrays and only one write at >>> insert. The size of the array will be kept constant and will be given by >>> the file size. >>> >>> Is there a way to import directly the 1D binary file into arrays of >>> higher dimensions or do I need to convert it first and then import it? >>> >>> Best, >>> George >>> >>> On Tue, Nov 25, 2014 at 5:56 PM, Rémi Cura <[email protected]> wrote: >>> >>>> Hey, >>>> you don't even need postgis for this, you could store a cube per line, >>>> with a x,y,z number, indexes on it,then querry like >>>> SELECT * >>>> FROM my_cube_table >>>> WHERE x BETWEEN 100 AND 200 >>>> AND y BETWEEN 23 AND 45 >>>> AND z BETWEEN 45 AND 67 >>>> You could alos use postgres range type. >>>> Of course you could store cube as meshes, and use pure 3D function >>>> (postgis so ) >>>> etc etc >>>> >>>> It is very hard to answer you if you don't explain what you want to do >>>> with this cubes , how many you will have, ifyou read it more or write it >>>> more, what kind of data it contains, if the size of each cube may change, >>>> if you want to convert this to geo types ... >>>> >>>> Cheers, >>>> Rémi-C >>>> >>>> 2014-11-25 17:32 GMT+01:00 George Merticariu <[email protected]>: >>>> >>>>> Hello! >>>>> >>>>> I want to use PostGIS for handling 3D cubes but I couldn't figure out >>>>> how to do it from the manual. >>>>> >>>>> The main tasks I want to accomplish are: >>>>> >>>>> 1. Import a 1D char array file (grey cube) into a 3D cube. >>>>> 2. Retrieve sections from the cube, where a section is defined by a >>>>> domain. >>>>> >>>>> Example: >>>>> >>>>> Given a file of 1024*1024*1024 bytes, I want to import it into a cube >>>>> with the domain [0:1023, 0:1023, 0:1023]. Then, select the sub-domains >>>>> (examples): >>>>> >>>>> - [100:200, 100:200, 0:100] >>>>> - [0:1, 0:1023, 0:1023] >>>>> >>>>> >>>>> Is this possible using PostiGIS? If yes, are there any detailed >>>>> tutorials which explain how to do that? >>>>> >>>>> Thank you! >>>>> >>>>> Best regards, >>>>> George Merticariu >>>>> >>>>> >>>>> _______________________________________________ >>>>> postgis-users mailing list >>>>> [email protected] >>>>> http://lists.osgeo.org/cgi-bin/mailman/listinfo/postgis-users >>>>> >>>> >>>> >>>> _______________________________________________ >>>> postgis-users mailing list >>>> [email protected] >>>> http://lists.osgeo.org/cgi-bin/mailman/listinfo/postgis-users >>>> >>> >>> >>> >>> -- >>> George Merticariu >>> >>> Jacobs University Bremen >>> B.Sc. Computer Science >>> Class of 2014 >>> >>> [email protected] >>> [email protected] >>> >>> _______________________________________________ >>> postgis-users mailing list >>> [email protected] >>> http://lists.osgeo.org/cgi-bin/mailman/listinfo/postgis-users >>> >> >> >> _______________________________________________ >> postgis-users mailing list >> [email protected] >> http://lists.osgeo.org/cgi-bin/mailman/listinfo/postgis-users >> > > > > -- > George Merticariu > > Jacobs University Bremen > M.Sc. Computer Science > Class of 2016 > > [email protected] > [email protected] > > _______________________________________________ > postgis-users mailing list > [email protected] > http://lists.osgeo.org/cgi-bin/mailman/listinfo/postgis-users >
_______________________________________________ postgis-users mailing list [email protected] http://lists.osgeo.org/cgi-bin/mailman/listinfo/postgis-users
