Re: [PD] Best way to deal with many tables.

2009-02-22 Thread David Doukhan
2009/2/21 B. Bogart b...@ekran.org:
 Thanks all for your comments.
 David, How fast was your python code to generate the tables? Seems like
 it would be about the same issues as dynamic patching.

The python code does not generate the tables: it generates a patch.
IE: suppose you wrote a patch that load a piece of data into an array:
this is your atomic component.
Then, with the python script (or any language you like), you generate
a patch storing all those atomic components.
The goal is just to avoid putting a huge number of components on a
patch yourself, since it could be automatized.
This is not a dynamic solution: once the patch has been generated, it
won't change until you generate it again.
So concerning the performances (speed, etc...), they will be the same
as non dynamic solution.
I don't know if it corresponds to what you're looking for, if it seems
to suit your needs, I can try to explain better.

-- 
David Doukhan

___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] Best way to deal with many tables.

2009-02-22 Thread marius schebella
cyrille henry wrote:
 the best 2D table are probably images.
 if the 8bits limitation is not a problem, you can store your arrays in 1 (or 
 more) big image (1000x768).

hi,
just curios, are you using [pix_set] for that or sig2pix? or an external 
program.
because with pix set the range is bet 0 and 1 and if I wanted to store 
values bet 0 and 256 I just divide by 256? isn't there a size limit of 
length of the message that is passed between objects?
marius.



 pix_crop + pix_pix2sig to get a row of your image in a table.
 
 Cyrille
 
 B. Bogart a écrit :
 Hey all.

 I've managed to get my patches to use less objects, and more messages.

 Problem I have now is storing data in an organized way.

 Basically the system I'm working on needs to store the RGB hists of many
 images (10,000 ideally, RAM permitting). RGB hists are concatenated into
 tables of 768 elements each.

 What is the best way to deal with this number of tables? There are the
 usual thoughts of using dynamic patching and such, but really I'd like a
 more elegant solution.

 Has anyone worked on something like a multi-table or nested table?

 I could put everything in one giant table, but each chunk needs to be a
 list in the end and it seems to be iterating over a section of the table
 to dump it as a list would be a lot slower than using [tabdump].

 Just wondering if anyone has any suggestions.

 I've already mentioned my wish to have a generic storage system (similar
 to data-structures but independent of any graphical representation) namely:

 tables of floats (done), tables of symbols, and most importantly tables
 of tables!

 .b.

 ___
 Pd-list@iem.at mailing list
 UNSUBSCRIBE and account-management - 
 http://lists.puredata.info/listinfo/pd-list

 
 ___
 Pd-list@iem.at mailing list
 UNSUBSCRIBE and account-management - 
 http://lists.puredata.info/listinfo/pd-list
 


___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] Best way to deal with many tables.

2009-02-22 Thread cyrille henry


marius schebella a écrit :
 cyrille henry wrote:
 the best 2D table are probably images.
 if the 8bits limitation is not a problem, you can store your arrays in 
 1 (or more) big image (1000x768).
 
 hi,
 just curios, are you using [pix_set] for that or sig2pix? or an external 
 program.
pix_image to load a static image.
or you can draw anything in a frambuffer and use pix_snap.

but this will not replace iemmatrix

 because with pix set the range is bet 0 and 1 and if I wanted to store 
 values bet 0 and 256 I just divide by 256?
you have to divide by 255, not 256.

cyrille

 isn't there a size limit of 
 length of the message that is passed between objects?
Gem pass pointer...

Cyrille

 marius.
 
 
 
 pix_crop + pix_pix2sig to get a row of your image in a table.

 Cyrille

 B. Bogart a écrit :
 Hey all.

 I've managed to get my patches to use less objects, and more messages.

 Problem I have now is storing data in an organized way.

 Basically the system I'm working on needs to store the RGB hists of many
 images (10,000 ideally, RAM permitting). RGB hists are concatenated into
 tables of 768 elements each.

 What is the best way to deal with this number of tables? There are the
 usual thoughts of using dynamic patching and such, but really I'd like a
 more elegant solution.

 Has anyone worked on something like a multi-table or nested table?

 I could put everything in one giant table, but each chunk needs to be a
 list in the end and it seems to be iterating over a section of the table
 to dump it as a list would be a lot slower than using [tabdump].

 Just wondering if anyone has any suggestions.

 I've already mentioned my wish to have a generic storage system (similar
 to data-structures but independent of any graphical representation) 
 namely:

 tables of floats (done), tables of symbols, and most importantly tables
 of tables!

 .b.

 ___
 Pd-list@iem.at mailing list
 UNSUBSCRIBE and account-management - 
 http://lists.puredata.info/listinfo/pd-list


 ___
 Pd-list@iem.at mailing list
 UNSUBSCRIBE and account-management - 
 http://lists.puredata.info/listinfo/pd-list

 
 

___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] Best way to deal with many tables.

2009-02-21 Thread Roman Haefeli
On Fri, 2009-02-20 at 20:59 -0800, B. Bogart wrote:
 Hey all.
 
 I've managed to get my patches to use less objects, and more messages.
 
 Problem I have now is storing data in an organized way.
 
 Basically the system I'm working on needs to store the RGB hists of many
 images (10,000 ideally, RAM permitting). RGB hists are concatenated into
 tables of 768 elements each.
 
 What is the best way to deal with this number of tables? There are the
 usual thoughts of using dynamic patching and such, but really I'd like a
 more elegant solution.

what is not elegant about dynamic patching? i find the concept of
dynamic patching actually quite elegant (and i am using for exactly this
kind of problems), but the not elegant part is the fact, that it is not
officially supported yet.
 
 Has anyone worked on something like a multi-table or nested table?

you might want to try gridflow. there you can have 'tables' with n
dimensions. iirc, this approach would you save some memory, since
gridflow lets you set the number type. 

 I could put everything in one giant table, but each chunk needs to be a
 list in the end and it seems to be iterating over a section of the table
 to dump it as a list would be a lot slower than using [tabdump].

i think, this is the least favorable approach. you might trigger some
problems with floating point precision, depending on your table size.
also i haven't found a fast way to transfer a section of a table into a
list. 

 Just wondering if anyone has any suggestions.
 
 I've already mentioned my wish to have a generic storage system (similar
 to data-structures but independent of any graphical representation) namely:
 
 tables of floats (done), tables of symbols, and most importantly tables
 of tables!

yeah, and native nested lists support ('native' != 'implementing it with
some whacky delimiter symbol or prepended number of elements')

roman





___ 
Der frühe Vogel fängt den Wurm. Hier gelangen Sie zum neuen Yahoo! Mail: 
http://mail.yahoo.de


___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] Best way to deal with many tables.

2009-02-21 Thread cyrille henry
hello,

your RGB hists are 1D table. 
so what you need is 2D table.
the best 2D table are probably images.
if the 8bits limitation is not a problem, you can store your arrays in 1 (or 
more) big image (1000x768).
pix_crop + pix_pix2sig to get a row of your image in a table.

Cyrille

B. Bogart a écrit :
 Hey all.
 
 I've managed to get my patches to use less objects, and more messages.
 
 Problem I have now is storing data in an organized way.
 
 Basically the system I'm working on needs to store the RGB hists of many
 images (10,000 ideally, RAM permitting). RGB hists are concatenated into
 tables of 768 elements each.
 
 What is the best way to deal with this number of tables? There are the
 usual thoughts of using dynamic patching and such, but really I'd like a
 more elegant solution.
 
 Has anyone worked on something like a multi-table or nested table?
 
 I could put everything in one giant table, but each chunk needs to be a
 list in the end and it seems to be iterating over a section of the table
 to dump it as a list would be a lot slower than using [tabdump].
 
 Just wondering if anyone has any suggestions.
 
 I've already mentioned my wish to have a generic storage system (similar
 to data-structures but independent of any graphical representation) namely:
 
 tables of floats (done), tables of symbols, and most importantly tables
 of tables!
 
 .b.
 
 ___
 Pd-list@iem.at mailing list
 UNSUBSCRIBE and account-management - 
 http://lists.puredata.info/listinfo/pd-list
 

___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] Best way to deal with many tables.

2009-02-21 Thread David Doukhan
what I did in such situation was to generate pd patch using a python script.
Using that method, all the arrays were stored in memory, and could be
accessed dynamically by the other components through their names.


2009/2/21 B. Bogart b...@ekran.org:
 Hey all.

 I've managed to get my patches to use less objects, and more messages.

 Problem I have now is storing data in an organized way.

 Basically the system I'm working on needs to store the RGB hists of many
 images (10,000 ideally, RAM permitting). RGB hists are concatenated into
 tables of 768 elements each.

 What is the best way to deal with this number of tables? There are the
 usual thoughts of using dynamic patching and such, but really I'd like a
 more elegant solution.

 Has anyone worked on something like a multi-table or nested table?

 I could put everything in one giant table, but each chunk needs to be a
 list in the end and it seems to be iterating over a section of the table
 to dump it as a list would be a lot slower than using [tabdump].

 Just wondering if anyone has any suggestions.

 I've already mentioned my wish to have a generic storage system (similar
 to data-structures but independent of any graphical representation) namely:

 tables of floats (done), tables of symbols, and most importantly tables
 of tables!

 .b.

 ___
 Pd-list@iem.at mailing list
 UNSUBSCRIBE and account-management - 
 http://lists.puredata.info/listinfo/pd-list




-- 
David Doukhan

___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] Best way to deal with many tables.

2009-02-21 Thread B. Bogart
Thanks all for your comments.

Roman, The major inelegance of dynamic patching is the massive CPU of
generating the graph containing 1000s of objects. In theory one only
needs to do this once, but I'm always changing the patch and also the
number of tables. Dynamic patching seems to work great in the 100s, but
starts getting ugly with 1000s.

Cyrille, This is an interesting idea, pix_histo outputs tables though. I
don't know what you mean by 2D tables, they should be 3D for each colour
channel right?? Even if there was an object to dump a hist into an
image, rather than into a table, I don't think the gem pixel operations
on pixels would be any faster than those on tables (slicing out,
concatenating, listing etc..) Please share if you have some ideas
already implemented that does this kind of thing.

David, How fast was your python code to generate the tables? Seems like
it would be about the same issues as dynamic patching.

Looks like I should just do the ol tried and true dynamic patching for
now (at least before some better method comes along).

The reason why I removed the dynamic patching from the rest of the patch
was that it was just becoming too unscalable. My current 75x75 unit SOM
is so far the biggest, but I would like to get as big as possible with
the amount of RAM available, say 100x100+.

Another option would be to use tables to store the RGB hists, read them
directly in python for concatenation to be stored in numpy arrays.
Operations on these are fast and flexible. Has anyone tried this
approach? (I believe this is how vasp does it). That way I could just
dump a list right into ann_som and let python store all the hist data. I
think I've convinced myself here.

Thanks all for your comments!

.b.

cyrille henry wrote:
 hello,
 
 your RGB hists are 1D table. so what you need is 2D table.
 the best 2D table are probably images.
 if the 8bits limitation is not a problem, you can store your arrays in 1
 (or more) big image (1000x768).
 pix_crop + pix_pix2sig to get a row of your image in a table.
 
 Cyrille
 
 B. Bogart a écrit :
 Hey all.

 I've managed to get my patches to use less objects, and more messages.

 Problem I have now is storing data in an organized way.

 Basically the system I'm working on needs to store the RGB hists of many
 images (10,000 ideally, RAM permitting). RGB hists are concatenated into
 tables of 768 elements each.

 What is the best way to deal with this number of tables? There are the
 usual thoughts of using dynamic patching and such, but really I'd like a
 more elegant solution.

 Has anyone worked on something like a multi-table or nested table?

 I could put everything in one giant table, but each chunk needs to be a
 list in the end and it seems to be iterating over a section of the table
 to dump it as a list would be a lot slower than using [tabdump].

 Just wondering if anyone has any suggestions.

 I've already mentioned my wish to have a generic storage system (similar
 to data-structures but independent of any graphical representation)
 namely:

 tables of floats (done), tables of symbols, and most importantly tables
 of tables!

 .b.

 ___
 Pd-list@iem.at mailing list
 UNSUBSCRIBE and account-management -
 http://lists.puredata.info/listinfo/pd-list

 

___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] Best way to deal with many tables.

2009-02-21 Thread cyrille henry


B. Bogart a écrit :
 Thanks all for your comments.
...
 
 Cyrille, This is an interesting idea, pix_histo outputs tables though. I
 don't know what you mean by 2D tables,
what i call a 2D table is just a 2D matrix...

 they should be 3D for each colour
 channel right??
why 3D? 
density = function (image, color index)
it's only 2D.

 Even if there was an object to dump a hist into an
 image, rather than into a table, I don't think the gem pixel operations
 on pixels would be any faster than those on tables (slicing out,
 concatenating, listing etc..) Please share if you have some ideas
 already implemented that does this kind of thing.

i don't know if i did understand.
do you have the hystogram and only wish to acces them in real time.
or do you want to modify them in RT?

i originaly thought that you could script something to write  the histogram of 
your images in table of 3x256 value (1 table for 3 hysto). Then concatenate 
them in not real time into 1 big image.
then accessing the data in RT was simple.
if you wish to modify this data in RT, it will be harder. 

maybe a matrix will be the best...

Cyrille


___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] Best way to deal with many tables.

2009-02-21 Thread Jonathan Wilkes



--- On Sat, 2/21/09, B. Bogart b...@ekran.org wrote:

 From: B. Bogart b...@ekran.org
 Subject: [PD] Best way to deal with many tables.
 To: PD list pd-list@iem.at
 Date: Saturday, February 21, 2009, 5:59 AM
 Hey all.
 
 I've managed to get my patches to use less objects, and
 more messages.
 
 Problem I have now is storing data in an organized way.
 
 Basically the system I'm working on needs to store the
 RGB hists of many
 images (10,000 ideally, RAM permitting). RGB hists are
 concatenated into
 tables of 768 elements each.
 
 What is the best way to deal with this number of tables?
 There are the
 usual thoughts of using dynamic patching and such, but
 really I'd like a
 more elegant solution.
 
 Has anyone worked on something like a multi-table or nested
 table?
 
 I could put everything in one giant table, but each chunk
 needs to be a
 list in the end and it seems to be iterating over a section
 of the table
 to dump it as a list would be a lot slower than using
 [tabdump].
 
 Just wondering if anyone has any suggestions.
 
 I've already mentioned my wish to have a generic
 storage system (similar
 to data-structures but independent of any graphical
 representation) namely:

What about nesting arrays in data structures without using drawing 
instructions?  For example, three templates, something like the following:
1. rgb values [struct rgb float value]
2. histogram with array of rgb values [struct histogram array values rgb]
3. master-list with array of histograms [struct histograms float dummy-variable 
array master-list histogram]

Traversing would only happen once for [append histograms dummy-variable].  You 
could then get and set everything else (e.g., array sizes and rgb value) using 
[setsize] and [element].  And maybe use [until] with a counter to dump the rgb 
values of a given histogram from the master-list.

-Jonathan



 
 tables of floats (done), tables of symbols, and most
 importantly tables
 of tables!
 
 .b.
 
 ___
 Pd-list@iem.at mailing list
 UNSUBSCRIBE and account-management -
 http://lists.puredata.info/listinfo/pd-list


  

___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] Best way to deal with many tables.

2009-02-20 Thread B. Bogart
Hey all.

I've managed to get my patches to use less objects, and more messages.

Problem I have now is storing data in an organized way.

Basically the system I'm working on needs to store the RGB hists of many
images (10,000 ideally, RAM permitting). RGB hists are concatenated into
tables of 768 elements each.

What is the best way to deal with this number of tables? There are the
usual thoughts of using dynamic patching and such, but really I'd like a
more elegant solution.

Has anyone worked on something like a multi-table or nested table?

I could put everything in one giant table, but each chunk needs to be a
list in the end and it seems to be iterating over a section of the table
to dump it as a list would be a lot slower than using [tabdump].

Just wondering if anyone has any suggestions.

I've already mentioned my wish to have a generic storage system (similar
to data-structures but independent of any graphical representation) namely:

tables of floats (done), tables of symbols, and most importantly tables
of tables!

.b.

___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list