On 12/20/2010 06:14 PM, John Fabiani wrote:
> On Monday, December 20, 2010 12:52:03 pm John Fabiani wrote:
>> On Monday, December 20, 2010 12:12:12 pm marcelo nicolet wrote:
>>> On 12/20/2010 05:03 PM, John Fabiani wrote:
>>>> On Monday, December 20, 2010 11:49:02 am marcelo nicolet wrote:
>>>>> Thank you, Ed.
>>>>> In fact, I was reading about the internals of psycopg and the module
>>>>> interprets python lists as postgres arrays and the other way.
>>>>> The only thing I would need, then, is to "present" that list to dabo
>>>>> as discrete values.
>>>>> Where is the appropriate place to do that?
>>>>> TIA
>>>>>
>>>>> On 12/20/2010 04:09 PM, Ed Leafe wrote:
>>>>>> On Dec 19, 2010, at 5:39 PM, marcelo nicolet wrote:
>>>>>>> But it would be nice to take advantage of the arrays this engine
>>>>>>> supports. In fact, the business model of the project focuses on a
>>>>>>> vector of integers, which is very inconvenient to manage in true
>>>>>>> normalized form, and also as a set of scalars in the same relation.
>>>>>>> So, I am asking for an example of how to integrate non standard
>>>>>>> types from the backend into dabo.
>>>>>>>
>>>>>> I would assume that you would have to create custom handlers for
>> these
>>
>>>>>> data types. I would imagine that psycopg would make the necessary
>>>>>> conversion from the PostegreSQL datatype to the nearest Python type
>>>>>> (list in this case). I'm not sure how Dabo would handle lists as
>>>>>> column values, as I've never tried anything like that. However, if
>>>>>> you do try and get errors at the Dabo level, please paste the
>>>>>> tracebacks here so that we can see what would have to be improved in
>>>>>> order to handle these correctly.
>>>>>>
>>>>>> -- Ed Leafe
>>>> I have several questions
>>>> Is each of the array fields fixed in the number of elements?
>>>> Is each element with in the array the same data type?
>>>> Are you going to search on any of the elements?
>>>> Will you be adding elements to the array?
>>>>
>>>>
>>>> Johnf
>>> Yes, they are of fixed size; all elements are integers (in this case, of
>>> course). The arrays will not change in size once stored.
>>> Regarding query, the arrays will be retrieved always in their entirety;
>>> if the user request some filtering, it will be done programatically, not
>>> at the backend level.
>> Untested and just things I'd try first:
>> First I believe the dbPostgres.py will return '?' as the datatype but the
>> field name should be there. So it might be possible to use the standard
>> createBizobj() routine to create the bizobj. Therefore, you could use a
>> virtual field routine to populate the added fields. Of course you then
>> need to handle the save (I think you will no matter what you do).
>>
>> Something like;
>> self.VirtualFields = {'listitem1':self.getItem_1,
>> 'listitem2':self.getitem_2, ...}
>>
>> def getitem_1(self):
>> listitems = self.Record.fieldname
>> return listitems[0]
>>
>> A second way would be to create a DataSource (manual define). The write
>> the method to retrieve the data and populate the DataSource/DataSet.
>>
>> Once the data is in a bizobj it should lend it self to all the filtering
>> required.
>>
>> Then over write the bizob's save routine to populate the array. Rebuild
>> the list and pass it to psycopg.
>>
>> I try my hardest to always get everything into a DataSource first. That
>> way I can work within the Dabo framework and use all the tools.
>>
>> I will not say I haven't gone way outside of Dabo to get the job done
>> because I have. But I call that the 'dark side' and it's always better to
>> work with the force Luke!
>>
>> Johnf
> Oh I forgot some important thoughts:
> You can always use the Postgres functions to return the array as needed. I've
> had to do this several times in the past. Returning 'SETOF ' or maybe you
> could create a view that expanded the array. This also makes creating the
> bizobj very easy. And don't forget you can use 'Rules' to allow inserts,
> updates, and deletes for views. Like I said using views this makes using Dabo
> much easier!
>
> Johnf
Thank you again, John. But is exactly in the bizobj where it's
convenient to have an "array flavor" of these data, because some of the
biz rules involve a sum over the elements.
> _______________________________________________
> Post Messages to: [email protected]
> Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users
> Searchable Archives: http://leafe.com/archives/search/dabo-users
> This message:
> http://leafe.com/archives/byMID/[email protected]
>
_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users
Searchable Archives: http://leafe.com/archives/search/dabo-users
This message: http://leafe.com/archives/byMID/[email protected]