Good point!
:)

Martin


On 6 April 2013 14:53, Niphlod <[email protected]> wrote:

> I'd go for
> *rows=db(
>        (db.my_table.created_on > date_one) &
>        (**db.mytable..my_field==my_criteria)
>        ).select(db.mytable.membership)*
> *.....
>
> *if your "my criteria" is not a function, but a fixed value ....
>
> On Saturday, April 6, 2013 4:27:10 AM UTC+2, Martin Barnard wrote:
>
>> Thanks again for the help - it's invaluable to the oracle noob! :D
>>
>> I'm actually building an analysis system so that my boss can view some
>> statistical subset of the information based on a few months or weeks. One
>> part of the data is stored in *our* MySQL database (which our department
>> control), whilst the rest of it is stored in the IT's oracle system. I have
>> to pull a bunch of data from the MySQL database (usually with a date-based
>> search parameter), then search the oracle system for the rest of the data
>> for each return in the MySQL rows (by membership).
>>
>> My current solution (for those on similar paths):
>>
>> *rows=db(db.my_table.created_on > date_one).select()
>> *
>> *ml=[]
>> *
>> *for row in rows:
>> *
>> *  if row.my_field==my_criteria:
>> *
>> *    ml.append(row.membership)
>>
>> *
>> *rows=odb(odb.other_table.membership.belongs(ml)).select()*
>>
>> Of course, I have to worry about the length of the list that I'm passing,
>> as I read that it cannot exceed 1000 items, but this shouldn't be an issue
>> in normal use, and I'll wrap it in a test first.
>> More efficient methods welcome!!!
>>
>> Martin
>>
>>
>>
>>
>>
>> On 6 April 2013 01:11, Niphlod <[email protected]> wrote:
>>
>>>
>>>
>>> On Friday, April 5, 2013 11:58:37 PM UTC+2, Martin Barnard wrote:
>>>>
>>>>  Thanks for the info, Niphlod.
>>>>
>>>> I will look into the efficiency of the IN clause for my needs, as it
>>>> appears to offer a solution which may mollify the IT  DBA, and his demands
>>>> for bind vars (they are concerned that a looped select will bring the db to
>>>> it's knees).
>>>>
>>>>
>>> .... a db(whatever.belongs(a_set)) issues ONE query only.
>>>
>>> If you want to "force" a looping query, you should do explicitely with
>>>
>>> mems = [1,2,3,4,5,6,...]
>>> for c in mems:
>>>      one_result = db(db.table.field == c).select()
>>>      ......
>>> of course, for zillions values into mems, it's not a smart move.
>>>
>>> the smartest move with a huge set (i.e. the technique with most of the
>>> "balance") would be "paginating" through your "mems" .
>>>
>>> You'd loop a few times but if you have thousands of values into "mems",
>>> a single IN () (or thousands ORs) will take some time ....
>>>
>>> Try to "draw a limit" with your DBA and if he says that you're "allowed"
>>> to do an IN() with 500 values at a times, you have it covered ^_^
>>>
>>> On the other end, you have a requirement..... fetch a zillions rows.....
>>> either you do it in one shot or in zillions/500 each.
>>>
>>> --
>>>
>>> ---
>>> You received this message because you are subscribed to a topic in the
>>> Google Groups "web2py-users" group.
>>> To unsubscribe from this topic, visit https://groups.google.com/d/**
>>> topic/web2py/E2pVWl_71t4/**unsubscribe?hl=en<https://groups.google.com/d/topic/web2py/E2pVWl_71t4/unsubscribe?hl=en>
>>> .
>>>  To unsubscribe from this group and all its topics, send an email to
>>> web2py+un...@**googlegroups.com.
>>>
>>> For more options, visit 
>>> https://groups.google.com/**groups/opt_out<https://groups.google.com/groups/opt_out>
>>> .
>>>
>>>
>>>
>>
>>  --
>
> ---
> You received this message because you are subscribed to a topic in the
> Google Groups "web2py-users" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/web2py/E2pVWl_71t4/unsubscribe?hl=en.
> To unsubscribe from this group and all its topics, send an email to
> [email protected].
> For more options, visit https://groups.google.com/groups/opt_out.
>
>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to