HI,

I guess if you do not want to have a physical database, the only option is
memory.

#define db in memory

db = DAL('sqlite:memory') #you got an temp sqlinstance on memory

db.define_tables('apitable',Field('apifield'),Field(....),.........)

then

# Call your API

apiresponse = urllib.urlopen('http://.......') or some websevice client or
module

#parse the api results to transform it in a Python dict

apiresults = YOURPARSER(apiresponse)

#insert into temp db

db.apitable.insert(**apiresults)

#Deal with your DAL temporary objects


Any other idea?



--
Bruno Rocha
[ About me: http://zerp.ly/rochacbruno ]



On Wed, Apr 27, 2011 at 7:38 PM, Ross Peoples <[email protected]>wrote:

> I was wondering if anyone has ever tried anything like this before: I
> have an application that will be running mostly from data via Python
> API calls. This means that I won't really need to use the database at
> all. The problem with this approach, is that some important web2py
> features are only available when using the DAL. I'll use CRUD and
> SQLFORM as an example, but also plugins like PowerTables all require
> the DAL and table definitions.
>
> I had thought of loading some of the API's information into a
> database, and working off of that, and periodically refreshing but I'd
> have to do that every 30 seconds or so and that's a lot of extra work
> that could potentially introduce bugs and make debugging harder.
>
> So my question is: Is it possible to define a set of table definitions
> (reference fields and all) and pull information from the API instead
> of from the database so that I could use things like CRUD and
> PowerTables?

Reply via email to