I can't answer you with a "for sure" answer, except to say that me personally -- I do not base reports on temp views, ever. Since you can create temp tables as easily as a temp view, why not just insert the data into the temp table and call it a day? Just my guess that basing a report on a table that already has the data you need in it should be faster than asking RBase to evaluate a view row by row as it is generating the report. I have no technical knowledge of the report generator code.
Karen -----Original Message----- From: tfred <[email protected]> To: RBASE-L Mailing List <[email protected]> Sent: Wed, Jun 25, 2014 11:05 pm Subject: [RBASE-L] - TEMP VIEW Question We have had a problem of multiple machines dropping R:Base in multi-user while creating complicated reports and we are working our way through it. In RBase-l archives, we keep finding suggestions on isolating users. I just changed all our PROJECT commands to TEMP TABLES WHERE LIMIT = 0. Previously used SomeID = 0 which works, but I see how LIMIT can avoid some conflicts which could lead to our problem. Now all REPORTS are being converted to TEMP VIEWS. The smaller reports work great. I am working toward a very large report (15-20 pages) which summarizes a lot of data. It is based on 12 table/views, 7 of which are multi-table views. Several questions: 1. Is it better to create new TEMP VIEW for each individual data table being used (major rewrite needed) or TEMP VIEWS of the existing VIEWS (minimal rewriting)? I assume regular VIEWS are functionally like TABLES. 2. Is there some way to create the TEMP VIEWS with the Query Builder then reload them on the fly using variables or just use QB to build the SQL code that creates the TEMP VIEWS before running the report? I have worked on this before and sometimes the light bulb of understanding just takes a while to come on. Tom Frederick Jacksonville, IL

