in your database?
Regards
Guenther
--
DavaoSOFT, the home of ERPel
ERPel, das deutsche Warenwirtschaftssystem fuer LINUX
http://www.davaosoft.com
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo
*
>> ** Results in:
>> **
>> ** 1 2 3 4 5 6 7 8 9
Thanks a lot! This is exactly what I have been looking for.
Keep up the good work,
Guenther
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
On 2011-06-07 23:52, Petite Abeille wrote:
> The short of it: no, not out-of-the-box.
Thanks for the quick reply.
I guess a loadable extension needs to be used for this purpose then.
Is there any such extension already known to be available? I would like
to avoid reinventing the wheel.
I
;int_seq".
It it possible in SQLite to create such a table implicitly "on the fly"
using some sort of recursive view/query or built-in special function?
Regards,
Guenther
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.o
had used Access for that, which seems so index
even views, but at some point access threw unexpected errors when I then
tried to access those views of views via odbc, why I don't now but
that's when I switched to sqlite and this problem came up.
Best regards
Guenther
P Kishor schrieb:
> On
Hi P,
thanks, I had considered that.
Hope there is another solution though.
Günther
P Kishor schrieb:
> On 10/13/08, Guenther Schmidt <[EMAIL PROTECTED]> wrote:
>
>> Hi,
>>
>> unfortunately I've hit a point where I run a query where one VIEW joins
>&
Hi,
unfortunately I've hit a point where I run a query where one VIEW joins
another VIEW.
Both views are rather large and since there is no index on the fields
where they join the query takes very very long. (About 15 min).
Does anybody here know a solution to this problem?
Best regards
On Mon, 2007-03-19 at 01:46 +0200, Dimitris Servis wrote:
> 2007/3/19, guenther <[EMAIL PROTECTED]>:
> > On Sun, 2007-03-18 at 23:51 +0200, Dimitris Servis wrote:
> > > in my wildest dreams... if you read carefully, *each* file is about
> > > 100-200MB. I now en
100 MByte, stuffed
into a single (target, database) file results into that database file
being 100*100 MByte. Considering "possibly 200 or more", this easily
could result in a single 64+ GByte file.
So, in what way was this meant to be a response regarding my
concerns? ;)
guenther
--
char *
storage backend that easily can handle files of this size?
Just a thought...
guenther
--
char *t="[EMAIL PROTECTED]";
main(){ char h,m=h=*t++,*x=t+2*h,c,i,l=*x,s=0; for (i=0;i<l;i++){ i%8? c<<=1:
(c=*++x); c&am
10 matches
Mail list logo