[tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-06-04 Thread 'Mark S.' via TiddlyWiki
Here's some code that runs in the prerelease 5.1.20 (it needs the new operators). ABSOLUTELY make sure you have a copy of everything in your browser and in your desktop, because if you have a lot of tiddlers, this may crash your browser. You may get a lot of those "A script on this page is slow

[tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-06-04 Thread Magnus
Check out https://groups.google.com/d/topic/tiddlywiki/hr75FTeEL_g/discussion -- You received this message because you are subscribed to the Google Groups "TiddlyWiki" group. To unsubscribe from this group and stop receiving emails from it, send an email to tiddlywiki+unsubscr...@googlegroups.

[tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-06-04 Thread Sean Boyle
In the spirit of large tiddlywikis, is there a simple way to get a listing of tiddlers by size, sorted largest first? I would like to do some weeding. On Friday, May 31, 2019 at 3:50:24 AM UTC-7, Jeremy Ruston wrote: > > Several of the projects I’m working on for Federatial clients involve > la

Re: [tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-06-02 Thread PMario
On Monday, June 3, 2019 at 5:33:42 AM UTC+2, Mark S. wrote: > > Does field:y include field:text ? > There is a limit of 128 characters. So text field most of the time will be excluded. -m -- You received this message because you are subscribed to the Google Groups "TiddlyWiki" group. To un

Re: [tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-06-02 Thread 'Mark S.' via TiddlyWiki
Does field:y include field:text ? Thanks! * [all[tiddlers]tag[x]... > > * [all[shadows]tag[x]... > * [all[tiddlers+shadows]tag[x]... > * [all[shadows+tiddlers]tag[x]... > * [all[tiddlers]field:y[x]... > * [all[shadows]field:y[x]... > * [all[tiddlers+shadows]field:y[x]... > * [all[shadows+tiddlers

Re: [tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-06-01 Thread Jeremy Ruston
Hi Diego > Ive recently noticed my wiki (10MB) has been very slow, much more so on FF > than on chrome. Im very interested in your results, and how I would go about > debugging my wiki. The improved indexer is live on the prerelease. You can read more about it here: https://tiddlywiki.com/prer

Re: [tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-05-31 Thread 'Mark S.' via TiddlyWiki
What platform? What operating system? How much memory? How many Ghz? I find I have to reboot FF periodically. In the process manager, I can see it slowly take over more and more memory. It's done this in every version, including the latest, greatest, "improved" version. On Friday, May 31, 2019

Re: [tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-05-31 Thread Diego Mesa
Hey Jeremy, Ive recently noticed my wiki (10MB) has been very slow, much more so on FF than on chrome. Im very interested in your results, and how I would go about debugging my wiki. On Friday, May 31, 2019 at 11:13:50 AM UTC-4, Jeremy Ruston wrote: > > Hi Mark > > What are the physical charact

Re: [tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-05-31 Thread Jeremy Ruston
Hi Mark > What are the physical characteristics of the machine that ran your tests? > RAM? Ghz? Make? Type of HD? I’ve recently got a modern Mac with 16GB RAM, 512MB SSD and a 3 GHz Intel Core i5. But I went back to my old 2013 MacBook Pro (also 16GB RAM and 512MB SSD) and tried the file there

Re: [tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-05-31 Thread Jeremy Ruston
Hi Mark > Is the indexing something that happens automatically, or something that you > activate? The improved indexing doesn’t need activating, it should be entirely invisible. It is worth knowing what gets optimised though: * [all[tiddlers]tag[x]... * [all[shadows]tag[x]... * [all[tiddlers+s

[tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-05-31 Thread 'Mark S.' via TiddlyWiki
Is the indexing something that happens automatically, or something that you activate? Thanks! On Friday, May 31, 2019 at 5:55:37 AM UTC-7, TonyM wrote: > > Thanks for sharing Jeremy. > > It great to hear the possibilities. > > Can you tell the community single file or server, locally or remotely

[tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-05-31 Thread 'Mark S.' via TiddlyWiki
What are the physical characteristics of the machine that ran your tests? RAM? Ghz? Make? Type of HD? Thanks! On Friday, May 31, 2019 at 3:50:24 AM UTC-7, Jeremy Ruston wrote: > > Several of the projects I’m working on for Federatial clients involve > large wikis, in the 10MB to 100MB range. I

[tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-05-31 Thread bimlas
I just tried to import 150MB of image. Previously, I thought it would take at least 20 minutes to load the wiki, but it was loaded surprisingly quickly. O_O Though I don't think it's a good idea to create such a wiki, and to keep the files physically there too, but it's good to know that Tiddly

[tw5] Re: Another datapoint on extremely large TiddlyWikis

2019-05-31 Thread @TiddlyTweeter
Jeremy, thanks ... Useful test ... There are just over 3,000 images, weighing in at about 197MB of base64 > encoded text. FYI, that is particularly interesting to me. Being able to keep images in a wiki, rather than external, has good upsides on maintenance and ease of portability. I'm gonn