Figure out where the real bottleneck is. I just did a demo app that loads
over 35,000 records from an uncompressed CSV file (over 2 meg). It's all
doable, you just have to figure out which part is slow. You basically have
these possibilities in terms of what's taking the most time:

   1. Loading
   2. Parsing
   3. Rendering

If the time is spent loading the file then simply compressing the file and
sending the compressed version across the wire might help. Using a binary
protocol like AMF will get you the best transfer speed though. I actually
like CSV files because they're small and easy to export from Excel.

If parsing the data is taking a long time then examine what you are doing as
you parse the data. Moving away from JSON will likely save time, since JSON
has to be decoded and processed even before your own processing happens. I'd
guess that XML would be faster than JSON, although I don't know that. If you
use AMF then you mostly get this step for free, since AMF decodes as native
AS3 objects (although that takes time, but it's fast). Assuming you're doing
the conversion to your VOs manually, a few tips:

   - the big thing here is make sure that your VOs do not use databinding.
   If your VOs are Bindable objects or have bindable properties you will want
   to remove those and control the binding manually. You can still have
   bindable properties, but you should control the bindings yourself.
   - Do not parse the records into an ArrayCollection. Parse them into an
   Array. If you use addItem() of an ArrayCollection then the collection
   dispatches an event every time you add an item. Instead, parse into a simple
   Array, and if you want to use an ArrayCollection in your app then create
   your AC after the parsing and set the source of the collection.
   - Do the parsing in batches. If you can split up the parsing then you can
   keep your app UI from locking up. I usually parse around 500 records at a
   time This won't speed up the processing (in fact, it will add to the total
   time a tiny bit), but this lets you show progress as your are parsing. I
   have my parsers dispatching progress events as they process batches and I
   show that progress in a progress bar.

And if the most proessing time is spent after you have your data loaded and
parsed then figure out what you're doing to render that data. Try to show
less items, or group the items, or something. I assume you don't need to
create 5,000 renderers all at once. Use the list controls to take advantage
of item renderers, etc.

In my example app that loads 35,000 records the loading of the 2 meg file
takes about 10 seconds. Looping over all the records and turning them into
AS3 value objects takes about 2-3 seconds. Then post-processing those
records to group them takes about 10 seconds (this loops over the entire
dataset and does some aggregation calculations). So all in all that takes
almost 30 seconds, but it's 35 thousand frickin records and during the
entire process the app is responsive and progress is shown for each step.

Doug

On Wed, May 21, 2008 at 12:51 PM, Battershall, Jeff <
[EMAIL PROTECTED]> wrote:

>    Is there a reason why the entire dataset is needed all at once?  Some
> sort of pagination scheme would help.
>
> Jeff
>
>  -----Original Message-----
> *From:* flexcoders@yahoogroups.com [mailto:[EMAIL PROTECTED] *On
> Behalf Of *Tracy Spratt
> *Sent:* Wednesday, May 21, 2008 3:40 PM
> *To:* flexcoders@yahoogroups.com
> *Subject:* RE: [flexcoders] 5,000 record dataset needs love
>
>  Are you certain the bottleneck is the "processing" as opposed to the
> rendering?
>
> Tracy
>
>
>  ------------------------------
>
> *From:* flexcoders@yahoogroups.com [mailto:[EMAIL PROTECTED] *On
> Behalf Of *Tom Longson
> *Sent:* Tuesday, May 20, 2008 10:53 PM
> *To:* flexcoders@yahoogroups.com
> *Subject:* [flexcoders] 5,000 record dataset needs love
>
>
>
> Dear Super Smart Flex Developer Mailing List,
>
> We are currently having major issues processing a dataset that is
> essential for our skunkworks web site. The dataset is stored as JSON,
> consists of 5000 records, and has numerous strings. It is 1.4mb
> uncompressed / 85kb compressed. Processing the data, which involves
> creating a custom object to hold it, currently takes a much as 60 seconds.
>
> We are in the process of attacking this beast to make it run faster,
> and we are considering the following approaches:
>
> 1. Create an index for repetitive strings within the dataset to avoid
> repetitive strings because integer assignment is faster than string
> assignment (we assume).
> 2. Try substituting XML for JSON (no idea if this would be faster or not).
> 3. Attempt to deliver an actionscript binary blob to the application
> from PHP (not even sure if that's possible... ASON?).
> 4. Create a compiled swf with our data postprocessed and attempt to
> access it (again, not sure if that's possible).
> 5. <insert your solution here>
>
> Your expert, snarky, helpful advice is much appreciated,
> Tom
>
>   
>

Reply via email to