But even that should be a native (or Java) app, not web-based.

Last time I tried, stard...@home used a *lot* of CPU while changing the virtual microscope focus. Wasted CPU cycles, that a BOINC project could have used meanwhile.

I presume each time the <img>'s src attribute was changed, the browser wouls check if the image was in cache and if the cache is fresh enough or needs to be refetched, then decompress the JPEG image (!), recalculate layout on the entire webpage (in case the new image has a different size), and show it on screen. And all that several times per second while I moved the mouse on the focus bar.

Storing an entire "movie" uncompressed in *RAM* would be more than feasible, and much faster. But only a native app would be able to do that...

In addition, on a mobile device, I would like to download a lot of images in advance, look for cosmic dust on the go, without Wi-Fi in range and without paying little fortunes of edge/3G, and finally connect back to the 'ner and report the results.

Aren't long airplane flights a *perfect* opportunity to kill time in distributed thinking? No Internet up there. And no, Google Gears is not the solution.

Enviado desde mi iPod

El 28/10/2009, a las 13:51, David Anderson <[email protected]> escribió:

My 2 cents on mobile devices:
Their best potential is for distributed thinking apps like stard...@home.
For computing, we should concentrate on GPUs,
which are 1000X faster than cell phones, and plenty numerous.

-- David
_______________________________________________
boinc_dev mailing list
[email protected]
http://lists.ssl.berkeley.edu/mailman/listinfo/boinc_dev
To unsubscribe, visit the above URL and
(near bottom of page) enter your email address.
_______________________________________________
boinc_dev mailing list
[email protected]
http://lists.ssl.berkeley.edu/mailman/listinfo/boinc_dev
To unsubscribe, visit the above URL and
(near bottom of page) enter your email address.

Reply via email to