Using webbrowsers as a supercomputing platform

The recent combination of using old web technologies into the new buzzword AJAX and more interactive web applications like TiddlyWiki and the well known way how Seti@Home utilizes spare CPU power inspired me into a new idea. Well, surely not new because I doubt that nobody has thought on that before but new in the sense that I never read or heard of something like that before.

So without further distractions I'll tell you my idea: What about using the spare computing power of websurfing peoples browsers to crunch some numbers? I mean using JavaScript to fetch computing units from a server, process it locally and load the results back up.

The process I have in mind starts with loading a certain webpage which uses AJAX-technologies to provide an interactive experience to the user. Perhaps setting this page as the browsers initial page can quite increase the participation level. Then the user can, if not already done, log in to an account and let the computations begin. The JavaScript on the page loads data and processing instructions from a server. Then it processes the data and delivers the results back to the server.

The advantages I see in this approach are:

  • Keeping everything contained in one webpage lowers the entry barrier for possible participants quite a bit. Just loading a page is much easier than installing a separate application.
  • Using JavaScript for computations removes the need to create separate applications for every different platform.

Of course every coin has two sides, I see also disadvantages:

  • JavaScript is SLOW! Of course it is, it's an interpreted language inside a quite tight corsett of restrictions, runtime-checks and conversions.
  • Different browsers have different restrictions and interpretations of the same JavaScript code. This adds the need for browser-detection and alternate code-paths.
  • Most browsers only have a single-threaded base which is responsible for the presentation of all open pages and tabs. Running JavaScript on one page also slows down the other open pages and probably even makes the browser less responsive.
  • Some browsers have a long-running-JavaScript detection which kicks in when a JavaScript function is running for some extended time.
  • The computation has to deal with possible interruptions at any time by closing the browser, changing the webpage or something else.

But for some of those problems I can imagine possible solutions or at least workarounds:

  • The slowness of JavaScript can be avoided to a certain level if the JavaScript passes on some or most of its computing to plugins, ie. Java applets or other forms of objects included on the webpage. Of couse this needs support of those objects in the browser through third-party addons but nowadays this shouldn't be a large problem.
  • Different browser platforms could be unified by providing a JavaScript library with an unified API for the different computation algorithms.
  • The single-thread problem could possibly be solved in the same way as the slowness-problem: through the use of plugins.
  • Closing the browser shouldn't destroy the computation results. I expect this is what the onUnload-procedure is for but this has to be evaluated.

All in all I think this is surely something to evaluate. The slower computation in comparison to native but independend applications could be compensated by the much smaller entry-barrier and therefore greater possible participants base. The possible application of such a system is probably limited to smaller mathematical things and not big data-crushing but I may be wrong on that.

I think AJAX-techniques combined with embedded Java-applets on such a page provide the greatest flexibility and platform independence compared to other solutions.


Similar entries