On Nov 21, 2009, at 11:23 AM, bbulkow wrote:

> How does v8 support threading? What locks would prevent multiple
> threads of execution from attaining good parallelism? How much
> parallelism has been seen in the wild?

I am not a V8 expert by any means, but I've been working with it in Chrome for 
a few months.

My understanding is that V8 itself doesn't have any multi-thread support; all 
API calls have to be made on the same native thread. The JavaScript language 
has never had any mechanisms for handling concurrency, not emulated ("green") 
threads or even coroutines, because the Web browser runtime model has always 
been single-threaded.

I am not sure whether or not it's possible to run multiple independent V8 
contexts in different threads; this comes down to a question of whether the V8 
implementation uses any global/static variables. Multiple V8 contexts share a 
lot of state for efficiency, but I don't know whether that state is managed 
with regular static variables or per-thread globals.

Even if contexts on multiple threads are allowed, they wouldn't be able to 
interact with each other because they'd have no shared state. This is of course 
awesome for efficient concurrency :) but makes it impossible for any individual 
JS program to scale beyond a single core. You could of course use the extension 
API to implement your own JavaScript primitives — a message-passing pattern 
like the Actor model would be ideal because it minimizes data sharing, so the 
only overhead would be copying the message contents (maybe as JSON?)

Have you looked at Apple's JS interpreter, SquirrelFish? It's part of the 
WebKit project. It needs to support execution on multiple threads to support 
HTML 5 worker tasks, but again it probably won't support shared state between 
threads.


Back to your bigger issue...

> I am evaluating languages and implementations for embedding, and the
> key feature I need is great multi-core support.

This has been a weak spot of most dynamic language interpreters. They tend to 
take the easy way out and use a lot of global state, then slap a big Global 
Interpreter Lock around the entire interpreter to manage context switching. The 
traditional Python and Ruby interpreters both do this. (MacRuby is a very 
interesting new implementation that's fully threadable, but as the name implies 
it's very targeted to OS X and would probably be hard to port.)

Lua has a really interesting design that's optimized for cooperative 
multitasking via coroutines. (And it's great for embedding, as the whole 
runtime fits in about 100k of code. That's why it's used a lot in games.) The 
whole interpreter only runs in one native thread, though.

Modern mainstream Java VMs have very good multithreading, but might be too big 
for embedding purposes. There are of course smaller JVMs used for embedding but 
I don't know how good their concurrency is. There are implementations of JS, 
Ruby and Python for the JVM, which I think would all take adavantage of the 
underlying concurrency support.

Erlang's an interesting language that's designed around massive parallelism 
with extremely lightweight tasks that can scale to multiple processors and 
multiple computers. I don't know how large the runtime is, though. (It's also a 
functional language, which can be hard to get your head around if you're not 
used to such things.)

—Jens
--~--~---------~--~----~------------~-------~--~----~
v8-users mailing list
[email protected]
http://groups.google.com/group/v8-users
-~----------~----~----~----~------~----~------~--~---

Reply via email to