He Andre, I really appreciate the detailed response. I get what you are saying…and that seems like it will work under a small load. Although I didn't say so, my biggest concern was that only one process would be able to do a path dive on that big table at one time. If this dive takes any significant time at all, the other web requests will blocked until that first dive finishes. So it looks like I will have to instantiate at least a few read-only copies of this thing so several web requests can share them.
Thanks again for your help. Dewey On Mar 11, 2010, at 1:29 PM, Andre Carregal wrote: > Dewey > > On Wed, Mar 10, 2010 at 10:37 PM, Dewey Gaedcke <de...@minggl.com> wrote: >> I can tell you are really busy and so I really hate to keep bugging you…. > > You certainly aren't bugging me... :o) > >> A friend who's a more advanced programmer than me has indicated that it's >> difficult (outside of true multi-threading) for one process to get access >> (even read-only) to the memory of another process. > > That's probably true and I wasn't proposing that kind of data sharing. > What I had in mind was to split your system in two parts: > > 1) One part responsible for a web interface that shows some view of > interpretation of the data managed by (2). > > 2) One part responsible for the data structure (your big Lua table) > access. This is what I'm assuming you have ready and running. > > My points about the inconveniences of CGI were related to the fact > that it could be used for (1), but certainly not for (2). > > My points about Copas and Xavante were related to the fact that one > way to make (1) and (2) talk to each other would be to use a socket > channel between them, in particular using HTTP as the protocol. > > In that scenario, (2) would have to be extended in order to listen to > the HTTP conversation, while (1) would have to act as a gateway > between two HTTP conversations. One with the browser, acquiring > parameters for example and responding with an HTML page, and one with > (2), using HTTP as a sort of RPC to obtain part of the whole data > structure and then present it to the user. > > I'm not sure in which part you would like to calculate the shortest > paths, but the architecture would be the same either way. > >> So where I'm still lost is that I don't understand how Copas, Xavante, >> FastCGI or any other technology is able to read (in parallel) a Lua table >> that exists in my dedicated persistent Lua memory process. > > Here is where HTTP could help you. On (2) you could use any socket > dispatcher (for example Copas) to implement a listener (aka a server) > and a protocol handler. If that protocol was HTTP, you could then use > Xavante as the handler, since it naturally works with Copas so you > would have to just write the "web part" of the problem. > > Note that you would be talking about two "web parts", one between the > client/browser/user and your site (1) and another between (1) and the > back end (2). > > So assuming (1) was implemented as say an Orbit application over > Apache with FastCGI and (2) implemented embedding Xavante in your > current application, the communication would go something like: > > Client -> HTTP -> Apache -> WSAPI -> Orbit -> Web App (1) > > This would get the client request into your public web application in > (1), where you could extract parameters and then use LuaSocket's > http.request() to call the web server in (2) using those parameters: > > http.request -> HTTP -> Xavante -> WSAPI -> Web App (2) > > Now your parameters have been transported through HTTP to the internal > server (Xavante), embedded into your application (2). There you would > access your table, find the shortest path for the received parameters > and then return the results using wsapi.response, still on (2). This > would be the data received by the http.request call by (1) done in the > previous step. > > With that data now in (1), you could process the results somehow and > generate your HTML response using wsapi.response in (1). This would > follow through HTTP to the client. > > Confusing? :o) > >> Please forgive my attempt at clarification if you feel you have already >> addressed that point, but I just wanted to confirm that we're on the same >> page before I do a deep dive to explore the products you have pointed me to. > > No need for apologies, I'm also having a bad time explaining things... > >> If you can confirm that either Copas or Xavante would (with customization) >> be able to gain parallel, read-only access to this huge, in-process Lua >> table, then I'll dig into those products and figure out the details to make >> my shortest-path code run against just one memory instance of this data. > > Note that while I'm proposing using Xavante (or just Copas), this > would not involve direct access to the table. Xavante (since it uses > Copas) can handle multiple requests at the same time, but it > multitasks collaboratively. > >> I'm not a low level programmer so perhaps it's easy stuff and I just don't >> realize how easy it is? > > I never said it was easy! :o) > >> Thanks very much for pointing me in the right direction. > > You are welcome, let me know if this made any sense. > > André > > _______________________________________________ > Kepler-Project mailing list > Kepler-Project@lists.luaforge.net > http://lists.luaforge.net/cgi-bin/mailman/listinfo/kepler-project > http://www.keplerproject.org/ _______________________________________________ Kepler-Project mailing list Kepler-Project@lists.luaforge.net http://lists.luaforge.net/cgi-bin/mailman/listinfo/kepler-project http://www.keplerproject.org/