Author: adrian.chadd
Date: Wed Jun 17 00:08:27 2009
New Revision: 14098

Modified:
    wiki/LuscaArchitecture.wiki

Log:
Edited wiki page through web user interface.

Modified: wiki/LuscaArchitecture.wiki
==============================================================================
--- wiki/LuscaArchitecture.wiki (original)
+++ wiki/LuscaArchitecture.wiki Wed Jun 17 00:08:27 2009
@@ -6,7 +6,7 @@

  It uses POSIX threads for disk IO and external processes for a variety of  
tasks (logfile writing, ACL helpers, authentication helpers, URL rewriting,  
etc.)

-This architecture overview t is not designed to be completely thorough; it  
is meant to introduce the general structure and data flow of Lusca.
+This architecture overview is not designed to be completely thorough; it  
is meant to introduce the general structure and data flow of Lusca. The aim  
is to document the various APIs using HeaderDoc in the source code itself;  
this documentation just acts as a higher level introduction to what goes  
where.

  = Sections =

@@ -32,6 +32,13 @@
    * [LuscaArchitectureStoreDisk] - the store disk layer

    * [LuscaArchitectureStoreShortcomings] - shortcomings in the current  
storage layer
+
+== Network framework ==
+
+  * [LuscaArchitectureNetworkIntroduction] - the basic network and  
communication overview
+  * [LuscaArchitectureNetworkTransparentInterception] - transparent  
interception related changes
+  * [LuscaArchitectureNetworkReadingWriting] - reading and writing network  
data
+  * [LuscaArchitectureNetworkCloseHandlers] - the processing which occurs  
on socket and filedescriptor close


  == HTTP processing framework ==

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"lusca-commit" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/lusca-commit?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to