Hi David.
Here are few patches. The first should be good and I have sent you before. But
there was no reply.
The last one is my effort to rework URL module to support multi threading.
Actual
download is done in a separate thread. And result is reported through MVar's. I
would like to get comments on the idea in general and the implementation. Patch
was done on top of my previous "not so good" patches. I am not sure what is the
best way to fix this. Any advices?
Also, "make unit" fails to compile src/Darcs/Patch/Test.lhs:
src/Darcs/Patch/Test.lhs:195:49:
No instance for (Arbitrary Char)
arising from a use of `unempty'
at src/Darcs/Patch/Test.lhs:195:49-55
Possible fix: add an instance declaration for (Arbitrary Char)
In the fifth argument of `liftM4', namely `unempty'
In the expression: liftM4 patchinfo unempty unempty unempty unempty
In the definition of `arbpi':
arbpi = liftM4 patchinfo unempty unempty unempty unempty
So I can not run unit tests. But "make test" passes all tests.
Regards,
Dmitry
Sat Aug 9 19:48:34 MSD 2008 Dmitry Kurochkin <[EMAIL PROTECTED]>
* Add --debug-http flag to enable curl and libwww debug at run-time instead
of compile-time.
Sun Aug 10 02:17:55 MSD 2008 Dmitry Kurochkin <[EMAIL PROTECTED]>
* Handle error case with empty URL in URL.waitNextUrl function.
Sun Aug 10 13:28:10 MSD 2008 Dmitry Kurochkin <[EMAIL PROTECTED]>
* Add thread synchronization to URL module and resume select() if interrupted
by signal in curl module.
Tue Aug 12 02:12:09 MSD 2008 Dmitry Kurochkin <[EMAIL PROTECTED]>
* Rework URL module for multi threading.
New patches:
[Add --debug-http flag to enable curl and libwww debug at run-time instead of compile-time.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080809154834] hunk ./configure.ac 524
-
- # Check if we want curl debug enabled...
- AC_MSG_CHECKING([whether to do libcurl debugging])
- AC_ARG_ENABLE(curl-debug,
- AS_HELP_STRING([--enable-curl-debug],[enable libcurl debug mode]),
- curl_debug=$enableval, curl_debug=no)
- AC_MSG_RESULT($curl_debug)
- if test "$curl_debug" = "yes"; then
- CPPFLAGS="$CPPFLAGS -DCURL_DEBUG"
- fi
hunk ./configure.ac 539
-
- # Check if we want libwww debug enabled...
- AC_MSG_CHECKING([whether to do libwww debugging])
- AC_ARG_ENABLE(libwww-debug,
- AS_HELP_STRING([--enable-libwww-debug],[enable libwww debug mode]),
- libwww_debug=$enableval, libwww_debug=no)
- AC_MSG_RESULT($libwww_debug)
- if test "$libwww_debug" = "yes"; then
- CPPFLAGS="$CPPFLAGS -DLIBWWW_DEBUG"
- fi
hunk ./src/Darcs/Arguments.lhs 332
---debug
+--debug, --debug-http
hunk ./src/Darcs/Arguments.lhs 336
-would not be interesting.
+would not be interesting. Option \verb!--debug-http! makes darcs output debugging
+info for curl and libwww.
hunk ./src/Darcs/Arguments.lhs 345
+ DarcsNoArgOption [] ["debug-http"] DebugHTTP
+ "give debug output for curl and libwww",
hunk ./src/Darcs/Commands.lhs 57
+import URL ( setDebugHTTP )
hunk ./src/Darcs/Commands.lhs 341
+ when (DebugHTTP `elem` os) setDebugHTTP
hunk ./src/Darcs/Flags.lhs 28
- | Timings | Debug | DebugVerbose | Verbose | NormalVerbosity | Quiet
+ | Timings | Debug | DebugVerbose | DebugHTTP
+ | Verbose | NormalVerbosity | Quiet
hunk ./src/URL.hs 2
-module URL ( copyUrl, copyUrlFirst, waitUrl,
+module URL ( copyUrl, copyUrlFirst, setDebugHTTP, waitUrl,
hunk ./src/URL.hs 12
-import Autoconf ( have_libwww )
+import Autoconf ( have_libcurl, have_libwww )
hunk ./src/URL.hs 173
+setDebugHTTP :: IO ()
+setDebugHTTP | have_libwww = libwww_enable_debug
+ | have_libcurl = curl_enable_debug
+ | otherwise = debugMessage ("URL.setDebugHttp works only with curl and libwww")
+
hunk ./src/URL.hs 187
+
+foreign import ccall "hscurl.h curl_enable_debug"
+ curl_enable_debug :: IO ()
hunk ./src/URL.hs 202
+
+curl_enable_debug :: IO ()
+curl_enable_debug = no_curl
hunk ./src/URL.hs 216
+
+foreign import ccall "hslibwww.h libwww_enable_debug"
+ libwww_enable_debug :: IO ()
hunk ./src/URL.hs 231
+
+libwww_enable_debug :: IO ()
+libwww_enable_debug = no_libwww
hunk ./src/hscurl.c 44
+static int debug = 0;
hunk ./src/hscurl.c 140
-#ifdef CURL_DEBUG
- error = curl_easy_setopt(easy, CURLOPT_VERBOSE, 1);
- if (error != CURLE_OK)
- return curl_easy_strerror(error);
-#endif
+ if (debug)
+ {
+ error = curl_easy_setopt(easy, CURLOPT_VERBOSE, 1);
+ if (error != CURLE_OK)
+ return curl_easy_strerror(error);
+ }
hunk ./src/hscurl.c 289
+void curl_enable_debug()
+{
+ debug = 1;
+}
+
hunk ./src/hscurl.h 9
+void curl_enable_debug();
+
hunk ./src/hslibwww.c 45
+static BOOL debug = NO;
hunk ./src/hslibwww.c 95
-#ifdef LIBWWW_DEBUG
- HTSetTraceMessageMask("sop");
-#endif
+ if (debug == YES)
+ HTSetTraceMessageMask("sop");
hunk ./src/hslibwww.c 179
+void libwww_enable_debug()
+{
+ debug = YES;
+}
+
hunk ./src/hslibwww.h 9
+void libwww_enable_debug();
+
[Handle error case with empty URL in URL.waitNextUrl function.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080809221755] hunk ./src/URL.hs 112
- else when (not $ null url) $ case Map.lookup url p of
+ else case Map.lookup url p of
[Add thread synchronization to URL module and resume select() if interrupted by signal in curl module.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080810092810] hunk ./src/URL.hs 11
+import Control.Concurrent.MVar ( newMVar, putMVar, takeMVar, MVar )
+import Control.Exception ( bracket_ )
hunk ./src/URL.hs 70
+urlMutex :: MVar ()
+urlMutex = unsafePerformIO $ newMVar ()
+
hunk ./src/URL.hs 74
-copyUrlWithPriority prio url file c = do
+copyUrlWithPriority prio url file c = withUrlMutex $ do
hunk ./src/URL.hs 102
-waitNextUrl = do st <- readIORef urlState
- let l = pipeLength st
- if l > 0
- then do err <- waitNextUrl'
- url <- lastUrl'
- let p = inProgress st
- new_st = st { inProgress = Map.delete url p,
- pipeLength = l - 1
- }
- if null err
- then
- case Map.lookup url p of
- Just f -> do renameFile (f++"-new") f
- debugMessage $ "URL.waitNextUrl succeeded: "++url++" "++f
- Nothing -> bug $ "Possible bug in URL.waitNextUrl: "++url
- else case Map.lookup url p of
- Just f -> do removeFileMayNotExist (f++"-new")
- debugMessage $ "URL.waitNextUrl failed: "++
- url++" "++f++" "++err
- Nothing -> bug $ "Another possible bug in URL.waitNextUrl: "++url++" "++err
- when (not $ null url) $ do
- writeIORef urlState new_st
- checkWaitToStart
- return (url, if null err then Nothing else Just err)
- else return ("", Nothing)
+waitNextUrl = withUrlMutex $ do
+ st <- readIORef urlState
+ let l = pipeLength st
+ if l > 0
+ then do err <- waitNextUrl'
+ url <- lastUrl'
+ let p = inProgress st
+ new_st = st { inProgress = Map.delete url p,
+ pipeLength = l - 1
+ }
+ if null err
+ then case Map.lookup url p of
+ Just f -> do renameFile (f++"-new") f
+ debugMessage $ "URL.waitNextUrl succeeded: "++url++" "++f
+ Nothing -> bug $ "Possible bug in URL.waitNextUrl: "++url
+ else case Map.lookup url p of
+ Just f -> do removeFileMayNotExist (f++"-new")
+ debugMessage $ "URL.waitNextUrl failed: "++
+ url++" "++f++" "++err
+ Nothing -> bug $ "Another possible bug in URL.waitNextUrl: "++url++" "++err
+ when (not $ null url) $ do
+ writeIORef urlState new_st
+ checkWaitToStart
+ return (url, if null err then Nothing else Just err)
+ else return ("", Nothing)
hunk ./src/URL.hs 183
+withUrlMutex :: IO a -> IO a
+withUrlMutex j = bracket_ (takeMVar urlMutex) (putMVar urlMutex ()) j
+
hunk ./src/hscurl.c 6
+#include <errno.h>
hunk ./src/hscurl.c 101
- if (select(max_fd + 1, &fd_read, &fd_write, &fd_except, &tval) < 0)
- return error_strings[RESULT_SELECT_FAIL];
+ while (select(max_fd + 1, &fd_read, &fd_write, &fd_except, &tval) < 0)
+ if (errno != EINTR)
+ return error_strings[RESULT_SELECT_FAIL];
[Rework URL module for multi threading.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080811221209] hunk ./src/URL.hs 6
-import Data.IORef
hunk ./src/URL.hs 10
-import Control.Concurrent.MVar ( newMVar, putMVar, takeMVar, MVar )
-import Control.Exception ( bracket_ )
+import Control.Concurrent ( forkOS )
+import Control.Concurrent.Chan ( isEmptyChan, newChan, readChan, writeChan, Chan )
+import Control.Concurrent.MVar ( modifyMVar, modifyMVar_, newEmptyMVar, newMVar, putMVar, takeMVar, MVar )
hunk ./src/URL.hs 14
+import Control.Monad.Trans ( liftIO )
+import Control.Monad.State ( evalStateT, get, put, StateT )
hunk ./src/URL.hs 30
+data UrlRequest = UrlRequest { url :: String
+ , file :: FilePath
+ , cachable :: Cachable
+ , priority :: Priority
+ , notifyVar :: MVar String }
+
hunk ./src/URL.hs 38
-data UrlState = UrlState { inProgress :: Map String FilePath,
- waitToStart :: Q (String, String, Cachable),
- pipeLength :: Int
- }
+data UrlState = UrlState { inProgress :: Map String (FilePath, (MVar String))
+ , waitToStart :: Q (String, FilePath, Cachable)
+ , pipeLength :: Int }
hunk ./src/URL.hs 64
+nullQ :: Q a -> Bool
+nullQ (Q [] []) = True
+nullQ _ = False
+
hunk ./src/URL.hs 78
-urlState :: IORef UrlState
-urlState = unsafePerformIO $ newIORef (UrlState Map.empty emptyQ 0)
+urlNotifications :: MVar (Map String (MVar String))
+urlNotifications = unsafePerformIO $ newMVar Map.empty
hunk ./src/URL.hs 81
-urlMutex :: MVar ()
-urlMutex = unsafePerformIO $ newMVar ()
+urlChan :: Chan UrlRequest
+urlChan = unsafePerformIO $ do
+ ch <- newChan
+ forkOS (urlThread ch)
+ return ch
hunk ./src/URL.hs 87
-copyUrlWithPriority :: Priority -> String -> String -> Cachable -> IO ()
-copyUrlWithPriority prio url file c = withUrlMutex $ do
- debugMessage ("URL.copyUrlWithPriority ("++url++"\n"++
- " -> "++file++")")
- st <- readIORef urlState
- let p = inProgress st
- w = waitToStart st
- new_w = case prio of
- High -> pushQ (url, file, c) w
- Low -> insertQ (url, file, c) w
- new_st = st { inProgress = Map.insert url file p,
- waitToStart = new_w
- }
- if Map.member url p
- then if prio == High && (url, file, c) `elemQ` w
- then do writeIORef urlState (st { waitToStart = pushQ (url, file, c) (deleteQ (url, file, c) w) })
- debugMessage $ "Moving "++url++" to head of download queue."
- checkWaitToStart
- else debugMessage "Ignoring copyUrlWithPriority of file that's already queued."
- else do writeIORef urlState new_st
- checkWaitToStart
+urlThread :: Chan UrlRequest -> IO ()
+urlThread ch = evalStateT urlThread' (UrlState Map.empty emptyQ 0)
+ where urlThread' = do
+ empty <- liftIO $ isEmptyChan ch
+ st <- get
+ let l = pipeLength st
+ w = waitToStart st
+ reqs <- if not empty || (nullQ w && l == 0)
+ then readAllRequests
+ else return []
+ mapM_ addReq reqs
+ checkWaitToStart
+ waitNextUrl
+ urlThread'
+ readAllRequests = do r <- liftIO $ readChan ch
+ dbg $ "URL.urlThread ("++(url r)++"\n"++
+ " -> "++(file r)++")"
+ empty <- liftIO $ isEmptyChan ch
+ reqs <- if not empty
+ then readAllRequests
+ else return []
+ return (r:reqs)
+ addReq r = do st <- get
+ let p = inProgress st
+ w = waitToStart st
+ e = (url r, file r, cachable r)
+ new_w = case priority r of
+ High -> pushQ e w
+ Low -> insertQ e w
+ new_st = st { inProgress = Map.insert (url r) (file r, notifyVar r) p
+ , waitToStart = new_w }
+ if Map.member (url r) p
+ then if e `elemQ` w
+ then if priority r == High
+ then do put $ st { waitToStart = pushQ e (deleteQ e w) }
+ dbg $ "Moving "++(url r)++" to head of download queue."
+ else dbg "Ignoring UrlRequest of file that's already queued."
+ else bug $ "URL.urlThread: same URLs with different parameters "++(url r)++" "++(file r)
+ else put new_st
+
+checkWaitToStart :: StateT UrlState IO ()
+checkWaitToStart = do
+ st <- get
+ let l = pipeLength st
+ when (l < maxPipeLength) $ do
+ let w = waitToStart st
+ case readQ w of
+ Just ((u,f,c),rest) -> do
+ put $ st { waitToStart = rest
+ , pipeLength = l + 1 }
+ err <- liftIO $ requestUrl u f c
+ when (not $ null err) $
+ (dbg $ "Failed to start download URL "
+ ++u++": "++err)
+ checkWaitToStart
+ _ -> return ()
+ where requestUrl u f cache =
+ withCString u $ \ustr ->
+ withCString (f++"-new") $ \fstr -> do
+ atexit $ removeFileMayNotExist (f++"-new")
+ debugMessage ("URL.requestUrl ("++u++"\n"++
+ " -> "++f++")")
+ let fn = if have_libwww then libwww_request_url
+ else curl_request_url
+ err <- fn ustr fstr (cachableToInt cache) >>= peekCString
+ when (null err) (debugMessage "URL.requestUrl succeeded")
+ return err
hunk ./src/URL.hs 155
-copyUrlFirst :: String -> String -> Cachable -> IO ()
+copyUrlFirst :: String -> FilePath -> Cachable -> IO ()
hunk ./src/URL.hs 158
-copyUrl :: String -> String -> Cachable -> IO ()
+copyUrl :: String -> FilePath -> Cachable -> IO ()
hunk ./src/URL.hs 161
-waitNextUrl :: IO (String, Maybe String)
-waitNextUrl = withUrlMutex $ do
- st <- readIORef urlState
+copyUrlWithPriority :: Priority -> String -> String -> Cachable -> IO ()
+copyUrlWithPriority p u f c = do
+ v <- newEmptyMVar
+ let r = UrlRequest u f c p v
+ writeChan urlChan r
+ let fn _ old_val = old_val
+ modifyMVar_ urlNotifications (return . (Map.insertWith fn u v))
+
+waitNextUrl :: StateT UrlState IO ()
+waitNextUrl = do
+ st <- get
hunk ./src/URL.hs 173
- if l > 0
- then do err <- waitNextUrl'
- url <- lastUrl'
- let p = inProgress st
- new_st = st { inProgress = Map.delete url p,
- pipeLength = l - 1
- }
- if null err
- then case Map.lookup url p of
- Just f -> do renameFile (f++"-new") f
- debugMessage $ "URL.waitNextUrl succeeded: "++url++" "++f
- Nothing -> bug $ "Possible bug in URL.waitNextUrl: "++url
- else case Map.lookup url p of
- Just f -> do removeFileMayNotExist (f++"-new")
- debugMessage $ "URL.waitNextUrl failed: "++
- url++" "++f++" "++err
- Nothing -> bug $ "Another possible bug in URL.waitNextUrl: "++url++" "++err
- when (not $ null url) $ do
- writeIORef urlState new_st
- checkWaitToStart
- return (url, if null err then Nothing else Just err)
- else return ("", Nothing)
- where waitNextUrl' = do let fn = if have_libwww then libwww_wait_next_url
- else curl_wait_next_url
- err <- fn >>= peekCString
- return err
- lastUrl' = let fn = if have_libwww then libwww_last_url
- else curl_last_url
- in fn >>= peekCString
+ when (l > 0) $ do
+ e <- liftIO $ waitNextUrl'
+ u <- liftIO $ lastUrl'
+ let p = inProgress st
+ new_st = st { inProgress = Map.delete u p
+ , pipeLength = l - 1 }
+ if null e
+ then case Map.lookup u p of
+ Just (f, v) -> do liftIO $ renameFile (f++"-new") f
+ liftIO $ putMVar v e
+ dbg $ "URL.waitNextUrl succeeded: "++u++" "++f
+ Nothing -> bug $ "Possible bug in URL.waitNextUrl: "++u
+ else case Map.lookup u p of
+ Just (f, v) -> do liftIO $ removeFileMayNotExist (f++"-new")
+ liftIO $ putMVar v e
+ dbg $ "URL.waitNextUrl failed: "++
+ u++" "++f++" "++e
+ Nothing -> bug $ "Another possible bug in URL.waitNextUrl: "++u++" "++e
+ when (not $ null u) $ put new_st
+ where waitNextUrl' = let fn = if have_libwww then libwww_wait_next_url
+ else curl_wait_next_url
+ in fn >>= peekCString
+ lastUrl' = let fn = if have_libwww then libwww_last_url
+ else curl_last_url
+ in fn >>= peekCString
hunk ./src/URL.hs 200
-waitUrl u = do st <- readIORef urlState
- when (u `Map.member` inProgress st) waitUrl'
- debugMessage ("URL.waitUrl "++u++" succeeded")
- where waitUrl' = do (url, merr) <- waitNextUrl
- if u /= url
- then waitUrl'
- else case merr of
- Just err -> debugFail $ "Failed to download URL "++url++": "++err
- Nothing -> return ()
+waitUrl u = do let fn m = let (a, b) = Map.updateLookupWithKey (\_ _ -> Nothing) u m
+ in (b, a)
+ r <- modifyMVar urlNotifications (return.fn)
+ debugMessage $ "waitUrl "++u
+ case r of
+ Just var -> takeMVar var
+ _ -> bug $ "Possible bug in URL.waitUrl: "++u
+ return ()
hunk ./src/URL.hs 209
-checkWaitToStart :: IO ()
-checkWaitToStart = do st <- readIORef urlState
- let l = pipeLength st
- when (l < maxPipeLength) $
- do let w = waitToStart st
- case readQ w of
- Just ((u,f,c),rest) -> do
- let new_st = st { waitToStart = rest,
- pipeLength = l + 1
- }
- writeIORef urlState new_st
- err <- requestUrl u f c
- when (not $ null err) (debugFail $ "Failed to start download URL "
- ++u++": "++err)
- checkWaitToStart
- _ -> return ()
- where requestUrl u f cache = withCString u $ \ustr ->
- withCString (f++"-new") $ \fstr -> do
- atexit $ removeFileMayNotExist (f++"-new")
- debugMessage ("URL.requestUrl ("++u++"\n"++
- " -> "++f++")")
- let fn = if have_libwww then libwww_request_url
- else curl_request_url
- err <- fn ustr fstr (cachableToInt cache) >>= peekCString
- when (null err) (debugMessage "URL.requestUrl succeeded")
- return err
+dbg :: String -> StateT a IO ()
+dbg = liftIO . debugMessage
hunk ./src/URL.hs 222
-withUrlMutex :: IO a -> IO a
-withUrlMutex j = bracket_ (takeMVar urlMutex) (putMVar urlMutex ()) j
-
Context:
[Print a warning when the remote end does not have darcs 2.
Eric Kow <[EMAIL PROTECTED]>**20080811100933
Two reasons:
(1) right now people get a scary warning from ssh when it can't fetch
some non-essential files (it used to be that we would send stderr from ssh
to /dev/null, but that has other problems...)
(2) darcs transfer-mode more widely deployed could help a lot of people
wrt darcs performance
]
[Added a beware note to the unrecord command
[EMAIL PROTECTED]
[Fixed typo
[EMAIL PROTECTED]
[make Convert.lhs compile.
David Roundy <[EMAIL PROTECTED]>**20080810201725]
[improve type safety of Darcs.Repository.Internal.
Jason Dagit <[EMAIL PROTECTED]>**20080810051109]
[Refactor `darcs convert' warning at kowey's request.
Trent W. Buck <[EMAIL PROTECTED]>**20080810110014]
[Expand formats text based in part on suggestions from darcs-users
Max Battcher <[EMAIL PROTECTED]>**20080809184043]
[Fixes to global cache text based on darcs-users suggestions
Max Battcher <[EMAIL PROTECTED]>**20080809181424]
[Add user-focused documentation of repository format options
Max Battcher <[EMAIL PROTECTED]>**20080807195429]
[Highlight the global cache as a best practice
Max Battcher <[EMAIL PROTECTED]>**20080807193918]
[Describe best practice in `darcs convert --help'.
Trent W. Buck <[EMAIL PROTECTED]>**20080810110615]
[add type witnesses to Population
Jason Dagit <[EMAIL PROTECTED]>**20080808053252]
[add type witnesses to CommandsAux
Jason Dagit <[EMAIL PROTECTED]>**20080808052738]
[Add type witnesses to more modules, rounding out Darcs/Repository/*
Jason Dagit <[EMAIL PROTECTED]>**20080808050947]
[fixed a bug in identity_commutes property
Jason Dagit <[EMAIL PROTECTED]>**20080808023025
In the right identity check the patch order should have gone from
(identity :> p) to (p2 :> i2). I added a rigid type context too
so that ghc 6.8 and newer would type the definition.
]
[Make Darcs.Repository.Internal compile with type witnesses.
Jason Dagit <[EMAIL PROTECTED]>**20080808015343]
[Better debug messages in URL module.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080809215247]
[UF8.lhs: remove unusued functions/imports/docs
[EMAIL PROTECTED]
[Resolve issue974 : do not pass both -optc-g and -opta-g to GHC
Eric Kow <[EMAIL PROTECTED]>**20080807073620]
[make this test more cross-platform
Simon Michael <[EMAIL PROTECTED]>**20080807103433]
[document how to run unit tests
Simon Michael <[EMAIL PROTECTED]>**20080807030416]
[move (most) failing tests to bugs for clean test output
Simon Michael <[EMAIL PROTECTED]>**20080806191336]
[fix an old spelling error
Simon Michael <[EMAIL PROTECTED]>**20080806170432]
[make searching for "test:" in makefile work
Simon Michael <[EMAIL PROTECTED]>**20080805222241]
[run only normal (expected to pass) tests by default
Simon Michael <[EMAIL PROTECTED]>**20080805222108]
[Downplay quantum mechanics link.
Eric Kow <[EMAIL PROTECTED]>**20080806124109
Besides, darcs has far more than 3 users by now.
]
[Make patch theory intro more inviting to math people.
Eric Kow <[EMAIL PROTECTED]>**20080806123411]
[cleanup and slight rewrite of the test docs
Simon Michael <[EMAIL PROTECTED]>**20080806165949]
[make order of running tests consistent
Simon Michael <[EMAIL PROTECTED]>**20080806172123]
[small makefile refactoring: allow just the normal tests to be run, without bugs/*
Simon Michael <[EMAIL PROTECTED]>**20080805203242]
[Rectify dist help
[EMAIL PROTECTED]
Removed the "make dist" suggestion, the manual is a better place for that.
Instead, make clear that it operates on a clean copy of the tree, and
mention the "predist" functionality.
]
[website: explain that darcs 2 is required to get the darcs source.
Simon Michael <[EMAIL PROTECTED]>**20080803181216]
[Canonize Gaetan Lehmann and Daniel Buenzli.
Eric Kow <[EMAIL PROTECTED]>**20080730104357
(for Daniel B, avoid an accent in his name)
]
[configure: check for packages needed with split base.
Eric Kow <[EMAIL PROTECTED]>**20080730103840
Now that all packages must be used explicitly.
]
[fix type witness compile errors specific to ghc 6.8
Jason Dagit <[EMAIL PROTECTED]>**20080722182729]
[avoid import of unused function fromMaybe.
David Roundy <[EMAIL PROTECTED]>**20080729172825]
[configure: suggest regex-compat before text
Eric Kow <[EMAIL PROTECTED]>**20080725095336]
[configure: mention Haskell in 'try installing' suggestion
Eric Kow <[EMAIL PROTECTED]>**20080725095015]
[Typo (Text.Regex)
Eric Kow <[EMAIL PROTECTED]>**20080715121708]
[Use haskeline to have a readline-like behavior when asking something to the user
[EMAIL PROTECTED]
Unlike the implementations using readline or editline packages, this code
code doesn't break the Ctrl-C behavior.
]
[Improve generic rules for English plurals.
Eric Kow <[EMAIL PROTECTED]>**20080604123728]
[add configure check for Network.URI.
David Roundy <[EMAIL PROTECTED]>**20080711011914]
[add -hide-all-packages to default GHCFLAGS.
David Roundy <[EMAIL PROTECTED]>**20080711010952]
[add support for outputting patch numbers in darcs changes.
David Roundy <[EMAIL PROTECTED]>**20080710011211]
[add support for matching single patches by index.
David Roundy <[EMAIL PROTECTED]>**20080710004512]
[add support for matching ranges of patches (counting back from present).
David Roundy <[EMAIL PROTECTED]>**20080710003225]
[Better avoid silly manpage error.
Trent W. Buck <[EMAIL PROTECTED]>**20080704024920
It turned out only initialize's help string used 'quotes', so just
remove them. This makes init's docstring consistent with the others.
]
[Missing period at end of sentence.
Trent W. Buck <[EMAIL PROTECTED]>**20080704024232]
[darcs --overview no longer works, so don't document it.
Trent W. Buck <[EMAIL PROTECTED]>**20080704030804]
[Avoid silly manpage error.
Trent W. Buck <[EMAIL PROTECTED]>**20080703010733
man (nroff) treats an apostrophe in the first column specially,
resulting in a syntax error without this patch.
Ideally, all cases of 'foo' in the manpage (i.e. docstrings) should
become `foo', since man -Tps turns ` and ' into left and right single
quotes respectively.
]
[obliterate whitespace in Darcs.Commands.Get
[EMAIL PROTECTED]
'twas causing lhs/haddock difficulties where a \end{code} wasn't getting recognized.
]
[rm haddock CPP business
[EMAIL PROTECTED]
Try as I might, I can't see any reason to special-case some Haddock CPP logic to deal with some *commented-out guards*, unless CPP magically restores and uncomments the code if Haddock isn't being run.
]
[make pull less verbose when --verbose flag is given.
David Roundy <[EMAIL PROTECTED]>**20080624170035]
[fix makefile to remember to regenerate version information after running configure.
David Roundy <[EMAIL PROTECTED]>**20080624170001]
[TAG 2.0.2
David Roundy <[EMAIL PROTECTED]>**20080624012041]
Patch bundle hash:
40677319d869bab1699a76fbf687d1ee3b37c59a
_______________________________________________
darcs-users mailing list
[email protected]
http://lists.osuosl.org/mailman/listinfo/darcs-users