Hello.

This patches implement new API for Curl module. Libwww.hs and Curl.hs
are very similar so I merged them into URL.hs.

Also --with-curl-pipelining configure option added.

And libwww now follows redirects. This fixes issue621.

This are pretty big changes and touch well tested curl code. But it
seems to work well for me. So I hope I managed not to break everything :)

Regards,
  Dmitry

Sun Jan 27 18:17:56 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
  * Rework libcurl module: use multi interface, support pipelining.

Sun Jan 27 18:27:10 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
  * Cleanup libwww module, better error handling, follow redirects (closes 
issue621).

Sun Jan 27 21:15:48 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
  * Fix darcs version in libwww user agent.

Sun Jan 27 21:20:01 MSK 2008  Dmitry Kurochkin <[EMAIL PROTECTED]>
  * Merge Curl and Libwww to URL module.

New patches:

[Rework libcurl module: use multi interface, support pipelining.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080127151756] {
hunk ./GNUmakefile 160
+ifeq ($(CURL_PIPELINING),True)
+GHCFLAGS += -DCURL_PIPELINING
+CPPFLAGS += -optc-DCURL_PIPELINING
+endif
hunk ./autoconf.mk.in 30
+CURL_PIPELINING := @CURL_PIPELINING@
hunk ./configure.ac 421
+
+AC_ARG_WITH(curl_pipelining,
+            AS_HELP_STRING([--with-curl-pipelining],[use curl pipelining (requires libcurl >= 7.12.0)]))
+
+CURL_PIPELINING=False
+
+if test "$with_curl_pipelining" == "yes"; then
+ AC_MSG_CHECKING([for libcurl pipelining])
+ if test "$HAVE_LIBCURL" == "True"; then
+   vernum=`curl-config --vernum`
+   if test $vernum -ge 071200; then
+     CURL_PIPELINING=True
+     AC_MSG_RESULT([yes])
+   else
+     AC_MSG_RESULT([no])
+     AC_MSG_RESULT([requires libcurl version >= 7.12.0])
+   fi
+ else
+   AC_MSG_RESULT([no])
+   AC_MSG_WARN([requires libcurl])
+ fi
+fi
+
+AC_SUBST(CURL_PIPELINING)
hunk ./src/Curl.hs 2
-module Curl ( copyUrl, Cachable(Cachable, Uncachable, MaxAge) )
+module Curl ( copyUrl, waitNextUrl, lastUrl,
+              Cachable(Cachable, Uncachable, MaxAge) )
hunk ./src/Curl.hs 6
-import System.IO
-import Foreign.C.Types ( CInt )
hunk ./src/Curl.hs 7
-import Foreign.C.String ( withCString, CString )
-import System.Environment (getEnv)
-import Autoconf (darcs_version)
+import Foreign.C.Types ( CInt )
+import Foreign.C.String ( withCString, peekCString, CString )
+import Monad ( when )
hunk ./src/Curl.hs 15
-copyUrl :: String -> String -> Cachable -> IO ()
+copyUrl :: String -> String -> Cachable -> IO String
hunk ./src/Curl.hs 18
-  withCString darcs_version $ \vstr ->
hunk ./src/Curl.hs 20
-  ppwd <- getProxyUserPwd
-  withCString ppwd $ \pstr -> do
hunk ./src/Curl.hs 22
-  err <- get_curl vstr pstr fstr ustr (cachableToInt cache)
-  if (err /= 0) then fail $ "Failed to download URL "++ u ++ " : " ++ curl_e err
-                else debugMessage "Curl.copyUrl succeeded"
-      where curl_e 1 = "unsupported protocol"
-            curl_e 3 = "malformed URL"
-            curl_e 5 = "couldn't resolve proxy"
-            curl_e 6 = "couldn't resolve host"
-            curl_e 7 = "couldn't connect to host"
-            curl_e 22 = "HTTP error (404?)"
-            curl_e 23 = "libcurl write error"
-            curl_e 46 = "bad password"
-            curl_e err = "libcurl error code: "++show err
+  err <- curl_request_url ustr fstr (cachableToInt cache) >>= peekCString
+  when (null err) (debugMessage "Curl.copyUrl succeeded")
+  return err
hunk ./src/Curl.hs 29
+waitNextUrl :: IO String
+#ifdef HAVE_CURL
+waitNextUrl = do err <- curl_wait_next_url >>= peekCString
+                 when (null err) (debugMessage "Curl.waitNextUrl succeeded")
+                 return err
+#else
+waitNextUrl = fail "There is no libcurl!"
+#endif
+
+lastUrl :: IO String
+#ifdef HAVE_CURL
+lastUrl = curl_last_url >>= peekCString
+#else
+lastUrl = fail "There is no libcurl!"
+#endif
+
hunk ./src/Curl.hs 51
-foreign import ccall "hscurl.h get_curl"
-  get_curl :: CString -> CString -> CString -> CString -> CInt -> IO CInt
+foreign import ccall "hscurl.h curl_request_url"
+  curl_request_url :: CString -> CString -> CInt -> IO CString
+
+foreign import ccall "hscurl.h curl_wait_next_url"
+  curl_wait_next_url :: IO CString
hunk ./src/Curl.hs 57
-getProxyUserPwd :: IO String
-getProxyUserPwd = do
- getEnv "DARCS_PROXYUSERPWD" `catch` (\_ -> return "")
+foreign import ccall "hscurl.h curl_last_url"
+  curl_last_url :: IO CString
hunk ./src/Darcs/External.hs 57
-import qualified Curl ( copyUrl )
hunk ./src/Darcs/External.hs 204
-copyRemoteNormal u v cache = if have_libwww
+copyRemoteNormal u v cache = if have_libwww || have_libcurl
hunk ./src/Darcs/External.hs 206
-                             else if have_libcurl
-                               then Curl.copyUrl u v cache
-                               else if HTTP.exists
-                                 then HTTP.copyUrl u v cache
-                                 else copyRemoteCmd u v
+                             else if HTTP.exists
+                                  then HTTP.copyUrl u v cache
+                                  else copyRemoteCmd u v
hunk ./src/Darcs/External.hs 270
-    if have_libwww
+    if have_libwww || have_libcurl
hunk ./src/Darcs/External.hs 273
-    else if (have_libcurl || have_HTTP)
+    else if have_HTTP
hunk ./src/URL.hs 11
+import qualified Curl as Curl ( copyUrl, waitNextUrl, lastUrl )
hunk ./src/URL.hs 13
+import Autoconf ( have_libwww )
hunk ./src/URL.hs 23
-maxPipeLength = 5
+maxPipeLength = if have_libwww then 5
+#ifdef CURL_PIPELINING
+                               else 5
+#else
+                               else 1
+#endif
hunk ./src/URL.hs 57
-                    then do err <- WWW.waitNextUrl
-                            url <- WWW.lastUrl
+                    then do err <- if have_libwww then WWW.waitNextUrl
+                                                  else Curl.waitNextUrl
+                            url <- if have_libwww then WWW.lastUrl
+                                                  else Curl.lastUrl
hunk ./src/URL.hs 67
-                            when (not $ null err) $ fail err
+                            when (not $ null err) (fail $ "Failed to download URL "
+                                                          ++url++" : "++err)
hunk ./src/URL.hs 89
-                                       err <- WWW.copyUrl u f c
-                                       when (not $ null err) $ fail err
+                                       err <- if have_libwww then WWW.copyUrl u f c
+                                                             else Curl.copyUrl u f c
+                                       when (not $ null err) (fail $ "Failed to start download URL "
+                                                                     ++u++" : "++err)
hunk ./src/hscurl.c 1
-#include <curl/curl.h>
-#include <curl/easy.h>
+#include "hscurl.h"
hunk ./src/hscurl.c 3
-#include <unistd.h>
-#include <stdio.h>
+#include <curl/curl.h>
hunk ./src/hscurl.c 7
-#include "hscurl.h"
+enum RESULT_CODES
+  {
+    RESULT_OK = 0,
+    RESULT_MALLOC_FAIL,
+    RESULT_SELECT_FAIL,
+    RESULT_MULTI_INIT_FAIL,
+    RESULT_EASY_INIT_FAIL,
+    RESULT_SLIST_APPEND_FAIL,
+    RESULT_NO_RUNNING_HANDLES,
+    RESULT_MULTI_INFO_READ_FAIL,
+    RESULT_UNKNOWN_MESSAGE,
+    RESULT_FILE_OPEN_FAIL
+  };
+
+static const char *error_strings[] =
+  {
+    "",
+    "malloc() failed",
+    "select() failed",
+    "curl_multi_init() failed",
+    "curl_easy_init() failed",
+    "curl_slist_append() failed",
+    "curl_multi_perform() - no running handles",
+    "curl_multi_info_read() failed",
+    "curl_multi_info_read() returned unknown message",
+    "fopen() failed"
+  };
+
+struct UrlData
+{
+  char *url;
+  FILE *file;
+  struct curl_slist *headers;
+};
+
+static const char user_agent[] =
+  "darcs/" PACKAGE_VERSION " libcurl/" LIBCURL_VERSION;
+static const char *proxypass;
+static int init_done = 0;
+static CURLM *multi = NULL;
+static int msgs_in_queue = 0;
+static char *last_url = NULL;
+
+static const char *perform()
+{
+  int error;
+  int running_handles, running_handles_last;
+  fd_set fd_read, fd_write, fd_except;
+  int max_fd;
+  long timeout;
+  struct timeval tval;
+
+  error = curl_multi_perform(multi, &running_handles);
+  if (error != CURLM_OK && error != CURLM_CALL_MULTI_PERFORM)
+    return curl_multi_strerror(error);
+  if (running_handles == 0)
+    return error_strings[RESULT_NO_RUNNING_HANDLES];
+
+  running_handles_last = running_handles;
+  while (1)
+    {
+      while (error == CURLM_CALL_MULTI_PERFORM)
+        error = curl_multi_perform(multi, &running_handles);
hunk ./src/hscurl.c 71
-static CURL *c = NULL;
-static int curl_init_done = 0;
-static char* user_agent = NULL;
-static char user_agent_fmt[] = "darcs/%s (libcurl/%s)";
+      if (error != CURLM_OK)
+        return curl_multi_strerror(error);
hunk ./src/hscurl.c 74
-// curl_easy_setopt requires that char * parameters must exist
-// between invocations. hence, we strdup the parameters, and keep
-// them here until a new value is used.
-char * persisted_user_agent = NULL;
-char * persisted_proxypass = NULL;
-char * persisted_url = NULL;
+      if (running_handles < running_handles_last)
+        break;
hunk ./src/hscurl.c 77
-CURLcode safe_curl_easy_setopt_str(CURL *c, CURLoption o, const char *val, char ** persisted_val) {
-  int err;
-  char * val_copy = NULL;
+      FD_ZERO(&fd_read);
+      FD_ZERO(&fd_write);
+      FD_ZERO(&fd_except);
hunk ./src/hscurl.c 81
-  if ( val != NULL ) {
-    val_copy = strdup(val);
-  }
+      error = curl_multi_fdset(multi, &fd_read, &fd_write, &fd_except, &max_fd);
+      if (error != CURLM_OK)
+        return curl_multi_strerror(error);
hunk ./src/hscurl.c 85
-  err = curl_easy_setopt(c, o, val_copy);
+      error = curl_multi_timeout(multi, &timeout);
+      if (error != CURLM_OK)
+        return curl_multi_strerror(error);
hunk ./src/hscurl.c 89
-  if ( *persisted_val != NULL )
-    free (*persisted_val);
+      if (timeout == -1)
+        timeout = 100;
hunk ./src/hscurl.c 92
-  *persisted_val = val_copy;
+      tval.tv_sec = timeout / 1000;
+      tval.tv_usec = timeout % 1000 * 1000;
hunk ./src/hscurl.c 95
-  return err;
+      if (select(max_fd + 1, &fd_read, &fd_write, &fd_except, &tval) < 0)
+        return error_strings[RESULT_SELECT_FAIL];
+
+      error = CURLM_CALL_MULTI_PERFORM;
+    }
+
+  return NULL;
hunk ./src/hscurl.c 104
-// get_curl returns an error code
-// cache_time is -1 for default cachability
-int get_curl(const char *darcs_version, const char *proxypass, const char *filename, const char *url, int cache_time) {
-  CURLcode err;
-  FILE *f;
-  struct curl_slist *headers = NULL;
+const char *curl_request_url(const char *url,
+                             const char *filename,
+                             int cache_time)
+{
+  int error;
+
+  if (init_done == 0)
+    {
+      error = curl_global_init(CURL_GLOBAL_ALL);
+      if (error != CURLE_OK)
+        return curl_easy_strerror(error);
+      proxypass = getenv("DARCS_PROXYUSERPWD");
+      init_done = 1;
+    }
+
+  if (multi == NULL)
+    {
+      multi = curl_multi_init();
+      if (multi == NULL)
+        return error_strings[RESULT_MULTI_INIT_FAIL];
+#ifdef CURL_PIPELINING
+      error = curl_multi_setopt(multi, CURLMOPT_PIPELINING, 1);
+      if (error != CURLM_OK)
+        return curl_multi_strerror(error);
+#endif
+    }
+
+  CURL *easy = curl_easy_init();
+  if (easy == NULL)
+    return error_strings[RESULT_EASY_INIT_FAIL];
+
+#if 0
+  error = curl_easy_setopt(easy, CURLOPT_VERBOSE, 1);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
+#endif
+
+  struct UrlData *url_data = malloc(sizeof(struct UrlData));
+  if (url_data == NULL)
+    return error_strings[RESULT_MALLOC_FAIL];
hunk ./src/hscurl.c 145
-  if(!curl_init_done) {
-      curl_global_init(CURL_GLOBAL_ALL);
-      curl_init_done=1;
-  }
-  if (c == NULL) c = curl_easy_init();
-  if (c == NULL) return -2;
-  if (NULL == user_agent) {
-    unsigned int uasize = strlen(user_agent_fmt) + strlen(darcs_version) + strlen(curl_version());
-    user_agent = malloc(uasize);  /* only allocated once; let program exit clean it up */
-    if (NULL != user_agent)
-      sprintf(user_agent, user_agent_fmt, darcs_version, curl_version());
-  }
+  url_data->url = strdup(url);
+  if (url_data->url == NULL)
+    return error_strings[RESULT_MALLOC_FAIL];
hunk ./src/hscurl.c 149
-  f = fopen(filename,"wb");
-  if (f == NULL) return -1;
+  url_data->file = fopen(filename,"wb");
+  if (url_data->file == NULL)
+    return error_strings[RESULT_FILE_OPEN_FAIL];
hunk ./src/hscurl.c 153
-  err = safe_curl_easy_setopt_str(c, CURLOPT_URL, url, &persisted_url);
-  /*curl_easy_setopt(c, CURLOPT_NOSIGNAL, NULL);*/
+  error = curl_easy_setopt(easy, CURLOPT_PRIVATE, url_data);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
+
+  error = curl_easy_setopt(easy, CURLOPT_URL, url_data->url);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
hunk ./src/hscurl.c 161
-  err += safe_curl_easy_setopt_str(c, CURLOPT_USERAGENT, (user_agent ? user_agent : "darcs"), &persisted_user_agent);
hunk ./src/hscurl.c 162
-  err += curl_easy_setopt(c, CURLOPT_WRITEDATA, f);
+  error = curl_easy_setopt(easy, CURLOPT_WRITEDATA, url_data->file);
hunk ./src/hscurl.c 164
-  err += curl_easy_setopt(c, CURLOPT_FILE, f);
+  error = curl_easy_setopt(easy, CURLOPT_FILE, url_data->file);
hunk ./src/hscurl.c 166
-  err += curl_easy_setopt(c, CURLOPT_FOLLOWLOCATION, 1);
-  err += curl_easy_setopt(c, CURLOPT_FAILONERROR, 1);
-  err += curl_easy_setopt(c, CURLOPT_HTTPAUTH, CURLAUTH_ANY);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
+
+  error = curl_easy_setopt(easy, CURLOPT_USERAGENT, user_agent);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
+
+  error = curl_easy_setopt(easy, CURLOPT_FOLLOWLOCATION, 1);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
+
+  error = curl_easy_setopt(easy, CURLOPT_FAILONERROR, 1);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
+
+  error = curl_easy_setopt(easy, CURLOPT_HTTPAUTH, CURLAUTH_ANY);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
+
hunk ./src/hscurl.c 188
-  headers = curl_slist_append(headers, "Accept: */*");
-  if(cache_time == 0) {
-      headers = curl_slist_append(headers, "Pragma: no-cache");
-      headers = curl_slist_append(headers, "Cache-Control: no-cache");
-  } else if(cache_time > 0) {
+  url_data->headers = curl_slist_append(NULL, "Accept: */*");
+  if(cache_time == 0)
+    {
+      url_data->headers =
+        curl_slist_append(url_data->headers, "Pragma: no-cache");
+      url_data->headers =
+        curl_slist_append(url_data->headers, "Cache-Control: no-cache");
+    }
+  else if(cache_time > 0)
+    {
hunk ./src/hscurl.c 200
-      snprintf(buf, 40, "Cache-Control: max-age=%d", cache_time);
-      headers = curl_slist_append(headers, "Pragma:");
-      headers = curl_slist_append(headers, buf);
-  } else {
-      headers = curl_slist_append(headers, "Pragma:");
-      headers = curl_slist_append(headers, "Cache-Control:");
-  }
-  err += curl_easy_setopt(c, CURLOPT_HTTPHEADER, headers);
+      snprintf(buf, sizeof(buf), "Cache-Control: max-age=%d", cache_time);
+      buf[sizeof(buf) - 1] = '\n';
+      url_data->headers = curl_slist_append(url_data->headers, "Pragma:");
+      url_data->headers = curl_slist_append(url_data->headers, buf);
+    }
+  else
+    {
+      url_data->headers = curl_slist_append(url_data->headers, "Pragma:");
+      url_data->headers = curl_slist_append(url_data->headers, "Cache-Control:");
+    }
+  if (url_data->headers == NULL)
+    return error_strings[RESULT_SLIST_APPEND_FAIL];
+
+  error = curl_easy_setopt(easy, CURLOPT_HTTPHEADER, url_data->headers);
+  if (error != CURLE_OK)
+    return curl_easy_strerror(error);
hunk ./src/hscurl.c 217
-  /* don't set the proxy user and password to an empty string */
hunk ./src/hscurl.c 218
-    err += safe_curl_easy_setopt_str(c, CURLOPT_PROXYUSERPWD, proxypass, &persisted_proxypass);
+    {
+      error = curl_easy_setopt(easy, CURLOPT_PROXYUSERPWD, proxypass);
+      if (error != CURLE_OK)
+        return curl_easy_strerror(error);
+    }
+
+  error = curl_multi_add_handle(multi, easy);
+  if (error != CURLM_OK)
+    return curl_multi_strerror(error);
+
+  return error_strings[RESULT_OK];
+}
+
+const char *curl_wait_next_url()
+{
+  if (last_url != NULL)
+    {
+      free(last_url);
+      last_url = NULL;
+    }
+
+  if (msgs_in_queue == 0)
+    {
+      const char *error = perform();
+      if (error != NULL)
+        return error;
+    }
+
+  CURLMsg *msg = curl_multi_info_read(multi, &msgs_in_queue);
+  if (msg == NULL)
+    return error_strings[RESULT_MULTI_INFO_READ_FAIL];
+
+  if (msg->msg == CURLMSG_DONE)
+    {
+      CURL *easy = msg->easy_handle;
+      CURLcode result = msg->data.result;
+      struct UrlData *url_data;
+      int error = curl_easy_getinfo(easy, CURLINFO_PRIVATE, &url_data);
+      if (error != CURLE_OK)
+        return curl_easy_strerror(error);
hunk ./src/hscurl.c 259
-  if(!err)
-      err += curl_easy_perform(c);
-  curl_slist_free_all(headers);
-  fclose(f);
-  return (int)err;
+      last_url = url_data->url;
+      fclose(url_data->file);
+      curl_slist_free_all(url_data->headers);
+      free(url_data);
+
+      error = curl_multi_remove_handle(multi, easy);
+      if (error != CURLM_OK)
+        return curl_multi_strerror(error);
+      curl_easy_cleanup(easy);
+
+      if (result != CURLE_OK)
+        return curl_easy_strerror(result);
+    }
+  else
+    return error_strings[RESULT_UNKNOWN_MESSAGE];
+
+  return error_strings[RESULT_OK];
hunk ./src/hscurl.c 278
+const char *curl_last_url()
+{
+  return last_url != NULL ? last_url : "";
+}
hunk ./src/hscurl.h 1
-int get_curl(const char *darcs_version, const char *proxyuserpass, const char *filename, const char *url, int cache_time);
+const char *curl_request_url(const char *url,
+                             const char *filename,
+                             int cache_time);
hunk ./src/hscurl.h 5
+const char *curl_wait_next_url();
+
+const char *curl_last_url();
}

[Cleanup libwww module, better error handling, follow redirects (closes issue621).
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080127152710] {
hunk ./src/Libwww.hs 5
-import System.IO
hunk ./src/Libwww.hs 8
-import System.Environment ( getEnv )
-import Autoconf ( darcs_version )
+import Monad ( when )
hunk ./src/Libwww.hs 16
-  withCString darcs_version $ \vstr ->
hunk ./src/Libwww.hs 18
-  ppwd <- getProxyUserPwd
-  withCString ppwd $ \pstr -> do
hunk ./src/Libwww.hs 20
-  err <- libwww_request_url vstr pstr ustr fstr (cachableToInt cache)
-  if err /= 0
-     then return $ "Failed to start download URL " ++ u ++ " : " ++ libwwwError err
-     else do debugMessage ("Libwww.copyUrl succeeded")
-             return ""
+  err <- libwww_request_url ustr fstr (cachableToInt cache) >>= peekCString
+  when (null err) (debugMessage "Libwww.copyUrl succeeded")
+  return err
hunk ./src/Libwww.hs 29
-waitNextUrl = do err <- libwww_wait_next_url
-                 if (err /= 0)
-                    then do u <- lastUrl
-                            return $ "Failed to download URL " ++ u ++ " : " ++ libwwwError err
-                    else do debugMessage ("Libwww.waitNextUrl succeeded")
-                            return ""
+waitNextUrl = do err <- libwww_wait_next_url >>= peekCString
+                 when (null err) (debugMessage "Libwww.waitNextUrl succeeded")
+                 return err
hunk ./src/Libwww.hs 44
-libwwwError :: CInt -> String
-libwwwError 1000 = "HTRequest_new()"
-libwwwError 1001 = "HTRequest_addAfter() failed"
-libwwwError 1002 = "HTRequest_addCacheControl() failed"
-libwwwError 1003 = "HTLoadToChunk() failed"
-libwwwError 1004 = "fopen() failed"
-libwwwError 1005 = "fwrite() failed"
-libwwwError 1006 = "malloc() failed"
-libwwwError 1007 = "HTList_new() failed"
-libwwwError 1008 = "HTList_appendObject() failed"
-libwwwError err  = "libwww error code: " ++ show err
-
hunk ./src/Libwww.hs 50
-  libwww_request_url :: CString -> CString -> CString -> CString -> CInt -> IO CInt
+  libwww_request_url :: CString -> CString -> CInt -> IO CString
hunk ./src/Libwww.hs 53
-  libwww_wait_next_url :: IO CInt
+  libwww_wait_next_url :: IO CString
hunk ./src/Libwww.hs 57
-
-getProxyUserPwd :: IO String
-getProxyUserPwd = do
- getEnv "DARCS_PROXYUSERPWD" `catch` (\_ -> return "")
hunk ./src/hslibwww.c 1
+#include "hslibwww.h"
+
hunk ./src/hslibwww.c 11
-    // Start from 1000 to avoid conflict with libwww codes.
-    // libwww codes are defined in HTUtils.h
-    RESULT_REQUEST_NEW_FAIL = 1000,
-    RESULT_REQUEST_ADD_AFTER_FAIL,
+    RESULT_REQUEST_NEW_FAIL,
+    RESULT_NET_ADD_AFTER_FAIL,
hunk ./src/hslibwww.c 14
-    RESULT_LOAD_TO_CHUNK_FAIL,
-    RESULT_FILE_OPEN_FAIL,
-    RESULT_FILE_WRITE_FAIL,
+    RESULT_LOAD_TO_FILE_FAIL,
hunk ./src/hslibwww.c 17
-    RESULT_LIST_APPEND_OBJECT_FAIL
+    RESULT_LIST_APPEND_OBJECT_FAIL,
+    RESULT_EVENTLIST_NEW_LOOP
+  };
+
+static const char *error_strings[] =
+  {
+    "",
+    "HTRequest_new() failed",
+    "HTNet_addAfter() failed",
+    "HTRequest_addCacheControl() failed",
+    "HTLoadToFile() failed",
+    "malloc() failed",
+    "HTList_new() failed",
+    "HTList_appendObject() failed",
+    "HTEventList_newLoop() failed"
hunk ./src/hslibwww.c 40
+static const char user_agent[] = "darcs";
hunk ./src/hslibwww.c 42
-static const char *const user_agent = "darcs";
hunk ./src/hslibwww.c 45
+static char libwww_error[80];
+static const char libwww_error_fmt[] = "libwww error code: %i";
hunk ./src/hslibwww.c 53
-  HTChunk *const chunk = HTRequest_context(request);
-
hunk ./src/hslibwww.c 66
-          if (status == HT_LOADED)
-            {
-              completed->error = RESULT_OK;
-              FILE *const f = fopen(param, "wb");
-              if (f != NULL)
-                {
-                  if (HTChunk_size(chunk) > 0)
-                    {
-                      if (fwrite(HTChunk_data(chunk),
-                                 HTChunk_size(chunk), 1, f) != 1)
-                        completed->error = RESULT_FILE_WRITE_FAIL;
-                    }
-                  fclose(f);
-                }
-              else
-                completed->error = RESULT_FILE_OPEN_FAIL;
-            }
-          else
-            completed->error = status;
-
-          completed->url = HTAnchor_address((HTAnchor *)HTRequest_anchor(request));
+          completed->error = status;
+          completed->url = HTRequest_context(request);
hunk ./src/hslibwww.c 71
-  HTChunk_delete(chunk);
hunk ./src/hslibwww.c 72
-  free(param);
-
hunk ./src/hslibwww.c 75
-  return HT_ALL;
+  return HT_ERROR;
hunk ./src/hslibwww.c 78
-int libwww_request_url(const char *darcs_version,
-                       const char *proxypass,
-                       const char *url,
-                       const char *filename,
-                       int cache_time)
+const char *libwww_request_url(const char *url,
+                               const char *filename,
+                               int cache_time)
hunk ./src/hslibwww.c 82
-  error = RESULT_OK;
hunk ./src/hslibwww.c 86
-      HTProfile_newNoCacheClient(user_agent, darcs_version);
+      HTProfile_newNoCacheClient(user_agent, PACKAGE_VERSION);
hunk ./src/hslibwww.c 100
-        return error = RESULT_LIST_NEW_FAIL;
+        return error_strings[RESULT_LIST_NEW_FAIL];
hunk ./src/hslibwww.c 105
-    error = RESULT_REQUEST_NEW_FAIL;
+    return error_strings[RESULT_REQUEST_NEW_FAIL];
hunk ./src/hslibwww.c 107
-  result = HTRequest_addAfter(request, terminate_handler, NULL, strdup(filename),
-                              HT_ALL, HT_FILTER_LAST, YES);
+  HTRequest_setContext(request, strdup(url));
+
+  result = HTNet_addAfter(terminate_handler, NULL, NULL, HT_ALL, HT_FILTER_LAST);
hunk ./src/hslibwww.c 111
-    return error = RESULT_REQUEST_ADD_AFTER_FAIL;
+    return error_strings[RESULT_NET_ADD_AFTER_FAIL];
hunk ./src/hslibwww.c 113
-  HTRequest_setOutputFormat(request, WWW_SOURCE);
hunk ./src/hslibwww.c 125
-  if (result == FALSE)
-    return error = RESULT_REQUEST_ADD_CACHE_CONTROL_FAIL;
+  if (result == NO)
+    return error_strings[RESULT_REQUEST_ADD_CACHE_CONTROL_FAIL];
hunk ./src/hslibwww.c 128
-  HTChunk *const chunk = HTLoadToChunk(url, request);
-  if (chunk == NULL)
-    return error = RESULT_LOAD_TO_CHUNK_FAIL;
-  else
-    HTRequest_setContext(request, chunk);
+  result = HTLoadToFile(url, request, filename);
+  if (result == NO)
+    return error_strings[RESULT_LOAD_TO_FILE_FAIL];
hunk ./src/hslibwww.c 132
-  return error;
+  return error_strings[RESULT_OK];
hunk ./src/hslibwww.c 135
-int libwww_wait_next_url()
+const char *libwww_wait_next_url()
hunk ./src/hslibwww.c 145
-      HTNet_isEmpty() == NO){
-    HTEventList_newLoop();
-  }
+      HTNet_isEmpty() == NO &&
+      HTEventList_newLoop() != HT_OK)
+    return error_strings[RESULT_EVENTLIST_NEW_LOOP];
hunk ./src/hslibwww.c 152
-      error = completed->error;
+      if (completed->error == HT_LOADED)
+        libwww_error[0] = '\0';
+      else
+        {
+          snprintf(libwww_error, sizeof(libwww_error),
+                   libwww_error_fmt, completed->error);
+          libwww_error[sizeof(libwww_error) - 1] = '\0';
+        }
hunk ./src/hslibwww.c 163
+
+      return libwww_error;
hunk ./src/hslibwww.c 167
-  return error;
+  return error_strings[error];
hunk ./src/hslibwww.c 170
-char *libwww_last_url()
+const char *libwww_last_url()
hunk ./src/hslibwww.h 1
-int libwww_request_url(const char *darcs_version,
-                       const char *proxyuserpass,
-                       const char *url,
-                       const char *filename,
-                       int cache_time);
+const char *libwww_request_url(const char *url,
+                               const char *filename,
+                               int cache_time);
hunk ./src/hslibwww.h 5
-int libwww_wait_next_url();
+const char *libwww_wait_next_url();
hunk ./src/hslibwww.h 7
-char *libwww_last_url();
+const char *libwww_last_url();
}

[Fix darcs version in libwww user agent.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080127181548] {
hunk ./src/hslibwww.c 5
+
+static const char darcs_version[] = PACKAGE_VERSION;
+
hunk ./src/hslibwww.c 43
-static const char user_agent[] = "darcs";
hunk ./src/hslibwww.c 88
-      HTProfile_newNoCacheClient(user_agent, PACKAGE_VERSION);
+      HTProfile_newNoCacheClient("darcs", darcs_version);
}

[Merge Curl and Libwww to URL module.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080127182001] {
hunk ./src/Curl.hs 1
-{-# OPTIONS -fffi #-}
-module Curl ( copyUrl, waitNextUrl, lastUrl,
-              Cachable(Cachable, Uncachable, MaxAge) )
-where
-
-#ifdef HAVE_CURL
-import Foreign.C.Types ( CInt )
-import Foreign.C.String ( withCString, peekCString, CString )
-import Monad ( when )
-import Darcs.Progress ( debugMessage )
-#endif
-
-data Cachable = Cachable | Uncachable | MaxAge !CInt
-
-copyUrl :: String -> String -> Cachable -> IO String
-#ifdef HAVE_CURL
-copyUrl u f cache =
-  withCString u $ \ustr ->
-  withCString f $ \fstr -> do
-  debugMessage ("Curl.copyUrl ("++u++"\n"++
-                "              -> "++f++")")
-  err <- curl_request_url ustr fstr (cachableToInt cache) >>= peekCString
-  when (null err) (debugMessage "Curl.copyUrl succeeded")
-  return err
-#else
-copyUrl _ _ _ = fail "There is no libcurl!"
-#endif
-
-waitNextUrl :: IO String
-#ifdef HAVE_CURL
-waitNextUrl = do err <- curl_wait_next_url >>= peekCString
-                 when (null err) (debugMessage "Curl.waitNextUrl succeeded")
-                 return err
-#else
-waitNextUrl = fail "There is no libcurl!"
-#endif
-
-lastUrl :: IO String
-#ifdef HAVE_CURL
-lastUrl = curl_last_url >>= peekCString
-#else
-lastUrl = fail "There is no libcurl!"
-#endif
-
-#ifdef HAVE_CURL
-cachableToInt :: Cachable -> CInt
-cachableToInt Cachable = -1
-cachableToInt Uncachable = 0
-cachableToInt (MaxAge n) = n
-
-foreign import ccall "hscurl.h curl_request_url"
-  curl_request_url :: CString -> CString -> CInt -> IO CString
-
-foreign import ccall "hscurl.h curl_wait_next_url"
-  curl_wait_next_url :: IO CString
-
-foreign import ccall "hscurl.h curl_last_url"
-  curl_last_url :: IO CString
-#endif
rmfile ./src/Curl.hs
hunk ./src/Libwww.hs 1
-{-# OPTIONS -fffi #-}
-module Libwww ( copyUrl, waitNextUrl, lastUrl )
-where
-
-#ifdef HAVE_LIBWWW
-import Foreign.C.Types ( CInt )
-import Foreign.C.String ( withCString, peekCString, CString )
-import Monad ( when )
-import Darcs.Progress ( debugMessage )
-#endif
-import Curl ( Cachable(..) )
-
-copyUrl :: String -> String -> Cachable -> IO String
-#ifdef HAVE_LIBWWW
-copyUrl u f cache =
-  withCString u $ \ustr ->
-  withCString f $ \fstr -> do
-  debugMessage ("Libwww.copyUrl ("++u++"\n"++
-                "              -> "++f++")")
-  err <- libwww_request_url ustr fstr (cachableToInt cache) >>= peekCString
-  when (null err) (debugMessage "Libwww.copyUrl succeeded")
-  return err
-#else
-copyUrl _ _ _ = fail "There is no libwww!"
-#endif
-
-waitNextUrl :: IO String
-#ifdef HAVE_LIBWWW
-waitNextUrl = do err <- libwww_wait_next_url >>= peekCString
-                 when (null err) (debugMessage "Libwww.waitNextUrl succeeded")
-                 return err
-#else
-waitNextUrl = fail "There is no libwww!"
-#endif
-
-lastUrl :: IO String
-#ifdef HAVE_LIBWWW
-lastUrl = libwww_last_url >>= peekCString
-#else
-lastUrl = fail "There is no libwww!"
-#endif
-
-#ifdef HAVE_LIBWWW
-cachableToInt :: Cachable -> CInt
-cachableToInt Cachable = -1
-cachableToInt Uncachable = 0
-cachableToInt (MaxAge n) = n
-
-foreign import ccall "hslibwww.h libwww_request_url"
-  libwww_request_url :: CString -> CString -> CInt -> IO CString
-
-foreign import ccall "hslibwww.h libwww_wait_next_url"
-  libwww_wait_next_url :: IO CString
-
-foreign import ccall "hslibwww.h libwww_last_url"
-  libwww_last_url :: IO CString
-#endif
rmfile ./src/Libwww.hs
hunk ./GNUmakefile 48
-	URL.hs Curl.hs Libwww.hs HTTP.hs DateMatcher.lhs \
+	URL.hs HTTP.hs DateMatcher.lhs \
hunk ./src/HTTP.hs 3
-import Curl(Cachable(..))
+import URL(Cachable(..))
hunk ./src/URL.hs 1
+{-# OPTIONS -fffi #-}
hunk ./src/URL.hs 11
-import qualified Libwww as WWW ( copyUrl, waitNextUrl, lastUrl )
-import qualified Curl as Curl ( copyUrl, waitNextUrl, lastUrl )
-import Curl ( Cachable(..) )
hunk ./src/URL.hs 12
+import Foreign.C.Types ( CInt )
+import Foreign.C.String ( withCString, peekCString, CString )
+import Darcs.Progress ( debugMessage )
+
+#if !defined(HAVE_CURL) || !defined(HAVE_LIBWWW)
+import Foreign.Ptr ( nullPtr )
+#endif
+
+data Cachable = Cachable | Uncachable | MaxAge !CInt
hunk ./src/URL.hs 64
-                    then do err <- if have_libwww then WWW.waitNextUrl
-                                                  else Curl.waitNextUrl
-                            url <- if have_libwww then WWW.lastUrl
-                                                  else Curl.lastUrl
+                    then do err <- waitNextUrl'
+                            url <- lastUrl'
hunk ./src/URL.hs 73
-                                                          ++url++" : "++err)
+                                                          ++url++": "++err)
hunk ./src/URL.hs 76
+    where waitNextUrl' = do let fn = if have_libwww then libwww_wait_next_url
+                                                    else curl_wait_next_url
+                            err <- fn >>= peekCString
+                            when (null err) (debugMessage "URL.waitNextUrl succeeded")
+                            return err
+          lastUrl' = let fn = if have_libwww then libwww_last_url
+                                             else curl_last_url
+                     in fn >>= peekCString
hunk ./src/URL.hs 102
-                                       err <- if have_libwww then WWW.copyUrl u f c
-                                                             else Curl.copyUrl u f c
+                                       err <- copyUrl' u f c
hunk ./src/URL.hs 104
-                                                                     ++u++" : "++err)
+                                                                     ++u++": "++err)
hunk ./src/URL.hs 107
+    where copyUrl' u f cache = withCString u $ \ustr ->
+                               withCString f $ \fstr -> do
+                               debugMessage ("URL.copyUrl ("++u++"\n"++
+                                             "          -> "++f++")")
+                               let fn = if have_libwww then libwww_request_url
+                                                       else curl_request_url
+                               err <- fn ustr fstr (cachableToInt cache) >>= peekCString
+                               when (null err) (debugMessage "URL.copyUrl succeeded")
+                               return err
+
+cachableToInt :: Cachable -> CInt
+cachableToInt Cachable = -1
+cachableToInt Uncachable = 0
+cachableToInt (MaxAge n) = n
+
+#ifdef HAVE_CURL
+foreign import ccall "hscurl.h curl_request_url"
+  curl_request_url :: CString -> CString -> CInt -> IO CString
+
+foreign import ccall "hscurl.h curl_wait_next_url"
+  curl_wait_next_url :: IO CString
+
+foreign import ccall "hscurl.h curl_last_url"
+  curl_last_url :: IO CString
+#else
+no_curl :: IO ()
+no_curl = fail "There is no libcurl!"
+
+curl_request_url :: CString -> CString -> CInt -> IO CString
+curl_request_url _ _ _ = no_curl >> return nullPtr
+
+curl_wait_next_url :: IO CString
+curl_wait_next_url = no_curl >> return nullPtr
+
+curl_last_url :: IO CString
+curl_last_url = no_curl >> return nullPtr
+#endif
+
+#ifdef HAVE_LIBWWW
+foreign import ccall "hslibwww.h libwww_request_url"
+  libwww_request_url :: CString -> CString -> CInt -> IO CString
+
+foreign import ccall "hslibwww.h libwww_wait_next_url"
+  libwww_wait_next_url :: IO CString
+
+foreign import ccall "hslibwww.h libwww_last_url"
+  libwww_last_url :: IO CString
+#else
+no_libwww :: IO ()
+no_libwww = fail "There is no libwww!"
+
+libwww_request_url :: CString -> CString -> CInt -> IO CString
+libwww_request_url _ _ _ = no_libwww >> return nullPtr
+
+libwww_wait_next_url :: IO CString
+libwww_wait_next_url = no_libwww >> return nullPtr
+
+libwww_last_url :: IO CString
+libwww_last_url = no_libwww >> return nullPtr
+#endif
}

Context:

[resolve conflict with Eric on controlMasterPath.
David Roundy <[EMAIL PROTECTED]>**20080125203903] 
[More concise backup warning.
Eric Kow <[EMAIL PROTECTED]>**20071105012930] 
[Remove now obsolete wrapper for Map (we now require GHC >= 6.4).
Eric Kow <[EMAIL PROTECTED]>**20071105192636] 
[Modernise Data.Map import.
Eric Kow <[EMAIL PROTECTED]>**20071105192530] 
[Give ssh CM socket a unique name for each darcs process.
Eric Kow <[EMAIL PROTECTED]>**20071105021956
 Delete the socket in the unlikely event that a previous darcs had a socket
 with the same name left over.
] 
[Create ssh CM socket in $HOME/.darcs if possible.
Eric Kow <[EMAIL PROTECTED]>**20071105015525] 
[Refactor y/n prompts.
Eric Kow <[EMAIL PROTECTED]>**20071019213307] 
[issue578: steve and monica test for rolling back a rollback
Mark Stosberg <[EMAIL PROTECTED]>**20080118031606] 
[eliminate lazy parsing of patches, which gives bad error messages (issue364)
David Roundy <[EMAIL PROTECTED]>**20080125191836] 
[[issue492] Check that context file actually exists in darcs get.
Eric Kow <[EMAIL PROTECTED]>**20080125183741] 
[[issue227] Platform-independent absolute paths in get --context
Eric Kow <[EMAIL PROTECTED]>**20080125181702] 
[make uniqueoptions.sh test give friendlier output.
David Roundy <[EMAIL PROTECTED]>**20080125183430] 
[fix code to avoid duplicate --verbose in --help (so tests will pass).
David Roundy <[EMAIL PROTECTED]>**20080125183420] 
[Make verbosity flags advanced options universally.
Eric Kow <[EMAIL PROTECTED]>**20080125181005] 
[adding File::Temp 0.20 to tree for more consistent test results. It is GPL-licensed.
Mark Stosberg <[EMAIL PROTECTED]>**20080124033049] 
[update restrictive perms test to run a temporary directory and clean up after itself.
Mark Stosberg <[EMAIL PROTECTED]>**20080123000417
     Running in a tru temporary directory allows the potential to run tests in parallel.
] 
[update some ChangeLog entries to also credit those who contributed through bug reporting, test writing or feedback. 
Mark Stosberg <[EMAIL PROTECTED]>**20080122235435] 
[ issue602: part 1: Always prefer our private copy of Test::More over the system-wide one for more consistent results
Mark Stosberg <[EMAIL PROTECTED]>**20080124005407] 
[ issue602, part 2: freshen our versions of Test::More and Test::Builder
Mark Stosberg <[EMAIL PROTECTED]>**20080123013642] 
[More error messages for libwww.
Dmitry Kurochkin <[EMAIL PROTECTED]>**20080124092600] 
[issue608: a new test for 'mv', following Zooko's bug report
Mark Stosberg <[EMAIL PROTECTED]>**20080124013856] 
[report progress in writing the inventory out for hashed repos.
David Roundy <[EMAIL PROTECTED]>**20080125172017] 
[make empty key case of progress reporting fast.
David Roundy <[EMAIL PROTECTED]>**20080125171859] 
[fix issue where we overwrote prompt with progress info.
David Roundy <[EMAIL PROTECTED]>**20080125164609] 
[fix bug where we used show on an exception (and thus printed "User error").
David Roundy <[EMAIL PROTECTED]>**20080125164209
 This partially addresses issue168 by improving the error message.
] 
[add gnulib sha1.c file as a faster sha1 option.
David Roundy <[EMAIL PROTECTED]>**20080123212502] 
[fix embarrassing bug in External.
David Roundy <[EMAIL PROTECTED]>**20080125152329
 (which demonstrates that I didn't compile before pushing)
] 
[for now, print progress reports to stdout.
David Roundy <[EMAIL PROTECTED]>**20080125152105
 My hope is that this will alleviate some of the issues with progress
 reports overwriting prompts.
] 
[revamp progress reporting, making it more efficient and adding more output.
David Roundy <[EMAIL PROTECTED]>**20080125151540
 Note that there is still at least one time sink that remains to be identified.
] 
[avoid creating darcs-ssh if we aren't using ControlMaster. (issue613)
David Roundy <[EMAIL PROTECTED]>**20080125150846] 
[fix bug where darcs-ssh got even worse name (issue613).
David Roundy <[EMAIL PROTECTED]>**20080125150355] 
[provide more detailed progress reports in HashedIO.
David Roundy <[EMAIL PROTECTED]>**20080124145156] 
[print additional debug data in Progress.
David Roundy <[EMAIL PROTECTED]>**20080124145114] 
[add a few more debug messages in Repository.Internal.
David Roundy <[EMAIL PROTECTED]>**20080124144829] 
[fix incorrect report that we were reading patches.
David Roundy <[EMAIL PROTECTED]>**20080124125040] 
[reenable mandatory sha1 checks, now that we can link with a faster sha1.
David Roundy <[EMAIL PROTECTED]>**20080123203104] 
[remove (broken) git support and add openssl sha1 support.
David Roundy <[EMAIL PROTECTED]>**20080123202025
 These two changes got merged together as I was introducing the configure.ac
 changes to support openssl as a sha1 alternative to our Haskell code.
 
 (Yes, I'm lazy.)
] 
[remove redundant hash checks in hashed IO code.
David Roundy <[EMAIL PROTECTED]>**20080123173022] 
[output nicer progress in convert.
David Roundy <[EMAIL PROTECTED]>**20080123170428] 
[output timings when --timings is specified.
David Roundy <[EMAIL PROTECTED]>**20080123170314] 
[remove inaccurate message in convert.
David Roundy <[EMAIL PROTECTED]>**20080123170243] 
[use debugMessage in HashedIO.
David Roundy <[EMAIL PROTECTED]>**20080123160835] 
[add --timings flag (that as yet does nothing).
David Roundy <[EMAIL PROTECTED]>**20080123154931] 
[Major Perl test suite clean-up.
Mark Stosberg <[EMAIL PROTECTED]>**20080120035651
     The primary purpose of this patch was make sure all the tests are executed in
     randomly named directories, which allows us to run Perl tests in parallel, 
     without the directory names collided. 
 
     This isn't enabled by default for "make test", but it is there to play with.
     In the test directory, you can now do:
 
     ./bin/prove -j9 *.pl 
     
     to run 9 tests in parallel. There is also "--fork"
     option which should be a win on multi-CPU computers. 
     See "perldoc ./bin/prove" for details. 
 
     As part of this, a lot of boiler-plate code at the top and bottom of the
     scripts could be eliminated, and I made few other minor style clean-ups
     while I had the files open. 
 
     There should be no functional changes to the tests. 
] 
[Take advantage of new Perl testing infrastructure by eliminating needless --ignore-time mentions
Mark Stosberg <[EMAIL PROTECTED]>**20080120005242] 
[Take advantage of updated Perl testing infrastructure by removing needless author mentions in tests
Mark Stosberg <[EMAIL PROTECTED]>**20080120004503] 
[use --ignore-time in tests instead of "sleep", for faster, more reliable results
Mark Stosberg <[EMAIL PROTECTED]>**20080118030241] 
[Issue395: avoid single letter patch names in the test suite.  
Mark Stosberg <[EMAIL PROTECTED]>**20080118020634] 
[add regression test for amend-record removed file
Tommy Pettersson <[EMAIL PROTECTED]>**20080122223231] 
[use UTC in date matching test untill match handles time zones
Tommy Pettersson <[EMAIL PROTECTED]>**20080122134322] 
[fix bug with timestamps and obliterate.
David Roundy <[EMAIL PROTECTED]>**20080122224607] 
[Test: unpull may hide changes when using timestamp optimisation.
[EMAIL PROTECTED] 
[avoid printing totals that are less than our current progress.
David Roundy <[EMAIL PROTECTED]>**20080122210546] 
[TAG 2.0.0pre3
David Roundy <[EMAIL PROTECTED]>**20080122200612] 
Patch bundle hash:
967faa9f28e724d40f637685e335a61693d89ae3
_______________________________________________
darcs-devel mailing list
darcs-devel@darcs.net
http://lists.osuosl.org/mailman/listinfo/darcs-devel

Reply via email to