It sounds good. But I'm wonder download performance will be impaired. :'-(
It should be tested much more.

On Thu, Sep 22, 2011 at 8:13 PM, Gustavo Sverzut Barbieri <
[email protected]> wrote:

> Better to use a percentual if frametime otherwise user will try to do some
> work and it will not have enough time. Something like 0.7 *
> ecore_animator_frametime_get()
>
> On Thursday, September 22, 2011, Cedric BAIL <[email protected]> wrote:
> > On Thu, Sep 22, 2011 at 12:51 AM, Kim Yunhan <[email protected]> wrote:
> >> Thank you!
> >> Ecore_Con_Url already have the solution with
> _ecore_con_url_idler_handler.
> >> So I just break the while loop if it takes too long.
> >>
> >> ==================================================================
> >> --- src/lib/ecore_con/ecore_con_url.c (revision 63520)
> >> +++ src/lib/ecore_con/ecore_con_url.c (working copy)
> >> @@ -1357,15 +1357,21 @@
> >>    int fd_max, fd;
> >>    int flags, still_running;
> >>    int completed_immediately = 0;
> >> +   double start;
> >>    CURLMcode ret;
> >>
> >>    _url_con_list = eina_list_append(_url_con_list, url_con);
> >>
> >>    url_con->active = EINA_TRUE;
> >>    curl_multi_add_handle(_curlm, url_con->curl_easy);
> >> -   /* This one can't be stopped, or the download never start. */
> >> -   while (curl_multi_perform(_curlm, &still_running) ==
> >> CURLM_CALL_MULTI_PERFORM) ;
> >>
> >> +   start = ecore_time_get();
> >> +   while (curl_multi_perform(_curlm, &still_running) ==
> >> CURLM_CALL_MULTI_PERFORM)
> >> +     if ((ecore_time_get() - start) > ecore_animator_frametime_get())
> >> +       {
> >> +          break;
> >> +       }
> >> +
> >>    completed_immediately =
> _ecore_con_url_process_completed_jobs(url_con);
> >>
> >>    if (!completed_immediately)
> >>
> >>
> >> It works well for me.
> >> How about this code?
> >> Please review again.
> >
> > Sounds good to me. If nobody apply, I will in a few hours.
> >
> > Thanks,
> >
> >> Thank you once again.
> >>
> >> On Thu, Sep 22, 2011 at 4:46 AM, Cedric BAIL <[email protected]>
> wrote:
> >>
> >>> On Wed, Sep 21, 2011 at 6:18 PM, Kim Yunhan <[email protected]> wrote:
> >>> > Thank you for your advice.
> >>> >
> >>> > libcurl already supports asynchronous DNS lookup (including c-ares).
> >>> > Ecore_Con_Url is integrated with libcurl.
> >>> > But I think that code in below blocks asynchronous mechanism of
> libcurl.
> >>> > while (curl_multi_perform(_curlm, &still_running)
> >>> > == CURLM_CALL_MULTI_PERFORM) ;
> >>> >
> >>> > I want to fix it simple. :)
> >>>
> >>> Agreed, I didn't look to that code since months or years, but why do
> >>> we have a 'while' here ? shouldn't we just go back to the main loop
> >>> and be magically called again ? did you try that solution ? if that
> >>> work, it would be a much better fix in my opinion.
> >>>
> >>> > On Thu, Sep 22, 2011 at 12:48 AM, Nicolas Aguirre <
> >>> [email protected]
> >>> >> wrote:
> >>> >
> >>> >> 2011/9/21 Kim Yunhan <[email protected]>:
> >>> >> > Hello!
> >>> >> >
> >>> >> > elm_map uses Ecore Con with CURL.
> >>> >> > I tested elm_map many times on my device.
> >>> >> > But sometimes UI interaction is held when data connection is poor.
> >>> >> > So I tried to debug and I found that this code lead to hold an
> Ecore
> >>> main
> >>> >> > loop.
> >>> >> >
> >>> >> > ----------------------------------------------
> >>> >> > In ecore_con_url.c
> >>> >> >
> >>> >> > while (curl_multi_perform(_curlm, &still_running) ==
> >>> >> > CURLM_CALL_MULTI_PERFORM) ;
> >>> >> > ----------------------------------------------
> >>> >> >
> >>> >> > curl_multi_perform() is CURL's asynchronous API.
> >>> >> > But above code hold an Ecore main loop.
> >>> >> > When it takes long time in libcurl, UI interaction is delayed.
> >>> >> >
> >>> >> > For example, If you have poor data connection.
> >>> >> > libcurl is trying to resolve DNS in this step.
> >>> >> > But it have to wait until timeout.
> >>> >> > At that time it looks like being locked.
> >>> >> >
> >>> >> > So I write a patch for fixing it.
> >>> >> >
> >>> >> > Index: ecore_con_url.c
> >>> >> >
> ===================================================================
> >>> >> > --- ecore_con_url.c (revision 63518)
> >>> >> > +++ ecore_con_url.c (working copy)
> >>> >> > @@ -1364,7 +1364,10 @@
> >>> >> >    url_con->active = EINA_TRUE;
> >>> >> >    curl_multi_add_handle(_curlm, url_con->curl_easy);
> >>> >> >    /* This one can't be stopped, or the download never start. */
> >>> >> > -   while (curl_multi_perform(_curlm, &still_running) ==
> >>> >> > CURLM_CALL_MULTI_PERFORM) ;
> >>> >> > +   while (curl_multi_perform(_curlm, &still_running) ==
> >>> >> > CURLM_CALL_MULTI_PERFORM)
> >>> >> > +     {
> >>> >> > +        ecore_main_loop_iterate();
> >>> >> > +     }
> >>> >> >
> >>> >> >    completed_immediately =
> >>> >> _ecore_con_url_process_completed_jobs(url_con);
> >>> >> >
> >>> >> > I am not sure that this patch is right because I don't understand
> >>> Ecore
> >>> >> and
> >>> >> > libcurl deeply.
> >>> >> > So I need your help. :)
> >>> >> > Please review this patch.
> >>> >> >
> >>> >> > Best regards,
> >>> >> > Yunhan Kim (spbear)
> >>> >> >
> >>> >> >
> >>> >>
> >>>
>
> ------------------------------------------------------------------------------
> >
>
> --
> Gustavo Sverzut Barbieri
> http://profusion.mobi embedded systems
> --------------------------------------
> MSN: [email protected]
> Skype: gsbarbieri
> Mobile: +55 (19) 9225-2202
>
> ------------------------------------------------------------------------------
> All the data continuously generated in your IT infrastructure contains a
> definitive record of customers, application performance, security
> threats, fraudulent activity and more. Splunk takes this data and makes
> sense of it. Business sense. IT sense. Common sense.
> http://p.sf.net/sfu/splunk-d2dcopy1
> _______________________________________________
> enlightenment-devel mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/enlightenment-devel
>
------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2dcopy1
_______________________________________________
enlightenment-devel mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/enlightenment-devel

Reply via email to