Gitweb links:

...log 
http://git.netsurf-browser.org/libhubbub.git/shortlog/0580bd8eed161949889506e792979de2458852db
...commit 
http://git.netsurf-browser.org/libhubbub.git/commit/0580bd8eed161949889506e792979de2458852db
...tree 
http://git.netsurf-browser.org/libhubbub.git/tree/0580bd8eed161949889506e792979de2458852db

The branch, master has been updated
       via  0580bd8eed161949889506e792979de2458852db (commit)
      from  9199df40bc95d4d54a954d2d2d4d0ec74b0e5163 (commit)

Those revisions listed above that are new to this repository have
not appeared on any other notification email; so we list those
revisions in full, below.

- Log -----------------------------------------------------------------
commitdiff 
http://git.netsurf-browser.org/libhubbub.git/commit/?id=0580bd8eed161949889506e792979de2458852db
commit 0580bd8eed161949889506e792979de2458852db
Author: Daniel Silverstone <[email protected]>
Commit: Daniel Silverstone <[email protected]>

    Consume insert_buf when resuming a parse
    
    Signed-off-by: Daniel Silverstone <[email protected]>

diff --git a/src/tokeniser/tokeniser.c b/src/tokeniser/tokeniser.c
index a7e67a1..2d9c4ed 100644
--- a/src/tokeniser/tokeniser.c
+++ b/src/tokeniser/tokeniser.c
@@ -393,6 +393,24 @@ hubbub_error hubbub_tokeniser_setopt(hubbub_tokeniser 
*tokeniser,
                } else {
                        if (tokeniser->paused == true) {
                                tokeniser->paused = false;
+                               /* When unpausing, if we have had something
+                                * akin to document.write() happen while
+                                * we were paused, then the insert_buf will
+                                * have some content.
+                                * In this case, we need to prepend it to
+                                * the input buffer before we resume parsing,
+                                * discarding the insert_buf as we go.
+                                */
+                               if (tokeniser->insert_buf->length > 0) {
+                                       parserutils_inputstream_insert(
+                                               tokeniser->input,
+                                               tokeniser->insert_buf->data,
+                                               tokeniser->insert_buf->length);
+                                       parserutils_buffer_discard(
+                                               tokeniser->insert_buf, 0,
+                                               tokeniser->insert_buf->length);
+                               }
+
                                err = hubbub_tokeniser_run(tokeniser);
                        }
                }


-----------------------------------------------------------------------

Summary of changes:
 src/tokeniser/tokeniser.c |   18 ++++++++++++++++++
 1 file changed, 18 insertions(+)

diff --git a/src/tokeniser/tokeniser.c b/src/tokeniser/tokeniser.c
index a7e67a1..2d9c4ed 100644
--- a/src/tokeniser/tokeniser.c
+++ b/src/tokeniser/tokeniser.c
@@ -393,6 +393,24 @@ hubbub_error hubbub_tokeniser_setopt(hubbub_tokeniser 
*tokeniser,
                } else {
                        if (tokeniser->paused == true) {
                                tokeniser->paused = false;
+                               /* When unpausing, if we have had something
+                                * akin to document.write() happen while
+                                * we were paused, then the insert_buf will
+                                * have some content.
+                                * In this case, we need to prepend it to
+                                * the input buffer before we resume parsing,
+                                * discarding the insert_buf as we go.
+                                */
+                               if (tokeniser->insert_buf->length > 0) {
+                                       parserutils_inputstream_insert(
+                                               tokeniser->input,
+                                               tokeniser->insert_buf->data,
+                                               tokeniser->insert_buf->length);
+                                       parserutils_buffer_discard(
+                                               tokeniser->insert_buf, 0,
+                                               tokeniser->insert_buf->length);
+                               }
+
                                err = hubbub_tokeniser_run(tokeniser);
                        }
                }


-- 
HTML5 parser library

_______________________________________________
netsurf-commits mailing list
[email protected]
http://listmaster.pepperfish.net/cgi-bin/mailman/listinfo/netsurf-commits-netsurf-browser.org

Reply via email to