It looks like html_link_find is allocating a buffer of size 0, and then you
are trying to write 1 byte to it.

On Thu, Jun 4, 2020, 22:28 James Read <jamesread5...@gmail.com> wrote:

> Here is my valgrind output that I don't understand:
>
> ==319842== Invalid write of size 1
> ==319842==    at 0x48436E4: mempcpy (in
> /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
> ==319842==    by 0x50CD1D8: _IO_default_xsputn (genops.c:386)
> ==319842==    by 0x50CD1D8: _IO_default_xsputn (genops.c:370)
> ==319842==    by 0x50B227B: __vfprintf_internal (vfprintf-internal.c:1688)
> ==319842==    by 0x50C0278: __vsprintf_internal (iovsprintf.c:95)
> ==319842==    by 0x509D047: sprintf (sprintf.c:30)
> ==319842==    by 0x10B88F: html_link_find (crawler.c:452)
> ==319842==    by 0x10BD6F: html_parse (crawler.c:536)
> ==319842==    by 0x10C2CB: check_multi_info (crawler.c:678)
> ==319842==    by 0x10C3DA: event_cb (crawler.c:706)
> ==319842==    by 0x10D828: crawler_init (crawler.c:1154)
> ==319842==    by 0x10DAE8: main (crawler.c:1207)
> ==319842==  Address 0xf107d18 is 0 bytes after a block of size 8,200
> alloc'd
> ==319842==    at 0x483B7F3: malloc (in
> /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
> ==319842==    by 0x10B736: html_link_find (crawler.c:440)
> ==319842==    by 0x10BD6F: html_parse (crawler.c:536)
> ==319842==    by 0x10C2CB: check_multi_info (crawler.c:678)
> ==319842==    by 0x10C3DA: event_cb (crawler.c:706)
> ==319842==    by 0x10D828: crawler_init (crawler.c:1154)
> ==319842==    by 0x10DAE8: main (crawler.c:1207)
> ==319842==
>
> valgrind: m_mallocfree.c:305 (get_bszB_as_is): Assertion 'bszB_lo ==
> bszB_hi' failed.
> valgrind: Heap block lo/hi size mismatch: lo = 8272, hi =
> 3625731377157460067.
> This is probably caused by your program erroneously writing past the
> end of a heap block and corrupting heap metadata.  If you fix any
> invalid writes reported by Memcheck, this assertion failure will
> probably go away.  Please try that before reporting this as a bug.
>
> The code this pertains to can be found at
> https://github.com/JamesRead5737/webcrawler
>
> Any help in understanding what this error means would be greatly
> appreciated.
>
> James Read
> _______________________________________________
> Valgrind-users mailing list
> Valgrind-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/valgrind-users
>
_______________________________________________
Valgrind-users mailing list
Valgrind-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/valgrind-users

Reply via email to