Thank you, Kyleman123, for stating that so eloquently. Plus, I find it kinda hypocritical when someone says I'm arrogant, then goes to correct me and acts just as arrogant, if not outright insulting, right back. That doesn't exactly help the situation at all, and only puts fuel on the fire. Also, what I think Kyleman123 means to say is the way the transmitted bytes can be handled by an application. The larger the buffer size, the more data you can scan, and the higher risk of junk/nonsense being inserted into the data stream (as well as the higher amount). Such nonsense could include SQL injection attacks, machine code, etc. If you limit the buffer to 1024/2048 bytes, you can discard the rest of that nonsense (since, really, it can still be inserted into the stream, the buffer size being irrelevant since most, if not all, insertions are done through the client and then through transmission from the client, or through an intermediary such as an MITM). But limiting the data means you only read 1024/2048 bytes at most. If that contains unrecognizable code? No issue; discard it and move on. Allocate a buffer of 20000-40000 elements and that's 20000-40000 more bytes to process. That takes time and memory, and considering that that buffer is always allocated and released.... well, its probably possible to crash a system that way. I really don't know, I'm not dumb enough to try.
-- Audiogames-reflector mailing list Audiogamesfirstname.lastname@example.org https://sabahattin-gucukoglu.com/cgi-bin/mailman/listinfo/audiogames-reflector