(Sorry, my mailserver rejected the reply, so I have to pull this from the archive and respond with broken references)
On 24.06.23, Antonio Borneo wrote: >> What is the benefit of including zlib and code to manage decompression >> versus just loading the data from a file? > >Valid question! >But I either don't see a good reason to distribute the FPGA bitstream >in the original form as a big file, "If it ain't broken, dont fix it" comes to mind. If someone does development on such a device (eg. modifying the FPGA, or the loader or whatever is embedded), it adds another build step. If that step (converting bitstream to embedded array) is done automatically by the build system, it may be fine, but nevertheless it adds complexity, and it duplicates functionality that is already present (in the file system and C library). If that is a manual build step, it gets really cumbersome. >Some of the flashing binaries are not small, so compression can be >welcome there too (maybe we should also investigate why they are so >big and if they can be optimized). That is a valid argument, however, OpenOCD is quite small (it runs fine on a raspberry pi 1), so I do not see much pressure in shaving of a few kilobytes here (unless you are thinking about running on even smaller targets). I am not totally against adding compression, but currently, I see added complexity and a really small benefit. >Embedding the binaries is an easy way to deploy OpenOCD without a >complete packaging; I can just copy openocd statically linked (or with >its lib dependencies) plus the few scripts I need. That sounds valid - however, you can do the same and just copy along the files needed for your adapter, and specify the path to them. Since in most cases, OpenOCD needs the supplied scripts, having one or two files less to bundle seems irrelevant. Note that I am mostly a Linux user who compiles his own OpenOCD binary, so windows users might have different needs. >Keeping separate the loadable scripts is instead welcome: they are >good as examples, are easy to edit/change, the user can load its own >script. This does not (always) apply to the binaries. Correct, but we need the scripts anyway, so we are just talking about a small difference here. >We offer command-line flags and TCL commands to set the search path >and to find the scripts in the path. Today the only external binary is >searched at a fixed path. If we allow external binaries we should add >some features to find them. Agreed - there should be a standard mechanism for all modules that need to access external files, and a common mechanism to specify their location. >I'm thinking I would try to add zlib and use it for one of the big >flashers, to see how much mess it adds. It could end up that it >doesn't make sense just for the extra code to be maintained... My suggestion would be to add zlib but leave the FPGA bitstreams etc. as external files, only compressed. >And I ask again: any other use case that could benefit from compression? reading/writing compressed binary files from/to flash might be useful. At work, we usually have 256MB NAND flash, but image files are mostly ~30MB full, so compression would help. >By the way, someone is going to extend OpenOCD to load external binary >flashers from ARM CMSIS files.FLM >https://open-cmsis-pack.github.io/Open-CMSIS-Pack-Spec/main/html/flashAlgorithm.html Nice! >These are often prorietary binaries with either custom and/or >incompatible licenses; they cannot be embedded in OpenOCD, probably >even cannot be distributed in the same package. >They must be loaded from external files, but here the reason why is clear. Correct. However, if we need some files to be externally, there is no big benefit in embedding some of them (yes, embedding smaller flash algos makes sense - I am only arguing against the need that *everything* should be embedded). cu Michael -- Some people have no respect of age unless it is bottled.