2013/7/25 Peter FELECAN <[email protected]> > > As an aside, this brings up the question: why do we need to > compile at installation time and not deliver the pre-compiled bits; is > the p-code not portable on different architectures?
The Debian Python policy doesn't provide the reasons why they do it that way. http://www.debian.org/doc/packaging-manuals/python-policy/ch-module_packages.html#s-byte_compilation According to my own web searching, the *.pyc files are Python-version-dependent according to this entry on stackoverflow: http://stackoverflow.com/questions/2263356/are-python-2-5-pyc-files-compatible-with-python-2-6-pyc-files They don't provide sources, so I couldn't quickly verify it. I did see an example where the Python version has been shown to be encoded inside a *.pyc file. I'm not sure how Python handles version mismatches. I'm expecting that it reads the file, deserializes it, check the version and if a mismatch is detected, it discards the *.pyc file and compiles the *.py file. This would mean that a *.pyc file is always useless for one of the two installed Python versions. I looked at the Debian pyshared solution by poking an installed system. The quick summary is: - /usr/share/pyshared contains the .py files and is not a member of sys.path - for each Python version a /usr/lib/pythonX.Y/dist-modules directory exists and is a member of sys.path - when a module is installed, symlinks from /usr/lib/pythonX.Y/dist-modules to respective entries in /usr/share/pyshared are made. They are exclusively file symlinks, and never directory symlinks. - the *.py files are compiled once per Python version, and *.pyc files live in the /usr/lib/pythonX.Y/dist-modules directory tree Sounds very reasonable to me. Somebody has to implement it for us though. Maciej _______________________________________________ maintainers mailing list [email protected] https://lists.opencsw.org/mailman/listinfo/maintainers .:: This mailing list's archive is public. ::.
