Kristján Valur Jónsson added the comment: Just came across this when running hadoop jobs it takes your .py script folder, puts each file in its own cache folder, then recreates the script folder populating it with individual symlinks. When run like this, the scripts can no longer import each other, because sys.path[0] is set to the "real" place of the file, rather than the place it was invoked from.
I just reproed this with python 2.7.3 on a new ubuntu system: repro: mkdir pydir mkdir pydir/lnk echo "import sys; print ">", sys.path[0]" >> pydir/lnk/test.py lndir -s lnk/test.py pydir/test.py python pydir/test.py > /home/kristjan/pydir/lnk You would have expected "/home/kristjan/pydir" since this is the apparent directory of the file. When "pydir" contains many .py files, each residing in their own unique real target directories, then they cannot import each other. ---------- nosy: +kristjan.jonsson status: closed -> open versions: +Python 2.6 _______________________________________ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue1387483> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com