Nick Timkovich wrote: > I actually gave a talk along these lines at the Chicago Python (ChiPy) > meetup last week: slides > https://docs.google.com/presentation/d/1v5z4f-FQkS-bQYE-Xv5SvA6cyaTiqlxs2w2C...
Nice presentation. I've adapted the examples in the "how to parent" section to illustrate the potential gains in expressiveness when using ush: Example 1: >>> import subprocess | >>> from ush.sh import echo, git | >>> result = subprocess.run( | >>> str(echo(b'asdf') | git('rev-parse', ... ["git", "rev-parse", "HEAD"], | ... 'HEAD', cwd="/home/nick/Code/toss", ... cwd="/home/nick/Code/toss", | ... env={"A":"B"})) ... env={"A": "B", **os.environ}, | ... input=b"asdf", | ... capture_output=True, | ... ) | Example 2 (with different way of passing env/cwd to children, useful for spawning multiple commands with the same env/cwd): >>> import os | >>> from ush.sh import chdir, setenv, git | >>> os.chdir("/home/nick") | >>> with chdir("/home/nick"), setenv({"A":"B"}): >>> os.execvpe("git", | ... git("rev-parse", "HEAD")() ... args=["git", "rev-parse", "HEAD"], | ... env={"A": "B", **os.environ} | ... ) | Note that chdir/setenv context managers don't pollute anything in the host python process. The temporary changes in the env/cwd are managed by the Shell instance, which is not only a factory for Command objects but also stores default options values. That makes it possible to have multiple shell instances in separate threads running commands in completely different environments (though this will be more useful once I add asyncio support). There's a few features I haven't documented yet (chdir/setenv are examples). My goal was create a full shell scripting DSL right into python, while also: - Not using the shell=True under the hoods, which ends up making the python script depend on the underlying shell. - Not relying on native code while also creating full OS-level pipelines. Some alternatives like http://amoffat.github.io/sh/ implement piping by manually transfering data between processes. ush creates OS pipes and lets it handle data transfer, which is how shells like bash do it. - Being cross platform (windows/unix supported) - Supporting python2/3 (Though python2 support is less of a priority now). > Part of the argument was about using pure standard library so a > self-contained script/repo could run anywhere Python is in order to (e.g.) > bootstrap other testing environments and/or work within restricted ones, > just like your average shell script. A gigantic step up from there is using > anything not in the stdlib because you may need to copy a bunch of files > (venv creation), and download executable things from the Internet > (horrific to some). One of the reasons I implemented ush in one file without dependencies was to make it simple bootstrapping it into any project. It is as easy as adding something like this to the top of your script: import pathlib script_dir = pathlib.Path(__file__).parent ush_path = script_dir / 'ush.py' if not ush_path.exists(): print('downloading ush.py') from urllib.request import urlretrieve urlretrieve( 'https://raw.githubusercontent.com/tarruda/python-ush/master/ush.py', str(ush_path)) (script_dir / '__init__.py').touch(exist_ok=True) from ush.sh import git, grep # import whatever commands you need Though it would be much better to have a library like ush built into standard library ;) _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/27CYRDRWY6Z6ZUKFF2SNRSOOUIBLUSYF/ Code of Conduct: http://python.org/psf/codeofconduct/