Python is often used a shell scripting language due to it being more robust than traditional Unix shells for managing complex programs, but it is significantly more verbose than Bash & friends for doing common shell scripting tasks such as writing pipelines and redirection .
I propose adding an idiomatic shell scripting API that uses Python operator overloading in a way that allows invoking commands in a Bash-like way. A fully working implementation (which I've been using in my personal scripts) can be found at https://github.com/tarruda/python-ush/. Here I will describe the basics of the API, more details in the github README.rst (which is also a doctest for the project). Here's how one can create commands: >>> from shell import Shell >>> sh = Shell() >>> cat = sh('cat') >>> ls = sh('ls') >>> sort = sh('sort') # or something like this: >>> cat, ls, sort = sh('cat', 'ls', 'sort') The Shell class is the entry point of the API and it is used as a factory for Command objects. For convenience we can add a builtin Shell instance which is a also module from where commands can be quickly imported from (idea taken from sh.py http://amoffat.github.io/sh/): >>> from shell.sh import cat, ls, sort We can construct more complex commands by calling it with certain arguments: >>> ls('-l', '-a', env={'LANG': 'C.UTF-8'}) And we can invoke the command by calling it without arguments: >>> ls() # or >>> ls('-l', '-a', env={'LANG': 'C.UTF-8'})() Invoking the command returns a tuple with the status code: >>> ls() (0,) The are more ways to invoke a command other than calling it without arguments. One such example is by iterating through it, which automatically takes care of piping the output back to python: >>> files = [] >>> for line in ls: ... files.append(line) Pipelines can be easily created with the `|` operator: >>> ls = ls('--hide=*.txt', env={'LANG': 'C.UTF-8'}) >>> sort = sort('--reverse') >>> ls | sort ls --hide=*.txt (env={'LANG': 'C.UTF-8'}) | sort --reverse Everything that can be done with Command objects, can also be done with Pipeline objects: >>> (ls | sort)() (0, 0) # single commands return a tuple to make the return value compatible with Pipelines >>> list(ls | sort) [u'tests', u'setup.cfg', u'pytest.ini', u'bin', u'README.rst'] We also use the Pipeline syntax for redirecting input/output. # piping to a filename string redirects output to the file >>> (ls | sort | '.stdout')() (0, 0) >>> str(cat('.stdout')) 'tests\nsetup.cfg\npytest.ini\nbin\nREADME.rst\n' # piping from a filename string redirects input from the file >>> str('setup.cfg' | cat) '[metadata]\ndescription-file = README.rst\n\n[bdist_wheel]\nuniversal=1\n' This is the basic idea. The current implementation only supports the classic subprocess API, but I can probably make it compatible with asyncio (to create async shells which spawns commands compatible with `await`). Thoughts? Is it worth writing a PEP for this? Best regards, Thiago. _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-le...@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/NZSFQXBXCSRUR7HMJFEQBDFBQFN4P3SH/ Code of Conduct: http://python.org/psf/codeofconduct/