Ideally it should be like a list of commands that I want to execute and execute all of them using a single subprocess call. I was able to do something similar by storing all the commands as a shell script and calling that script using subprocess, but I want a pure python solution.I will be executing the commands with shell=True and yes I understand the risks.
1 Answer
Use semicolon to chain them if they're independent.
For example, (Python 3)
>>> import subprocess
>>> result = subprocess.run('echo Hello ; echo World', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
>>> result
CompletedProcess(args='echo Hello ; echo World', returncode=0, stdout=b'Hello\nWorld\n')
But technically that's not a pure Python solution, because of shell=True. The arg processing is actually done by shell. (You may think of it as of executing /bin/sh -c "$your_arguments")
If you want a somewhat more pure solution, you'll have to use shell=False and loop over your several commands. As far as I know, there is no way to start multiple subprocesses directly with subprocess module.
2 Comments
Skaperen
something must loop through the list of commands. a shell can do this. python can do this.
Pavel Gurkov
@Skaperen exactly. It's just, if Python does the looping and
exec() syscall, I'd consider it more pure, as OP wants.
&/;/&&to chain the commands together? Why a single subprocess call though - why not loop that would give you more control over, for example, terminating early?;to put them in sequence:subprocess.call("do_A ; do_B ; do_C ; ", shell=True). Programmatically:call(' ; '.join(commands), shell=True)p = call("bash", stdin=PIPE); p.communicate("\n".join(commands)).