I am a bit new to bash, and I need to run a short command several hundred times in parallel but print output sequentially. The command prints a fairly short output to stdout that is I do not want to loose or for it to get garbled/mixed up with the output of another thread. Is there a way in Linux to run several commands (e.g. no more than N threads in parallel) so that all command outputs are printed sequentially (in any order, as long as they don't overlap).
Current bash script (full code here)
declare -a UPDATE_ERRORS
UPDATE_ERRORS=( )
function pull {
git pull # Assumes current dir is set
if [[ $? -ne 0 ]]; then
UPDATE_ERRORS+=("error message")
fi
for f in extensions/*; do
if [[ -d $f ]]; then
########## This code should run in parallel, but output of each thread
########## should be cached and printed sequentially one after another
########## pull function also updates a global var that will be used later
pushd $f > /dev/null
pull
popd > /dev/null
fi
done
if [[ ${#UPDATE_ERRORS[@]} -ne 0 ]]; then
# print errors again
fi
-kto your invocation of GNU Parallel to keep the outputs in order. Point 2) Define a function, and be sure to export it, and pass the function to GNU Parallel to execute - inside the function, append the error message to your array. gnu.org/software/parallel/…