Is it really so bad? A bit more verbose but also more readable, can be plenty short and sweet for me. I probably wouldn't even choose Python here myself and it's the kind of thing shell scripting is tailor-made for, but I'd at least be more comfortable maintaining or extending this version over that:
from subprocess import Popen, PIPE
CMD = ("printf", "x:hello:67:ugly!\nyy$:bye:5:ugly.\n")
OUT = "something.report"
ERR = "err.log"
def beautify(str_bytes):
return str_bytes.decode().replace("ugly", "beautiful")
def filter(str, \*index):
parts = str.split(":")
return " ".join([parts[i-1] for i in index])
with open(OUT, "w") as out, open(ERR, "w") as err:
proc = Popen(CMD, stdout=PIPE, stderr=err)
for line_bytes in proc.stdout:
out.write(filter(beautify(line_bytes), 2, 4))
I would agree though if this is a one-off need where you have a specific dataset to chop up and aren't concerned with recreating or tweaking the process bash can likely get it done faster.
Edit: this is proving very difficult to format on mobile, sorry if it's not perfect.
Edit: this is proving very difficult to format on mobile, sorry if it's not perfect.