scripting suggestion: how to make this command shorter

gs_stoller at gs_stoller at
Sun Jun 28 08:43:12 UTC 2009

On 6/27/09, Zhang Weiwu <zhangweiwu at> wrote:
> Hello. I wrote this one-line command to fetch a page from a long uri,
> parse it twice: first time get subject & second time get content, and
> send it as email to me.
> $ w3m -dump
> ',w6y4zzjaxxymvjomxy----------------40--commen
> | grep -A 100 ¶Ô±È | mail -a 'Content-Type: text/plain; charset=UTF-8' -s
> '=?UTF-8?B?'`w3m -dump
> ',w6y4zzjaxxymvjomxy----------------40--commen
> | grep ÕÒµ½.*¼þ | base64 -w0`'?=' zhangweiwu at
> The stupid part of this script is it fetches the page 2 times and parse
> 2 times, thus making the command very long. If I can write the command
> in a way that the URI only appear once, then it is easier for me to
> maintain it. I plan to put it in cron yet avoid having to modify two
> places when the URI changes (and it does!).
> How do you suggest optimizing the one-liner?
               Whenever I have to look through a long file more than once, I copy the relevant sections into another file (a RAM file if it is short enough and I have the RAM) and then parse it there as many times as I need to do it.
Criminal Lawyers - Click here.

More information about the freebsd-questions mailing list