shell/awk scripting help - parsing directories to gain user/file information for commands

Dave [Hawk-Systems] dave at hawk-systems.com
Tue May 27 05:24:28 PDT 2003


have a number of uses for this where I am trying to get away from maintaining
lengthy static files which contain all the statically entered commands to run
for a cron job...

for example;

- need to run webalizer of a number of user websites and directories
- can run webalizer without a customized conf file, but need to provide
hostname, outputdir and other such variables from the command line
- can list all the log files which give the appropriate information

# ls /www/*/logs/*.access_log

generates...

/www/user1/.logs/user1domain1.com.access_log
/www/user1/.logs/user1domain2.com.access_log
/www/user2/.logs/user2domain1.com.access_log
/www/user2/.logs/user2domain2.com.access_log
/www/user3/.logs/user3domain1.com.access_log
/www/user3/.logs/user3domain2.com.access_log

what I am trying to script is something that does;

<pseudo code>
for i in /www/*/logs/*.access_log;
	ereg (user)(domain_name) from $i;
	do webalizer -n $domain_name -o /www/$user/stats/;
done
</pseudo code>

...as this would eliminate human error in maintaining a file which contains the
appropriate lines to handle this via cron or something every night.

This is one example, there are a slew of other similar applications that I would
use this for. have played with awk as well, just can't wrap my head around this
(not a shell scripting person by trade).

Any guidance or insight would be appreciated.

Dave




More information about the freebsd-questions mailing list