curl question - not exactly on-topic
Dan Nelson
dnelson at allantgroup.com
Wed Feb 10 05:05:20 UTC 2010
In the last episode (Feb 09), Kurt Buff said:
> Actually, it's not merely a curl question, it's a "curl and squid"
> question.
>
> I'm trying to determine the cause of a major slowdown in web browsing on
> our network, so I've put curl on the squid box, and am using the following
> incantations to see if I can determine the cause of the slowdown:
>
> curl -s -w "%{time_total}\n" "%{time_namelookup}\n" -o /dev/null http://www.example.com
>
> and
>
> curl -s -w "%{time_total}\n" "%{time_namelookup}\n" -o /dev/null -x 192.168.1.72 http://www.example.com
>
> The problem arises with the second version, which uses the proxy. The
> first incantation just returns the times, which is exactly what I want.
>
> However, when I use the -x parameter, to use the proxy, I get html
> returned as well as the times, which is a pain to separate out.
Your problem is what's after -w. You want one argument:
"%{time_total}\n%{time_namelookup}\n", not two. With your original command,
"%{time_namelookup}\n" is treated as another URL to fetch. With no proxy
option, curl realizes it's not an url immediately and skips to the next
argument on the commandline - http://www.example.com. With a proxy, curl
has to send each url to the proxy for processing. The proxy probably
returns a "400 Bad Request" error on the first (invalid) url, which is
redirected to /dev/null. The next url doesn't have another -o so it falls
back to printing to stdout.
Adding -v to the curl commandline will help you diagnose problems like this.
--
Dan Nelson
dnelson at allantgroup.com
More information about the freebsd-questions
mailing list