curl question - not exactly on-topic

Kurt Buff kurt.buff at
Wed Feb 10 17:35:10 UTC 2010

On Tue, Feb 9, 2010 at 21:05, Dan Nelson <dnelson at> wrote:
> In the last episode (Feb 09), Kurt Buff said:
>> Actually, it's not merely a curl question, it's a "curl and squid"
>> question.
>> I'm trying to determine the cause of a major slowdown in web browsing on
>> our network, so I've put curl on the squid box, and am using the following
>> incantations to see if I can determine the cause of the slowdown:
>>   curl -s -w "%{time_total}\n" "%{time_namelookup}\n" -o /dev/null
>> and
>>   curl -s -w "%{time_total}\n" "%{time_namelookup}\n" -o /dev/null -x
>> The problem arises with the second version, which uses the proxy. The
>> first incantation just returns the times, which is exactly what I want.
>> However, when I use the -x parameter, to use the proxy, I get html
>> returned as well as the times, which is a pain to separate out.
> Your problem is what's after -w.  You want one argument:
> "%{time_total}\n%{time_namelookup}\n", not two.  With your original command,
> "%{time_namelookup}\n" is treated as another URL to fetch.  With no proxy
> option, curl realizes it's not an url immediately and skips to the next
> argument on the commandline -  With a proxy, curl
> has to send each url to the proxy for processing.  The proxy probably
> returns a "400 Bad Request" error on the first (invalid) url, which is
> redirected to /dev/null.  The next url doesn't have another -o so it falls
> back to printing to stdout.
> Adding -v to the curl commandline will help you diagnose problems like this.

Thanks for that, though it's unfortunate.

I would really like a better understanding of the times, to help
further diagnose the problem, and 'man curl' says that multiple
invocations of '-w' will result in the last one winning, which I've

Do you have any suggestions for a way to get the timing of these
operations without resorting to tcpdump?


More information about the freebsd-questions mailing list