Anyone else think it's about time to beat a WEB server to death?

Terry Lambert terry at
Fri Nov 10 11:07:25 PST 1995

> I frequently get asked the question: "How many users can I run off a
> FreeBSD WEB server?" and I'm naturally tempted to ask in response
> "How long is a piece of string?"
> However, I check myself with the knowledge that it's not an entirely
> unreasonable thing to want to know, and I merely wish that I had more
> data on this subject to provide in response.  It's obviously
> impossible to come up with one number that fits all situations, but
> various guesstimates can be derived from existing data so that given a
> link speed of x, a PC of macho-factor y and the "average" user doing
> z, you can come up with a performance projection of n users.
> The only problem is that I don't *have* any existing data worth
> mentioning.

If the httpd is started from inetd, then the limit is dictated by no
more than 256 requests in any 60 second period, unless you override
this at inetd startup time by increasing the number of requests
allowed per 60 seconds using a -R when you start it in the rc file.

If you don't override this, your test will fail in short order with
"server failing (looping), service terminated".

You can kill most BSD inetd based FTP servers this way now, actually,
using a -d 0 on an ncftp retry when the server is already loaded.  I
saw go down the other day when an associate stupidly kicked
the retry delay to 0.

By default, it takes 10 minutes for the thing to reset and start serving
connection requests again (#define RETRYTIME (60*10) in inetd.c).

I first saw this problem with a machine tftp serving fonts, color
tables, and configuration files to a very large number of X terminals
back in the 1.x days, and posted the equivalent of the '-R' option
addition at that time.

					Terry Lambert
					terry at
Any opinions in this posting are my own and not those of my present
or previous employers.

More information about the freebsd-announce mailing list