xfb52 at dial.pipex.com
Fri Jun 30 12:00:47 UTC 2006
Olivier Nicole wrote:
>2) as there are many connections comming from search engines siders
> (90% of all the established connections), I'd like to limit the
> ressources that spiders are using. One way would be through IPFW,
> but are there better ways? Is there a way to limit/prioritize in
> Apache (not that I know any).
google robots.txt which ought to limit what the spiders look at (but
consequently reduces what they index, as well).
Overall, though, your problem sounds more like a piece of software
bloating as it runs; the longer it runs the more memory it consumes.
Does the machine end up swapping? Try tracking memory usage.
More information about the freebsd-questions