Brandon Erhart berhart at ErhartGroup.COM
Sun Apr 4 15:07:31 PDT 2004

Yes, it pays attention to /robots.txt.

But, I am writing my own -- I don't want to use rsync, wget, anything like 
that. This is part of an archiving project, and it uses so many FDs because 
it has tons of connections opened to DIFFERENT servers at different times 
.. not just one site.

Any advice on the timeouts? I don't really care about the RFC , honestly 
:-P. Like I said, I'm going for sheer speed.


At 04:02 PM 4/4/2004, you wrote:
>Brandon Erhart wrote:
>>I am writing a network application that mirrors a given website (such as 
>>a suped-up "wget"). I use a lot of FDs, and was getting connect() errors 
>>when I would run out of local_ip:local_port tuples. I lowered the MSL so 
>>that TIME_WAIT would timeout very quick (yes, I know, this is "bad", but 
>>I'm going for sheer speed here), and it alleviated the problem a bit.
>>However, I have run into a new problem. I am getting a good amount of 
>>blocks stuck in FIN_WAIT_1, FIN_WAIT_2 or LAST_ACK that stick around for 
>>a long while. I have been unable to find must information on a timeout 
>>for these states.
>Well, these are defined in RFC-791 (aka STD-5).
>If you want to mirror the content of a given website rapidly, a good 
>approach would be to use a tool like rsync and duplicate the changed 
>portions at the filesystem level rather than mirroring via HTTP requests.
>It would also be the case that using HTTP/1.1 pipelining ought to greatly 
>reduce the number of new connections you need to open, which ought to 
>speed up your program significantly while reducing load on the servers 
>you're mirroring.
>Since I've given some helpful advice (or so I think :-), perhaps you'll be 
>willing to listen to a word of caution: if your client is pushing so hard 
>that it exhausts the local machine's resources, you're very probably doing 
>something that reasonable website administrators would consider to be 
>abusive and you may cause denial-of-service conditions for other users of 
>that site.
>Does your tool pay attention to /robots.txt?

More information about the freebsd-net mailing list