Fetching directories inclusive subdirectories on HTTP server via fetch or othe FreeBSD-own tools?

O. Hartmann ohartman at zedat.fu-berlin.de
Thu Apr 2 02:43:07 PDT 2009


Oliver Fromme wrote:
> O. Hartmann <ohartman at zedat.fu-berlin.de> wrote:
>  > I run into a problem I can not solve. I need to fetch a whole directory 
>  > tree from a public remote site. The top level directory and its 
>  > subdirectories are accessible via ftp:// and http:// so I tried fetch, 
>  > but fetch does only retrieve data on file basis and does not copy a 
>  > whole directory tree recursively. The remote site does not offer 
>  > sftp/sshd for that purpose.
>  > 
>  > Is there a simple way to perform such a task with FreeBSD's own tools (I 
>  > try to avoid installing 'wget' and sibblings)? I need to keep it simple, 
>  > task should be performed via cronjob.
> 
> I'm afraid you can't do that with FreeBSD base tools.
> 
> An alternative to wget would be "omi" (ports/ftp/omi)
> which is a simple FTP mirroring tool, written in C
> without any dependencies.  Usage is simple:
> 
> $ omi -s server.name.com -r /remote/dir -l ./local/dir
> 
> Note that, by default, it tries to synchronize the local
> dir perfectly, i.e. if the remote dir is empty, it will
> wipe out the local dir.  (The option "-P 0" will prevent
> omi from removing anything.)
> 
> Best regards
>    Oliver
> 

Thanks for so much answers.

I tried 'omi' but I find that the tool does not travers deeper into a 
dir than level one, so subdirs seem to be left out. I will try wget, 
although this tool would not be the first choice.


Thanks,
Oliver


More information about the freebsd-questions mailing list