Data management (devel) and network size increase
iaccounts at ibctech.ca
Tue Nov 20 15:40:33 PST 2007
I know there are quite a few members here who have grown through ranks
and network size increases since I've joined this list, hence I'll ask
my question here. Pardon the length, but if you follow through, I'm
certain I'll get good feedback.
Our network has grown exponentially in the last few years (without
extensive forward-looking thought), and I have development work, data
files etc on each server that has been implemented.
Overhaul the entire network from Layer 1 up.
To be able to access data very quickly no matter what box I am on, and
to do it securely.
More in depth:
Let's assume I have 30 servers, all within one PoP. Instead of having to
SSH into server 'A' from server 'C' to look for a file I need for a
program I have written (or just a normal data file), I want a way that I
can have one source of ALL my data, then from ALL servers, be able to:
# find /home/steve -name file.name
Instead, I have to either remember (yeah, good luck), or guess what
server the file is on, look for it remotely, then scp it over.
For development, I had CVS set up at one point, but I found it to be too
much effort for my simple tasks (I was probably using it beyond what I
needed). I know about NFS, but I've heard it is a hassle to initially
configure and relatively insecure. (This is not my stand, and I'm
willing to be corrected).
Does anyone have some recommendations on how I can consolidate my data
(including development (almost all Perl)), so that it can be accessed as
I would a local directory tree? Beyond that, also a suggestion on
particularly the development files. If CVS is the way for that, I'll do
it, but file management/access is more important.
A key factor will be the ease of implementation of a new server into the
I have no problem throwing up a box with GELI, but it's gaining the
access to the data like it were a local drive I want to make easy.
More information about the freebsd-questions