Restricting users to their own home directories / not letting users view other users files...?

Paul Schmehl pauls at utdallas.edu
Thu Feb 12 09:44:10 PST 2009


--On Thursday, February 12, 2009 10:04:59 -0600 Keith Palmer 
<keith at academickeys.com> wrote:

>
>
> Your other proposed solution results in the same situation, correct? No
> matter what, Apache needs read-access to any and all files, so no matter
> what PHP will have access to read any user's files. There's no way around
> that for a shared hosting situation that I know of...
>
> If you remove the groups write privs, then PHP scripts can't really do any
> damage at least.
>
>
> Your solution doesn't work because the user "keith" could still do a "ls
> /home/shannon/public_html/" and get the directory listing (shannon's
> public_html directory is 0755, per your suggestion). Unless I'm missing
> something...?
>

If you set the world readable bit, you break the entire schema.  To make it 
work, world must have no access - not even directory search access.  So you set 
u=rwx,g=srx,o-rwx (or 2750), for homedirs and u=rw,g=sr,o-rwx (or 2640) for 
files.  To maintain the schema you would also need to change the users' umask 
to 027 or (script a perm change periodically to remove the world bits from new 
files.)

If you want to get more granular, you can set the homedirs and all subdirs to 
owner:owner and only set the public_html dir and its subdirs to owner:www.  The 
key is to remove the world access from the homedirs and everything under them, 
set the group to www, setgid and change the umask.  Once you've done that, it's 
pretty much maintenance free.  It wouldn't hurt to script something that crawls 
the homedirs periodically looking for perm problems, just in case something 
crops up.

The webserver only needs read access to files (unless the application you're 
running has some special requirements.)  You can make a perl script (or php 
files, python, tcl, you name it) read only and then configure Apache so it's 
executable from within Apache but not directly from the hard drive.

Most application vendors tend to "err" on the side of too-loose perms, 
demanding rwx for everything when that's really not needed.  You can play 
around with the perms and see what breaks, then roll the new set out once 
you've figured out what's needed.  But, if you do it right, world doesn't need 
any access at all, and that's going to be a requirement going forward to keep 
others from seeing the files.  If world has access, anyone on the server has 
access.

The webserver I maintain has no access at all for world.  Individual dirs may 
have differing access rights depending upon who needs to get into them, but 
world is excluded.  This means an attacker has to become root or the webserver 
user before he can even see the web stuff, and only root would have more than 
read access.

If the web server has read only access to the files, then an attacker is 
limited to exploiting vulnerabilities in the webserver or the applications 
running on it.

I strongly suggest you install and use mod_security (if you're not already) to 
protect against that.  It's very lightweight and works quite well.  There's an 
active user community, and you can protect against existing vulnerabilities 
with the right filters in place.

-- 
Paul Schmehl (pauls at utdallas.edu)
Senior Information Security Analyst
The University of Texas at Dallas
http://www.utdallas.edu/ir/security/


More information about the freebsd-questions mailing list