Virtual Host Insecurity

by Neil Fraser, April 2002

[Internet Server]The vast majority of websites do not run on dedicated servers. They run on Virtual Hosts[?] with hundreds of websites all sharing one server. Each site has its own account and most operating systems are quite capable of keeping one account from interfering with another account. The problem is that most web servers are not capable of making this distinction.

This document examines the security problems with virtual hosting two accounts under Apache on a Unix system. Similar problems may (or may not) occur under other configurations.

World writable

Let's consider the case of a website that has a guestbook, discussion forum, hit counter or some other simple web application that requires one or more data files. All the files in this website would be owned by the account (let's call it good_client). Since the shared web server runs in its own account (let's call it www), it doesn't have permission to update good_client's data files when it needs to. The usual solution is to chmod the data files "o+w", meaning that everyone has write-access to them. It doesn't take much thought to realise that if an enemy of good_client wants to severely mess up the data, they can sign up for an account (let's call it bad_client) on the same server, then happily rewrite the data files as much as they like. World writable files have no security whatsoever. Never the less, that is what most hosting companies (in my experience) advise their clients to do.

Group writable

The data files should really only be editable by the web server, not everyone. So the obvious solution is to chgrp the data files to www and then chmod them "g+w". Now it is impossible for bad_client to edit them directly. Unfortunately bad_client has the same CGI access good_client has, so they can write a simple CGI script that performs whatever edit they want. All they need to do is point a web browser at their script, and the web server will execute it using its permissions. Everyone on the server has access to execute scripts using www's authority. No security there either.

SetUID

What we really want to do is to execute our CGI programs with our own permissions, not www's permissions. And you can do exactly that, just chmod your scripts "u+s". The first problem is that most FTP clients and servers are unable to do this, so you'd usually need shell access. The second problem is that every time you edit the script, the setUID bit is lost, so you have to chmod it over and over. The third problem is that Perl will suddenly get very nervous about running under this mode, and will start raising warnings all over the place. However, if you can resolve these pesky issues, it does mean that your scripts run under your perms and your data files can be "go-rw" and thus be secure from direct reading and writing.

Unfortunately most back-end maintenance scripts (such as those that edit data files or report credit card numbers) rely on .htaccess files to authenticate the user. Bad_client can bypass the .htaccess directives by writing their own CGI script that simply calls your 'protected' CGI script. Thus bad_client can use your own administrative system to read or edit your data.

Multiple daemons[?]

So how does Apache suggest you solve this problem?

Use multiple daemons when: There are security partitioning issues, such as company1 does not want anyone at company2 to be able to read their data except via the web. In this case you would need two daemons, each running with different User, Group, Listen, and ServerRoot settings.

This is perfectly sound advice, run a separate web server for each account. Unfortunately it defeats the point of virtual hosting, since running hundreds of web servers are going to start eating up memory alarmingly quickly. If you are going to go this route, you are getting very close to co-location.

CGIWrap[?]

CGIWrap is a great way to run scripts under your permissions instead of that of the web server. And the best part is that other users can't run your scripts or trick the webserver into running them. The downside is that .htaccess files are completely ignored. From the web server's point of view, it is only calling one program: cgiwrap. So it is incapable of applying different access restrictions on the end scripts. This means that you'd need to have separate installations of cgiwrap for each mode (one for public scripts, one for private scripts, one for member scripts, etc) and a separate script directory for each one. It works, and is secure, but isn't very convenient. It also requires root to install each instance.

suEXEC[?]

suEXEC is exactly the same as CGIWrap except that it is internal to Apache. And that makes a huge difference. Unlike all external wrappers, suEXEC has the ability to honour .htaccess directives. suEXEC provides perfect isolation for every user of the server. This is the way things should be. The only unfortunate part is that for various reasons the Apache group decided to actively deter people from using the module by not including it with the default Apache installation and littering the documentation with scary warnings. If you are shopping for a Unix/Apache hosting provider for your website, make sure than they have suEXEC installed. Not only will your data be secure, but it is also an indication of competence.

-------------------------------------

Disclaimer: Although I am a server administrator at Digital Routes, this document is a personal rant, not a professional one. If my advice results in unrecoverable loss of wetware, sue me, not my employer. Naturally I'm very interested in receiving corrections or updates to this document.

Last modified: 14 April 2002