.htaccess file for /var/www

Discussion in 'Installation/Configuration' started by GoremanX, May 16, 2010.

  1. GoremanX

    GoremanX New Member

    Using ISPConfig on Ubuntu 10.04 based on the Perfect Server guide.

    I'm not sure where I'm supposed to set AllowOverride to be able to use an .htaccess file in /var/www. There are threads about this on the forum, but the configuration files seem to have changed since those threads were solved.

    In addition, I'd like to be able to use an .htaccess file for all requests to port 8080. How would I go about doing that?

    What are the minimum permissions needed by an .htpasswd file?

  2. mike_p

    mike_p Member

    An htaccess file needs to be in a folder thats looked at by apache.
    eg in the document root of a website

    If you have a global directive, put it in your httpd.conf
  3. GoremanX

    GoremanX New Member

    But /var/www is the document root of a web site. It's the server's root website. If I type myserver.com or my server's ip address in the browser, I get the index.html from /var/www

    I have an .htaccess file there, it does nothing. I tried adding the following to httpd.conf:

    <Directory /var/www>
    AllowOverride All
    Order Deny,Allow
    Deny from all

    and restarted Apache. It didn't work. The .htaccess file still does nothing.
  4. ChaosRealm

    ChaosRealm New Member


    If your using debian/ubuntu, the reason for this is most likely due to the fact that inside /etc/apache2/sites-enabled/000-default /var/www is set as follows:
            <Directory /var/www/>
                    Options Indexes FollowSymLinks MultiViews
                    AllowOverride None
                    Order allow,deny
                    allow from all
    Which will override your changes defined in httpd.conf.
    In order to allow processing of htaccess Options for files inside /var/www, you will need to change the AllowOverride line from that file to something other than None.


    Last edited: May 16, 2010
  5. GoremanX

    GoremanX New Member

    That's the file I was looking for! Thank you!

    Instead of using an .htaccess file, I just changed /etc/apache2/sites-enabled/000-default like this:

            <Directory /var/www/>
                    Options Indexes FollowSymLinks MultiViews
                    AllowOverride All
                    Order deny,allow
                    deny from all
    And now the spam/scraper/harvester bots are hitting a "permission denied" page rather than a "file not found" page.

Share This Page