.htaccess
.htaccess is a file stored in a directory, commonly on a Unix and Linux variant operating system, that grants or denies users or groups access rights to that directory. On a Unix-like operating system such as Linux, this file should have the permissions set to 640 using chmod. It should also be located in the root public_html directory. Its contents may look similar to the examples listed below.
Full .htaccess example and explanation
Below is a full breakdown of each major segments of a .htaccess file. Each segment can be incorporated into your .htaccess file depending on your account needs.
Be sure to test your web page after implementing any of the below changes. These changes can restrict or redirect your visitors in a way you may not have anticipated.
Lines that begin with # are comments or nonexecutable statements. Also, many examples below use regular expressions to help match characters or files in the URL string.
Set the default character setting
#Set the charset for the pages AddDefaultCharset UTF-8
In the first example, the character settings of each of the page are set to UTF-8. Although this can be specified in a meta tag, to apply the setting to every document, set it in .htaccess.
Redirect matches found in the URL
#Redirect M$soft and Hacking attempts RedirectMatch (.*MSOffice)$ /error.htm RedirectMatch (.*httpodbc\.dll)$ /error.htm RedirectMatch (.*root\.exe)$ /error.htm RedirectMatch (.*nt)$ /error.htm RedirectMatch (.*comments.php)$ /error.htm
In the example above, the RedirectMatch redirects any matched strings to the error.htm page. These lines can also be forwarded to a script to log matches or direct the users more accordingly. In the first line, we match any MSOffice at the end of the URL (uniform resource locator) and forward to the error.htm page.
Redirect the user with a 410 error
#HTTP 410 don't log files Redirect gone /crossdomain.xml Redirect gone /labels.rdf
The next example redirects the user to a 410 error message, which means the page they're looking for is gone, never going to return, and has no forwarding address. A 410 redirect is a great way to redirect requests to pages that aren't on your server but are frequently requested, causing 404 errors in your error log.
Custom error document pages
#Error pages ErrorDocument 400 /error.php?400 ErrorDocument 401 /error.php?401 ErrorDocument 403 /error.php?403 ErrorDocument 404 /error.php?404 ErrorDocument 405 /error.php?405 ErrorDocument 410 /error.php?410 ErrorDocument 500 /error.php?500 ErrorDocument 501 /error.php?501
In the example above, any HTTP (hypertext transfer protocol) errors are directed to a PHP script that displays the error to the user. The PHP (PHP: Hypertext Preprocessor) script also logs the error for the webmaster. See our HTTP definition for a full listing of HTTP errors, if you need more than what is listed above. Your site may need nothing more than a custom 404 response.
Create a 301 redirect
#HTTP 301 redirect computerhope.com to www.computerhope.com RewriteEngine On rewritecond %{http_host} ^computerhope.com [NC] rewriterule ^(.*)$ https://www.computerhope.com/$1 [L,R=301,NC]
In the example above, we created a 301 which redirects https://computerhope.com to https://www.computerhope.com. The redirection uses "L, R=301,NC" as a flag. The "L" is short for "last." It tells Apache to run no more rewrite rules. "R=301" is for the 301 redirect, and "NC" is short for "no case" and makes this rule case-insensitive. Creating this type of redirect helps prevent your web pages from getting listed multiple times in search engines and keeps everything consistent. We've also added the option to follow symlinks (symbolic links), helping prevent errors from occurring if a file or directory is linked to and isn't an actual file or directory.
RewriteCond %{HTTP_HOST} ^www\.(.*) [NC] RewriteRule ^(.*) http://%1/$1 [R=301,L]
The example above is another example of how to create a 301 direct. In this example, we're directing any www address to a non www address. So, if implemented https://www.computerhope.com would become https://computerhope.com. In this example, we also added the wild character .* as the domain instead of specifying computerhope.com.
Secure .htaccess file
# Secure htaccess file <Files .htaccess> order allow,deny deny from all </Files>
In this next example, a rule is created preventing anyone from viewing your .htaccess file, and the rules you have listed there. These extra lines can added additional protection to the .htaccess file.
Disable directory indexing
# disable directory browsing Options All -Indexes
In the example above, this security rule would prevent anyone from browsing directories on your server. For example, if you have a directory called /files that does not contain a index.html file, that directory's files are shown to anyone. If that directory has sensitive files (e.g., passwords), the person browsing that folder can view or save any files in that directory, a security risk.
Make HTML files act as SSI files
#Allow files with chmod +x to be SSI files xBitHack on
By turning on xBitHack, you can allow any HTML (hypertext markup language) file with executable permissions, e.g., chmod +x to be treated as an SSI (server-side include) file. This addition is useful for anyone running a web page as static HTML files and need one or more of their HTML web pages to have SSI.
Enable website caching
# Month cache <FilesMatch "\.(gif|jpg|jpeg|pdf|png|ico)$"> Header set Cache-Control "max-age=2592000" </FilesMatch> # Week cache <FilesMatch "\.(js|css|ch|txt)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # Day cache <FilesMatch "\.(html|htm)$"> Header set Cache-Control "max-age=86400" </FilesMatch>
In the example above, caching is setup to help improve how fast your pages load and decrease the demand on your server. In the first example, image files and other files are set to a max-age of a month. If the visitor requests a file they've already viewed once, all future requests are loaded from their computer and not the server for a month. Next, files such as JavaScript files and CSS (cascading style sheets) files are set to a week max-age limit. Finally, the HTML files are set to a day limit. These can all be adjusted depending on how often you update these types of files.
The age is represented in seconds. There are 86,400 seconds in one day.
Deny visitors based on USER_AGENT
RewriteEngine On RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.* - [F,L]
There are several tools and services that index your site looking for e-mail addresses or copying your complete page. If used improperly these services are a drain on your server and can also be used maliciously. If you notice these user agents in your visitor logs that can be prevented using a command similar to the above.
Deny visitors based on IP address
Order Allow,Deny Deny from 178.239.58.144 Allow from all
In the example above, these lines deny an IP address from accessing your pages. Banning an IP using this method helps block anyone from that IP from doing anything to your website.
Creating a password-protected directory
AuthUserFile /home/directory/.passfile AuthGroupFile /dev/null AuthName Access For Valid Users AuthType Basic <Limit GET> require valid-user </Limit >
The AuthUserFile contains your users and passwords that you want to grant access to the directory where the files are stored.
To create a passfile, enter the following command at the prompt.
htpasswd -c . passfile username
After entering the command above, a prompt to enter a password for the username appears.
The passfile should also be set to 640 permissions.