This post will show you how to add the first security layer to your Apache web server based on an installation of Apache 2.4 in CentOS.
All changes will be made on “/etc/httpd/conf/httpd.conf” file and it will be necessary to reload httpd service after changes are performed.
1- Avoid showing the Apache version
By default, error HTTP response codes (302, 403, 404, etc) show the version of Apache running on your server:
In order to avoid that you must add the following two lines to the end of “/etc/httpd/conf/httpd.conf” file:
1 2 |
ServerTokens Prod ServerSignature Off |
After reload httpd service, the responses should show only ‘Apache’:
2- Restrict the HTTP Method of the requests
Most of web applications only need GET, POST and HEAD methods. Allowing TRACE method is dangerous since it can be used to bybpass XSS protections and steal session cookies. TRACE request returns to the client the information received by the server in that request. Here is an example how can be used to steal cookies:
1 2 3 4 5 6 7 8 |
<script> var x = new XMLHttpRequest(); x.open('TRACE', 'http://example.com/', false); x.setRequestHeader("Max-Forwards", "0"); x.send(); if(200 == x.status) alert(x.responseText); </script> |
To disable TRACE requests you must add the following line to httpd.conf file:
1 |
TraceEnable off |
To know if you have enabled TRACE method on your server you can execute the following command:
1 |
curl -X TRACE 81.171.19.31 |
If you get a response “405 Method Not Allowed” means it is disabled. You can also use nmap tool to check which HTTP Methods you have enabled:
1 |
nmap -p 80 --script http-methods example.com |
Finally you can disable other HTTP Methods such as PUT, OPTIONS, etc with the following directive on httpd.conf:
1 2 3 |
<Location "/"> AllowMethods GET POST HEAD </Location> |
3- Deny CGI script execution
If your website doesn’t need to run any CGI script on server side, you should deny access to the default CGI binaries path:
1 2 3 4 5 |
<Directory "/var/www/cgi-bin"> AllowOverride None Options None Require all denied </Directory> |
Also you should comment the following line which indicates that scripts can be executed on those paths:
1 |
#ScriptAlias /cgi-bin/ "/var/www/cgi-bin/" |
4- Disable default options
If you are not using CGI scripts or SSI Includes on server side you must disble those options in order to avoid Apache from interpreting those commands on .shtml o .cgi files. In addition, directory indexing is enabled by default, you must disable it too to avoid people navigating through your website directories:
1 |
Options -Includes -Indexes -ExecCGI |
5- Enable HttpOnly flag
When an HttpOnly flag is used, JavaScript will not be able to read this authentication cookie in case of XSS exploitation. Is an additional flag included in a Set-Cookie HTTP response header. Using the HttpOnly flag when generating a cookie helps mitigate the risk of client side script accessing the protected cookie (if the browser supports it).
You should add the following line to httpd.conf:
1 |
Header edit Set-Cookie ^(.*)$ $1;HttpOnly |
Reload httpd service and check if now, HttpOnly is being added to the headers:
6- Deny direct HTTP requests against the server IP
If you have a domain published in Apache server and you are not using <VirtualHost> directives, probably if someone makes a request against the server IP (http://X.X.X.X) he will access to your domain as he was browsing normally to your website.
This is an unwanted situation since all Internet bots will get a 200 response code when they test your server IP. In order to avoid this situation you can deny all requests against the server IP by spliting the requests with <VirtualHost> directive:
1 2 3 4 5 6 7 8 9 10 11 12 |
<VirtualHost *:80> ServerName your_server_IP DocumentRoot /var/www/html <Location /> Require all denied </Location> </VirtualHost> <VirtualHost *:80> ServerName your_domain.com DocumentRoot /var/www/html/your_domain </VirtualHost> |
7- Block requests based on User-Agent
On Internet there are thousands of unwanted bots, hacking tools and vulnerability scanners that use their own User-Agent on HTTP requests. You can deny all requests that come from those User-agents in order to avoid they obtain information about your website or exploit it.
Here is a good example of config in httpd.conf to block those User-Agents:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 |
# Non-standard User-Agent beginnings RewriteCond %{HTTP_USER_AGENT} ^-?$ [NC,OR] RewriteCond %{HTTP_USER_AGENT} "^\(" [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^\' [NC,OR] RewriteCond %{HTTP_USER_AGENT} "^\ " [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^\" [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^- [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^= [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^\.$ [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^\\$ [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^\d [NC,OR] # Bad bots Blocking RewriteCond %{HTTP_USER_AGENT} ^$ [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^binlar [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^BOT\ for\ JCE [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^ [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^clshttp [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^cmsworldmap [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^ContactBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^dotbot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^extract [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^FHscan [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^g00g1e [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^grab [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^harvest [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InternetSeer\.com [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^libwww [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^masscan [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^miner [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^muhstik-scan [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^nutch [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^purebot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^pycurl [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^sitecheck\.internetseer\.com [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^sucker [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^ZmEu [NC,OR] RewriteCond %{HTTP_USER_AGENT} AhrefsBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR] RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR] RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} httplib [NC,OR] RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR] RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR] RewriteCond %{HTTP_USER_AGENT} MJ12bot [NC,OR] RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR] RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR] RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR] RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR] RewriteCond %{HTTP_USER_AGENT} Sogou [NC,OR] RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR] RewriteCond %{HTTP_USER_AGENT} Surf15a [NC,OR] RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR] RewriteCond %{HTTP_USER_AGENT} Python-urllib [NC,OR] RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR] RewriteCond %{HTTP_USER_AGENT} XoviBot [NC,OR] RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR] RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC,OR] #Block known Vulnerability Scanners RewriteCond %{HTTP_USER_AGENT} ^Acunetix [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^WPScan [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^w3af.sourceforge.net [NC,OR] RewriteCond %{HTTP_USER_AGENT} dirbuster [NC,OR] RewriteCond %{HTTP_USER_AGENT} nikto [NC,OR] RewriteCond %{HTTP_USER_AGENT} wpscan [NC,OR] RewriteCond %{HTTP_USER_AGENT} SF [NC,OR] RewriteCond %{HTTP_USER_AGENT} sqlmap [NC,OR] RewriteCond %{HTTP_USER_AGENT} fimap [NC,OR] RewriteCond %{HTTP_USER_AGENT} nessus [NC,OR] RewriteCond %{HTTP_USER_AGENT} whatweb [NC,OR] RewriteCond %{HTTP_USER_AGENT} Openvas [NC,OR] RewriteCond %{HTTP_USER_AGENT} jbrofuzz [NC,OR] RewriteCond %{HTTP_USER_AGENT} libwhisker [NC,OR] RewriteCond %{HTTP_USER_AGENT} webshag [NC,OR] RewriteRule ^.* - [F,L] |
Now if we try to scan or website with some vulnerability scanner such as ‘sqlmap’ or ‘wpscan’ we won’t get any information (unless we spoof the User-Agent).
Example 1 (Sqlmap)
Example 2 (WPscan)
8- Disable all unnecessary Apache modules
It is recommended to disable all unnecessary Apache modules from “/etc/httpd/conf.modules.d/” file. You only have to comment those lines that you want to disable and reload httpd service.
For example, if you are not using your Apache server as proxy, you can disable the following modules:
1 2 3 4 5 6 |
proxy_module proxy_balancer_module proxy_ftp_module proxy_http_module proxy_ajp_module proxy_connect_module |
I hope these guidelines help you to harden your server. I will post more security tips to harden Apache server as well as other services.
I work with an Apache server and this information will come in handy. Thank you.
Thanks for your comment. I hope these posts can help someone 🙂
Add 7)
1. The last condition can’t be with ‘OR’. It should be just “RewriteCond %{HTTP_USER_AGENT} webshag [NC]”
2. The condition “RewriteCond %{HTTP_USER_AGENT} ^ [NC,OR]” is wrong. It catches all user agents including the right ones.
3. The entire block should be enclosed with …
Otherwise perfect.