I am letting my brother in law run a non-profit club website for US military folk. His site has been slowing. From the logs one of the main culprits is a bot/spider from China called Baidu. I have blocked it using robots.txt but it doesn't heed the disallow.
Anyway, he has several gigabytes usage per month just from Baidu and some Russian bot. Oddly enough, all these bots from overseas combined don't take up the same bandwidth as this Baidu and russian bot are using lol
His military folks could very well be overseas but they do NOT need to find the site via search engine. And since my server is mainly family and friends site I want to block out these bots server wide, rather than altering .htaccess files on each site.
So I read this page:
The article mentions adding the following info to my httpd.conf file which I can find simple enough.
SetEnvIfNoCase User-Agent "^Baiduspider" bad_bots
SetEnvIfNoCase User-Agent "^Sogou" bad_bots
SetEnvIf Remote_Addr "212\.100\.254\.105" bad_bot
Allow from all
Deny from env=bad_bots
But httpd.conf is currently empty. Is there a different file I should put the info into for ISPConfig 3?
Are there any better suggestions for banning these bots?
Thanks as always folks,