Need to Stop Bots from Killing my Webserver

You could set your script to throw a 404 error based on the user agent string provided by the bots - they'll quickly get the hint and leave you alone.

if(isset($_SERVER['HTTP_USER_AGENT'])){
   $agent = $_SERVER['HTTP_USER_AGENT'];
}

if(preg_match('/^Googlebot/i',$agent)){
   http_response_code(301);
   header("HTTP/1.1 301 Moved Permanently");
   header("Location: http://www.google.com/");
   exit;
}

Pick through your logs and reject Bingbot, etc. in a similar manner - it won't stop the requests, but might save some bandwidth - give googlebot a taste of it's own medicine - Mwhahahahaha!

Updated

Looking at your code, I think your problem is here:

if (preg_match("/$bot[1]/i",$hostname) && $ipaddress == $iphostname)

If they are malicious bots then they could be coming from anywhere, take that $ipaddress clause out and throw a 301 or 404 response at them.

Thinking right up by the side of the box

  1. Googlebot never accepts cookies, so it can't store them. In fact, if you require cookies for all users, that's probably going to keep the bot from accessing your page.
  2. Googlebot doesn't understand forms - or - javascript, so you could dynamically generate your links or have the users click a button to reach your code (with a suitable token attached).

    <a href="#" onclick="document.location='rss2html.php?validated=29e0-27fa12-fca4-cae3';">Rss2html.php</a>

    • rss2html.php?validated=29e0-27fa12-fca4-cae3 - human
    • rss2html.php - bot

If rss2html.php is not being used directly by the client (that is, if it's PHP always using it rather than it being a link or something), then forget trying to block bots. All you really have to do is define a constant or something in the main page, then include the other script. In the other script, check whether the constant is defined, and spit out a 403 error or a blank page or whatever if it's not defined.

Now, in order for this to work, you'll have to use include rather than file_get_contents, as the latter will either just read in the file (if you're using a local path), or run in a whole other process (if you're using a URL). But it's the method that stuff like Joomla! uses to prevent a script from being included directly. And use a file path rather than a URL, so that the PHP code isn't already parsed before you try to run it.

Even better would be to move rss2html.php out from under the document root, but some hosts make that difficult to do. Whether that's an option depends on your server/host's setup.


PHP Limit/Block Website requests for Spiders/Bots/Clients etc.

Here I have written a PHP function which can Block unwanted Requests to reduce your Website-Traffic. Good for Spiders, Bots and annoying Clients.

CLIENT/Bots Blocker

DEMO: http://szczepan.info/9-webdesign/php/1-php-limit-block-website-requests-for-spiders-bots-clients-etc.html

CODE:

/* Function which can Block unwanted Requests
 * @return array of error messages
 */
function requestBlocker()
{
        /*
        Version 1.0 11 Jan 2013
        Author: Szczepan K
        http://www.szczepan.info
        me[@] szczepan [dot] info
        ###Description###
        A PHP function which can Block unwanted Requests to reduce your Website-Traffic.
        God for Spiders, Bots and annoying Clients.

        */

        # Before using this function you must 
        # create & set this directory as writeable!!!!
        $dir = 'requestBlocker/';

        $rules   = array(
                #You can add multiple Rules in a array like this one here
                #Notice that large "sec definitions" (like 60*60*60) will blow up your client File
                array(
                        //if >5 requests in 5 Seconds then Block client 15 Seconds
                        'requests' => 5, //5 requests
                        'sek' => 5, //5 requests in 5 Seconds
                        'blockTime' => 15 // Block client 15 Seconds
                ),
                array(
                        //if >10 requests in 30 Seconds then Block client 20 Seconds
                        'requests' => 10, //10 requests
                        'sek' => 30, //10 requests in 30 Seconds
                        'blockTime' => 20 // Block client 20 Seconds
                ),
                array(
                        //if >200 requests in 1 Hour then Block client 10 Minutes
                        'requests' => 200, //200 requests
                        'sek' => 60 * 60, //200 requests in 1 Hour
                        'blockTime' => 60 * 10 // Block client 10 Minutes
                )
        );
        $time    = time();
        $blockIt = array();
        $user    = array();

        #Set Unique Name for each Client-File 
        $user[] = isset($_SERVER['REMOTE_ADDR']) ? $_SERVER['REMOTE_ADDR'] : 'IP_unknown';
        $user[] = isset($_SERVER['HTTP_USER_AGENT']) ? $_SERVER['HTTP_USER_AGENT'] : '';
        $user[] = strtolower(gethostbyaddr($user[0]));

        # Notice that I use files because bots do not accept Sessions
        $botFile = $dir . substr($user[0], 0, 8) . '_' . substr(md5(join('', $user)), 0, 5) . '.txt';


        if (file_exists($botFile)) {
                $file   = file_get_contents($botFile);
                $client = unserialize($file);

        } else {
                $client                = array();
                $client['time'][$time] = 0;
        }

        # Set/Unset Blocktime for blocked Clients
        if (isset($client['block'])) {
                foreach ($client['block'] as $ruleNr => $timestampPast) {
                        $elapsed = $time - $timestampPast;
                        if (($elapsed ) > $rules[$ruleNr]['blockTime']) {
                                unset($client['block'][$ruleNr]);
                                continue;
                        }
                        $blockIt[] = 'Block active for Rule: ' . $ruleNr . ' - unlock in ' . ($elapsed - $rules[$ruleNr]['blockTime']) . ' Sec.';
                }
                if (!empty($blockIt)) {
                        return $blockIt;
                }
        }

        # log/count each access
        if (!isset($client['time'][$time])) {
                $client['time'][$time] = 1;
        } else {
                $client['time'][$time]++;

        }

        #check the Rules for Client
        $min = array(
                0
        );
        foreach ($rules as $ruleNr => $v) {
                $i            = 0;
                $tr           = false;
                $sum[$ruleNr] = 0;
                $requests     = $v['requests'];
                $sek          = $v['sek'];
                foreach ($client['time'] as $timestampPast => $count) {
                        if (($time - $timestampPast) < $sek) {
                                $sum[$ruleNr] += $count;
                                if ($tr == false) {
                                        #register non-use Timestamps for File 
                                        $min[] = $i;
                                        unset($min[0]);
                                        $tr = true;
                                }
                        }
                        $i++;
                }

                if ($sum[$ruleNr] > $requests) {
                        $blockIt[]                = 'Limit : ' . $ruleNr . '=' . $requests . ' requests in ' . $sek . ' seconds!';
                        $client['block'][$ruleNr] = $time;
                }
        }
        $min = min($min) - 1;
        #drop non-use Timestamps in File 
        foreach ($client['time'] as $k => $v) {
                if (!($min <= $i)) {
                        unset($client['time'][$k]);
                }
        }
        $file = file_put_contents($botFile, serialize($client));


        return $blockIt;

}


if ($t = requestBlocker()) {
        echo 'dont pass here!';
        print_R($t);
} else {
        echo "go on!";
}