Get Google Search Results With PHP – Google AJAX API And The SEO Perspective

If you’ve ever tried to write a program that fetches search results from Google, you’ll no doubt be familiar with the excrutiating annoyances of parsing the results and getting blocked periodically. Run a couple hundred queries in a row and bam! – your script is banned until proven innocent by entering an captcha. Even that would provide only a short reprieve, as you’d soon get blocked again.

Luckily there’s an official Google search API that will let you avoid that hassle. In this post you’ll find an example PHP script and a (mainly) SEO-oriented review of the API.

Using the AJAX API in PHP

I must confess that until yesterday I didn’t know you could use the Google AJAX search API in languages other than JavaScript. The documentation didn’t even mention the possibility when the API was first released. Well, it does now, and PHP is among the supported languages. Oh, the joy.

The API is already pretty well documented, so I won’t waste your time with another lengthy tutorial. Instead, here’s a simple example of how you could use it in PHP :

/**
 * google_search_api()
 * Query Google AJAX Search API
 *
 * @param array $args URL arguments. For most endpoints only "q" (query) is required.  
 * @param string $referer Referer to use in the HTTP header (must be valid).
 * @param string $endpoint API endpoint. Defaults to 'web' (web search).
 * @return object or NULL on failure
 */
function google_search_api($args, $referer = 'http://localhost/test/', $endpoint = 'web'){
	$url = "http://ajax.googleapis.com/ajax/services/search/".$endpoint;
	
	if ( !array_key_exists('v', $args) )
		$args['v'] = '1.0';
	
	$url .= '?'.http_build_query($args, '', '&');
	
	$ch = curl_init();
	curl_setopt($ch, CURLOPT_URL, $url);
	curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
	// note that the referer *must* be set
	curl_setopt($ch, CURLOPT_REFERER, $referer);
	$body = curl_exec($ch);
	curl_close($ch);
	//decode and return the response
	return json_decode($body);
}

$rez = google_search_api(array(
		'q' => 'antique shoes',
 ));

print_r($rez);

That’s it for the programming part.

So should we really throw away our lovingly crafted SERP scrapers and embrace the “official” API? Perhaps not. There are some peculiar things I’ve noticed after trying out the new API.

The Good

Lets start with the positive aspects. First, it looks like you can indeed safely use the API without getting blocked – I successfully ran about 1800 API queries in ~2 hours. Due to my crappy connection I was unable to test how it would behave if you turn it up to eleven and send hundreds of requests per second, but the rate limiter is definitely more lenient on API users than on plain SERP scrapers. This is a major plus for people who don’t like throttling their software to one request per minute or hunting for working proxies to get around bans.

The API also makes it easy to parse the results. All queries return JSON-encoded data, so you just json_decode() it and go. No need to invent complicated regexps that must be rewritten every time Google changes the HTML structure of the search results page.

The Bad

Of course, with a cliche megacorporation like Google it’s never all fun and games. You can only get 8 search results at a time, and no more than 64 results in total for any particular keyword. Whether this is a problem depends on what you intend to do with the API, but it’s certainly an unpleasant limitation.

The really peculiar – nay, insidious – thing is how the search results returned by the API differ from normal SERPs. A site that is #10 in a normal Google search may suddenly turn up as #1 in the API results. The typical #5 result may be moved to the second page. Basically, the API results look like they’ve been shuffled around a bit – the same URLs are returned but in slightly different order. Also, the “estimated result count” provided by the API is consistently much lower than what a normal search shows. All this makes the API useless for rank checking and similar SEO applications.

According to my tests you can’t just write off these discrepancies as a sideffect of geo-targeting.

It Depends

Overall, the API is either great or it kind-of sucks, depending on what you want to do with it.

At the risk of sounding like a conspiracy theorist, I must say the API seems to be cleverly engineered to be useful for “normal” purposes and somewhat useless for SEO. After all, only SEO workers really need accurate ranking data and more than 64 results per keyword phrase. Typical search engine users rarely move beyond the first page of results, so the limitations don’t hurt them. The various mashup makers that cater to the common user are also unaffected. It’s only the SEOs (and the rare academic researcher) that would be dissatisfied with the imposed constraints.

Of course, I’m sure you can still imagine a few interesting uses for the API 😉

Related posts :

86 Responses to “Get Google Search Results With PHP – Google AJAX API And The SEO Perspective”

  1. White Shadow says:

    The function returns an object that contains the search results and all kinds of additional information. If you want just the URLs, you could get them by iterating over the $rez->responseData->results array and grabbing the unescapedUrl field from each result. Like this :

    $rez = google_search_api( array('q' => 'antique shoes') );
    foreach ($rez->responseData->results as $result){
    	echo  $result->unescapedUrl, '
    '; }
  2. Nilay says:

    That was fast…and Accurate ! Thanks alot

  3. Nilay says:

    It seems that the results are always from a country-specific domain, the defult being ‘us’. Is it possible to get results from google.com instead of google.country ?

  4. White Shadow says:

    It doesn’t appear to be possible, at least not when querying the API from PHP. See the gl argument in the API docs.

  5. Nilay says:

    The example given in the gl argument states “google.loader.GoogleLocale = ‘www.google.com’;”. Does this mean that there is a way to override the domain, and can it be used in your script ? I tried adding this verbatim but doesn’t help. I appreciate your time.

  6. White Shadow says:

    As I said, I don’t see any way to get non-locale-specific results via this particular API. It seems you can set the locale to other countries (e.g. gl=uk for United Kingdom), but not turn it off – it would default to “us”.

  7. neel says:

    Nice article. I will definately try this on a website where i want to display the website rankings.

  8. Nilay says:

    Hello. Two more last things
    – What would be ideal to write as ‘referer’ – It seems that anything works, even the ‘localhost’ as mentioned in your script. Wanted to know if it will have any effect.
    – The object ‘rez’ prints info abou the url like title, meta description, but not meta keywords. Is there any way so that even the meta keywords are fetched ?

    Thanks

  9. White Shadow says:

    I don’t think “referer” is very significant, it just needs to be set to something valid. And no, the API doesn’t return meta keywords. If I remember correctly, the returned description is also not guaranteed to be the meta description – it could be an automatically generated page excerpt instead.

  10. […] ø Get Google Search Results With PHP – Google AJAX API And The SEO Perspective | W-Shadow.com ø If you’ve ever tried to write a program that fetches search results from Google, you’ll no doubt be familiar with the excrutiating annoyances of parsing the results and getting blocked periodically. Run a couple hundred queries in a row and bam! – your script is banned until proven innocent by entering an captcha. Even that would provide only a short reprieve, as you’d soon get blocked again. (tags: todo google screenscraping api) […]

  11. mans says:

    Hey!
    Thank you for this explanation. The js API is quite powerful but the args that can be used in the queries from other languajes is a bit poor.
    After reading the documentation from google I cannot see how could I make a site restriction from php.
    The js API is quite powerful but the args that can be used in the queries from other languajes is a bit poor.
    Do you know how could I make a query with site restriction from php? I want to find term A in z.com, y.com and x.com…
    Thanks in advance!

  12. White Shadow says:

    For a single site, you could simply add ” site:example.com ” to your search query. I don’t know about restricting the query to multiple sites; AFAIK that’s only possible with custom search engines.

  13. saeed says:

    Hi
    thanck for your script, i’m currently using it but the output is just a bunch of assorted text. please help.
    for more info see the page on my website @

    link:
    http://saeed-x.co.cc/qq.php

    ?

  14. White Shadow says:

    That is the expected result. That “bunch of text” actually represents the PHP object that contains the search results. To format the output more readably, add “echo ‘<pre>’;” somewhere above the “print_r($rez)” line.

    Long story short, the $rez->responseData->results array is probably what you’re after – it contains the actual results. Each array item is an object, with several fields like “title”, “url”, “content” (and so on) that describe the search result.

  15. Ciaran says:

    Well done, very useful article. Thank you!

  16. saeed says:

    hello,
    i have been using your script for a few months now and it’s been really helpful, but since a few days ago the script is not working anymore, no results is returned, just a white page, see below:

    http://www.saeedx.danagig.ir/nn.php

    any help is greatly appreciated.

  17. White Shadow says:

    Still works fine here.

    Why don’t you try adding some “echo” statements in strategic places to see where the script fails? That would probably be more helpful for debugging than a blank page.

  18. saeed says:

    thank you very much for your answer,

    i checked again, and eventually found out it was Google that was banning my IP address from its service. I run a website with an average 20.000 visitors per day and i think this (large number of requests per day) is causing the ban. I also added the UserIP and Referrer arguments, only this time I get temporary bans. Google’s documentation in this regard is not really helpful. i really would like to hear your opinion on this.

    thank you very much.

  19. White Shadow says:

    As far as I know, the only semi-reliable way to avoid the bans and still be able to send a large number of requests is to use lots and lots of proxies.

Leave a Reply