views:

349

answers:

2

I'm looking to find an api/program/interface to get the following information.

  1. a term(s) overall popularity - ala google trends
  2. how a website shows up rank wise in for said term(s) - ala googlesearchpositionfinder and how many websites have the term(s) - standard google, e.g. Searching for foobar and urban dictionary shows up at position 5 of 9,000,000

I would like to see the amount of times a particular search term was used and its /weekly/monthly/yearly popularity breakdown along with its rank in a particular page.

I've found googlesearchpositionfinder.com and google.com/trends but i have 5000 terms to search for by hand is not happening. I've also found www.juiceanalytics.com/openjuice/programmatic-google-trends-api but it doesn't allow me to do a break down for a period of 2 years.

Basically i'm trying to create a ranking of search phrases, the weeks(period) they were more popular and how a particular site(e.g urban dictionary) showed up in googles search rankings for the terms. See above (1-2)

Also this doesn't have to be in python this is just what I've found to build with...

Latest edit: Both answers below helped.

I ended up using curl against google directly and then parsing the results with a c# program.

+2  A: 

Google trends does not allow a search on a range of two years, but one year at time.
Using pyGTrends.py you could do:

from import pyGTrends
from csv import DictReader
r = pyGTrends(username, password)
r.download_report(('stackoverflow'), date='2009')
export1 = DictReader(r.csv().split('\n'))
r.download_report(('stackoverflow'), date='2010')
export2 = DictReader(r.csv().split('\n'))

then you could join export1 and export2 to suit your needs.

OR even better

You could download the report without date filter and do the dirty job with Python.
Have a look to the following script, arrange date_MIN\date_MAX as you need.

from pyGTrends import pyGTrends
import csv
import datetime
date_MIN ='2007/01/01'
date_MAX ='2009/03/01'
r = pyGTrends('username','password')
r.download_report(('stackoverflow'))
csv_reader = csv.reader(r.csv().split('\n'))
with open('gtrends_report.csv', 'w') as csv_out:
    csv_writer = csv.writer(csv_out)
    for count,row in enumerate(csv_reader):
        if count == 0:
            csv_writer.writerow(row)
        else:
            date = datetime.datetime.strptime(row[0], "%b %d %Y")
            if  datetime.datetime.strptime(date_MIN, "%Y/%m/%d") <= date <= datetime.datetime.strptime(date_MAX, "%Y/%m/%d"):
                 csv_writer.writerow(row)
systempuntoout
Thanks at least we're on the same page, i've been doing that but i want to do a finer granularity and am looking to see if there is a way to do a date range. Right now I'm forced to join a bunch of months together. I've also been looking at insights as it looks similar to google trends with more control... but have not got it to spit out the right information yet...
Drakkhen
@Drakkhen Updated, works for me.
systempuntoout
thanks for that, i've used it and got a bit further but still trying to get the first part which is how close to the top of google searches does a page appear
Drakkhen
@Drakkhen your request is not clear, if you want help please update your question with more details.
systempuntoout
the detail was there just made it more clear, thank you though!
Drakkhen
A: 

I had the same problem, and i just wrote a small class for checking the ranking via Google AJAX Search API, you can download it here:

http://bohuco.net/blog/2010/07/google-ranking-checker-class-in-php/

DerFichtl