Over the past few months Google keyword data has been getting harder and harder to acquire.

For anyone looking to do keyword research I’d always recommended using Google keyword planner over any 3rd party tool, primarily as it was free and previously the paid tools didn’t have much else to offer on top of what was already available.

How times have changed!

With Google now limiting the amount of data it provides, keyword research is close to being 100% reliant on 3rd party tools that (for the time being) still have access.

At MRE we don’t do keyword research in half measures, for many clients we’ve used keyword research not only as a means of guiding SEO strategy but also as a business intelligence tool, for one client we were able to dispel a long-held belief around the peak periods of their business year, leading them to make changes to their entire marketing strategy, not just at a search level.

To get hold of this bulk data we’d previously relied on a tool by Alec Bertram – VolumeAPI, a cloud-based system that was extremely easy to use, affordable and an efficient means of accessing the Adwords API.

When we found out keyword data was gradually becoming heavily restricted, us and I can imagine the entire SEO community were quite worried. Only a few weeks later it turned out that VolumeAPI was closing due to the same difficulties faced by the wider community. We then spent a panicked few days looking for the next best method to access search volumes and tested out a few different APIs on the market. SEMrush came out on top by a mile based on its speed and reliability.

The only downside was that it didn’t seem there was much documentation helping novice API users (I group myself into this category), after a few fraught hours of trial and error we were able to put together a python script to access the API and carry out bulk requests.

As of this moment it’s pretty basic but can make light work of ten’s of thousands of keyword requests, usually only taking a few hours or so to run through them all (we would however recommend keeping an eye on your available API units, you can burn through these quite quickly!).

It’s been a real time saver for us, allowing us to continue to carry out granular search analysis for our clients. Here at MRE we’re firm believers in supporting the community, so we’ve made our script publicly available – hopefully the team at SEMrush can share this with the rest of their users and potentially help us with a few improvements!

Here’s the script and a few steps to get it running, we’d recommend some basic knowledge of python/command line:

  1. Install Python 2.7 (see this tutorial for a detailed installation).
  2. Create a folder for the script. E.g. ‘semrush bulk api’.
  3. Copy the script into your favoured text editor (we use sublime), enter your API key and path for the queries file into the script.
  4. Save as “SEMrush_bulk_SV.py” into your folder.
  5. Save your chosen keywords into a text file “queries.txt” (this must be saved within the folder).
  6. Open up command prompt, navigate to the ‘semrush bulk api’ folder and then run the script – “python SEMrush_bulk_SV.py”
  7. The text file ‘SEMRush.txt’ will be created in the folder and the keyword data will be added in. This will be suitable to open in Excel in CSV format.

The script:



import requests



############### ENTER YOUR DETAILS ###############



# Enter the path of your queries text file below. Make sure you include the entire path including the holding drive e.g. C:\



queries_file = "queries.txt"



# Enter your API key below, you can find this here - https://www.semrush.com/api-use/



API_key = ""



############### ENTER YOUR DETAILS ###############



# Adds queries from text file into a list



queries = [line.rstrip('\n') for line in open(queries_file)]



# Creates the end file



the_file = open("SEMRush.txt", "w")



# Writes the column headers into the file



the_file.write("Keyword,Search Volume,CPC,Competition,No. of Results \n")



# Loops through all of the queries



for query in queries:



	# Creates a list to collate the SEMRush data for the individual query



	the_list = []



	# Create and make API request



	url = "http://api.semrush.com/?type=phrase_this&key=" + API_key + "&export_columns=Ph,Nq,Cp,Co,Nr&phrase=&" + query + "&database=uk"
	result = (requests.get(url).text).split(';')
	keyword = the_list.append(query)



	# Add SEMRush data to the list



	try:
		search_vol = the_list.append(result[5])
		cpc = the_list.append(result[6])
		comp = the_list.append(result[7])
		no_results = the_list.append(result[8])



	# If the request fails, or there is no search volume use "ERROR"



	except:
		the_list.append("ERROR")



	the_list.append("\n")



	# Write indivdidual query list to the file



	the_file.write(",".join(the_list))



	print the_list



# Close the file



the_file.close



Join the discussion 2 Comments

Leave a Reply