I used to dream about a “minion “ that would give me an overview of client KPIs every morning. Thanks to Python, I now have autogenerated KPI reports that are send through Slack every morning. I just need to sit back and sip my coffee. Alfredo (that’s how I named my Slack Bot) will take care of the rest. Here’s how I did it and how you can too.
The way the Slack Bot works is simple. Once you call it through a dedicated Slack channel, a script will run in the background, fetching the data from Google Search Console and Google Analytics and crunching it to come up with a small report of the pages that increased AND decreased the most in a specified time range. Here’s a more visual representation of the process:
- Slack Request
- Gather data via API
- Compare the data between 2 date ranges.
- Send Report with TOP increasing & TOP dropping pages. (It can be clicks, Sign Ups, Leads…)

In this article I’ll show you how to create a similar Slack Bot step by step. We’ll start with the Python script that gathers the data and then we’ll go on into the specifics of connecting everything together.
1. Setting up a Cloud Console project
If you’ve ever used any Google API chances are you already know how to set up a Google Cloud Project, feel free to skip this part.
If you are new to this, I suggest you take a look at the following article from JC Chouinard. IMO one of the best articles to quickly set up a project. Otherwise, here’s how you can do it:
- Create a Google Cloud Project.

2. Enable Google Search Console and Google Analytics Reporting API.

3. Create a Service Account and get the API Keys.

4. Give the service account permissions to view/ read data in both your Search Console and Google Analytics accounts. Each service account has an email attached to it.
ALTERNATIVELY, and in case you don’t want to add the service account to each of your Search Console accounts, you can do “delegate credentials”. You just need to grant the service account domain-wide authority and the clients won’t need to do anything. Here’s the steps to do it:
- From your Google Workspace domain’s Admin console,go to Main menu menu > Security > API Controls.
- In the Domain wide delegation pane, select Manage Domain Wide Delegation.
- Click Add new.
- In the Client ID field, enter the service account’s Client ID. (You can find it under service accounts in GCP)
- In the OAuth scopes (comma-delimited) field, enter the list of scopes that your application should be granted access to. T
- You need to check the box that says “Overwrite existing client ID”
- Click Authorize.
2. Getting Search Console Data
Now onto the fun part! We will create some basic scripts for demonstration purposes, to show you how to start gathering data and manipulating it.
We’ll define a function to define authenticate with our credentials and store them in a variable.
Now we can define some functions to get search console data and compare it. Of course if you already have a report in mind that you’d like to send via Slack you can skip this section. This is just for demonstration purposes.
The following function takes the service object that we created, the site URL, a start and end date and gets search console data for the given period of time. As you can see you can pass as many variables as you want in the request, so I suggest you take a look at the Search Console API Documentation in case you want to come up with a more advanced query.
This function will return the following Data Frame.

So we are now able to get Search Console data via API! 🎉
3. Getting Google Analytics Data
Now we will create a function to get data from the Google Analytics Reporting API. I added some extra “dimensionFilterClauses” for the sake of demonstrating how those work, but feel free to get rid of those in your report.
You can pass in the following variable (for illustrating purposes):
ga_view_id="your-view-id"
ga_metrics=['ga:sessions', 'ga:newUsers', 'ga:avgTimeOnPage', 'ga:goal18Completions', 'ga:goal18ConversionRate']
ga_dimensions=['ga:landingPagePath']
ga_service=initialize__delegated_google_reporting('ga_service',credentials_file_location)
get_analytics_report(ga_service,ga_view_id,main_start,main_end,ga_metrics,ga_dimensions
The output should be something like this:

To be honest I’m not a huge expert on Google Analytics API, but here’s some resources that should help you query the data that you need:
Google Analytics Metrics Explorer
JC Chouinard Guide on Google Analytics Reporting API
Now that we now how to retrieve a Data Frame both from Search Console and Google Analytics, we will create a “summary” Data Frame that compares the data between 2 different dates. We can do that using the following function.
- df_last is the Data Frame with data from the most recent period (for example, last 30 days) and df_previous is the Data Frame with data from the previous period (previous 30 days). Both data frames follow the same structure.
- kpi_metrics: This is a list of the metrics that we want to “compare”. In the case of search console we normally pass in [“clicks”,”impressions”], because those are the KPIs that we want to monitor.
- join: This is a little dirty I know. But basically you need to specify which column should be use as join key . In the case of Search Console Data Frames the value of join will be “page” and when the Data Frames contain Google Analytics data the value of join will be “landingPagePath”
That’s how we execute the function on Data Frames that contain Search Console data:
##Define some Global Variables
gsc_kpis=["impressions","clicks"]
window=30 #we define a window
start=datetime.today()-timedelta(days=window+2)
main_start=start.strftime("%Y-%m-%d")
end=datetime.today()-timedelta(days=3)
main_end=end.strftime("%Y-%m-%d")
previous_start=(start-timedelta(window)).strftime("%Y-%m-%d")
previous_end=(end-timedelta(window)).strftime("%Y-%m-%d")
url="your-url.com"
credentials_file_location="your-creds.json"## Create the Auth Object
gsc_service=initialize__delegated_google_reporting("gsc_service",credentials_file_location)## Get the Reports
df_last=get_search_console_report(gsc_service,main_start,main_end,url)
df_previous=get_search_console_report(gsc_service,previous_start,previous_end,url)##Create Comparison
create_comparison(df_last,df_previous,gsc_kpis,join="page")
The output is something like this:

As you can see, the function merges both Data Frames, comparing the kpi_metrics we have specified. So now we get the following calculated columns:
- impressions_abs_diff
- impressions_rel_diff
- clicks_abs_diff
- clicks_rel_diff
We can also execute this function on GA Data. Let’s try:
##Define Global Variables
ga_view_id="your-view-id"
ga_metrics=['ga:sessions', 'ga:newUsers', 'ga:avgTimeOnPage', 'ga:goal18Completions', 'ga:goal18ConversionRate']
ga_dimensions=['ga:landingPagePath']
ga_kpis=['newUsers', 'goal18Completions', 'goal18ConversionRate']##Create Data Frames
ga_last=get_analytics_report(ga_service,ga_view_id,main_start,main_end,ga_metrics,ga_dimensions)ga_previous=get_analytics_report(ga_service,ga_view_id,previous_start,previous_end,ga_metrics,ga_dimensions)
create_comparison(ga_last,ga_previous,ga_kpis,join="landingPagePath")
The output is something like this:

Again, in this case we specified the following KPIs:
['newUsers', 'goal18Completions', 'goal18ConversionRate']
So the script will create comparison columns (both relative and absolute change) for each of those KPIs.
4.1 Create the Slack Application
Creating a Slack Application is really easy. You just need to follow this steps:
- Sign in to Slack and then navigate to Slack API
- Click on “Create an App” -> From Scratch

3. Create permissions for your app. Go to the “OAuth & Permissions” section of the sidebar, and add the following scopes.
4. After that you can add the application to your Work Space by clicking on “Install to Workspace”.

5. Once you’ve done that, a bot user OAuth will be generated. You need to copy that into your Python script.
6. Go to the App in your Installed Slack Apps and add it to a channel. Then copy the ID of the channel where the App is added.


4.2. Creating the code and Sending it via Slack
Alright so let’s recap a bit.
- We have created a project in Google Cloud Console to connect to Search Console and Analytics APIs.
- We’ve got 2 different sets of Data Frames from Search Console and Google Analytics, and created a function to compare them.
- We now have a Data Frame that has a comparison of the data.
What we want now is to extract the highlights from that Data Frame, and create a nice written report that is send through Slack. The report should look like this:

Bascially what the code does is:
- Create a “comparison data frame” with data from Search Console OR Google Analytics.
- Extract the highlights of that Data Frame with .nlargest or .nsmallest
- Create a written report
- Send it via Slack
Here’s the final code:
We are just pulling data from 2 different sources at the moment, but once you’ve set up the infrastructure for this, the possibilities are limitless. Here’s some more ideas I have for future versions of the tool:
- Nightwatch API / or other rank trackers: Moving forward, my plan is to connect to Nightwatch API, which is the rank tracking tool I’m using to monitor visibility. This way I should be able to combine visibility data with clicks, to get possible answers
- Google Indexing API
- HubSpot API.
I still haven’t got the time to write PART 2 of this series, but once you’ve got this script, you’ll probably want to host it on the cloud. I’m preparing another article in which I’ll explain how I automated this fully by hosting the script in the Cloud, and how can I call it via Slack so it gives me a specific report.