🚀 Executive Summary

TL;DR: This guide provides a Python script to automate the tedious process of manually organizing Google Meet transcripts. It scans Google Drive for new transcripts and automatically creates structured Notion pages, significantly improving productivity and knowledge management.

🎯 Key Takeaways

  • The automation leverages Python with `notion-client` and `google-api-python-client` to integrate Google Drive and Notion.
  • Notion setup requires creating an integration with ‘Read, Update, Insert’ capabilities, a database with ‘Meeting Title’, ‘Meeting Date’, and ‘Transcript Link’ properties, and obtaining an ‘Internal Integration Token’ and ‘Database ID’.
  • Google Drive API access is secured via a GCP Service Account with ‘Viewer’ permissions, enabling the Drive API, generating a `credentials.json` key, and explicitly sharing the target Google Drive folder with the service account’s email.
  • The Python script queries Google Drive for new Google Docs containing ‘Meeting transcript’ in a specified folder, created within the last 24 hours, and includes a check to prevent duplicate Notion page creation.
  • Scheduling the script can be achieved using cron jobs on Linux or Task Scheduler on Windows, with serverless functions like AWS Lambda or Google Cloud Functions recommended for production environments.

Automate Meeting Notes: Google Meet Transcript to Notion Page

Automate Meeting Notes: Google Meet Transcript to Notion Page

Hey there, Darian Vance here. As a Senior DevOps Engineer at TechResolve, I’m constantly looking for ways to claw back time from repetitive tasks. One of my biggest time sinks used to be managing meeting follow-ups. I’d have a dozen Google Meet transcripts scattered in a Drive folder, and I’d spend a solid hour or two every week manually organizing them into our Notion knowledge base. It was tedious, and I knew there had to be a better way.

That’s when I built this little Python script. It automatically scans my Google Drive for new meeting transcripts and creates a neat, organized page for each one in a Notion database. It’s been a game-changer for my team’s productivity and our shared knowledge. Today, I’m going to walk you through how to set it up yourself. Let’s get that time back.

Prerequisites

Before we dive in, make sure you have the following ready to go:

  • A Notion Account & Workspace: You’ll need permissions to create a new integration and a database.
  • A Google Workspace Account: This is required for Google Meet’s automatic transcript generation feature. The transcripts must be saved to a specific Google Drive folder.
  • Python 3 Environment: You should have Python 3 installed on the machine where you plan to run this script.
  • Google Cloud Platform (GCP) Project: We’ll need this to get API credentials for Google Drive. If you don’t have one, setting one up is free.

The Guide: Step-by-Step

Step 1: Configure Your Notion Workspace

First, we need to give our script a place to put the notes and the permissions to do so.

  1. Create a Notion Integration: Go to Notion’s integrations page. Click “New integration,” give it a name like “Google Meet Sync,” and associate it with your workspace. For capabilities, ensure it can “Read,” “Update,” and “Insert” content. Submit and copy the “Internal Integration Token”. Treat this like a password.
  2. Create the Notion Database: In your Notion workspace, create a new full-page database. I recommend the following properties to start:
    • Meeting Title (Title property, the default)
    • Meeting Date (Date property)
    • Transcript Link (URL property)
  3. Connect the Integration: Open your new database, click the three-dot menu (…) in the top-right corner, and go to “Add connections”. Search for and select the “Google Meet Sync” integration you just created.
  4. Get the Database ID: The Database ID is in the URL of your Notion database. It’s the long string of characters between your workspace name and the question mark. For example, in `notion.so/your-workspace/[DATABASE_ID]?v=…`, you need to copy that middle part.

Keep your Integration Token and Database ID handy. We’ll need them for the script.

Step 2: Set Up Google Drive API Access

Next, we need to grant our script permission to read files from Google Drive. We’ll use a Service Account, which is a secure way for an application to authenticate without a user being present.

  1. Create a GCP Project: Head to the Google Cloud Console, create a new project, and give it a name.
  2. Enable the Google Drive API: In your new project, navigate to “APIs & Services” > “Library”. Search for “Google Drive API” and enable it.
  3. Create a Service Account: Go to “APIs & Services” > “Credentials”. Click “Create Credentials” and choose “Service account”. Give it a name, and for roles, grant it “Viewer” access for now—we only need to read files.
  4. Generate a Key: After creating the service account, find it in the list, click on it, go to the “KEYS” tab, click “Add Key,” and choose “Create new key.” Select JSON as the key type. A `credentials.json` file will download. Guard this file carefully!
  5. Share Your Drive Folder: Find the email address of the service account you just created (it looks like `…gserviceaccount.com`). Go to the Google Drive folder where your Meet transcripts are saved, click “Share,” and paste in this email address, giving it at least “Viewer” access.

Pro Tip: I highly recommend creating a dedicated folder in Google Drive just for the transcripts you want to sync. This prevents the script from accidentally pulling in other documents and keeps your permissions clean and specific.

Step 3: The Python Script

Now for the fun part. I’ll skip the standard virtualenv setup since you likely have your own workflow for that. Let’s jump straight to the Python logic. You’ll need to install a few libraries for this to work. I use `pip` for this, so you would run a command to install `google-api-python-client`, `google-auth-httplib2`, `google-auth-oauthlib`, `notion-client`, and `python-dotenv`.

Create a file named `config.env` to store our secrets. It should look like this:


NOTION_TOKEN="your_internal_integration_token_here"
NOTION_DATABASE_ID="your_notion_database_id_here"
DRIVE_FOLDER_ID="your_google_drive_folder_id_here"

Now, create your main Python file, let’s call it `sync_notes.py`:


import os
import datetime
from dotenv import load_dotenv
from notion_client import Client
from google.oauth2 import service_account
from googleapiclient.discovery import build

# --- CONFIGURATION ---
load_dotenv('config.env')
NOTION_TOKEN = os.getenv('NOTION_TOKEN')
NOTION_DATABASE_ID = os.getenv('NOTION_DATABASE_ID')
DRIVE_FOLDER_ID = os.getenv('DRIVE_FOLDER_ID')
GOOGLE_CREDS_FILE = 'credentials.json' # Path to your downloaded service account key
SCOPES = ['https://www.googleapis.com/auth/drive.readonly']

# --- INITIALIZE CLIENTS ---
# Initialize Notion client
notion = Client(auth=NOTION_TOKEN)

# Initialize Google Drive client
def get_drive_service():
    """Authenticates and returns a Google Drive service object."""
    try:
        creds = service_account.Credentials.from_service_account_file(
            GOOGLE_CREDS_FILE, scopes=SCOPES)
        service = build('drive', 'v3', credentials=creds)
        print("Successfully connected to Google Drive API.")
        return service
    except Exception as e:
        print(f"Error connecting to Google Drive API: {e}")
        return None

# --- CORE LOGIC ---
def find_new_transcripts(service):
    """Finds Google Docs created in the last 24 hours in the specified folder."""
    if not service:
        return []
    
    # Calculate time 24 hours ago
    yesterday = (datetime.datetime.utcnow() - datetime.timedelta(days=1)).isoformat() + 'Z'
    
    # In my production setups, I use a more robust query. This is a great starting point.
    query = (
        f"'{DRIVE_FOLDER_ID}' in parents and "
        f"mimeType = 'application/vnd.google-apps.document' and "
        f"name contains 'Meeting transcript' and "
        f"createdTime > '{yesterday}' and "
        "trashed = false"
    )
    
    results = service.files().list(
        q=query,
        pageSize=10,
        fields="nextPageToken, files(id, name, webViewLink, createdTime)"
    ).execute()
    
    return results.get('files', [])

def create_notion_page(file_info):
    """Creates a new page in the Notion database for a given transcript file."""
    meeting_title = file_info['name']
    meeting_date_iso = file_info['createdTime'].split('T')[0] # Extracts YYYY-MM-DD
    transcript_link = file_info['webViewLink']

    try:
        # Check if a page with this title already exists to avoid duplicates
        existing_pages = notion.databases.query(
            database_id=NOTION_DATABASE_ID,
            filter={"property": "Meeting Title", "title": {"equals": meeting_title}}
        )
        if existing_pages['results']:
            print(f"Skipping duplicate: '{meeting_title}' already exists in Notion.")
            return

        # Create the new page
        notion.pages.create(
            parent={"database_id": NOTION_DATABASE_ID},
            properties={
                "Meeting Title": {"title": [{"text": {"content": meeting_title}}]},
                "Meeting Date": {"date": {"start": meeting_date_iso}},
                "Transcript Link": {"url": transcript_link}
            }
        )
        print(f"Successfully created Notion page for: '{meeting_title}'")
    except Exception as e:
        print(f"Error creating Notion page for '{meeting_title}': {e}")

# --- MAIN EXECUTION ---
if __name__ == '__main__':
    print("Starting sync process...")
    drive_service = get_drive_service()
    if drive_service:
        transcripts = find_new_transcripts(drive_service)
        if not transcripts:
            print("No new transcripts found in the last 24 hours.")
        else:
            print(f"Found {len(transcripts)} new transcript(s). Adding to Notion...")
            for transcript in transcripts:
                create_notion_page(transcript)
    print("Sync process finished.")

Step 4: Schedule the Automation

A script is only useful if it runs automatically. In my production setups, I’d deploy this to a serverless function like AWS Lambda or Google Cloud Functions for reliability. However, for a personal or small team setup, a simple cron job is perfectly fine.

On a Linux machine, you could schedule this to run every day at 2 AM with a command like this:

0 2 * * * python3 /path/to/your/sync_notes.py

If you’re on Windows, you can use the built-in Task Scheduler to achieve the same result.

Where I Usually Mess Up (Common Pitfalls)

  • Forgetting to Share the Folder: This is my number one mistake. I set up all the APIs correctly, but the script finds zero files because I forgot to share the Google Drive folder with the service account’s email. If your script runs but finds nothing, check your sharing settings first.
  • Incorrect Notion Database ID: Double-check that you’ve copied the full database ID from the URL and not just a page ID. The script will fail with a 404-like error if this is wrong.
  • API Scope Errors: The Google API will return a 403 Forbidden error if your service account doesn’t have the right scopes enabled or if you forgot to enable the Drive API in the GCP console. Ensure the scope in the script (`…/auth/drive.readonly`) is sufficient.

Conclusion

And that’s it. With a bit of initial setup, you’ve built a robust bridge between two essential tools. This automation doesn’t just save you time copying and pasting; it creates a reliable, up-to-date knowledge base from your team’s discussions. From here, you could extend it to use an AI API to summarize the notes, extract action items, and add them directly to the Notion page body. The possibilities are endless. Happy automating!

Darian Vance - Lead Cloud Architect

Darian Vance

Lead Cloud Architect & DevOps Strategist

With over 12 years in system architecture and automation, Darian specializes in simplifying complex cloud infrastructures. An advocate for open-source solutions, he founded TechResolve to provide engineers with actionable, battle-tested troubleshooting guides and robust software alternatives.


🤖 Frequently Asked Questions

âť“ How do I automate Google Meet transcript uploads to Notion?

You can automate this using a Python script that leverages the Google Drive API to find new Google Meet transcripts and the Notion API to create corresponding database pages, populating them with the meeting title, date, and a link to the transcript.

âť“ How does this Python script solution compare to manual methods or third-party integration tools?

This Python script offers a highly customizable, cost-effective solution, providing more control over data flow than manual copy-pasting and potentially avoiding subscription fees associated with third-party integration platforms, though it requires initial technical setup and maintenance.

âť“ What is a common implementation pitfall when setting up this Google Meet to Notion automation?

A common pitfall is forgetting to share the specific Google Drive folder containing the Meet transcripts with the Google Cloud Platform service account’s email address, which prevents the script from accessing the files and results in no transcripts being found.

Leave a Reply

Discover more from TechResolve - SaaS Troubleshooting & Software Alternatives

Subscribe now to keep reading and get access to the full archive.

Continue reading