🚀 Executive Summary
TL;DR: Manually syncing VS Code extensions across multiple machines is tedious and error-prone. This guide provides a scriptable, Gist-based workflow using Python and GitHub Personal Access Tokens to automate exporting and installing extensions, saving significant setup time.
🎯 Key Takeaways
- Leverage the VS Code command-line tool `code –list-extensions` to export your current extension list into a simple `extensions.txt` file.
- Utilize a GitHub Personal Access Token (PAT) with only the `gist` scope enabled to securely authenticate Python scripts for Gist creation and updates.
- Employ Python scripts with the `requests` library to interact with the GitHub Gist API and the `subprocess` module to programmatically install extensions using `code –install-extension`.
Syncing VS Code Extensions across machines using Gist
Hey there, Darian here. Look, we all have multiple machines: a work laptop, a personal desktop, maybe a cloud instance for heavy lifting. The single most frustrating part of a new setup for me was always the “death by a thousand cuts” process of remembering and reinstalling every single VS Code extension. I’d only realize one was missing right when I needed it most, completely breaking my flow. It felt like I was wasting hours every quarter just on environment setup.
That’s why I landed on this Gist-based workflow. It’s a simple, scriptable way to keep my extension list in a central, version-controlled spot. Now, setting up a new machine’s editor takes about two minutes. Let me show you how it’s done.
Prerequisites
Before we dive in, make sure you have a few things ready:
- A GitHub account.
- A GitHub Personal Access Token (PAT) with the “gist” scope enabled.
- Python 3 installed on your machines.
- VS Code installed, with its command-line tool available in your PATH.
The Guide: A Step-by-Step Breakdown
Step 1: Get Your GitHub Personal Access Token (PAT)
First, you need to give our script permission to talk to GitHub. The safest way is with a PAT.
- Navigate to your GitHub account settings, then go to Developer settings > Personal access tokens > Tokens (classic).
- Click “Generate new token”.
- Give it a descriptive name like “VSCode Sync”.
- Crucially, check the box for the
gistscope. This is the only permission it needs. - Click “Generate token”. Copy the token immediately and save it somewhere secure. You won’t see it again.
We’ll store this token in a configuration file. I recommend creating a file named config.env in your project directory. Inside that file, add this line:
GITHUB_PAT="paste_your_token_here"
GIST_ID=""
Leave GIST_ID blank for now; we’ll fill that in after our first upload.
Step 2: Capture Your Current Extensions
On your main, fully-configured machine, you need to export a list of your installed extensions. The VS Code command-line tool makes this incredibly easy. Open your terminal and run the command to list extensions, redirecting the output into a file. The command is typically code --list-extensions. You’ll want to pipe that output to a file like extensions.txt. This creates a simple text file, with one extension ID per line.
Step 3: The “Upload” Script
Now for the fun part. We’ll write a Python script to read that extensions.txt file and upload its contents to a GitHub Gist. I’ll skip the standard virtualenv setup since you likely have your own workflow for that. Just make sure you install the necessary libraries, like ‘requests’ and ‘python-dotenv’, using pip.
Here’s the script, let’s call it upload_extensions.py:
import os
import requests
import json
from dotenv import load_dotenv
def sync_gist():
load_dotenv('config.env')
github_pat = os.getenv("GITHUB_PAT")
gist_id = os.getenv("GIST_ID")
if not github_pat:
print("Error: GITHUB_PAT not found in config.env")
return
try:
with open('extensions.txt', 'r') as f:
content = f.read()
except FileNotFoundError:
print("Error: extensions.txt not found. Please generate it first.")
return
headers = {
"Authorization": f"token {github_pat}",
"Accept": "application/vnd.github.v3+json"
}
data = {
"description": "My VS Code Extensions List",
"public": False,
"files": {
"extensions.txt": {
"content": content
}
}
}
if gist_id:
# Update existing Gist
url = f"https://api.github.com/gists/{gist_id}"
response = requests.patch(url, headers=headers, data=json.dumps(data))
print("Updating existing Gist...")
else:
# Create new Gist
url = "https://api.github.com/gists"
response = requests.post(url, headers=headers, data=json.dumps(data))
print("Creating new Gist...")
if response.status_code in [200, 201]:
response_data = response.json()
new_gist_id = response_data.get('id')
print(f"Success! Gist available at: {response_data.get('html_url')}")
if not gist_id:
print(f"IMPORTANT: Add this GIST_ID to your config.env file: {new_gist_id}")
else:
print(f"Failed to sync Gist. Status: {response.status_code}")
print(f"Response: {response.text}")
if __name__ == "__main__":
sync_gist()
The Logic: This script loads your PAT and Gist ID from the config.env file. It reads your extensions.txt. If a GIST_ID is present, it performs a PATCH request to update the existing Gist. If not, it POSTs to create a new one. After the first run, it will print a new Gist ID—you must copy this and add it to your config.env file for future updates.
Pro Tip: In the script, I’ve set
"public": False. This creates a “secret” Gist, meaning it’s not searchable but is accessible via its direct URL. For something like an extension list, this is usually what you want.
Step 4: The “Download and Install” Script
On your new machine, you need a way to pull that list down and install everything. This second script handles that. Let’s call it install_extensions.py.
import os
import requests
import subprocess
from dotenv import load_dotenv
def install_from_gist():
load_dotenv('config.env')
# PAT is needed for private gists, though not strictly for public ones
github_pat = os.getenv("GITHUB_PAT")
gist_id = os.getenv("GIST_ID")
if not gist_id:
print("Error: GIST_ID not found in config.env. Cannot fetch extensions.")
return
headers = {
"Authorization": f"token {github_pat}",
"Accept": "application/vnd.github.v3+json"
}
url = f"https://api.github.com/gists/{gist_id}"
response = requests.get(url, headers=headers)
if response.status_code != 200:
print(f"Failed to fetch Gist. Status: {response.status_code}")
return
data = response.json()
content = data['files']['extensions.txt']['content']
extensions = content.strip().split('\n')
print(f"Found {len(extensions)} extensions to install.")
for extension in extensions:
if extension: # Avoid empty lines
print(f"Installing {extension}...")
try:
# This constructs and runs the 'code --install-extension' command
command = ["code", "--install-extension", extension]
subprocess.run(command, check=True, capture_output=True, text=True)
print(f"Successfully installed {extension}")
except subprocess.CalledProcessError as e:
print(f"Failed to install {extension}. Error: {e.stderr}")
except FileNotFoundError:
print("Error: 'code' command not found. Is VS Code in your PATH?")
return
print("Extension installation complete.")
if __name__ == "__main__":
install_from_gist()
The Logic: This one is simpler. It uses your Gist ID to fetch the Gist’s content. It then splits the text into a list of extension IDs and iterates through them, calling the VS Code command-line tool (`code –install-extension`) for each one. The `subprocess` module is a clean way to run external commands from Python.
Here’s Where I Usually Mess Up
A couple of common pitfalls to watch out for:
- Forgetting to save the PAT. I’ve done this more times than I care to admit. GitHub shows it to you exactly once. If you lose it, you have to generate a new one.
- Incorrect PAT scope. If you forget to check the
gistbox, the API will return a 404 (Not Found) or 403 (Forbidden) error, which can be confusing because you’ll swear the Gist ID is correct. - Not updating the GIST_ID in
config.env. After your first successful upload, the script will print the ID. If you forget to add it to your config file, you’ll just keep creating new Gists instead of updating the one you care about.
Conclusion
And that’s it. You now have a robust, scriptable system for keeping your development environments perfectly in sync. On your main machine, you can run the upload script whenever you add or remove an extension. On any other machine, just pull your project and run the install script. In my production setups, I even have the upload script running on a weekly schedule just to keep the Gist fresh. A simple cron job like 0 2 * * 1 python3 upload_extensions.py does the trick. It’s a small bit of automation that pays huge dividends in time and sanity.
Hope this helps you reclaim some focus. Happy coding.
– Darian Vance
🤖 Frequently Asked Questions
âť“ How can I automate the synchronization of my VS Code extensions across different machines?
Automate VS Code extension syncing by exporting your current extensions to an `extensions.txt` file, then using Python scripts to upload this file to a GitHub Gist and later download and install them on new machines via the `code –install-extension` command.
âť“ How does this Gist-based method compare to manual extension installation?
The Gist-based method centralizes your extension list in a version-controlled, scriptable location, eliminating the ‘death by a thousand cuts’ of manual reinstallation and ensuring consistency across environments. Manual installation is time-consuming, prone to forgetting essential extensions, and lacks automation.
âť“ What is a common implementation pitfall when setting up Gist-based VS Code extension syncing?
A common pitfall is forgetting to set the `gist` scope for your GitHub Personal Access Token (PAT). This will result in 403 (Forbidden) or 404 (Not Found) errors when the script attempts to interact with the GitHub Gist API, preventing successful upload or download of extensions.
Leave a Reply