🚀 Executive Summary
TL;DR: Manually checking daily cloud costs across AWS, Azure, and GCP is a time-consuming task. This guide provides a Python script solution to automate fetching daily cost data from all three major cloud providers and sending a unified report directly to a Slack channel.
🎯 Key Takeaways
- The solution leverages cloud-specific SDKs: `boto3` for AWS Cost Explorer (`ce:GetCostAndUsage`), `azure-mgmt-costmanagement` for Azure Cost Management, and `google-cloud-billing` for GCP Billing Budgets API.
- Secure credential management is crucial, utilizing `python-dotenv` for environment variables (e.g., `SLACK_BOT_TOKEN`, `AZURE_CLIENT_SECRET`) and avoiding hardcoding secrets.
- Automation is achieved via a cron job, scheduling the Python script to run daily and deliver the consolidated cost report to a designated Slack channel using the `slack-sdk`.
Daily Cloud Cost Report: AWS vs Azure vs GCP to Slack
Hey team, Darian here. Let’s talk about something that used to be a real thorn in my side: daily cost checks. I’d start my morning by logging into the AWS console, then hopping over to Azure’s portal, and finally navigating the GCP billing dashboard. It was a manual, tedious process that ate up at least 20 minutes a day. That’s nearly two hours a week I wasn’t getting back. I finally said, “enough is enough,” and built a simple, unified report that lands in my team’s Slack channel every morning. It’s a game-changer for visibility and catching unexpected spikes before they become a problem.
Today, I’m going to walk you through how to build the exact same thing. We’ll write a Python script that pulls cost data from all three major clouds and sends a neat summary to Slack. Let’s get that time back.
Prerequisites
Before we dive in, make sure you have the following ready. Getting the permissions right is half the battle.
- Python 3.x installed on the machine where you’ll run the script.
- A Slack Workspace where you have permissions to create an app.
- AWS: An IAM user or role with `ce:GetCostAndUsage` permissions.
- Azure: A Service Principal with `Reader` access to your subscription or management group.
- GCP: A Service Account with the `Billing Account Viewer` role and its JSON key file.
You’ll also need a few Python libraries. I’ll skip the standard `virtualenv` setup since you likely have your own workflow for that. Just make sure to install these packages using pip: `boto3`, `azure-identity`, `azure-mgmt-costmanagement`, `google-cloud-billing`, `slack-sdk`, and `python-dotenv`.
The Guide: Building Your Report
Step 1: Configure Your Slack Bot
First, we need a way to post messages. The cleanest way is with a Slack App.
- Navigate to api.slack.com/apps and create a new app “From scratch”.
- In the “Add features and functionality” section, select “Bots”.
- Go to “OAuth & Permissions” in the sidebar. You need to add a Bot Token Scope. Click “Add an OAuth Scope” and add `chat:write`. This allows your bot to post messages.
- Scroll up and click “Install to Workspace”. Authorize it.
- You’ll now see a “Bot User OAuth Token” that starts with `xoxb-`. This is your golden ticket. Copy it and keep it safe.
- Finally, go to the Slack channel you want to post in, and invite your newly created app to it.
We’ll store this token and other secrets in a file named config.env. Never commit this file to source control!
Step 2: Fetching AWS Costs
AWS has a great API called the Cost Explorer. We’ll use `boto3`, the AWS SDK for Python, to query it. The logic is simple: we ask for the unblended cost for a specific time period (yesterday).
Here’s the function I use:
import boto3
from datetime import datetime, timedelta
def get_aws_cost():
# Assumes your AWS credentials are configured (e.g., via environment variables)
client = boto3.client('ce', region_name='us-east-1')
# Get yesterday's date range
end_date = datetime.now().strftime('%Y-%m-%d')
start_date = (datetime.now() - timedelta(days=1)).strftime('%Y-%m-%d')
try:
response = client.get_cost_and_usage(
TimePeriod={
'Start': start_date,
'End': end_date
},
Granularity='DAILY',
Metrics=['UnblendedCost']
)
cost = response['ResultsByTime'][0]['Total']['UnblendedCost']['Amount']
currency = response['ResultsByTime'][0]['Total']['UnblendedCost']['Unit']
return f"{float(cost):.2f} {currency}"
except Exception as e:
print(f"Error fetching AWS cost: {e}")
return "Error"
Pro Tip: In my production setups, I add a `GroupBy` clause to the `get_cost_and_usage` call. You can group by `SERVICE` or by a specific `TAG` to get a more granular breakdown, like seeing how much your S3 buckets cost versus your EC2 instances. It’s incredibly powerful for spotting which service is causing a cost spike.
Step 3: Fetching Azure Costs
For Azure, we use their Cost Management SDK. The setup requires a Service Principal, and you’ll need the Tenant ID, Client ID, and Client Secret. Store these securely in your `config.env` file. The concept is similar: we query for a date range.
Here’s the Azure cost function:
import os
from azure.identity import ClientSecretCredential
from azure.mgmt.costmanagement import CostManagementClient
from datetime import datetime, timedelta
def get_azure_cost():
# Load credentials from config.env (handled by python-dotenv)
subscription_id = os.getenv('AZURE_SUBSCRIPTION_ID')
credential = ClientSecretCredential(
tenant_id=os.getenv('AZURE_TENANT_ID'),
client_id=os.getenv('AZURE_CLIENT_ID'),
client_secret=os.getenv('AZURE_CLIENT_SECRET')
)
client = CostManagementClient(credential)
scope = f"/subscriptions/{subscription_id}"
# Get yesterday's date range
end_date = datetime.now()
start_date = end_date - timedelta(days=1)
try:
result = client.query.usage(
scope=scope,
parameters={
"type": "ActualCost",
"timeframe": "Custom",
"time_period": {"from": start_date, "to": end_date},
"dataset": {
"granularity": "None",
"aggregation": {
"totalCost": {
"name": "Cost",
"function": "Sum"
}
}
}
}
)
cost = result.rows[0][0]
currency = result.columns[1].name
return f"{cost:.2f} {currency}"
except Exception as e:
print(f"Error fetching Azure cost: {e}")
return "Error"
Step 4: Fetching GCP Costs
GCP’s billing API can be a bit more complex for simple daily costs. I find the most reliable and straightforward method is to query the current spend against a monthly budget. This isn’t *exactly* yesterday’s cost, but it’s a fantastic proxy for tracking overall spend velocity.
First, create a monthly budget in the GCP Console for your billing account. Then, use this function with your Service Account JSON file.
import os
from google.cloud import billing_budgets_v1
def get_gcp_cost():
# Set the GOOGLE_APPLICATION_CREDENTIALS env var to the path of your JSON key file
billing_account_id = os.getenv('GCP_BILLING_ACCOUNT_ID')
budget_display_name = os.getenv('GCP_BUDGET_DISPLAY_NAME') # The name you gave your budget in the console
client = billing_budgets_v1.BudgetServiceClient()
parent = f"billingAccounts/{billing_account_id}"
try:
# GCP doesn't have a simple "yesterday's cost" API.
# A common pattern is to check the current spend against a budget.
for budget in client.list_budgets(parent=parent):
if budget.display_name == budget_display_name:
cost = budget.last_period_amount.units
currency = budget.last_period_amount.currency_code
return f"{cost:.2f} {currency} (Month-to-Date)"
return "Budget not found"
except Exception as e:
print(f"Error fetching GCP cost: {e}")
return "Error"
Step 5: Tying It All Together and Posting to Slack
Now we create a main script to orchestrate everything. It will call each function, format the results into a nice message, and use the `slack_sdk` to post it.
Here is the main execution block:
import os
from dotenv import load_dotenv
from slack_sdk import WebClient
from slack_sdk.errors import SlackApiError
# Make sure you have a config.env file with your secrets
load_dotenv('config.env')
def send_slack_message(message):
client = WebClient(token=os.getenv('SLACK_BOT_TOKEN'))
channel_id = os.getenv('SLACK_CHANNEL_ID')
try:
client.chat_postMessage(channel=channel_id, text=message)
print("Message sent to Slack successfully.")
except SlackApiError as e:
print(f"Error sending to Slack: {e.response['error']}")
return # Use return instead of sys.exit
if __name__ == "__main__":
print("Fetching cloud costs...")
aws_cost = get_aws_cost()
azure_cost = get_azure_cost()
gcp_cost = get_gcp_cost()
report_date = (datetime.now() - timedelta(days=1)).strftime('%Y-%m-%d')
message = (
f"*Daily Cloud Cost Report for {report_date}*\n\n"
f":aws: *AWS:* {aws_cost}\n"
f":azure: *Azure:* {azure_cost}\n"
f":gcp: *GCP:* {gcp_cost}"
)
send_slack_message(message)
Step 6: Automating with a Cron Job
The last step is to make this run automatically. A simple cron job is perfect for this. We’ll set it to run every morning, say at 8 AM.
You can edit your cron table and add the following line. This example runs the script at 8:00 AM every day. Remember, no absolute paths starting with a slash, to keep things clean and portable.
0 8 * * * python3 get_costs.py
Where I Usually Mess Up
I’ve set this up a dozen times, and here are the traps I still fall into:
- Permissions, Permissions, Permissions: 90% of the time, when a script fails, it’s because the IAM role or Service Principal is missing a specific permission. Double-check that `ce:GetCostAndUsage` is there for AWS or that the Azure role is applied at the correct scope.
- Timezone Mismatches: The AWS Cost Explorer API operates in UTC. If you’re in a different timezone, your “yesterday” might not align with their “yesterday,” leading to confusing numbers. Be aware of this when you’re debugging.
- Forgetting to Invite the Bot: You can set up the Slack app perfectly, but if you forget to `/invite @your-bot-name` to the channel, it will fail to post messages. It’s an easy one to miss.
- Committing Secrets: In a rush, it’s tempting to hardcode a key. Don’t do it. Always use a `config.env` file and add it to your `.gitignore` immediately. It’s a discipline that will save you a massive headache later.
Conclusion
And there you have it. With a single Python script and a bit of setup, you’ve replaced a tedious manual task with a reliable, automated report. This not only saves you time but also gives your entire team immediate visibility into cloud spending. From here, you can expand it with cost breakdowns, budget alerts, or even trend analysis.
Automating the small, repetitive tasks is what DevOps is all about. Now, go enjoy that extra 20 minutes of your morning!
– Darian
🤖 Frequently Asked Questions
âť“ How can I automate daily cloud cost reporting across multiple providers?
You can automate daily cloud cost reporting across AWS, Azure, and GCP by developing a Python script that uses `boto3`, Azure’s Cost Management SDK, and Google Cloud’s Billing API to fetch cost data, then consolidates and sends this information to a Slack channel via the `slack-sdk`.
âť“ How does this custom Python script solution compare to commercial cloud cost management tools?
This custom Python script provides a free, highly customizable, and lightweight solution for daily cost visibility directly in Slack. Commercial cloud cost management tools typically offer more advanced features like detailed analytics, forecasting, anomaly detection, and optimization recommendations, but come with associated costs and potentially less direct control over specific reporting formats.
âť“ What are common implementation pitfalls for this multi-cloud cost reporting script?
Common pitfalls include incorrect IAM permissions (e.g., `ce:GetCostAndUsage` for AWS, `Reader` for Azure, `Billing Account Viewer` for GCP), timezone mismatches affecting ‘yesterday’s cost’ calculations, forgetting to invite the Slack bot to the target channel, and insecurely committing API keys or secrets instead of using `config.env` files.
Leave a Reply