Announcing the AWS Sustainability console: Programmatic access, configurable CSV reports, and Scope 1–3 reporting in one place
AWS announces the Sustainability console, a new standalone service that consolidates carbon emissions reporting and resources, giving sustainability teams independent access to Scope 1, 2, and 3 emissions data without requiring billing permissions.
AWS News Blog
As many of you are, I’m a parent. And like you, I think about the world I’m building for my children. That’s part of why today’s launch matters for many of us. I’m excited to announce the launch of the AWS Sustainability console, a standalone service that consolidates all AWS sustainability reporting and resources in one place.
With the The Climate Pledge, Amazon set a goal in 2019 to reach net-zero carbon across our operations by 2040. That commitment shapes how AWS builds its data centers and services. In addition, AWS is also committed to helping you measure and reduce the environmental footprint of your own workloads. The AWS Sustainability console is the latest step in that direction.
The AWS Sustainability console builds on the Customer Carbon Footprint Tool (CCFT), which lives inside the AWS Billing console, and introduces a new set of capabilities for which you’ve been asking.
Until now, accessing your carbon footprint data required billing-level permissions. That created a practical problem: sustainability professionals and reporting teams often don’t have (and shouldn’t need) access to cost and billing data. Getting the right people access to the right data meant navigating permission structures that weren’t designed with sustainability workflows in mind. The AWS Sustainability console has its own permissions model, independent of the Billing console. Sustainability professionals can now get direct access to emissions data without requiring billing permissions to be granted alongside it.
The console includes Scope 1, 2, and 3 emissions attributed to your AWS usage and shows you a breakdown by AWS Region, service, such as Amazon CloudFront, Amazon Elastic Compute Cloud (Amazon EC2), and Amazon Simple Storage Service (Amazon S3). The underlying data and methodology haven’t changed with this launch; these are the same as the ones used by the CCFT. We changed how you can access and work with the data.
As sustainability reporting requirements have grown more complex, teams need more flexibility accessing and working with their emissions data. The console now includes a Reports page where you can download preset monthly and annual carbon emissions reports covering both market-based method (MBM) and location-based method (LBM) data. You can also build a custom comma-separated values (CSV) report by selecting which fields to include, the time granularity, and other filters.
If your organization’s fiscal year doesn’t align with the calendar year, you can now configure the console to match your reporting period. When that is set, all data views and exports reflect your fiscal year and quarters, which removes a common friction point for finance and sustainability teams working in parallel.
You can also use the new API or the AWS SDKs to integrate emissions data into your own reporting pipelines, dashboards, or compliance workflows. This is useful for teams that need to pull data for a specific month across a large number of accounts without setting up a data export or for organizations that need to establish custom account groupings that don’t align with their existing AWS Organizations structure.
You can read about the latest features released and methodology updates directly on the Release notes page on the Learn more tab.
Lets see it in action To show you the Sustainability console, I opened the AWS Management Console and searched for “sustainability” in the search bar at the top of the screen.
The Carbon emissions section gives an estimate on your carbon emissions, expressed in metric tons of carbon dioxide equivalent (MTCO2e). It shows the emissions by scope, expressed in the MBM and the LBM. On the right side of the screen, you can adjust the date range or filter by service, Regions, and more.
For those unfamiliar: Scope 1 includes direct emissions from owned or controlled sources (for example, data center fuel use); Scope 2 covers indirect emissions from the production of purchased energy (with MBM accounting for energy attribute certificates and LBM using average local grid emissions); and Scope 3 includes other indirect emissions across the value chain, such as server manufacturing and data center construction. You can read more about this in our methodology document, which was independently verified by Apex, a third-party consultant.
I can also use API or AWS Command Line Interface (AWS CLI) to programmatically pull the emissions data.
aws sustainability get-estimated-carbon-emissions \ --time-period='{"Start":"2025-03-01T00:00:00Z","End":"2026-03-01T23:59:59.999Z"}'aws sustainability get-estimated-carbon-emissions \ --time-period='{"Start":"2025-03-01T00:00:00Z","End":"2026-03-01T23:59:59.999Z"}'{ "Results": { "TimePeriod": { "Start": "2025-03-01T00:00:00+00:00", "End": "2025-04-01T00:00:00+00:00" }, "DimensionsValues": {}, "ModelVersion": "v3.0.0", "EmissionsValues": { "TOTAL_LBM_CARBON_EMISSIONS": { "Value": 0.7, "Unit": "MTCO2e" }, "TOTAL_MBM_CARBON_EMISSIONS": { "Value": 0.1, "Unit": "MTCO2e" } } }, ...` [blocked]
The combination of the visual console and the new API gives you two additional ways to work with your data, in addition to the Data Exports still available. You can now explore and identify hotspots on the console and automate the reporting you want to share with stakeholders.
The Sustainability console is designed to grow. We plan to continue to release new features as we grow the console’s capabilities alongside our customers.
Get started today The AWS Sustainability console is available today at no additional cost. You can access it from the AWS Management Console. Historical data is available going back to January 2022, so you can start exploring your emissions trends right away.
Get started on the console today. If you want to learn more about the AWS commitment to sustainability, visit the AWS Sustainability page.
— seb
Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
announceservicereport
🚀 The Developer Who Survives 2026 Is NOT the One You Think
⚠️ The Hard Truth In 2026, being a “good developer” is no longer enough. You can: Write clean code ✅ Know Docker, Kubernetes ✅ Grind LeetCode daily ✅ …and still get replaced. Not by another developer. But by someone who knows how to use AI better than you. 🤖 The New Battlefield: AI-Augmented Developers Let’s be clear: AI is NOT replacing developers. But developers using AI are replacing those who don’t. The game has changed from: “How well can you code?” to: “How well can you THINK, DESIGN, and ORCHESTRATE?” 🧠 The 3 Skills That Actually Matter Now 1. 🧩 AI Orchestration (The Hidden Superpower) Most devs use one tool. Top devs use systems of tools : GPT → for architecture Claude → for reasoning large codebases Copilot/Cursor → for execution Local LLM → for privacy 👉 The magic is not in t

Stop Guessing What Caused Your Flaky Tests Fail or Pass
Flaky tests don’t fail when you expect them to. They fail when you least have time. One moment everything is green, the next your CI pipeline is red — and then, magically, it passes on rerun. ❌ ❌ ✅ → Passed So… what just happened? Was it a network issue? Timing? State leakage? The classic DOM detached? May be, the fixture didnt return the value? The Problem: We Only See the Final Outcome Most test reports show you only the final result , or you install a bunch of plugins that would scrap all the xmls for you to show you multiple tests of same title and you click each one of them to see which might have ran first? If a test fails twice and passes on the third attempt, all you see is: TestCheckoutFlow → Rerun TestCheckoutFlow → Rerun TestCheckoutFlow → Rerun TestCheckoutFlow → PASSED That “p
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Releases

From Redis to Valkey: pre-migration Reconnaissance — detect all apps & connections in realtime
Abstract Abraham Lincoln : "Give me six hours to chop down a tree and I will spend the first four sharpening the axe" Redis Valkey is getting more popular due to its performance increase compared to classic Redis version, I'm starting a seris of posts related to migration from one vendors' Redis implemnetation to AWS Valkey . To choose proper migration technics, the most important step is a reconnaissance of pre-migration. In this post I'll explain how native Redis features can help to identify all services that have connection to Redis (what is really hard in distributed environment of enterprise level infrastucture, that was created with periodically changed stack, languages, SDKs by multiple engineering generations). Valkey project bried history Redis first version release was in 2009 ,





Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!