After understanding what it means to be eco-conscious in the world of QA and recognizing the regulatory and corporate pressures pushing us toward more sustainable practices, the inevitable question arises: how do we actually implement it? The transition to a sustainable testing model requires structure, methodology, and, above all, a clear objective.
In this second article of our Green Quality Assurance (GQA) series, we leave behind the “why” and move into the “how.” If in the first article we established that measuring quality in watts and CO2 is just as important as ensuring software works properly, now it is time to build the foundations that will support this new way of working. It is not just about reducing the energy consumption of our tests or running fewer test cases; it is about completely reimagining our approach to software quality.
The software industry must understand that technical excellence and environmental responsibility are not mutually exclusive goals. In fact, the most innovative organizations are discovering that sustainable QA practices often lead to more efficient processes, more productive teams, and, surprisingly, better final product quality.
But to achieve this balance, we need a robust framework that allows us to assess, implement, and continuously improve our practices.
The journey toward Green QA is evolutionary. We cannot expect an organization to go from zero to one hundred overnight. That is why, in the following sections, we explain a gradual approach that recognizes different maturity levels, provides concrete tools for each stage, and enables us to measure our progress.
Framework layers
Addressing cultural change in an organization regarding quality is usually an evolutionary process in which awareness plays a major role. The shift proposed by GQA (Green QA) introduces a new dimension to that awareness: the goal of doing things in the greenest way possible. But are we really aware of what must change to make this happen? Let’s explore it.
Governance
First of all, it is important to have governance that is appropriate for this context. This is the organization’s official “mandate” and the basis of the entire process. Without it, Green QA remains an isolated initiative. Governance addresses the following points:
- Sustainable quality policy
This is not just a document; it is about defining the “Green Acceptance Threshold.” It establishes the sustainability goals the company is pursuing, setting targets by resource type. Once these goals are defined, everything else is aimed at meeting them. For example: “No production deployment may increase the energy consumption of the microservice by more than 5%.”
- Responsibility matrix
This defines what each member of the organization involved in the delivery lifecycle is responsible for.
- QA: designs efficiency test cases, defines baseline metrics, and is responsible for executing measurements in each cycle.
- Architecture: validates that design decisions do not introduce structural energy debt before reaching the testing phase.
- DevOps / Platform Engineering: ensures that the instrumentation needed to measure consumption is available in test environments. Without observability infrastructure, QA cannot measure.
- Sustainability: provides energy-to-CO2 conversion factors and ensures data traceability into the ESG reporting system.
- Compliance: verifies that data collection, processing, and reporting comply with current regulations, particularly the CSRD.
- Product Owner: formally accepts that acceptance criteria include efficiency metrics, not only functionality and traditional performance.
- Alignment with ESG and corporate strategy. In the previous post, we discussed ESG extensively and its connection to Green QA, and it is something that must be considered within governance.
Processes
This layer establishes how Green QA is integrated into the project’s day-to-day work.
- Shift-Left Green: integrating environmental criteria during the Refinement phase. If a feature consumes too much unnecessary data, it is rejected before development begins.
- Green Gateways in the Pipeline: adding “quality gates” into CI/CD. If automated tests detect an unusual spike in CPU or RAM usage, the build fails. Environmental controls must be present in design, development, testing, and deployment.
- Suppliers: evaluating whether our service providers (Cloud, SaaS) operate using renewable energy.
It is important that governance also defines the consequences of non-compliance with the policy, whether by blocking a release, creating technical debt, or establishing the appropriate mechanisms to prevent bypassing the defined goals.
Data and metrics
This framework layer analyzes the quality of the data that will support decision-making.
- Data traceability: ensuring that ESG data has a clear lineage, from the sensor or server log to the annual report.
- QA Data Cleaning: a process for deleting test environments, temporary databases, and old execution logs that worsen storage conditions and consume unnecessary energy.
- Accuracy vs. estimation: defining which data is measured (real) and which is estimated (mathematical models), applying different QA approaches to each.
- Definition of environmental and ESG KPIs
- Data quality controls (accuracy, completeness, traceability)
- Audit and reporting
Technology
This is the layer of tools. Green QA needs technical eyes to “see” energy. Therefore, it is advisable to have tools to measure the impact of programming languages (for example, comparing Python vs. Rust consumption in critical processes), optimize test suites so they do not run 2,000 tests if only 2 lines of code changed (risk-based test selection), configure the framework so test environments “self-destruct” immediately after execution, and have measurement tools (energy, carbon, resources), green test automation, and infrastructure optimization (cloud, hardware).
Continuous improvement
Within the framework layers, it is advisable to invest in the cultural and improvement aspect so that the framework becomes circular rather than linear.
Elements such as “Retro-Green,” where each sprint retrospective can include a question like: “what process or code did we make more efficient this month?”, gamification through rankings of development/QA teams that have reduced their digital carbon footprint the most, and the updating of standards.
As ESG laws evolve, this layer changes and is typically reviewed quarterly. In this way, the framework continues to comply with new regulations and allows organizations to establish measurable reduction goals, whether quantitative or qualitative.
The path toward Zero-waste Testing
One of the points mentioned above is measurement. It is essential to understand at every moment where we stand relative to the objectives in order to take the necessary actions and decisions while working within the defined governance. If you are familiar with TMMi, it is a process that measures maturity levels in relation to testing within an organization.
In this regard, below is an analysis of existing levels in relation to GQA, so that we can understand where we are and what the goals should be to consolidate or advance through them. Each of these levels should be further developed to make evaluations more concrete.
Level 1: initial (we start walking)
At this level, the company is not aware of the environmental impact of its testing. If there is efficiency, it is for cost savings, not because of purpose. Compliance is reactive, without metrics.
At this early stage, all tests are always executed in environments where we have no visibility into their consumption and which generally remain on 24/7.
Level 2: basic (awareness has awakened)
Stakeholders understand what Green QA is, and the first manual efforts begin to document impact. Manual controls and basic reporting appear.
At this intermediate level, assets such as servers and tools are identified in order to request sustainability reports from providers, creating an inventory of digital assets. Testing begins to be handled more consistently with this policy, for example by deleting old test data.
At this stage, actions still depend on people rather than automated processes.
Level 3: defined (the green standard)
Sustainability is officially integrated into the quality manual. The first standardized processes and KPIs appear.
At this level, having a green production readiness checklist should be one of the goals. To support this, green KPIs are defined (for example, watts per test suite).
At the same time, this level seeks to ensure that the QA team has sufficient training in Green Coding and energy efficiency.
Level 4: managed (automated and measurable quality)
A set of metrics has been established, and continuous reporting automation through pipelines is in place.
Once certain aspects are consolidated, including cultural ones, the goal becomes the use of “real-time carbon dashboards” integrated into tools like Jira or Grafana, so teams can see their daily impact.
The challenge is integrating this data with the rest of the company (ESG). CI/CD pipelines include tools that automatically measure the CPU/RAM consumption of tests. If a test is inefficient, an alert is generated.
Level 5: optimized (Green QA as DNA)
GQA is no longer an “extra”; it is the only way of working. It is aligned with the company’s overall strategy.
Now the company uses AI to predict and minimize the energy consumption of tests. The carbon savings achieved by the QA team are reported directly in the company’s annual sustainability report (ESG). The challenge is maintaining innovation and leading standards in the industry.
Aspects such as the “circular economy of data,” where test data is intelligently reused to avoid generating new loading processes, begin to be taken seriously in order to consolidate green goals.
Strategy and methodology
The strategy is the decision-making framework in which the elements needed to address GQA in practice are defined, aligned with all the framework layers described above.
The methodology describes how we implement the strategy we have defined and is responsible for defining how to execute GQA.
In the context of GQA (or sustainability in the software lifecycle), these tools do not only measure emissions, but also integrate into the quality process to ensure that software is efficient and complies with environmental regulations (ESG).
Once a strategy aligned with objectives has been established, the tools and frameworks that will be used for its implementation are selected. Below are some examples.
Measuring software efficiency (Green Testing)
This is where QA has direct control. The energy consumption of a process or test suite is measured. These measurements are used to establish the baseline.
Before optimizing a single line of code, QA needs to know how many grams of CO2 the hardware running the application generates. Without that, we cannot measure improvement after an optimization.
Tool: Scaphandre (energy metrology)
An open-source power consumption metrics agent designed for Kubernetes and bare-metal servers.
- What it is used for
It measures exactly how many watts a specific process consumes (for example, your Selenium suite or a microservice under load).
- Setup and use
- Installation: it is installed as a binary or Docker container on the server where tests run.
- Usage: it exposes metrics in Prometheus format.
- Green QA Step: configure a Grafana dashboard that crosses “CPU consumption” with “Watts consumption.” If, after a code optimization, the tests take the same time but consume fewer watts, Green QA has succeeded.
Tool: Eco-Code / SonarQube (Green Rules)
- What it is used for
Static code analysis focused on energy efficiency.
- Setup
- Installation: add the "Green IT" or "Eco-Code" plugin to your SonarQube instance.
- Usage: QA defines quality gateways. If the code contains patterns that wake up the CPU unnecessarily (inefficient loops, redundant API calls), the quality test fails.
Tool: SimaPro and GaBi
These are industrial standards that can be adapted to the QA ecosystem in the following way:
- What they mean for IT environments
They allow us to model the impact of our digital infrastructure (servers, test mobile devices, or networks). They do not only measure energy expenditure, but also the “carbon debt” of the hardware supporting our software.
- Their strategic use in Green QA
They are used to establish the baseline. Before optimizing a single line of code, QA needs to know how many grams of CO2 the hardware generates. Without this, we cannot measure improvement after optimization.
- Setup and data sources
To control the environmental footprint of our applications, Green QA maps:
- Hardware inventories: CPUs, RAM, and storage systems used in staging and production environments.
- Environmental databases: sources such as Ecoinvent are integrated to calculate the impact of the energy mix (it is not the same to run a test on a server in Norway powered by hydroelectric energy as in a coal-dependent region).
- Practical decision example
Thanks to these tools, the QA team can make comparisons based on real data:
Case study: is it more sustainable to run our regression suite on old on-premise servers or migrate testing to a cloud instance with Energy Star certification and auto-scaling? Green QA uses LCA to demonstrate that migration reduces carbon footprint by X%.
Measuring Cloud Carbon Footprint (CCF)
If you do not want to rely on native tools (which are sometimes opaque), Cloud Carbon Footprint is the open standard.
Tool: Cloud Carbon Footprint (CCF)
- What it is used for
To visualize emissions from AWS, Azure, and GCP in one place with a transparent calculation methodology.
- Setup
- Connection: you need read permissions for billing files (CUR in AWS, Billing Export in GCP).
- Usage: it allows the QA team to compare regions.
- Technical decision: QA can demonstrate that moving the staging environment from a coal-based region (for example, Virginia, US-East-1) to one powered by cleaner energy (for example, Sweden or France) instantly reduces carbon footprint without changing a single line of code.
Tools: Watershed, Persefoni, Plan A
These are SaaS platforms for measuring corporate carbon footprint.
- What they are
They automate the calculation of Scope 1, 2, and 3 emissions.
- Usage in Green QA
Software falls under Scope 3 (indirect emissions). These tools collect data from your electricity bills and cloud providers.
- Setup
They connect via API to your inventory systems. The QA team reports the energy consumption of test server farms here so the company has real data on IT department impact.
Measuring sustainability in the Frontend
GQA also measures the impact on the end-user device.
Tool: GreenFrame.io or Lighthouse (Carbon Indicator)
- What it is used for
To measure the carbon footprint of a user session in the browser.
- Setup and use
- CI/CD Integration: it integrates with GitHub Actions or Jenkins.
- Usage: every time a visual regression test is launched, GreenFrame estimates the grams of CO2 produced by loading the page (data transfer + JS execution on the client).
- QA Metric: “this new Home version weighs 2MB more and generates 0.5g of extra CO2 per visit.” This is reported as a Sustainability Bug.
QA and audit: green quality management
This is where you connect data with the testing process.
Jira and TestRail
- Usage in Green QA: they do not measure carbon by themselves, but they are configured to manage green requirements.
- Setup
- Jira: create custom fields such as Estimated Carbon Cost in User Stories.
- TestRail: create a “Green Test Cases” section where you validate that the app enters power-saving mode or does not make unnecessary API requests.
ESG data platforms (Environmental, Social, and Governance)
- What they are
Repositories where all evidence for legal audits is stored.
- Usage in Green QA
The result of your green tests is uploaded here as proof of regulatory compliance (for example, to comply with the CSRD directive in Europe).
- Pipeline configuration
For this to be true “Green QA,” the flow must be:
- Define thresholds: in Jira, set an energy consumption limit per feature.
- Measure: during execution (Cucumber/Playwright), monitor consumption with tools such as Scaphandre or Intel Power Gadget.
- Visualize: cross that data with the Azure Emissions Dashboard.
- Audit: export reports to Persefoni or Plan A for annual accounting.
Conclusions
The transition process within an organization to embrace GQA involves a series of unavoidable steps and an adaptation process at every level.
It requires involvement from both the business and technical sides, beginning with cultural change and continuing through methodological and strategic adaptation.
All these changes will help the company comply with the legal framework already mentioned in the previous post.
Comments are moderated and will only be visible if they add to the discussion in a constructive way. If you disagree with a point, please, be polite.
Tell us what you think.