flowchart TD
A[Develop Scope] -->|Assemble Team| B[Understand System]
B --> C[Identify Climate Hazards and Exposure]
C --> |Collect Climate Data| D[Vulnerability Assessment]
D --> |No Vulnerability| G[Institutionalize Decision]
D --> E[Risk Assessment]
E --> |Risk Not Acceptable| F[Adaptation Plan]
E --> |Risk Acceptable| H[Institutionalize Decision]
Climate Vulnerability and Risk Assessments
Climate vulnerability and risk assessments are tools for understanding how climate change may impact electricity sector activities. These types of analysis can be scaled to meet the needs of the sector activity. They could be conducted in a very limited way for an individual component or could be expanded to an entire generating station, transmission corridor or other asset classes.
For a vulnerability and risk assessment to be most successful it should be integrating into existing processes. For example, the assessment could be conducted as part of a dam safety review or as part of a planning exercise of a new transmission line. The overall process will be very similar.
A generic framework for conducting a climate change risk assessment is shown below. Following the scope and purpose of this guide the greatest detail will focus on the use of climate data and therefore the actual elements of vulnerability and risk assessments. The previously mentioned guidance documents are good sources of information for sector specific guidance on framing and assessment and undersanding a system.
Clear and transparent documentation of a vulnerability and risk assessment is important. The whole process should be documented well enough that it could be easily repeated by an independent group and either come to the same conclusions or clearly understand how conclusions were drawn. This should include any assumptions (implicit and explicit), rationale for choices made and conclusions reached.
Establish the Scope and Context
The scope and context of the assessment should first be established. This step sets the foundation for the full climate vulnerability and risk assessment by defining what is being assessed, why it is being assessed, and how results will be used.
The starting point is a clear understanding of the application for the assessment, including the purpose, intended decisions, and the problem the assessment is intended to address. This step also clarifies what “unacceptable” looks like for the system by confirming performance requirements (if they exist), and vulnerability/risk evaluation criteria. Teams should use this information to choose a practical assessment approach, including procedures for testing potential solutions in later steps, and to set the depth of analysis to match the system and decision context (for example, a high-level screening assessment, a spreadsheet-based analysis, or a more detailed modelling exercise).
This step should be integrated into existing processes rather than recreating them. The problem statement, risk evaluation criteria, and critical thresholds are typically available through standard governance, processes, and documentation, such as risk registers, a Business Requirements Document (BRD), design basis documents, historical performance information, or input from system owners and subject matter experts.
Where relevant and current, previous assessment findings may be used to reduce effort in later steps by providing existing system descriptions, hazard screens, or vulnerability findings; however, teams should first confirm the earlier assessment fits the current objective and application, and confirm that key inputs (such as climate datasets, assumptions, and performance criteria) remain up to date. For example, a previous assessment could have been completed on an system where climate vulnerability has already been completed. This may reduce the effort needed and the objective of the new assessment may be focused on improving climate vulnerability or risk understanding of the system and strengthening adaptation planning and implementation.
Objectives and Evaluation Criteria
Teams should confirm or establish the objective of the assessment in practical terms (e.g., inform an investment decision, support a design change, prioritize maintenance actions, update emergency planning, etc.) and then define the criteria used to evaluate climate-related assessment. Objectives and evaluation criteria describe what the assessment is intended to achieve and how results will be interpreted to support decisions.
Criteria are most useful when they connect to decision needs and are stated in measurable or clearly described terms, such as safety impacts, service delivery impacts, environmental impacts, financial impacts, compliance impacts, and reputational impacts. Where possible, teams may also confirm critical thresholds and unacceptable outcomes, such as conditions that could lead to worker or public harm, loss of essential function, exceedance of design or operating limits, or prolonged outage beyond acceptable recovery time.
Typical inputs for this step may include the decision statement and intended use of results, existing corporate or project objectives, risk criteria used, performance requirements and operating limits and any existing risk register entries or business requirements.
Regulatory Context
Teams should identify applicable laws, permits, regulatory guidance, and engineering codes early so the assessment can support compliance while remaining decision useful. Regulatory drivers may exist and influence the scope of the assessment through requirements that can restrict the approach, specify minimum design elements, or impose other constraints.
Typical inputs for this step may include applicable regulatory requirements and code obligations for the system or activity, the approval or assessment processes being triggered, and any regulatory guidance on climate-related expectations.
Codes and standards often set minimum design or performance requirements; these minimums may be effective in the near term but may not fully account for future climate conditions, particularly where requirements are based on historical or retrospective design values. The climate assessment process presented in this standard supports forward-looking, prudent decisions to manage systems and assets, where analysis often times goes beyond minimum codes and requirements.
System, Geographic and Time Boundaries
System, geographic and time boundaries define where and when climate conditions are evaluated, and they help ensure climate information is fit for the system and decision being supported.
The system boundary should define the functional extent of the system and should identify if the boundary is limited to the system form, the physical system itself, or function, how it operates. It should also state if the assessment is considering an SSC in isolation or if interactions with other systems upstream or downstream are to be considered.
The geographic boundary should define the spatial representativeness and scale of influence of the system, recognizing that some decisions are driven by very local conditions (e.g., a single piece of equipment) while others require a broader scale (e.g., a watershed-level approach for a dam system). Geographic boundaries should reflect how the SSC functions in practice, including dependencies such as site access, staffing, supply chains, and requirements for human intervention during extreme events.
Time boundaries (time horizons) should reflect at least expected lifespan of the system or the period during which climate impacts may become critical. Multiple time horizons or scenarios may be used to capture a range of plausible futures.
Assessing beyond the lifespan of the system may be useful if decisions made could result in lock-in or the exclusion of options in the future.
Climate analyses commonly used multi-decade periods of at least 30 years to reflect how climate statistics are derived and interpreted. The selected horizon(s) should also match the decision type, such as near-term operational planning versus long-term asset management.
Establish a Project Team
The size and composition of the project team should reflect the scope and complexity of the assessment. For simple or low-complexity system, a single individual may undertake this work. For broader, higher-risk, or safety-critical systems, teams are typically cross-functional and may include project leads, system engineers, operations and maintenance staff, consultants, asset owners, risk practitioners, and other subject matter experts. Depending on scope, agreements, or regulatory obligations, participation may also include community or stakeholder representatives. The project team should include at least one person with practical knowledge of how climate conditions affect the type of systems being assessed (e.g., how heat, icing, wind, flooding, low water, or wildfire can change performance, access, or reliability). For complex assessments, teams may include climate science expertise to support data selection, scenario interpretation, and uncertainty communication.
Understand the System, Structure, Component(s)
This step builds a clear, practical understanding of the system being assessed, in the context of climate change. Teams should review how the system is designed, built, operated, maintained, and how it influences other systems to deliver its intended service. This understanding helps the assessment stay focused on the parts of the systems that matter most for safety, performance, and continuity, and it provides the system information needed to assess climate hazards and exposure. The level of detail should match the scope and objective confirmed previously. For a simple assessment, the system may be a single component. For a system-wide or higher-risk assessment, the system may include multiple critical structures, components, sub-components, and supporting services.
Typical inputs for this step include engineering drawings and design basis information, system specifications, operating procedures, maintenance strategies, condition and inspection reports, operational experience, and prior risk or event documentation (such as near misses).
Hazard and Exposure Assessment
This step identifies which climate hazards are relevant to the system and determines which critical aspects of the system are exposed to those hazards. Hazard and exposure work is often described together, but they serve different purposes. Hazard identification asks, “What climate-related events or long-term changes occur here?”, whereas exposure assessment asks, “Which parts of the SSC could be affected, either directly or indirectly, if that hazard occurs?”. Keeping exposure separate from vulnerability helps avoid jumping too early to conclusions about whether the system will fail or perform poorly; that deeper evaluation is developed in later steps using thresholds, system behaviour, and risk criteria.
In many assessments, teams identify relevant hazards first and then complete an exposure analysis for each hazard. In practice, hazard identification and exposure screening may also be iterative or completed in parallel, particularly when uncertainty exists about the hazard list, the system boundaries, or how it depends on supporting infrastructure. This step builds on scope, objectives, evaluation criteria, and boundaries. It produces documented inputs that can be carried forward consistently. Hazards and exposures may also be identified concurrently when developing an understanding of the system.
Details on selecting relevant climate hazards, completing an exposure analysis, and selecting appropriate indicators and data is presented below.
Hazard & Climate Driver Identification
Climate hazards that may affect the SSC within the defined geographic boundary and time horizon(s) should be documented. Hazard identification typically focuses on the site and regional setting (e.g., watershed conditions, coastal or riverine influences, terrain, and known weather patterns) and then uses available evidence to confirm which hazards are relevant to the SSC.
Teams should draw on internal and external sources, including previous climate assessments (especially those for the same SSC or nearby sites), regional studies, and sector guidance. Information gathered previously is also a key input, particularly design and operating thresholds, known sensitivities, historical disruptions, and “live experience” observations from station staff.
Hazard selection is often straightforward for most applications, and in some cases, hazards are partly predetermined by regulatory expectations, assessment types, or technology-specific programs. Where this occurs, required hazards may be used as a starting point and then supplemented based on SSC-specific conditions and dependencies.
Exposure Assessment
Aspects of the SSC that are exposed to each relevant climate hazard should be documented. Screened out hazards or SSC aspects that do not require further evaluation may be documented. Exposure may be direct (the hazard physically affects the SSC) or indirect (the SSC is not physically exposed but depends on other SSCs or services that are exposed). ::: {.callout-tip collapse=“true”} For example, equipment inside a building may not be directly exposed to wind-driven rain, but may still be indirectly exposed if power, controls access routes, cooling water, or communications are impacted by the same hazard. :::
A practical exposure screening approach typically includes identifying direct and indirect exposures for each hazard; grouping SSCs with similar exposure profiles to reduce repetitive screening where appropriate; using system information (design parameters, technical specifications, condition, and OPEX) to confirm exposure pathways; and documenting a preliminary impact concern where exposure exists. Where no exposure pathway exists, the SSC aspect (or hazard interaction) can be screened out. Documentation of screened out exposure pathways, and rationale why they were excluded may be required or considered best practice depending on the purpose of the assessment (e.g. assessments completed for Impact Assessments). During exposure screening, teams may identify additional hazards or dependencies that were not captured initially; where this occurs, the hazard and exposure assessment should be updated. Hazard and exposure may also be revisited at the initial stages of vulnerability analysis.
Early impact notes are optional, but they often help clarify why an interaction matters and provide a useful bridge into vulnerability and risk analysis.
Selecting Appropriate Indicators
One or more climate indicators should be selected to represent each relevant hazard in a consistent, measurable way. Indicators translate a hazard into information that can be screened, compared, and, where needed, quantified. Indicators may include monthly or annual aggregations (i.e. annual average temperature, or annual maximum temperature), indices (i.e. number of days above 30℃) or threshold/probabilistic or return period indicators (i.e. 1:100-year rainfall or probability of exceedance). The indicator set is typically built from the hazard list and the exposure pathways identified earlier, and it is refined to match the depth of assessment. ::: {.callout-tip collapse=“true” title=“Screening”} Screening-level assessments typically use indicators that are widely understood and easy to interpret (e.g., the presence of heat spells or the temperature on the hottest day of the year can serve as indicators for extreme heat). Detailed analysis may require more specific variables, higher-resolution datasets, or time series to evaluate thresholds and operational constraints.
:::
Indicator selection works best when it focuses on climate drivers linked to how the SSC performs. Some common hazard indicators represent multiple drivers and may need to be expressed using a combination of indicators. For example, ’thunderstorm” is a multi-hazard that may involve heavy precipitation and strong winds, with the potential for lightning or hail. Climate models often provide precipitation and wind variables directly, but not a single “thunderstorm” variable. In these cases, the assessment should document which indicators are used to represent the hazard and whether the drivers can be evaluated separately or whether co-occurrence is important for the SSC.
More than one climate indicator is often selected for the same hazard because different SSC aspects may interact with different parts of that hazard. For example, one component may be susceptible to extreme heat (e.g., maximum temperature), while another may be susceptible to longer-duration heat conditions (e.g. heating degree days or heat spells). In this example, both indicators for the same climate hazard would need to be assessed. Selecting multiple indicators for a single hazard is typically appropriate when it reflects distinct exposure pathways and performance concerns.
Performance Criteria
Performance criteria should be defined for each selected climate indicator so the assessment can interpret climate conditions in SSC terms. Performance criteria are the benchmarks used to judge whether the SSC remains reliable and fit for service under a given climate stress. Criteria are most useful when they connect the indicator to an operational limit, a design constraint, a safety requirement, or an acceptable level of degraded performance. This sub-step uses information gathered during the SSC review and exposure assessment, including design and operating parameters, dependencies, known sensitivities, condition information, and operational experience.
Performance criteria may include a critical threshold that defines the point at which performance becomes unacceptable when exposed to a climate indicator. Where practical, teams may also define a margin (or trigger level) that signals the SSC is approaching a critical threshold and may warrant monitoring, operational changes, or planned upgrades. These thresholds and margins become the reference points used later when comparing projected climate conditions in the vulnerability assessment and risk assessment.
Two approaches are commonly used:
A performance-based threshold approach uses established design or functional limits where these are available and meaningful, and it helps keep results actionable. For climate assessments, it is important to represent unacceptable levels of performance or reliability.
A risk-informed approach is often used when a clear performance threshold does not exist or when partial functionality is acceptable. In those cases, teams define what level of performance reduction is acceptable (e.g., allowable derating) and for what duration, consistent with the decision context and the organizations risk tolerance. This approach relies on SMEs experience with the SSC or supporting literature to inform the performance criteria selection.
Many assessments use both approaches because some indicators have well-defined limits (e.g., extreme precipitation thresholds in certain designs) while others may not (e.g., changes in freeze-thaw cycles).
Gather and Validate Climate Data
This sub-step gathers and validates the climate data needed to evaluate selected indicators over the relevant geographic area and time horizon(s). Data selection supports two needs: representing historical conditions to provide context and support reasonableness checks and representing future conditions to test how hazards may change over the SSC life.
The dataset should cover a broad enough range of plausible climate conditions to meaningfully test the SSC against the performance criteria.
For more information on climate data and projections refer to the overview on climate datasets
Assessing Data Suitability
Data suitability should be assessed before analysis begins, and ensure that:
The range of climate drivers is broad enough to encompass all plausible conditions that could impact the SSC. This should include collecting climate data for all relevant hazards and climate indicators identified, under one or more climate scenarios (i.e. SSP2-4.5, SSP5-8.5). Refer to Core Concepts: Climate Data and the Climate Data Decision Treefor more details on climate data.
Data is collected for all relevant time horizons (e.g., 2030s, 2050s, 2080s) to represent the time boundaries identified in System, Geographic and Time Boundaries. Climate data time horizons selected should capture the life of the SSC and/or should align with the purpose and objective of the assessment.
Select one or more appropriate future value or range criteria to represent the future climate condition in which the SSC performance criteria will be compared against for a given climate scenario and time horizon. This may be a consistent future value or range criteria across climate hazards and indicators or may vary depending on the specific climate hazard exposure interactions, and their criticality to SSC performance or operations.
Climate Vulnerability Assessment
This step identifies how vulnerable the critical aspects of the SSC are to the climate hazards and indicators identified in the previous step. In some climate assessments, the term vulnerability is sometimes erroneously used interchangeably with exposure or risk, but the concepts are different (see Core Concepts - Understanding Adaptation . Exposure describes whether the SSC interacts with a climate driver (directly or indirectly). Vulnerability describes how the SSC performs when exposed, based on its sensitivity (how strongly it responds to a stress) and its adaptive capacity (how well it can cope using measures that already exist). Risk adds likelihood or probability and is addressed later. The core question is “how sensitive is the SSC to the relevant climate drivers and how much existing capacity does it have to cope under those drivers”?
The intergovernmental Panel on Climate Change (IPCC) adopts the following definition of vulnerability:
\[ V = f (S, AC) \]
where vulnerability \(V\); Sensitivity \(S\); and Adaptive Capacity \(AC\)
Vulnerability assessments should be as quantitative as practical, given available information and the decision context. This step uses outputs from the exposure and hazard assessment to estimate vulnerability for each critical SSC aspect and exposure interaction, and to identify what follow-up actions or decisions are needed. The following sections provide detail on how these concepts are completed and used to estimate climate vulnerability.
Sensitivity Assessment
Sensitivity should be evaluated by comparing projected climate conditions (for each selected indicator, scenario and time horizon) against the SSC performance criteria established in Performance Criteria. A sensitivity assessment analyzes how susceptible the critical aspects of the SSC are to a particular climatic driver.
Sensitivity may be estimated using different industry methods depending on data availability, complexity, and purpose. Common approaches may include:
Simple design margin-based screening (sometimes described as “bounding” analysis). This approach simply determines if the design is within the range of potential climate data and assumes sensitivity is not an issue if it designed to handle those requirements;
Manufacturer or design data where available can be used to relate SSC performance and climate drivers.
Fragility-style relationships are sometimes developed that link stress levels to probability of failure either through empirical data or laboratory testing
Stress testing methods that use modelling or expert judgement approaches to evaluate system response under increasing stress from climate drivers.
::: {#stress .callout collapse=“true” title:“Conducting a Stress Test”} ### Conducting a Stress Test A stress test systematically alters climate variables or other external drivers to assess how an SSC responds under increasingly stressful conditions. The goal is to determine where performance criteria are no longer met. Stress testing can be performed at various levels of complexity, depending on available data, models, and the confidence required in the results.
Types of Stress Tests
Stress testing is an industry recognized method that can be used to perform quantitative and semi-quantitative sensitivity analyses. Stress testing can also be used to estimate SSC vulnerability, accounting for existing adaptative capacity measures, as well as used in adaptation measure analysis and selection. The type of stress test performed will vary depending on the available data. Different types of stress test methods are presented below depending on if quantitative or semi-quantitative data is available.
Stress Test Example
The figure below illustrates a hypothetical single-metric stress test where a climate driver is systematically increased – here using an ensemble of climate model data – to evaluate SSC response, showing progressive performance degradation until a threshold beyond which the SSC no longer performs acceptably. In some cases, equivalent stress-test information (e.g., performance or fragility curves) may already exist from manufacturers or other sources, provided it spans a sufficient range of conditions.
The guidance describes three stress-test levels mainly for clear communication and suggests applying them sequentially (increasing complexity) until confidence is adequate, though analysts can also select any level upfront based on available, trusted models or data. In practice, hybrid/continuous levels of complexity are common. The central requirement is having enough evidence to confidently assess SSC sensitivity or vulnerability, while exercising caution when extrapolating beyond the conditions for which the model or data were developed.
The stress test shown above could be completed with a multiple climate hazards or multiple performance measures so long as there was a model or method to map those drivers to performance. For multiple climate hazards, this would be done by varying combinations of two inputs to determine the system response. For example, an external powerline could be subject to loading from both wind and ice accretion. Combinations of both climate hazards could be tested in a loading model to see how they interact to affect line loading. :::
This guide describes two approaches for sensitivity assessment: a preferred quantitative approach which uses data to link system response to a climate driver and a semi-quantitative approach where data is limited, and expert judgement needs to be considered.
Modifications to the sensitivity of the system due to operational procedures or temporary modifications are considered adaptative capacity and reflected in the overall sensitivity of the system following the consideration of adaptive capacity.
Option 1: Quantitative Data Approaches
The quantitative approach allows for the visualization of the SSC sensitivity using lines or other functions that map system performance against climate indicators under one or more climate scenarios. This is done using quantitative data (measured or modelled SSC performance) to map SSC performance under a specific climate driver or indicator.
The type of quantitative data (simple or complex modeling) used for this approach will depend on the data available at the time of the assessment, the level of analysis required to satisfy the purpose of the assessment, and the level of complexity of the climate interaction with the SSC. However, the outputs should include some quantification expression of sensitivity, and calibrated/validated to ensure the approach is fit for simulating the SSC response to the climate condition. This validation may require multiple experts, particularly in situations where no formal model exists or it is difficult to capture SSC performance when exposed to the climate driver.
These relationships may be linear, non-linear, or exhibit step changes (e.g., systems with automatic shutdowns when thresholds are exceeded). While simple examples are one-dimensional, real-world cases may require multi-dimensional analysis with multiple climate hazards, climate scenarios and performance criteria.
This approach should use performance data provided by a manufacturer or another credible entity, information from laboratory testing or the results from empirical or numerical stress testing to establish how the SSC will respond to climate drivers.
The range of conditions in both the quantitative and semi-quantitative sensitivity approaches should be plausible (i.e. encompass all historic and potential future conditions for all scenarios). The only goal for vulnerability is to understand the SSC response under various circumstances and under what circumstances the SSC fails to perform. It is therefore likely that the vulnerability assessment will show the breaking point of the system, even if it is unlikely to occur in reality. Likelihood/probability of the event occurring is assessed in the risk analysis.
Option 2: Semi-Quantitative Approach There are situations in which the quantitative Option 1 approach is not possible due to a lack of data or modelling, limitations in climate projection models (i.e. hazards such as tornados not captured by models), or complex interactions in the SSC that make modelling impractical. In these situations, a semi-qualitative approach to sensitivity can be used. This approach aims to achieve the same objective of Option 1, where the performance of the system is described (qualitatively) or mapped in a similar way to the quantitative approach, but the performance is determined through OPEX and expert judgement to develop representative proxies.
It is important to note that since this approach requires qualitative inputs, higher uncertainty is likely to exist in the estimation of sensitivity. As a result, this approach may be best implemented as an initial screening approach or higher-level assessments (e.g., portfolio assessments) and should be refined once more data is available.
This approach involves identifying proxy for the SSC that can be categorically mapped to the relevant climate indicators. Proxy selection relies on knowledgeable SSC SMEs who understand general SSC behavior under the relevant climate hazards being assessed. Proxies are often thought of as variables that increase SSC sensitivity when exposed to a climate indicator (e.g., age of the SSC). To identify relevant sensitivity indicators, SSC SMEs will review technical documents, operational reports, and prior system reviews. One or multiple proxies may be selected and assessed for each relevant climate indicator, depending on the exposure assessment results.
Next, climate data that has been collected is used to assess SSC sensitivity across the proxies for each climate indicator. Similar to the Option 1 approach, the climate driver being considered is sampled broadly enough to capture any plausible conditions. The results of this analysis should include some expression or level of sensitivity for each proxy. This is often represented with a categorical ranking of SSC performance. The categorical ranking process often involves developing a tiered scale.
This example highlights how higher uncertainty may exist in this approach, since there is subjectivity involved in selecting proxies that would represent the performance criteria of the SSC, as well as subjectivity when selecting sensitivity criteria level (low, medium high). As such, this approach should be used when data limitations exist and be revisited once further data is available to refine the assessment findings.
Since expert judgment can be influenced by bias or incomplete system knowledge, the process should include multiple experts, independent review, and thorough documentation of assumptions and decisions to ensure transparency and repeatability.
Adaptive Capacity
The existing adaptive capacity that describes the SSC’s current ability to cope with climate stresses using measures that already exist should be documented. It differs from adaptation measures, which are new actions implemented in response to unacceptable vulnerability or risk. Existing adaptive capacity includes SSC temporary modifications or operational procedures available that help maintain acceptable performance during adverse conditions. If/when new adaptation measures are implemented, they become part of the SSC’s existing adaptive capacity for future climate assessments.
Identifying existing adaptive capacity of an SSC will involve technical expertise and an overall operational understanding of the SSC. Some systems have existing temporary modifications or operational procedures that allow SSCs to adapt more readily than others, depending on factors such as financial resources, regulatory environments, and organizational flexibility. Adaptive capacity can offset the primary sensitivity to an SSC by ensuring there are existing options to increase resilience and reliability, and the institutional capacity to execute those options.
Estimating Adaptive Capacity
Adaptative capacity should be estimated by critically assessing what key considerations (presented above) are applicable and currently exist for the SSC being assessed, in the context of their overall ability to reduce sensitivity.
Other considerations may be to understand the sensitivity of the temporary modification or operational procedure being assessed to understand its reliability or performance when exposed to the same climate interaction.
Once key adaptive capacity considerations are identified for the SSC, the influence of these adaptive capacity considerations can be estimated to understand how they may reduce sensitivity when exposed to climate hazards and indicators. This process can be subjective and difficult to estimate, where biases can overestimate the perceived adaptive capacity of an SSC to reduce sensitivity. For these reasons, it is important to be objective and challenge findings to ensure adaptive capacity is not being overly credited for reducing a climate vulnerability.
This may involve weighing each of the key considerations available in terms of importance or relevance to reducing the SSC sensitivity. At the end of this evaluation, an expression of reduction to the SSC sensitivity should be established for the temporary modification or operational procedure assessed. This may be informed by quantitative inputs when measured or modelled data exists for the adaptive capacity, as well as semi-quantitative data, based on knowledgeable system SMEs who understand general SSC behavior and the efficacy of the temporary modification or operational procedure assessed.
If further refinement is needed to understand how the temporary modification or operational procedure will reduce SSC sensitivity, key considerations can be mapped or categorically evaluated to help refine understanding. Criteria definitions for each key consideration can be developed to understand how each key consideration will influence sensitivity, as well as the relevance or importance of each key consideration identified. For example, the technical adaptive capacity of the cooling fans or system, a low adaptive capacity may represent a temperature reduction to the SSC of 0-2 ℃, where a high adaptive capacity would represent a temperature reduction to the SSC of 8-10 ℃.
It is also important to consider the calculation method in which adaptive capacity estimation is completed. Some key considerations identified may be more critical to reducing the overall SSC sensitivity, and as such weighing the key considerations may be completed to reflect that importance. For example, it may be determined that the cooling fans or system have a high adaptive capacity (e.g., ability to reduce SSC temperatures to acceptable levels) but a low adaptive capacity for the cooling fans or system to respond to the climate event (e.g., long response time). This would mean the ability to cool the SSC is there, however the SSC sensitivity may remain the same since the adaptive capacity can not cool the SSC within an acceptable amount of time to avoid performance entering unacceptable levels.
The outcomes of the adaptive capacity estimations can be integrated with the sensitivity analysis results to estimate climate vulnerability.
Definitions of adaptive capacity vary across guidance documents, with each source framing the concept differently based on its purpose and context. For more information on adaptive capacity or for comparing an assessment with notable external industry standards and guidance on this topic.
Assess Vulnerability
The overall climate vulnerability should be estimated where sensitivity and adaptive capacity results can be combined to establish vulnerability. How this is completed can vary depending on the approaches taken to estimate SSC sensitivity. The approach for estimating vulnerability should fit the need of the overall purpose or objective of the assessment.
At the conclusion of the vulnerability assessment, there should be a clear and well-documented understanding of how sensitive the SSC is to relevant climate hazards and indicators. This information should be institutionalized – formally recorded – along with a determination of whether the identified vulnerability is acceptable (i.e., not expected to compromise performance) or if further action is required. If the vulnerability is low, then there may be justification to accept the situation and not proceed further or implement a monitoring plan to determine if the situation is changing.
Vulnerability analysis does not assess the likelihood of conditions but focuses on system response should those conditions occur.
Next Steps for Unacceptable Vulnerability
If the assessment identifies unacceptable performance under certain climate conditions, several pathways are available. The choice of next steps should be guided by the specific context and informed discretion:
Proceed to Climate Risk Assessment: Many guidance documents, including those used for regulatory purposes (such as SACC CCR during an Impact Assessment), recommend moving from vulnerability assessment to a risk assessment. This next step adds complexity by estimating the probability that climate hazards will reach levels that could impact performance and helps prioritize adaptation actions. Note that risk assessment may require specialized climate expertise and can be resource intensive.
Consider Adaptation Options Directly: In some cases, it may be more practical to address the identified vulnerability with adaptation measures, rather than conducting a full risk assessment. The CSA S910.1 allows for the direct evaluation of adaptation options when the cost or complexity of a risk assessment is not justified by the potential impact.
Document and Institutionalize Findings: Regardless of the chosen pathway, it is essential to document the assessment results, the acceptability of the vulnerability, and the rationale for any further actions. This ensures transparency, repeatability, and accountability in decision-making.
Not all vulnerabilities require a risk assessment; the decision should be based on the potential impact, cost, and complexity of the issue. Adaptation options can often be implemented directly if they are clearly justified by the vulnerability assessment. For high-impact or complex vulnerabilities, a more detailed risk assessment may be warranted to guide effective prioritization and resource allocation.
Following these steps ensures that climate vulnerability assessments lead to informed, practical, and cost-effective decisions that enhance the resilience of the agencies assets and operations.
Climate Risk Assessment
This step may be completed which estimates climate risk for each relevant climate interaction with the SSC by combining: the climate hazard (what could occur), exposure (what parts of the SSC are in direct/indirect contact with the hazard), and vulnerability (how the SSC responds, based on sensitivity and existing capacity to cope). Risk can be described using the IPCC framing and formula:
\[ R = f(H,E,V) \]
where Risk (\(R\)); Hazard (\(H\)); Exposure (\(E\)); and Vulnerability (\(V\)).
This risk concept is beneficial for understanding how different key factors influence the overall risk of an SSC and how modification of one of these factors can reduce the overall risk. See Core Concepts: Understanding Adaptation for more details. It can also be directly mapped to the traditional definition of risk:
\[ R = L × C \] where Risk (\(R\)); Likelihood (\(L\)); and Consequence (\(C\)),
In this framing, likelihood is driven by the hazard (for selected scenarios, time horizons, and indicator values), while consequence is driven by the exposure and vulnerability (how the SSC performs if the event occurs). The relationship between the IPCC concept of risk and traditional risk definitions are discussed in Core Concepts: Understanding Adaptation. The relationship between likelihood and consequence to estimate risk is typically visually represented with a risk matrix or heat map and can vary depending on the type of risk assessment that is completed.
Climate risk assessment is often challenging because both likelihood and consequence are uncertain, and uncertainty generally increases further into the future. For that reason, risk may be better represented as a range of results across scenarios and time horizons rather than a single value, and it may be impractical to represent climate risk as a single point on a traditional probability-consequence matrix. The objective of the assessment should be to evaluate multiple climate scenarios to understand a range of possible risks for any given climate interaction with an SSC. Accurately assessing climate risk may require a nuanced understanding of the climate data and specialized climate science expertise to complement engineering teams.
This step supports forward-looking decisions by documenting how likelihood and consequence are estimated, showing how results change across scenarios, and clearly describing confidence and uncertainty so decision-makers can understand what is known, what is uncertain, and what could change the outcome.
The more climate scenarios and time horizons needed for the assessment, the more likelihood values will need to be calculated for each climate indicator. For example, if SSP2-4.5 and SSP5-8.5 climate scenarios are selected for the 2°C global warming level, under two future time horizons (2050s and 2080s time periods), then four distinct likelihood values will need to be calculated and compared in the analysis. The time window selected should include 30 years to capture climate variability, but does not need to exceed SSC design life beyond that.
Select the Future Value or Range Criteria: For each scenario, determine the historic and future value or range in which likelihood will be calculated. That is, the future value or range of possibilities within the climate scenario ensembles in which statistical analysis will be calculated against. Percentiles could also be used to add some conservatism, at the cost of sample size. For example, the number of 75th percentile or greater events that exceed the critical threshold could be compared to the total number of events that exceed the 75th percentile of the climate projection data.
Assess Probability: For each scenario, determine the probability of relevant hazards and indicators. This is completed through statistical analysis to calculate the probability of the future value or range exceeding the selected threshold (performance criteria) established previously. For simple risk assessments, the probability should be calculated following the classical formula, or other established statistical methods, for determine the probability of event \(X\) occurring:
\[ L(X) = \frac{\text{Number of times X occurs}}{\text{Total Number of Events}} \]
This formula is applied with respect to a pre-defined threshold. In the simple climate risk analysis scenario, where a critical threshold has been identified, this should be the number of times the climate data in a 30+ year window exceeds the critical threshold divided by the total number of data points. For complex risk assessments, this probability calculation may vary.
The classic probability formula assumes data are stationary, which means there is no long-term trend. This assumption is false, but may be useful. Climate data should be compared in 30-40 year windows where stationarity assumptions are more valid and not used for long periods of time. Alternatively, probabilities may be estimated from fitted probability distributions, some of which may be applied to non-stationary data. For screening and other high-level assessments, the probability may be able to be reasonable estimated visually.
Mapping Probability: Once the probabilities of a climate indicator exceeding the performance criteria is calculated, map each of the scenario probabilities to a likelihood scale or criteria. This is done so that risk can be estimated by combining the likelihood level with the consequence level. This will result in a likelihood value for each scenario and time horizon evaluated for a given climate indicator.
Risk Quantification
The final risk (risk level) is determined quantitatively as the product of the likelihood and the consequence (impact) or may be qualitatively evaluated using likelihood and consequence results. This is commonly mapped to a risk matrix. Risks should be calculated for each climate interaction, scenario, and time horizon that have been included in the assessment. This may result in multiple risk values for a given interaction, if multiple climate scenarios and time horizons are considered in the assessment.
When risk, likelihood and consequence are determined semi-quantitatively, care should be taken to ensure the resulting analysis is meaningful. Often categorical values have a tendency to select middle values and are selected very subjectively. In these situations, ensure values are non-biased, founded on information and granular enough to suit the needs of the assessment.
Level of Concern
Recognizing uncertainty and understanding the confidence in the analysis is an important part of a climate risk assessment. There is natural variability in the climate system and human actions will determine the emissions pathway altering climate outcomes in the future. In addition, not all climate model parameters or vulnerability and risk assessment methods offer the same degree of confidence. As a result, it is important to integrate both the level of risk, the uncertainty and confidence in the analysis when finalizing the assessment. Identifying the level of concern allows for the consideration of both risk and uncertainty.
Key Steps for Determining Level of Concern
Risk: Using the risk from the risk assessment to plot the position on the vertical axis of a Level of Concern diagram for each climate interaction assessed. The vertical axis represents the amount of risk, as determined by the risk assessment.
Where multiple climate risks exist for a single climate interaction (e.g., multiple climate scenarios and time horizons assessed), it is often best practice to select one conservative future risk level, to simplify the analysis depending on the objective of the assessment. The analytical uncertainty will guard against overly conservative and costly adaptation actions. The future risk level mapped for the climate interaction should be documented and justified to fit the purpose of the assessment.
Analytical Uncertainty: Assign the analytical uncertainty on the horizontal axis. This value reflects the uncertainty in climate projections, the confidence in the ability of climate models to represent relevant hazards, and the confidence in the vulnerability assessment process. This step is inherently subjective unless there is detailed knowledge of the uncertainties involved. Information on uncertainty and confidence can be found here
Involve climate science and engineering experts to help assess uncertainty and avoid bias – whether it be overconfidence or under confidence in the data and models.
By combining risk magnitude with analytical uncertainty, it is possible to identify the overall level of concern for each risk. This approach will help prioritize adaptation actions and ensure decision-makers understand both the risks and the confidence in the underlying analysis.
Adaptation Strategy Quadrants
Building on the level of concern diagram, adaptation strategies for associated climate risks can be categorized based on the risk level and the confidence in the analysis can be identified. A four-quadrant approach is presented, and are aimed to support adaptation strategy development in thne next step of this assessment process. Adaptation quadrant descriptions include:
Following this structured approach ensures that adaptation strategies are well-matched to both the level of risk and the confidence in the analysis, supporting resilient and cost-effective decision-making.
Risk Prioritization
If the risk assessment is being done for a larger facility or multiple SSC, an additional step may be required to prioritize the risks that will be treated. The risk and the level of concern can be used to prioritize risks to be addressed. Adaptations plans for each risk can then be institutionalized or an organizational risk tolerance threshold can be applied to determine which risks needs to be addressed and in which order.
Adaptation
Adaptation strategies should be developed to address risks or vulnerabilities that have been identified as unacceptable. It involves taking actions to reduce the impacts of climate change on SSCs. Adaptation can take many forms:
Reducing Exposure: Some actions lower exposure to hazards, such as relocating an SSC away from areas at risk of flooding. Reducing Vulnerability: Most adaptation measures focus on decreasing the system’s vulnerability to climate hazards. This can be achieved by strengthening the SSC or modifying its operation to better withstand adverse conditions. Enhancing Reliability and Resilience: Certain adaptation actions improve the reliability of the SSC, ensuring it continues to perform even as climate conditions worsen. Others enhance resilience by minimizing the impact of an event or enabling faster recovery after disruption. Transferring Risk: Some elements of risk can be transferred through contractual obligations, rate regulation or insurance.
::: {.callout collapse=“true” title=“Types of Adaptation} Structural Adaptation: This involves modifying or replacing physical assets or components. These changes are typically implemented through maintenance activities or capital projects. Examples include upgrading equipment, increasing the capacity of turbines, or reinforcing infrastructure.
Operational Adaptation: This focuses on changing how assets are operated rather than altering the assets themselves. Examples include adjusting operating schedules, increasing inspection frequencies, or updating maintenance programs. These changes are usually introduced through revised procedures or work instructions. Where feasible, operational adaptations should be considered before structural adaptations, as they may offer effective, lower-cost solutions.
Both structural and operational adaptation depends on informational adaptation – the provision of climate data, expertise, and monitoring technology to support informed decision-making. :::
The figure illustrates how SSC adaptation measures can be implemented over time to achieve resilience. The solid blue line represents SSC climate vulnerability/risk to a climate indicator without the implementation of adaptation measures. In this situation, we see the SSC entering unacceptable performance criteria well before its expected design life. The dash blue line represented SSC climate vulnerability after an adaptation measure has been implemented. In this situation, the adaption strategy reduces vulnerability for a period, but as climate hazards intensify, and the SSC enters unacceptable performance criteria before its expected design life. As such, a second adaptation becomes necessary to maintain acceptable performance through the SSC design life. This example highlights the importance of flexible adaptation pathways and planning as well as the timing and institutionalizing when implementing adaptation strategies.
While most climate vulnerability and risk guidance documents emphasize the importance of managing climate risks and implementing adaptation actions, they often lack detailed instructions on how to evaluate and select the best adaptation options. When choosing adaptation measures to address climate risks or vulnerabilities the model or method used (i.e. stress test), can also be used to assess potential adaptation actions. This ensures that each option is measured against the same performance criteria and under comparable scenarios.
Not all adaptation actions are urgent. The urgency of adaptation should be determined through a risk evaluation process. Often, adaptation can be incorporated into regular asset life-cycle management. For example, if a valve is scheduled for replacement every ten years, it may be cost-effective to wait until the next scheduled replacement to install a climate-resilient model – unless inspections show accelerated deterioration that would require earlier action.
Following these steps systematically identifies, evaluates, and implements adaptation measures that effectively reduce climate-related risks and enhance the long-term performance and resilience of assets.
###Identifying Adaptation Options
Effective adaptation strategies should be tailored to the level of risk or vulnerability identified and the confidence placed in the analysis. The goal is to select actions or a range of actions that can reduce the likelihood or consequence of reaching critical performance thresholds. The scope and complexity of these actions should reflect the nature and cost of the problem or system being addressed.
Use the results from Level of Concern in the risk assessment will guide the selection of adaptation actions. Unless there is an option that is a clear winner, first identify multiple possible actions to ensure flexibility and resilience. Options should also be considered to minimize regret, meaning options that would minimize future opportunities or were fragile (i.e. not effective) under certain future conditions.
Where the risk assessment results select Quadrant 1, standard methods can be used to address the risk. If findings are in Quadrant 2, the various adaptation options considered should be evaluated and the best option selected.
Where the risk assessment results in Quadrants 3 or 4, robust or robust and flexible actions are to be considered. Adaptation pathways are a valuable tool for developing a long-term, flexible plan. These pathways identify a sequence of adaptation measures, some of which may not need to be implemented immediately, allowing the organization to respond as conditions evolve.
Adaptation Pathways
An adaptation pathway (Haasnoot et al. (2013)) is a strategic tool for evaluating a sequence of adaptation options, along with their associated costs and benefits over time to manage uncertainty. This approach is particularly valuable for situations characterized by higher uncertainty or risk (Quadrants 3 and 4) but may be used in multiple scenarios. By mapping out a series of potential actions, adaptation pathways help organizations develop flexible, long-term plans to address climate risks as conditions evolve. Adaptation pathways consider not a single but multiple pathways. Decisions or actions at some point in time may remove one pathway for consideration or justify the shift to a different path than originally envisioned.
Evaluating Adaptation Pathways
- Compare pathways based on costs, benefits, and co-benefits, including avoided losses.
- Use qualitative methods (e.g., expert judgment, simple scoring) or quantitative models (e.g., detailed financial analyses) as appropriate.
- Consider the time horizons for each action, and how quickly triggers might be reached under different climate scenarios.
Developing adaptation pathways can create resilient, phased strategies that adapt to changing climate risks over time – balancing immediate needs with long-term flexibility and cost-effectiveness.
Reporting or Institutionalizing the Decision
Once the assessment and decision-making process are complete, documentation of the analysis and formally institutionalization of the decisions made should be completed. The level of detail in the report should match the scope and complexity of the assessment.
::: {.callout collapse=“true” title=“Key Criteria for Reporting} Reproducibility: The analysis and decisions must be documented in a way that ensures reproducibility. Ideally, if the same assessment were conducted independently, it would yield similar results. Future teams should also be able to clearly understand the rationale behind each decision.
Transparency: Clearly outline the methods, data sources, assumptions, and reasoning used throughout the assessment and decision-making process.
Clarity: The report should be organized and written so that it is accessible to both technical and non-technical audiences.
Institutionalization: Ensure that decisions are formally integrated into relevant policies, procedures, or operational plans. This may involve updating risk registers, asset management plans, or standard operating procedures to reflect the chosen adaptation actions.
Budget: The budget for climate adaptation should be considered. In many cases a project or upgrade may be required even in the absence of climate change adaptation (i.e. end of component life). The budget should be recorded based on a base (like for like) case as well as the higher cost of adaptation. This way the organization can better track and manage the incremental costs of climate adaptation. :::
Clear, thorough records help support transparency, enable continuous improvement, and provide a reference for future projects or audits. When documenting the analysis consider the following:
Purpose of the Assessment: Clearly state the objectives and scope.
Methods Used: Describe the methodologies, tools, and data sources.
Key Assumptions: Document any assumptions made during the analysis.
Findings and Conclusions: Summarize the results and key insights.
Recommendations and Actions: Outline any proposed adaptation measures or next steps.
Supporting Evidence: Include relevant data, references, and appendices as needed.
Adapting the level of detail to match the complexity and significance of the activity ensures documentation remains practical and valuable for both current and future stakeholders.
Thoroughly documenting and institutionalizing decisions supports organizational learning, transparency, and the long-term resilience of assets and operations.
Process Integration
OPG manages assets and systems through a series of established processes.
Many adaptation needs can be addressed through operational changes or as part of ongoing asset management activities. Typically, an effective adaptation plan might begin with operational changes, followed by identification of larger projects for inclusion in the Investment Planning process.
Operational Changes: Institutionalize by updating procedures, work instructions, or inspection requirements. These changes may also impact operating budgets as reflected in the Business Plan.
Asset and System Changes: Needs for asset changes typically arise from risk assessments, plant condition assessments, or performance improvement initiatives.Early identification of adaptation needs ensures climate impacts are considered in the design and operation of assets over their entire lifecycle.
Once a project is initiated, the adaptation plan and pathways inform the project team’s solution development, allowing for further refinement as more detailed information becomes available.
Transitioning to Operations
After project close-out, monitoring plans for flexible actions should be updated and transitioned teams that will operationalize the plan. This may involve:
- Updating equipment reliability monitoring
- Modifying monitoring plans in risk registers or plant condition assessments
- Revising governance documents and work instructions as necessary
Documentation and Continuity
To ensure continuity and future reference, all adaptation pathways and decisions should be documented in accessible corporate records. Monitoring actions should be integrated into appropriate processes – such as engineering risk assessment program (ERAP), equipment reliability (ER), business unit, or station risk registers.
Long-term Monitoring and Triggers
Longer-term actions should be linked to specific triggers and monitored for opportunities to integrate with complementary projects. For instance, a future adaptation may be triggered by a threshold exceedance or aligned with another project to maximize efficiency and cost-effectiveness. Taking advantage of windows of opportunity – such as co-funding or heightened interest after major events – can also improve outcomes.
When flexible adaptation plans are needed, it is important to institutionalize this plan. There are several processes which may be used for this process including equipment reliability, safety analysis, risk registers, asset management or aging management processes. The performance of the SSC should be regularly reviewed to determine if performance continues to be acceptable. The plan should not only exist in a report of the risk assessment process.
The monitoring plan should also include a trigger value. This value identifies when a decision about adaptation is required. A trigger should be set to occur prior to reaching an unacceptable level of performance so that there is time to take action, including time for OPG processes to implement an action, prior to unacceptable levels or risk or performance being reached.
Monitoring Challenges
Some climate variables, such as extreme conditions (e.g., peak river discharge), can be difficult to monitor due to natural variability and limited data. In such cases, monitoring covariates that are easier to track may provide a practical solution.
Independent Review
Several guiding documents emphasize the importance of periodic and independent reviews of climate risk and adaptation assessments. To facilitate effective review, assessments should be conducted and documented in a manner that allows an independent third party to easily understand the process and reach the same conclusions. When a risk report serves as the institutional record of a decision, both the report and the associated risk process should be reviewed regularly to account for changes in equipment condition, updates to design standards, or new climate data. Periodic reviews are already a well-established practice within safety programs, ensuring that adaptation decisions remain current, relevant, and robust over time.