Leveraging Behavior To Combat Climate Change

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

By Beth Karlin from the See Change Institute

Climate change is an issue that can no longer be ignored  —  US emissions alone total more than six billion tons annually and residential energy use accounts for 20% of this total, making it a prime target for savings (US EPA, 2016). Although physical scientists are working to develop alternative energy and efficient appliances, behavior change remains a fundamental roadblock. There is a vital role for social science to contribute by developing and testing interventions for climate mitigation through behavior change.

Dozens of energy reductions within the home can be made in the immediate term, without economic sacrifice or loss of well-being, and nearly every city across the country (and world) has at least one energy efficiency program.These programs range from school competitions to online games, home audits, and marketing efforts. Such programs have led to energy savings ranging from 0%-6.5% in the residential sector (Illume et al., 2015). However, research suggests that household conservation behavior can lead to 16–20% reductions in energy use (Dietz et al., 2009; Frankel et al., 2013), suggesting there is a great deal more that we can do.

To fully leverage the potential of behavior-based energy programs, we must not only identify successful programs, but also understand the characteristics, or “building blocks,” that make up these programs and contribute to their success. Trying to change behavior by looking at programs without their characteristics is like trying to eat healthy by looking at boxes of cereal without nutritional information. While you can “test” Cheerios, Frosted Mini Wheats, and Kix, your learnings are limited and questions about the optimal mix of carbohydrates, protein, and fat for a healthy breakfast go unanswered.

Just like cereal, moving from program categories to characteristics in behavior change opens up a whole new set of opportunities for designing and optimizing programs. For example, while rebates are commonly used to encourage the purchase of energy efficient appliances, how rebates are framed or delivered to users can vary, which can impact their effectiveness. To keep things simple, we’ve broken down the components of behavior change programs into the building blocks of ABCDE: Audience, Behavior, Content, Delivery, and Evaluation.

Audience refers to the target audience for the intervention. It refers to both knowing the target audience, as well as personalizing treatment according to it, when appropriate or feasible. Audience comes first because program designers must start the process with the customer  —  their behavioral profile, their individual needs and experiences. Recent work suggests that behavioral strategies can have different effects depending on the individual or group being targeted. What works in some customer segments or contexts might not work for others. For example, Costa and Kahn (2013) showed that home energy reports were 2–4 times more effective with political liberals than with conservatives.

Behavior refers to the target behavior(s) of the intervention. Providing specific target behaviors (e.g., customer submits rebate application) has important benefits. First, it focuses program design, enabling the program to match strategies and behavioral components to more closely reflect the objectives. Second, it encourages cleaner research methods by focusing evaluation and data tracking resources. Finally, it generates clear, concrete, actionable behaviors for participants. Too many calls to action, or vague calls to action, can overwhelm the customer, leading to procrastination or inaction at best and frustration at worst. Instead of encouraging customers to take multiple actions at once, a sequence of actions is a more accurate reflection of models of human behavior. The behaviors can be directly energy-reducing or information seeking (e.g., getting an audit or downloading a tool).

Content refers to the strategy and message framing of the intervention. While many different programs utilize similar behavior change strategies (e.g., goal-setting, feedback, competition, games, message framing, and commitment), there is considerable variance across programs in terms of how a particular behavioral strategy is being applied. The body of research within the field of behavioral science is constantly growing and evolving, which continues to confirm the powerful role that human nature plays in the marketplace and, more specifically, in guiding energy-savings behavior. New applications and empirical results continue to shed light on when and how particular strategies are most effective. Thus, a closer integration and application of behavioral theory to meet specific program goals for a particular population can lead to more creative and effective behavior-based strategies. For example, particular strategies utilized to encourage opt-in to a given intervention (e.g., an emotional appeal or default setting) may not be optimal for encouraging follow-through, spillover, or long-term behavior change.

Delivery refers to the way that the program is distributed to consumers. It can be broken down further into the variables of medium, messenger, frequency, duration, and timing, which may impact intervention success. Systematically experimenting with timing of intervention delivery, different media channels and messengers could further reveal how customers respond to the message, and could shed light on the degree to which they trust the information, and how this is converted to action. Additionally, program design could capitalize more on delivering interventions during significant life events or key “moments that matter” (such as a residential move) as recent work suggests that customers will be most amenable to behavior change when routine and habitual behavior patterns are disrupted (e.g., Wood, Tam & Witt, 2005).

Evaluation refers to the way that the effectiveness of the intervention is measured. Evaluating a program to determine how and for whom it worked is vital to understanding the optimal mix of program variables that lead to the most effective design. For example, different segments of a population can be exposed to different versions of a single program depending on the variable being tested; the differential responses to these variations will provide insight into the specific mechanism driving the effectiveness of an intervention. Such real time testing encourages program design to match program components (e.g., content, delivery) to the contexts in which they work best (e.g., audience, behaviors). Program design should also consider the program goals and target customer behavior in order to identify the appropriate metrics (e.g., call center feedback, survey data, social media insights) to explore opportunities for refining program variables and measuring positive spillover.

This model reinforces the potential for understanding how and for whom behavioral strategies are working and how strategies may work together across programs to support customers along the energy efficiency journey. This social scientific approach extends upon current marketing and design-based behavior change efforts within the climate change space. It is hoped that such research will not only improve individual programs but lead to a greater understanding of the variables underlying their success so that the rising tide of social science can continue to lift us toward a more sustainable future.

Read the entire article here.

Subscribe to The Beam here.


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica.TV Video


Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

The Beam

The Beam Magazine is an independent climate solutions and climate action magazine. It tells about the most exciting solutions, makes a concrete contribution to eliminating climate injustices and preserving this planet for all of us in its diversity and beauty. Our cross-country team of editors works with a network of 150 local journalists in 50 countries talking to change makers and communities. THE BEAM is published in Berlin and distributed in nearly 1,000 publicly accessible locations, to companies, organizations and individuals in 40 countries across the world powered by FairPlanet.

The Beam has 481 posts and counting. See all posts by The Beam