TECHNICAL MEMORANDUM 1 Water Research Foundation Collaborative Utility Benchmarking in North America Project Benchmarking Workshop Results Report PREPARED FOR: Water Research Foundation PREPARED BY: CH2M DATE: February 4, 2016 Overview This document summarizes the Benchmarking Workshop conducted as part of Water Research Foundation’s (WRF) Tailored Collaboration Project “Collaborative (formally Enhancing) Utility Benchmarking in North America”. This project aims to further develop, refine, and implement a unified industry benchmarking framework, and collection of industry tools and process for North America that is connected to WSAA’s 2016 Asset Management Customer Value (AMCV) project. This WRF project is intended to enhance the value offered by WSAA’s AMCV, AWWA Utility Benchmarking Survey, Effective Utility Management (EUM), and other available tools, such as the ISO 55000 framework. This research will evaluate the process and future opportunities for benchmarking within the US water sector. During the course of the executing of the Collaborative Utility Benchmarking in North America project, an evaluation, business case, and recommendation will be made for potential tool integration and enhancements, association and utility collaboration, and other benchmarking opportunities. The specific objectives for the Benchmarking Workshop were the following: present the benchmarking tool and process envisioned for implementation in North America, and facilitate input, discussion and agreement on the final tool and approach; discuss and solicit input on the questions that will be important to ask the participating utilities at the end of the benchmarking exercise to determine the costs, benefits, and value to participating utilities; and solicit understanding, input and discussion on the business case development. The Benchmarking Workshop agenda is provided in Appendix A. The Benchmarking Workshop Day 1 Presentation is provided in Appendix B. The Benchmarking Workshop Day 2 Presentation is provided in Appendix C. Benchmarking Workshop Attendees Role Name WRF Project Manager Linda Reekie PAC Heather Pennington
158
Embed
TECHNICAL Research Foundation Collaborative · PDF fileWater Research Foundation Collaborative Utility ... will evaluate the process and future opportunities ... opportunity to discuss
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
T E C H N I C A L M E M O R A N D U M
1
Water Research Foundation Collaborative Utility Benchmarking in North America Project
Benchmarking Workshop Results Report PREPARED FOR: Water Research Foundation
PREPARED BY: CH2M
DATE: February 4, 2016
Overview This document summarizes the Benchmarking Workshop conducted as part of Water Research Foundation’s (WRF) Tailored Collaboration Project “Collaborative (formally Enhancing) Utility Benchmarking in North America”. This project aims to further develop, refine, and implement a unified industry benchmarking framework, and collection of industry tools and process for North America that is connected to WSAA’s 2016 Asset Management Customer Value (AMCV) project. This WRF project is intended to enhance the value offered by WSAA’s AMCV, AWWA Utility Benchmarking Survey, Effective Utility Management (EUM), and other available tools, such as the ISO 55000 framework. This research will evaluate the process and future opportunities for benchmarking within the US water sector. During the course of the executing of the Collaborative Utility Benchmarking in North America project, an evaluation, business case, and recommendation will be made for potential tool integration and enhancements, association and utility collaboration, and other benchmarking opportunities.
The specific objectives for the Benchmarking Workshop were the following:
present the benchmarking tool and process envisioned for implementation in North America, and facilitate input, discussion and agreement on the final tool and approach;
discuss and solicit input on the questions that will be important to ask the participating utilities at the end of the benchmarking exercise to determine the costs, benefits, and value to participating utilities; and
solicit understanding, input and discussion on the business case development.
The Benchmarking Workshop agenda is provided in Appendix A. The Benchmarking Workshop Day 1 Presentation is provided in Appendix B. The Benchmarking Workshop Day 2 Presentation is provided in Appendix C.
Benchmarking Workshop Attendees Role Name
WRF Project Manager Linda Reekie
PAC Heather Pennington
OVERVIEW
2
PAC Kurt Vause
PAC Kevin Campanella
Steering Group Greg Ryan
Steering Group Mike Sweeney
Steering Group Leisa Thompson
Steering Group Frank Roth
Steering Group Sarah Neiderer
Steering Group Jeff Leighton
Steering Group Matt Ries
Steering Group Ken Mercer
Steering Group Stephanie Passarelli
Participant Dave Plank
Participant Zsolt Silberer
Consultant Scott Haskins
Consultant Priscilla Bloomfield
Consultant Terry Brueck
Agenda Discussion, Outcomes, and Action Items – Day 1
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
INTRODUCTIONS/ OBJECTIVES/ EXPECTATIONS
There was general discussion that this project is an opportunity for the associations with different benchmarking/ self‐improvement tools/frameworks to survey utility participants and evaluate the outcome of the project to determine what opportunities could exist in the future. This workshop was a way to get the Steering Team, PAC, and research team on the same page to better understand the tools and how they fit together. It was also an opportunity to discuss what they could look like going forward and identify the key success criteria for the project. Participant expectations for the workshop were discussed.
None
CONTEXT – WRF TC PROJECT
There was discussion about the scope, schedule, roles and responsibilities (Steering Group, PAC, etc.), and engagement/communication for the WRF TC project. Components of the deliverables were confirmed: Benchmarking Workshop Results Report, Benchmarking Evaluation Report (includes business case evaluation), North American Leading Practice Report, North American Industry Report. The WRF TC project builds upon previous work done by WSAA, AWWA, and others. Various associations will be included and informed as the project progresses. It was confirmed that the project will make recommendations for the future and not to produce a final tool.
Contact associations
Provide workshop presentations to participants
OVERVIEW
3
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
AMCV PROJECT AMCV is WSAA’s rebranded Aquamark tool, focused on customer value which addresses the requirements of ISO 55000, with a focus on utility management. There is a cost reduction for this round in 2016 of approximately 40%, 25% of the tool has been updated with the addition of new measures in relation to leadership and culture, ensuring a customer focus in all relevant questions, addition of 40 measures to ensure alignment with ISO 55000, rationalization of Function 7 ‐ support systems, reducing it from 18 to 6 Functions along with consolidation of the measures in Function 7 from 18 to 9, peer review of measure weightings to ensure they reflect contemporary practice, complete overhaul of the software platform, ISO aligned, with integration of AWWA and EUM elements for the North American version. There is a major focus on leading practices and knowledge sharing. The group discussed the AMCV project’s deliverables.
None
AWWA TEC AWWA’s Benchmarking Committee and Asset Management Committee submitted a TEC proposal. The budget may be increased to $30,000, and the scope elements are to be determined. The project will be released as an RFP. The focus of the project is around developing a BCE for AWWA about making their benchmarking program stronger and aligns with the WRF TC project’s evaluation and BCE component.
None
ASSOCIATED INDUSTRY FRAMEWORKS AND FEATURES ‐ AMCV
Information on the 2016 AMCV project can be accessed via http://amcv.wsaa.asn.au/AMCV. Improvements include simpler data entry, more interactive engagement online with discussion boards, dialogues, etc., 2 assessment options – self or facilitated, alignment with ISO, possibly with EUM, AWWA, etc. There are various potential options for the future. A live online demonstration was provided. There was dialogue about the tool’s functionality and various project aspects.
Provide WSAA ISO decoder to participants
Confirm that the 2012 measure comments (Aquamark) can be imported into the 2016 version (AMCV)
Include ISO and EUM mapping in the utility and industry reports if feasible
MAPPING There was a discussion of mapping ISO, EUM, AWWA metrics, and others to the AMCV framework and adding new measures to fill gaps in the AMCV tool. Metrics will be shown at the Subprocess level in a manner similar to a measure. A definition and calculation will be provided. Users will enter a current and target value. This data would be provided by the utility separately from the AWWA 2016 Utility Benchmarking Survey, which will continue as a separate exercise, as in previous years. There was a discussion
Complete mapping EUM to AMCV
Map AWWA metrics to AMCV if cost
OVERVIEW
4
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
about the potential mapping of AWWA Utility Management Standards to AMCV. AWWA Utility Management Standards could be a resource for a utility to help close their gaps.
CONSENSUS: Pursue mapping other frameworks to AMCV for possible inclusion. This was with the caveat that such mapping must add value to the industry and be cost effective.
effective to do so (see discussion below)
Secure copy of AWWA Utility Management Standards
Investigate level of effort to map/include AWWA Utility Management Standards in AMCV
Investigate level of effort to map EUM to ISO directly
ASSOCIATED INDUSTRY FRAMEWORKS AND FEATURES ‐ EUM
A steering group is currently working to update EUM based on changes in the industry in the last 10 years. The 10 attributes and 5 keys to success are expected to stay the same with some changes to their descriptions. The changes are centered around automated and smart systems and data integration, climate variability and extremes, customer expectations and public awareness, employee recruitment and retention, resource recovery, regulatory requirements and operating conditions, and stormwater and watershed management/one water. The updates are expected to be finalized by spring 2016.
Continue to stay abreast of EUM updates
ASSOCIATED INDUSTRY FRAMEWORKS AND FEATURES – AWWA SURVEY
AWWA’s Utility Benchmarking Survey is now conducted annually. Improvements for 2016 include ease of use (pre‐population of data, definitions, etc.), data quality, confidence levels (guess, audited number, etc.), new indicators and adjustments to existing metrics, and feedback from users and utilities. It has also been mapped to EUM. Utilities answer questions in an Excel file and get the metrics as the output, which AWWA compiles and reports, both specific to the utility and generalized for the industry.
CONSENSUS: Pursue incorporating AWWA metrics into AMCV.
Pursue coordination between AWWA and WSAA on inclusion of AWWA metrics in AMCV
MCES CASE STUDY MCES piloted the proposed 2016 AMCV project approach and tool integration and presented the results. The scope and process was discussed, along with innovations. Key themes were breaking down silos, engagement, and translating a vision and benchmarking results into prioritized actions. There were many questions and answers. There were suggestions as to how to further improve the process, especially for repeat participants.
None
OVERVIEW
5
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
ASSOCIATED INDUSTRY FRAMEWORKS AND FEATURES – NACWA
NACWA conducts a comprehensive annual survey that is mostly financial and gathers data (versus metrics). There is some overlap with the AWWA Utility Benchmarking Survey. In discussions, NACWA has been supportive of this WRF TC project. It is anticipated that NACWA would like to contribute to the project with measures around the utility of the future for potential addition to AMCV.
CONSENSUS: Collaborate with NACWA if possible.
Follow up with NACWA on involvement/ contribution
ASSOCIATED INDUSTRY FRAMEWORKS AND FEATURES – MATT RIES PHD
Matt Ries is writing his PhD thesis on sustainability indicators for
urban water utilities with a focus on defining the drivers and
attributes. The US does not have much data so Matt interviewed
12 utility leaders and conducted an online survey with water
professionals. The results indicated 8 top practices (Education and
communication, Community ROI, Bond rating/financial
management, Resource recovery, Green infrastructure, Asset
management, etc.) and 6 attributes (leadership, political will,
training, board support, etc.), as well as drivers (public demand,
political will, tools, vision, regulations) and barriers (resources, lack
of incentive, lack of definition). Based on this work, Matt
developed a set of sustainability measures. Measures that are not
currently addressed may be incorporated into AMCV.
CONSENSUS: Pursue incorporating Matt’s PhD work into AMCV.
Pursue potential integration of sustainability measures into AMCV
UTILITY PROFILE AND UTILITY DRIVERS
AMCV – The utility profile contains basic utility information such as size and sector. The drivers include Regulatory, Financial, Sustainability, Customer expectations and demand, Knowledge management (staff), Asset lifecycle management, Industry movements, Completion, and Technology advancements. The top 5 are selected by each utility.
AWWA – Drivers are a new part of the Utility Benchmarking Survey and are being developed. Draft list includes Regulatory, Water supply, Rates, Aging infrastructure, Expected efficiency, Growth, Security, Strategic planning, Economic climate, Continual improvement, and Workforce/succession planning/training. Utilities rate each driver as high, medium, and low.
In the past, knowing the drivers has been helpful information to compare and contrast utilities and in formulating improvement initiatives. This also aligns with Importance and Urgency.
CONSENSUS: There should be alignment between AMCV and the Utility Benchmarking Survey drivers if possible.
WSAA and AWWA to work to get alignment between drivers and add an open ended category
METRICS There was a discussion of the possible inclusion of metrics into AMCV. The group decided to include AWWA metrics as well as other metrics (to be identified). Jeff Leighton offered to support identifying additional metrics. [Subsequent to the workshop, Jeff provided some sources of potential additional metrics.] The 2016 AMCV project could provide offline support for developing customized metrics for a utility, which would not be included as a standard approach in AMCV.
Identify additional metrics for possible inclusion in AMCV
OVERVIEW
6
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
CONSENSUS: AWWA metrics and potentially additional metrics should be included in AMCV.
IMPORTANCE AND URGENCY
There was a discussion around the inclusion of an Importance and Urgency rating system in AMCV to help prioritize gaps, which would be applied at the Sub‐process level. The group discussed the definitions and differences between Importance and Urgency. In general, the group felt that Importance and Urgency were an important enhancement for AMCV. These elements will be incorporated into the 2016 AMCV deliverables.
CONSENSUS: Include Importance and Urgency in AMCV.
Include Importance and Urgency in AMCV
EVALUATION There was a discussion around the evaluation component of the WRF TC project that includes a survey for the 2016 AMCV participants and a business case evaluation (BCE). A list of questions and considerations for the survey was developed. These include how the project was done, how much time did the project take, what is the value proposition and benefits, engagement, and survey timing. This component of the project will be discussed in more detail on Day 2.
None
ADDITIONAL ITEMS There was a discussion around the clarity of the roles of PAC, the Steering Group, and the associations.
The 2106 AMCV project will be called the 2016 North American AMCV.
Provide 2016 AMCV marketing materials
PARKING LOT Define the criteria for success for the project
Discuss what happens after the project ends in a year
Discuss WRF TC project name ideas
Address items on Day 2 if necessary
OVERVIEW
7
Agenda Discussion, Outcomes, and Action Items – Day 2
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
DAY 1 UPDATES WSAA and AWWA came to agreement on the list of drivers to be used in the 2016 AMCV and the Utility Benchmarking Survey. The same definitions will be used as well.
List is:
Regulation and standards
Financial
Sustainability
Customer expectations
Resources and service demand
Workforce evolution
Asset lifecycle management
Security
Efficiency and continual improvement
Technology
Other (fill in)
None
EVALUTION The participants divided into 2 facilitated break out groups to further develop the evaluation questions for the survey with equal representation from utilities, associations, and PAC members. One group focused on the project experience, and the other group focused on future and tool improvements. The groups met for over an hour and reported results back to the entire group. Overall there was general alignment on the survey questions between the two groups.
The break out groups work products are provided in Appendix D.
Further develop the survey questions
BUSINESS CASE EVALUATION
The group discussed important features of the BCE, and the various stakeholders described their interests as discussed below.
Develop the BCE
WSAA WSAA is interested in whether the process has value to the users, should we continue on this path, is it the right thing to develop a consolidated tool, if so what are the enhancements, what are the options for the future, are there different options, one tool or many, what is the market for the tool(s), can we get an idea of the frequency and approach, what is appropriate pricing, risks and benefits, SWOT by associations, and what are the association roles (part of this WRF TC project or separate).
None
AWWA AWWA is interested in whether to integrate or coordinate their survey going forward, is the survey meeting their needs, best way to deliver the survey, targeting leading edge utilities versus the whole water industry, reaching the utilities that
None
OVERVIEW
8
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
they want to reach in the ways they want, linking the results from this process across the entire spectrum of utilities, how to build on existing content and partner with other organizations, best way forward to meet industry needs, and how to translate the aggregate results into something meaningful to apply to affect regulatory changes or legislative interests. The Water Utility Council is interested in knowing whether there is opportunity for consolidation and what is needed for the entire sector, how to translate aggregate results into something meaningful (regulatory changes or legislative initiative), how to take the industry report and manifest into meaningful actions, and exploring opportunities for consolidation/integration (metrics, TEC surveys (2 per year), partnered surveys (1‐2 per year), non‐partnered surveys, rates survey, state of the water industry, compensation survey).
WEF WEF has no benchmarking system but they have recognition programs (utility of the future, stormwater, etc.) where benchmarking practices can be helpful. Their interest is in sharing information, best practices, and innovative approaches.
None
PAC The PAC is focused on utility improvement. They would like to have broad participation by utilities in the 2016 AMCV and beyond. They want the BCE to provide the value proposition for the utilities to participate in benchmarking.
Provide the value proposition for the utilities to participate in benchmarking in the report
UTILITIES The utilities are focused on continuous improvement, gap analysis as what to do differently to expand the program, monitoring, exploring the capacity for benchmarking at a greater frequency, demonstrating the value through the initial study, addressing survey fatigue, considering the do nothing option, quantifying benefits, opportunity cost and avoided cost, incorporating TBLI (Infrastructure), exploring ways to increase participation, informing the sector affects the benefits, cost and broad appeal, using case studies in the BCE, why, why now, value, return on investment, what is the internal cost, what can it augment, time commitment for internal resources and what falls off the plate, what is needed to accomplish, how does it support the organizations that also represents the utility, AMCV and other tools, description of tools and where they fit, values they add, costs (internal and external), ROI, etc., what are the boundaries around the tools, competition for use of tools, could the business case compare and contrast in terms of the cost and resource, what is the business case moving forward, do we use all or some, combined or separate.
A potential future project could be mapping of each of the currently available tools, the gaps they fill and when they
None
OVERVIEW
9
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
should be applied and used, and identifying different programs and the value they add and the cost. This could include how these tools might be better rationalized, variation of utilities, regions, scale and scope, and compare and contrast all the tools and resources required and costs (internal and external).
SUCCESS CRITERIA The group defined the success criteria for the WRF TC project. These are listed below.
Comprehensive BCE with elements from the discussion with recommendation for a path forward. Would be good if utilities understood the options – what is the depth and breadth of the approach.
High response rate to the evaluation and clear response to the evaluation. I.e. we understand what people want – needs to be comprehensive. Are we going to set goals for success rate for feedback on the survey? Need to encourage people to complete the survey, but not set a metric.
Ability to have the AWWA Utility Management Standards mapped and implemented – we have committed to look at level of effort only at this point.
Evaluation report will culminate effort and suggest next steps from the perspective of what was learned. The BCE could take that into account. I.e. the evaluation will inform the BCE.
Catalyst for change within the utility with utility report and possibly the industry as a whole with industry report. Ability to pull the data together and determine what is happening across the sector could be a catalyst for change if there are common areas.
Lessons learned to build on for next steps. The number of survey questions are appropriate for this first stage to get as much feedback as possible. Can we do this in a way that doesn’t result in survey fatigue? Need several options. Success = being able to effectively gather data from the utilities without causing survey fatigue.
Collect feedback in a variety of ways. Expect survey results from at least the project coordinator and encourage survey feedback from others as well. If survey has the right questions, the feedback at the end could be really helpful in the evaluation.
Include Canadians as well.
Workshop report that documents the outcomes and discussions.
None
NEXT STEPS The decision was made to hold the next meeting in conjunction with the 2016 Utility Management Conference in San Diego. The meeting is tentatively planned for Tuesday, February 23 from 3 pm to 5 pm EST. Call in information will be provided to those who are not attending the conference. Location to be determined.
Set up the meeting for February 23
OVERVIEW
10
AGENDA TOPIC DISCUSSION/OUTCOME ACTION ITEM
Another group meeting may be held in conjunction with AWWA’s ACE in June in Chicago.
PROJECT NAME The group decided to name the WRF TC project “Collaborative Utility Benchmarking in North America”. Formerly the title was “Enhancing Utility Benchmarking in North America”.
None
WRAP UP The Benchmarking Workshop Results Report will be provided by December 31, 2015.
The group participated in a discussion of items to consider in moving forward with the WRF TC project. Items included communication, schedule and milestones, incorporating feedback, and focus on value propositions.
Develop the Benchmarking Workshop Results Report
T E C H N I C A L M E M O R A N D U M
11
Appendix A: Benchmarking Workshop Agenda Appendix A includes the Benchmarking Workshop agenda for both days.
T E C H N I C A L M E M O R A N D U M
12
Day 1 Agenda Time Topic Presenter
6:00 – 8:00 am Breakfast provided at hotel
8:10 am Shuttle from hotel to AWWA, meet in hotel lobby
8:30 am Introductions/objectives/expectations
Objectives:
present the benchmarking tool and process envisioned for implementation in North America, and facilitate input, discussion and agreement on the final tool and approach
discuss and solicit input on the questions that will be important to ask the participating utilities at the end of the benchmarking exercise to determine the costs, benefits, and value to participating utilities
solicit understanding, input and discussion on the business case development
Linda Reekie
8:45 am Context – overall scope, objectives, schedule and deliverables for the project:
WaterRF Tailored Collaboration aspects
Benchmarking process and approach
AWWA TEC (If funded)
Scott Haskins Greg Ryan Ken Mercer
9:30 am Associated industry frameworks and features
Asset Management Customer Value (AMCV) ‐ highlighting changes from 2012, detailed explanation of tool
ISO 55000
Greg Ryan Greg Ryan
10:15 am Break
10:30 am Associated industry frameworks and features (continued)
EUM
AWWA Survey
NACWA Survey
PhD Dissertation
Matt Ries Terry Brueck Stephanie Passarelli Scott Haskins Matt Ries
11:15 am Presentation and discussion of MCES case study Leisa Thompson
12:00 pm Lunch provided by AWWA
Afternoon sessions on details of the process. Throughout the process we are seeking to achieve:
1) Understanding of the benchmarking tool and the benchmarking process as presented below
2) Confirm you are comfortable with the approach
3) Fatal flaws
OVERVIEW
13
1:00 pm Presentation of utility profile and utility drivers methodology alignment and how this will be used as part of the project
Greg Ryan/ Terry Brueck
1:30 pm Presentation and discussion of the approach used to map AMCV and EUM, metrics, ISO 55001
Validation of concept
Greg Ryan
2:15 pm Metrics – presentation of the approach and process
Validation of concept
Scott Haskins
3:15 pm Break
3:30 pm Importance and urgency– presentation of the approach and process
Validation of concept
Scott Haskins
4:00 pm Evaluation – major components for Experience and Future Opportunities
Validation of Process and Tools
Scott Haskins
4:30 pm Closeout
How did the day go?
Day 2 plans
Linda Reekie
5:00 pm Finish Day 1
5:15 pm Shuttle from AWWA to hotel, meet in AWWA lobby
6:30 pm Dinner – Bonefish Grille provided by WRF
OVERVIEW
14
Day 2 Agenda
Time Topic Presenter
6:00 – 8:00 am Breakfast provided at hotel
8:10 am Shuttle from hotel to AWWA, meet in hotel lobby
8:30 am Evaluation – break out exercise incorporating the outcomes from Day 1.
Form two break out groups, to describe how the final survey will be designed to ensure a robust evaluation of the project outcomes in terms of:
Participants’ experience of the benchmarking process and tools
Future opportunities arising from the process
All
Facilitated by Scott Haskins, Priscilla Bloomfield
9:30 am Group 1 report out and feedback All
10:00 am Group 2 report out and feedback All
10:30 am Break
10:45 am Business Case Evaluation (BCE) – what are the key elements of the future business case from associations’ and utilities’ perspective? Presentations to frame the discussion
AWWA
WSAA
Utilities
Discussion and input into the final structure of the BCE
Ken Mercer
Greg Ry an
Utilities
Facilitated by Scott Haskins
11:45 am Feedback/next steps Scott Haskins/Linda Reekie
12:00 pm Lunch provided by AWWA
12:30 pm Close
1:00 pm Shuttle to airport, meet in AWWA lobby
T E C H N I C A L M E M O R A N D U M
15
Appendix B: Benchmarking Workshop Day 1 Presentation Appendix B includes the Benchmarking Workshop Day 1 Presentation.
1 Form a North American Steering Group that is made up of six utilities/leaders and industry association representatives from AWWA, WSAA, WEF, and possibly others.
Y
2 Evaluate potential tools and recommend a process that can be applied to the execution of the 2016 Utility Benchmarking project.
Y
3 Develop a base case and options for an implementation approach, suggested tool and process methodology, and availability of materials that can be reviewed and lead to recommendations.
Y
4 Conduct a Benchmarking Workshop with the Steering Group that incorporates items 1, 2, and 3 above
In progress
5 Document the workshop and outcomes in a Benchmarking Workshop Results Report
N
6 During course of the execution of the 2016 Utility Benchmarking project, develop an evaluation, business case, and recommendation
N
7 Document the evaluations result in a Benchmarking Evaluation Report N
8 Provide Leading Practice Report from 2016 Utility Benchmarking project N
Steering Group• WSAA – Greg Ryan • AWWA – Ken Mercer• WEF – Matt Ries
• 5 Utilities
CH2MScott Haskins, PI
Priscilla Bloomfield, PM & Lead Analyst
Steering Group Utilities: Portland Water Bureau, DC Water and Sewer, Toho Water, Metropolitan Council Environmental Services (Minneapolis-St Paul), Albuquerque Bernalillo County Water Utility Authority
• Development of the pilot model for converting 2012 AMCV (Aquamark) scores to ISO55001 compliance scores, and providing a tool for scoring gaps and partial ISO gaps in Aquamark
New – Additional AMCV measuresTweeks – Modification to existing AMCV measures
• Workgroup – association representatives— Shellie Chard-McClary, Oklahoma DEQ— Andrew Clarkson, American Water— Lisa Daniels, Pennsylvania Bureau of Water— Ken Fischer, Southwest Water Company— Dan Hartman, West Palm Bay, FL— George Martin, Greenwood, SC— Diane Taniguchi-Denis, Clean Water Services, OR— Tyler Richards, Gwinnett Co., GA— Steve Schneider, St. Paul, MN— Tom Sigmund, NEW Water (Green Bay, WI)— John Sullivan, Boston Water & Sewer Commission— Tim Wilson, Marshalltown, IA
• Automated and “Smart” Systems and Data Integration
• Climate Variability and Extremes• Customer Expectations and Public Awareness• Employee Recruitment and Retention• Resource Recovery• Regulatory Requirements and Operating
1 What do you think about using the “triple bottom line-plus” framework,with the plus being infrastructure, as a water utility sustainabilityframework?
2 What do you believe are the most important economically-sustainablepractices for U.S. urban water utilities?
3 What do you believe are the most important environmentally sustainablepractices for U.S. urban water utilities?
4 What do you believe are the most important socially sustainable practicesfor U.S. urban water utilities?
5 What do you believe are the most important infrastructure-relatedsustainability practices for U.S. urban water utilities?
6 What do you see as the most significant barriers to more widespreadadoption of sustainability indicators?
7 Do you currently, or do you plan to publicly reporting your utility’ssustainability performance, either through Global Reporting Initiative (GRI)formats or others?
Question 1Indicate if you primarily interact with (choose one):• Water utilities• Wastewater utilities• Both or combined water/wastewater utilities
Question 2Provide up to 20 brief responses for the following. “LIST EXAMPLES OF SUSTAINABLE PRACTICES FOR U.S. URBAN WATER UTILITIES.” Do not research the answers. Rather, simply provide answers in the order they come to mind.
Indicator examplePractice 2: Education & Communication
Indicator 2.1: Does your utility have a public education program about its sustainability efforts? Guidance: A public education program is externally‐focused and designed to build support for and awareness of utility operations and sustainability efforts.
1 2 3 4 5This activity is not practiced at our utility
This activity is implemented, but only occasionally or without uniformity
This activity is implemented, but there is room for substantial improvement
This activity is largely implemented, but there is room for improvement
This activity is fully implemented at our utility
Indicator 2.2: Does your utility have an effective communications plan that surveys stakeholders and engages them in dialogues?Guidance: A communications plan solicits responses from and engage stakeholders before, during, and after service events and infrastructure activities.
1 2 3 4 5This activity is not practiced at our utility
This activity is implemented, but only occasionally or without uniformity
This activity is implemented, but there is room for substantial improvement
This activity is largely implemented, but there is room for improvement
7 Functions and teamsFunction ESET Lead Team Members
Corporate Policy and Business Planning
Leisa Thompson Leisa Thompson, Karen Neis, Bryce Pickart, Mike Mereness, Jim Schmidt, Sam Paske, Larry Rogacki, Jason Willett
Capability Forward Planning
Ned Smith George Sprouse, Kyle Colvin, Dave Simons, Judy Sventek
Acquisition Bryce Pickart Deborah Peterson, Scott Dentz, Paul Dietz, Pat Oates, Jim Schmidt
Operations Mike Mereness Craig Edlund, Dan Fox, Rene Heflin, Dan Frey, Girma Yismaw, Dave Gardner, Mary Gail Scott, Lynn Schneider
Maintenance Jim Schmidt Nick Davies, Dan White, Tim Maranda, John Peick, Dave Quast, Jim Sailer, Tim Keegan
Rehabilitation and Replacement
Sam Paske Jim Nally, Nick Davern, Adam Gordon, Jim Wawra, Dennis Lindeke
Business Support Systems
Larry Rogacki Ricky Arora, Martina Nelson, Sara Landgreen, Dawn Ellis, Terrie O’dea, Roger Knuteson, CammyJohnson, Laura Fletcher, Dan Vaaler, Judy Sventek, Matt Strickland, Matt Gsellmeier
Importance Very low importance to the utility's desired results; does not align with business drivers, strategic vision priorities, or major industry directions
Low importance to the utility's desired results; does not align closely with business drivers, strategic vision priorities, or major industry directions
Medium importance to the utility's desired results; aligns closely with business drivers, strategic vision priorities, or major industry directions
High importance to the utility's desired results; aligns with top business drivers, strategic vision priorities, or major industry directions
Very high importance to the utility's desired results; aligns closely with top business drivers, strategic vision priorities, or major industry directions
Urgency Can be delayed with very limited negative impacts to desired results
Do when time allows with limited negative impacts to desired results
Do later with minor negative impacts to desired results
Do soon because of significant impacts/risks on desired results
Do now because of major impacts/risks on desired results
Results Successful and efficient process Added to past industry best practice:
– Integration of metrics with practices– Urgent and Importance scale– Using 1-5 cards– Cross functional participation– Mapped to Strategic Vision (be a leader in water
sustainability)– Included multiple tools (EUM, ISO 55000, NACWA,
• Applied at the subprocess level in AMCV• Helps to prioritize gaps to addressScore 1 2 3 4 5
Importance Very low importance to the utility's desired results; does not align with business drivers, strategic vision priorities, or major industry directions
Low importance to the utility's desired results; does not align closely with business drivers, strategic vision priorities, or major industry directions
Medium importance to the utility's desired results; aligns closely with business drivers, strategic vision priorities, or major industry directions
High importance to the utility's desired results; aligns with top business drivers, strategic vision priorities, or major industry directions
Very high importance to the utility's desired results; aligns closely with top business drivers, strategic vision priorities, or major industry directions
Urgency Can be delayed with very limited negative impacts to desired results
Do when time allows with limited negative impacts to desired results
Do later with minor negative impacts to desired results
Do soon because of significant impacts/risks on desired results
Do now because of major impacts/risks on desired results
Drivers Update• Regulation and Standards• Financial• Sustainability• Customer Expectations• Resources and Service Demand• Workforce Evolution• Asset Lifecycle Management• Security• Efficiency and Continual Improvement• Technology• Other
• How did you undertake the process?— Who led the process and what was their title?— Who did you involve to do the ratings?
• How much time did you spend undertaking the process?• How do you feel the process went?• What did you like/not like about the current process?• What did people want reported, what gave the most value
and why?• What did you like least about this process?• How are you going to share the results from this process
with stakeholders – the project and the assessment?• Comment on the ability to undertake the action plan? How
Appendix D: Benchmarking Workshop Day 2 Evolution Survey Questions Appendix D includes the Benchmarking Workshop Evaluation Survey Questions developed in the break out groups.
T E C H N I C A L M E M O R A N D U M
18
Straw Proposition of Questions for Past Experience
Can questions help compare the outcomes or satisfaction of a utility using the facilitated vs. not facilitated process?
Assess before and after attitudes about importance of various components of the survey
PROCESS
o What option did you pick? Solo or facilitated
Why did you pick the option?
Would you do it the same way again???? [does this question have value]
o How did you undertake the process?
Who led the process and what was their title?
Who did you involve to do the ratings?
Did you involve stakeholders outside of your utility? (boards, consultants, customers, advisory committees: multiple choice)
Identify the types of staff involved in the project. (front‐line staff, management)
Did you involve engineering, operations, maintenance, HR, customer service? (Include an estimate of the hours involved.) (Reference AWWA survey for job categories.)
How much time did staff spend undertaking the process (total person‐hours)?
o How did you feel about each element of the project (1‐5) (training, assessment, validation, leading practices conferences, Networking/sharing, deliverables, separate and combined)
o Did the process meet your expectations?
o What are you suggestions to improve the process?
o Was it helpful/did it add value to have the ability to map to EUM and ISO (1 – 5)
o How did you feel about having metrics and measures? (Were they relevant/meaningful to your utility?)
o Was it helpful (did it add value) to have the ability to rate importance and urgency?
ACTIONS
o What implementation plans do you have? (add time periods and the number of initiatives. ) 0‐6 mon, 7‐12 mon, etc.
o Comment on the ability to undertake the action plan in the utility report (plan as recommended)? As well as what you decided to do outside of the report
o How will you use the outcomes of the process/ tool? What else could have been provided that wasn’t? What would make the tool more complete?
VALUE
OVERVIEW
19
o What did you like/not like about the current process
Would you recommend to others (consortia, individual)?
Would you do it again, and if so, when (yearly, every 4 years, etc.) (consortia, individual)?
How much would you be willing to pay (consortia, individual)?
What would you carry forward?
Enhance?
Add?
Drop?
o Have you used other benchmarking or self‐assessment or survey tools this year?
o Did the project provide additional value over previous benchmarking experiences?
What aspects provided the additional value? (provide list of options)
COMMUNICATION
o Do you plan to share the results from this process with stakeholders (provide a list of different stakeholders for selection) – the project and the assessment?
OVERVIEW
20
Straw Proposition ‐ Future Opportunities
Free form responses with examples
1. Actions and opportunities to go forward using the results (VERY HIGH)
a. Ways to address the gaps
b. What do you feel is the value proposition for the utility going forward
i. Consortia
ii. Stand alone
c. Contacts and connections to others – leading practices…?
i. Follow‐in activities post‐benchmarking
ii. Experiences of others
iii. Pointers to other resources – e.g. AWWA, WSAA, WEF,…
d. Vehicles for Sharing with others?
i. Conferences, regional workshops, other venues
e. Analysis for patterns on gaps?
i. Ways to share with peers
ii. Creation of new content
iii. Trends over multiple benchmarking results of gap closure, support for the future
2. Cost and value proposition (VERY HIGH)
a. What was the learning/value of individual vs group process?
• Impact of the process?
— Quantification of benefits
b. Initiatives justification
c. Cost and value for money
• Cost Threshold (barrier, willingness to pay)
• Facilitated vs self‐assessment – comparison of benefit, time, etc.
• Time and effort spent vs value
3. Frequency ((MEDIUM)
What would you like to see for the future frequency of the process?
a. Consortia (group process)
b. Individually (per utility)
4. Mapping to other tools, processes and resources (HIGH)
a. How much flexibility is desired both consortia and individual
b. Links to Strategic plan
c. Add or modify measure
OVERVIEW
21
d. Add new topics
— Add additional metrics
— Mapping of benchmarking tool linkage to other tools?
— Continue or expand?
— Link to utility specifics (e.g. strategic plan, etc.)
— What level of integration of tools is desired in the future?
• Separate
• Partial
• Full
Straw Proposition for Evaluation of the Tools
1. How did you feel about the functionality of the Framework and Tools? (HIGH)
a. Tools/frameworks used?
b. Level of desired integration? More? Less? (e.g. metrics and practices, SAM Gap/SIMPLE, EUM)
c. Additions to the tool content? expectations
d. What outputs would make the tool more useful? Format, content, post processing
i. Examples of tool use and results
ii. How output was used?
iii. What was missing?
2. Would you undertake this again? (MEDIUM)
a. What would help or hinder you?
b. Tools – ease of use, reporting, etc.?
3. How often would you undertake? (MEDIUM)
a. Industry consortia benchmarking
b. Internal assessment
4. Evaluation of components of AMCV (HIGH)
a. Functions 1‐6 vs 7?
b. Importance and urgency
c. Rating scale – how to improve?
d. What do you feel are the elements of a valuable tool?