PERFORMANCE HANDBOOK Improving Performance at the County of San Mateo County of San Mateo Revised August 23, 2016 “Never neglect an opportunity for improvement.” – Sir William Jones
PERFORMANCE HANDBOOK
Improving Performance at the County of San Mateo
County of San Mateo
Revised August 23, 2016
“Never neglect an opportunity for improvement.”
– Sir William Jones
CUSTOMER SURVEYING
Surveying to Improve Performance
County of San Mateo
Revised August 23, 2016
“Your staff may have the skills and know-how to
interact with your customers. But what
organizational strategies can you employ to
please customers? Practice proactive customer
service by making your customers happy
before they come to you with problems.”
– Survey Monkey
Contents
San Mateo County strives to provide excellent service to the public .................................................................. 2
About this Chapter ......................................................................................................................................................... 2
Surveying to Improve Department Performance ..................................................................................................... 3
County Manager’s Office Commitment ................................................................................................................ 3
General Survey Information .......................................................................................................................................... 4
Encourage Customer Feedback ............................................................................................................................. 4
Know Your Customer .................................................................................................................................................. 4
Enterprise-wide Survey Account .............................................................................................................................. 4
Improving Survey Response Rates ............................................................................................................................... 5
Why are you conducting the Survey? .................................................................................................................... 5
The appropriate person should complete the survey. ........................................................................................ 5
Keep the survey short and the questions simple. ................................................................................................. 5
The Survey should be convenient for the customer ............................................................................................ 5
Survey timing and frequency ................................................................................................................................... 6
Send Reminders........................................................................................................................................................... 6
Offer incentives to customers for filling out your survey ...................................................................................... 6
Survey Methodology ...................................................................................................................................................... 7
Survey Formatting ....................................................................................................................................................... 7
Question Wording ....................................................................................................................................................... 7
Types of Responses ..................................................................................................................................................... 8
Types of Questions ...................................................................................................................................................... 9
Advanced Topic - Sampling ...................................................................................................................................10
Strategies for Reporting and Using Customer Service Data to Improve Performance ...................................12
Staff Meetings ............................................................................................................................................................13
Performance Dashboards .......................................................................................................................................13
Department Customers ...........................................................................................................................................13
Resources........................................................................................................................................................................14
1. County Manager required Overall Satisfaction Question ........................................................................14
2. Survey Monkey ..................................................................................................................................................14
3. Examples of Department Survey Improvements ........................................................................................14
4. Examples of Likert Scaled Responses Used in Data-Gathering ...............................................................22
2 | P a g e
San Mateo County strives to provide excellent service to
the public County departments provide a varied range of services that target and reach
different populations and communities. Residents obtain or receive County services
through different modalities (in-person, by phone, online, by mail), in different
locations, and with different levels of frequency and duration. The County values the
feedback of residents and the clients/customers we serve as an important source of
information for continuing to improve our services.
The County Manager’s Office (CMO) encourages ALL departments to adopt an
approach to obtaining meaningful feedback from clients/customers that can be used
to inform departments’ service improvement efforts.
The CMO is a resource to departments in obtaining meaningful feedback from clients/
customers as a component of departments’ broader performance improvement
efforts. The CMO also supports departments’ adoption of sector-specific tools that
provide meaningful client/ customer satisfaction feedback, and welcomes the
opportunity to learn from those tailored tools to inform efforts in other arenas.
About this Chapter This resource guide is the result of a workgroup made up of multiple departments that
looked closely at how the County of San Mateo could improve customer service.
Based on the FY 2014-15 Year End Performance Survey, customer service was
determined to be an area that needs improvement. The County currently measures
customer service performance across all departments with one measure:
“Percent of responses rating customer service as good or better”
Countywide Range: 72% - 100%
Countywide Average: 91%
While the customer service countywide average is high, many departments reported
that they need assistance developing surveys that are meaningful and capture
customer experience information that can be used to improve performance. In
addition, departments indicated that they would like to improve the number of
responses received.
3 | P a g e
Surveying to Improve Department Performance This resource guide was developed to encourage and support department driven
efforts that aim to enlist customer feedback through surveys with the goal to inform
ongoing improvement efforts and improve department performance.
In order to impact performance that will ultimately improve the customer experience,
it is important for departments to have a good understanding of the existing customer
experience. Surveying customers to enlist customer feedback is one way to capture
this information. The County Manager's Office recommends that departments survey
their customers and use the information collected to continuously improve service.
This tool-kit provides survey development guidance to departments in the following
areas:
Development of surveys that ask meaningful questions so that information
captured can be used to improve department performance
Increasing survey response rates
Creating online surveys that are convenient for the customer
County Manager’s Office Commitment In order to promote and support Departments’ efforts to collect and use client
feedback in driving performance improvement, the CMO will:
Bi-annually (during the “off year” of the County’s 2-year budget) ask
departments to relay their methods for collecting and using client feedback in
promoting ongoing improvement
Learn promising practices from departments that adopt department-specific
client feedback tools as well as approaches for using client feedback data to
drive ongoing improvement and capture it in this Resource Guide
Meet with departments upon request to provide survey technical assistance
Tip:
“83% of companies who see themselves as successful actively
measure customer satisfaction.” – Survey monkey
4 | P a g e
General Survey Information Encourage Customer Feedback In order to improve customer service and department performance, it is important to
understand your customer experiences, both positive and negative. Surveying your
customers and providing a way for them to give feedback makes it easier to learn
where a department is succeeding and where improvements are needed. Providing
customers with a way to express their displeasure through a survey, can also help
prevent them from voicing their frustrations on social media pages.
Know Your Customer As stewards of tax-payer dollars, the County of San Mateo has an interest in improving
customer service. Given that the County offers a variety of services to different types
of customers, it makes sense for each department to conduct surveys that are
targeted to their specific customers. If the department provides multiple services and
each service has different customers, separate surveys should be considered for each
customer base.
Every department has a customer. While the majority of County departments have
external customers that are members of the public, it is important to note that all
departments have internal customers which are often other county departments.
Since County business often relies upon the collaboration of many internal partners, it is
important to survey internal customers in addition to external customers. The feedback
that is obtained from these surveys can be used to improve collaboration and provide
a mechanism for fellow co-workers and colleagues to offer insight into how processes
can be improved.
Enterprise-wide Survey Account The County of San Mateo is expecting to implement an enterprise-wide survey
account that is available to all departments. The expected roll-out date is January
2017. To obtain your log-in, contact the Information Services Department.
Log-in here: https://www.surveymonkey.com/?ut_source=header
There are a number of benefits to using the enterprise-wide survey account including:
Survey Monkey offers templates and tools for survey development:
A collaboration tool allows employees to create, edit, and analyze surveys as a
team and manage group projects.
You can share data and results with other employees
Custom County templates are available
There is a library for logos, documents, and more
You do not have to use your personal credit card for access to survey monkey
and you can skip the step of getting reimbursed from purchasing
Survey results are analyzed for you
5 | P a g e
Text analysis is available for open ended questions
Improving Survey Response Rates There are a number of ways to improve survey response rates. Below are a few
suggestions that can impact the number of survey responses departments receive. For
more information, visit the Survey Monkey website which has a comprehensive listing
of ways to improve response rates.
Why are you conducting the Survey? The first thing to do when developing a survey is to articulate the purpose of the
survey. What information do you want to know, and for what purpose? Answering
these questions will help you focus the survey to get the information you are looking
for. For example, perhaps you would like to conduct a survey to get the public’s
reaction to a new business process recently implemented by your department. The
information gathered through the survey may help you determine whether or not you
want to continue that process, or the feedback may suggest areas where the process
can be improved. For this example, your questions should be focused on the process.
The appropriate person should complete the survey. It may seem obvious – the person receiving the service, should also take the survey.
However, we found that emailed surveys in particular, did not always go to the person
who received the service. For example, if an Administrative Assistant is arranging for a
service to be performed for someone else, the survey should be sent to the person that
actually receives the service, not the Administrative Assistant.
Keep the survey short and the questions simple. When developing a survey, it is natural to want to ask as many questions as possible of
the customer while you have a captive audience. Unfortunately, this often results in
surveys that are too long, unfocused, and data that is never used to improve
performance. By asking as few questions as possible, you will likely see the response
rate increase. In addition, you will have a manageable amount of data and feedback
that can be easily summarized and used to improve department performance.
Keep the questions simple. Avoid using complicated language and acronyms that are
not easily understood by the public.
Note: Remember to ask the overall satisfaction question that must be reported to the
CMO twice per year. See the Resources section at the end of this document for the
question wording.
The Survey should be convenient for the customer The County has a long history of surveying clients through paper surveys available at
service counters. While there is still a need for paper surveys, most people prefer to
respond to surveys on their mobile devices. Therefore, to ensure that your survey is
6 | P a g e
convenient for the customer, easy to access, and to improve your response rate,
consider sending a link to an online survey, through an email to your customers. This will
allow your customer to access the survey where ever they are, making it more likely
that they will respond. If you would like a template for an email that contains a survey,
see the resources section of this guide. If you are not collecting your customers’ email
addresses. Now is a great time to start!
Survey timing and frequency In order to maximize response rates, some thought and consideration should be put
into the timing of conducting your survey. For example, if the customers that you are
surveying are tax preparers, you probably don’t want to survey them during the
months of February, March or April when they are in the middle of tax season.
Surveys should also be sent as soon as possible after the service is provided so that the
customer experience is fresh in the customer’s mind. Common practice is to survey
once or twice a year. That level of frequency may be appropriate, for example, for
long term projects that span over the course of a few years. However, for departments
that provide service on an on-going basis, surveying customers once per year may not
be often enough. It would be very difficult for a customer to recall their experience
with a County service if they are being surveyed 6 months after they received the
service. The appropriate level of frequency for surveying can vary. If you would like
assistance determining how often to survey, contact the County Manager’s Office for
assistance.
Send Reminders Where appropriate, send out survey reminders to those who have not responded. This
can increase response rates.
Offer incentives to customers for
filling out your survey The County Manager’s Office sees
value in providing small incentives to
customers for responding to surveys –
within reason. For example, a
department could offer a small token
of appreciation to customers for filling out a survey. The Planning and Building
Department is considering offering tape measures with the County logo to customers
that fill out their paper surveys at the Planning and Building service counter. Entering
respondent’s names into an opportunity drawing for a small prize is also acceptable.
According to Survey Monkey, incentives can boost response rates by 50%.
Tip:
“Keep the incentive appropriate in scope. Overly
large incentives can lead to undesirable
behavior, for example, people lying about
demographics in order to not be screened out
from survey.” – Survey Monkey
7 | P a g e
Survey Methodology Below are notes that would assist in developing or revising surveys including how to
format a survey, question wording, types of responses, types of questions, and an
advanced topic section on how to properly sample your population for best results.
Survey Formatting Distinguish any instructions from the questions themselves with a different font
face, such as bold or italics
Do not break text between pages – keep question and answers together on
same page
Use the most general question first (i.e., How satisfied were you with the overall
service)
Start with questions that are easy to read, important topics of interest and close-
ended
Any demographic information is placed at the end of the survey
Keep questions on similar topics together
Use transitional statements between different topic areas to help guide the
respondent through the survey
Keep the survey as short as possible – only essential questions
Question Wording The wording of questions should be very simple and written at a 5th grade reading
level at the highest.
Do NOT use:
Leading – Do not lead a participant to your preferred response, for example: “We think
our customer service representatives are really awesome. How awesome do you think
our customer service representatives are?” This would pressure respondents to
answering more favorably than they actually feel.
Compound – Only ask one topic at a time. For example, do not ask, “Was the staff
responsive and professional?” You will be unable to determine which part of the
question the customer is evaluating in their answer.
Unclear/Ambiguous – Questions should be specific. Avoid words that can have
multiple meanings to different individuals. For example: “Do you think there should be
more strict environmental rules?” The word strict could have different meanings
depending on who reads the question. Or “How awesome do you think our customer
service representatives are?” The word awesome is a vague generalization.
Invasive/Personal – Questions about personal information can be hard to phrase in a
non-intrusive way, such as asking for income. A solution is to present categories rather
than asking for a specific number (ex $20k-$30k)
8 | P a g e
“Not” - Avoid using negative questions. Using the word “not” in a questions will require
that you have to reverse code the responses – a strongly agree would have to be
coded as a strongly disagree, and vice versa. For example “My needs were not met
by my interaction today” would be better asked as “My needs were met by my
interaction today”
Unfamiliar Words/Abbreviations – Avoid using technical jargon. Spell out any
acronyms.
Types of Responses
Open-Ended – Questions that do not place restrictions on the answers provided
Use when you are unsure of the ranges of responses and hope to conduct a
preliminary exploration of a topic
Pros
o Yield more varied responses
o Can highlight unanticipated responses
o Useful for piloting, to obtain proper range of responses
Cons
o Time to read through and code responses
o Difficult to report results for the entire group
o Difficult to determine respondents interpretation of question
Close Ended – Respondents must choose among specific response options
Use when you know the specific information needed to answer a question and
when a single frame of reference is required among respondents
Pros
o Allow for the same frame of reference for all participants when choosing
an answer
o More specific than open-ended questions
o More likely to produce consistency in understanding the question and
responses
o Faster to code responses
o Allows for more systematic analysis of data collected
Cons
o No room for respondents to express responses in their own words
Avoid Yes/No Response Choices – Rarely will a respondent feel comfortable
giving a black and white answer choice – a continuum better reflects actual
feelings
Likert Scale – Used to rate each item on a response scale
o A 5 point scale is very common and would include the items: “strongly
agree,” “agree,” “neutral,” “disagree,” and “strongly disagree.”
o See attachment E in the Resources Section for more examples
9 | P a g e
Types of Questions Many of the current survey questions for the county are focused on service quality
rather than the impact of services on a participant’s knowledge, skills and abilities
(KSA), and behavior. Service quality is a much easier concept to measure in surveys.
There are two ways to measure impact: actual impact or perception of impact from
the participant. Of these two, the perception of the impact is far easier to measure via
surveys
Actual Impact
Actual impact requires a hard test, involving both pre- and post- testing. For
example, if the goal of a program is to increase a customer’s knowledge, you
would have to administer a test before the service and after the service to
determine whether the score improved.
Perception
Alternatively, you can ask a participant whether they felt their KSA or other
intended impact has improved as a result of the services. This still provides some
measure of impact and can be easier and more efficient to obtain through a
survey
Structure of Questions
o One way to structure these questions is a simple statement coupled with a
Likert Scale response: “My knowledge has increased as a result of the
services provided today” with response options of “strongly agree,”
“agree,” “neutral,” “disagree,” and “strongly disagree.”
o You can also survey respondents at the end of a program or service and
have the customer rate their KSA at the present time and how they would
have rated themselves before the program. An example may look like
this:
o A similar method that may be helpful for those who speak
other languages, or have a lower comprehension level
would be a visual representation using a step scale as
illustrated below. Where a “B” indicates the before level,
and an “A” after receiving the service.
A
B
10 | P a g e
Advanced Topic - Sampling Sampling is incredibly important when it comes to surveying. Without proper sampling
techniques, the degree to which the sample differs from the population of customers is
unknown. In other words, if you don’t sample properly, you will only know information
about those from whom you collected your survey data – you will not be able to
extrapolate those findings to your entire customer population.
Some quick terminology first. Population refers to all of the people who receive your
services. A sample is a subset or small representation of that population. In some
cases, your population may be small enough that sending a survey to the entire
population is feasible – usually around 500 people or less. With very large populations,
sampling is often more cost efficient to obtain the customer satisfaction information
you seek.
Sample Size. To determine how many surveys you need to send out to in order to have
results that reflect the target population precisely, there are some simple online
calculators:
http://www.surveysystem.com/sscalc.htm
For this calculator, leaving the confidence level at 95% is fine. This means you are 95%
confident that the true percentage of the population lies within the confidence
interval. The confidence interval, also known as the margin of error, is the plus or minus
feature you frequently hear reported. For example, if you have a confidence interval
of 5%, and 30% of your sample picks a response, you can be sure that of the entire
population between 25 and 30% would have picked that answer. Finally, this
calculator requires that you have some idea of how large your population is.
Example: Below is an example of the input. For a 95% confidence level, with a
confidence interval of 5%, and a total population of 750, your needed sample size is
254 people. This is how many responses you need. If you normally have a 40% response
rate for your survey, you would need to take the sample size, and divide by 40%
254 / 0.40 = 635 surveys that need to be sent out, to obtain a sample of 254 responses.
11 | P a g e
There is a second tool on the website by which you can determine the confidence
interval based on survey results you may already have. This requires what confidence
interval you want (again, 95% is fine), your sample size, your population and
percentage – which refers to what percentage of your survey respondents selected
the answer for that item of the question.
Example: Below is an example of this input. If you received 49 surveys, and your entire
population is 850. You’re also interested in a question where 51% of the respondents
chose “strongly agree”. The output of this calculator says your confidence interval is
13.6%, which means you can be sure that of the population, between 37.4% and 64.6%
would have chosen “strongly agree”.
Now that you know if you need to send out your survey to the entire population, or to
a sample of your population, how do you know how to pick the people to send the
survey to?
There are two ways to survey:
12 | P a g e
Probability Sampling, means that each member of the population has a known non-
zero change of being selected. Non Probability Sampling occurs when members are
selected from the population in a non-random way. Without getting too complicated
– you should try to use Probability Sampling techniques, and avoid non-probability
techniques.
A common and easy to implement sampling method to use is systematic sampling.
This is where you would survey every 10th (or any number, 3rd, 5th, 22nd, etc) person who
comes in for a service. This can also be applied retroactively if you have a list of
customers. You would choose every “n”th customer from the list to send a survey. You
determine what number to use based on your population and your desired sample
size. For example, you have a list of 200 clients and you want to sample 50. You take
200, divided by 50, and would select every 4th person on the list to survey.
A common and easy to implement sampling method to avoid is convenience
sampling. The sample is selected because of their convenient accessibility and
proximity to the surveyor. For example, you have some clients that are served in office
and some that are served in the community. Convenience sampling would be
choosing to survey only those that come into the office.
Keeping in mind how the people you are surveying differs from the entire population
you serve is a good practice.
Tip:
Simple surveys get more responses. For best results, the survey should take less than 11
minutes to complete and should be as short as possible, and preferably 45 questions or
less.
https://www.surveymonkey.com/blog/2012/04/12/good-surveys-and-bad-surveys/
13 | P a g e
Strategies for Reporting and Using Customer Service Data
to Improve Performance You conducted your survey and received data and information via the survey
responses – now what? This data and information can be used to improve department
performance. However, the information must be shared with the staff that can
implement change. Data and information gathered through these surveys should be
shared regularly with executives, department directors, managers and staff. The data
should be used to help managers identify where customers are satisfied with County
services and where improvement is needed.
Staff Meetings It is good practice to include, on a regular basis, survey results as a discussion topic on
your staff meeting agenda. Sharing survey results at staff meetings is a great way to
ensure that all staff members are aware of the department’s overall customer service
performance. Putting the item on the agenda also ensures that time will be set aside
to discuss with staff how feedback can be used to improve overall customer
satisfaction and department performance. Keeping employees informed of customer
feedback on an ongoing basis allows them to see customer satisfaction improvement
over time as their suggestions are operationalized. Keeping them in the loop will also
help keep them motivated to improve.
Performance Dashboards Performance dashboards are used by the County to track the performance of all
County departments and programs and to provide those results to the public. The
County Manager’s Office requires departments to report program performance at
least twice per year through the dashboards. At least one question that is reported on
dashboards to the County Manager’s Office comes directly from customer surveying.
Department Customers Survey results can be used to share with customers, how the County plans to improve
services. Let customers know that you have heard and responded to their concerns by
circling back to them after they have completed a survey to let them know how you
have addressed their concerns. It is also good practice to thank them for their positive
feedback and let them know that the department will continue the best practice that
was commented upon.
14 | P a g e
Resources 1. County Manager required Overall Satisfaction Question
The County Manager’s Office requires programs to report on customer service
feedback twice per year. Those results are aggregated to show overall
department performance.
Please ensure that you include the “overall customer satisfaction” question in
your survey – the County Manger’s Office will ask for the results.
Although each Department may ask the customer satisfaction question a bit
differently, here is one example of how the “Overall Customer Satisfaction
Question” question can be asked:
Overall, how would you rate the quality of your customer service
experience?
o Excellent
o Good
o Neutral
o Poor
o Very poor
2. Survey Monkey https://www.surveymonkey.com/?ut_source=header
3. Examples of Department Survey Improvements There are many ways for a department to capture their customer service
experience. Below we have provided some examples of how County of San
Mateo Departments are surveying their customers.
Email Survey – Attachment A
Because business is frequently conducted on mobile devices, sending a link to
an online survey via email is an excellent way to obtain customer feedback.
Using Survey Monkey, you can upload your contacts into Survey Monkey
Contacts. You can then create email lists and custom invitations. By sending the
link in an email, you can track responses and send reminders to those that have
not responded.
Public Works sends out some of its surveys through email with a link to an online
survey. See attachment A for an example of an email that contains a link to an
online survey.
Kiosk Survey – Attachment B
The Parks Department uses iPads to conduct surveys to gain feedback in real
time.
15 | P a g e
iPads are used to survey park visitors and participants in programs such as Take
A Hike and educational programs. The benefit of surveying participants in this
manner is obtaining real-time evaluation of experiences at one point in time
and potentially engaging more participants in surveys.
The abbreviated Visitor In-Park Survey (Attachment B) is the tool that was
developed in partnership with San Francisco State University’s Department of
Recreation, Parks and Tourism. The Parks Department asked visitors to complete
surveys that were loaded onto tablets. A small incentive was offered to people
to complete the survey (one-day park pass). The attached survey example is
long and is not recommend for routine surveying. Like other surveys, shorter
surveys are better. Depending on participant’s comfort level with electronics, it
took from 7 to 9 minutes to complete.
Website Survey
The County’s Information Services Department can help you put a survey on
your department’s website. This is an effective way to survey customers that
conduct business on the website. If your customers do not go to the website to
conduct business, this method may provide another opportunity for customers
to offer feedback, but it may not yield many responses.
Social Media
Online surveys can be tweeted on Twitter or posted on Facebook. More
information about how to do that with Survey Monkey can be found here:
https://www.surveymonkey.com/mp/tour/responsecollection/
Paper Survey – Attachment C
While most surveys are now web based, the paper survey is still relevant for the
County. For example, the Agriculture, Weights and Measures department has
implemented best practices while using paper surveys. When conducting
inspections, Department staff frequently visit customers in person. The
Department has recently revised their survey to focus questions on customers
that receive in person visits. The survey, which is printed on post card sized paper
with postage-paid, has only five focused questions. Department staff can easily
and frequently, hand the survey to the customer after an interaction. Thanking
the customer for filling out the card and returning it, is good customer service
practice and encourages a response.
For an example of a paper survey, see Attachment C.
Department Specific Surveys – Attachment D
The Health System’s San Mateo Medical Center (SMMC) has adopted a robust
client satisfaction measurement tool that enables frequent and meaningful
data on SMMC’s patients’ experiences in obtaining services. This tool, by a
vendor named Press-Ganey, is used by healthcare organizations across the
16 | P a g e
country and therefore also allows SMMC to benchmark its performance with
those of peer institutions. And, the tool allows departments within SMMC to
measure whether specific actions focused on improving client satisfaction result
in “moving the needle” on overall satisfaction levels.
A specific effort SMMC adopted based on its leaders’ regular review of Press-
Ganey measurement results, and reading of all patient comments is the “WE
CARE” program. WE CARE stands for Welcome with a smile, Explain what you
are doing, Communicate clearly, Respond respectfully, and Express gratitude.
SMMC leaders noticed a significant number of comments from patients related
to courtesy of service, acknowledgment of patients by the front-line staff who
interact with them, and not feeling “listened to”. SMMC determined that it
needed to set an expectation of respectful, empathetic and courteous service
across the organization, invest in training all staff on how to meet that
expectation, and validate whether such training is being adhered to after it is
delivered. Between January and April of 2016, the WE CARE effort has:
Trained 580 SMMC staff in a 1 hour in-person training conducted at the
medical center and clinic worksites and all three shifts in which SMMC staff
work; (About 50% of staff already trained)
Enlisted patients, family members (our Improvement Partners )and staff
involved in a Family Advisory Council to validate the training by following up
at sites in which the training was delivered to document adherence to key
topics covered in the training, such as making eye contact with the patient,
acknowledging their needs, repeating back their service request… (see
Attachment D, “Validation Tool.”) SMMC has validated more than 300
interactions in 6 months.
Organized required follow-up trainings for any worksites/teams that achieved
a lower than 80% score in the validation review conducted by the Advisory
Council members.
There are some promising indications that this approach of a robust client
satisfaction tool that is used by leaders to drive improvement is working. SMMC
is seeing a decrease in the number of patient complaints related to customer
service – from 16 of 47 (34%) client complaints in March 2016 to 10 of 44 (24%) in
April 2016. SMMC leaders continue to read every patient comment that is
submitted with the Press Ganey survey results to monitor key trends and inform
further actions.
17 | P a g e
Attachment A – Public Works
Email Survey
Dear Customer,
Subject: Completion of Service Request [Auto Insert #]
The Construction Services Team has completed your request for the following service: [Auto
Insert- Brief Description of Work]
The Team would be grateful if you could take 30 seconds to complete a customer survey:
[Insert Link]
We are always looking for ways to improve our service and we value your feedback.
Facilities appreciates the opportunity to serve you and hope that we have completed the
requested work to your satisfaction.
Thank you,
Facilities Customer Service
This is an auto-generated message from an unattended mailbox. Please do not reply to this
message. To contact Facilities Customer Service please call 650-363-4444.
18 | P a g e
Attachment B - Parks
iPad Kiosk Survey
Help San Mateo County Parks and Receive a Free Parks Day Pass
The San Mateo County Parks Department would like to know about your experiences
in this park today to help them serve you and other visitors better in the future. Upon
completing this brief survey you will receive a complementary county parks day pass.
Your responses will be kept confidential. You are one of the few persons taking the
survey so your feedback is very important. San Francisco State University is providing
technical and analytical support in this effort.
1) What is the name of this San Mateo County (SMC) park, preserve or trail you are in now?
Check just one name.
( ) Coyote Point Marina ( ) Coyote Point Recreation Area
( ) Crystal Springs Regional Trail ( ) Devil's Slide Trail
( ) Edgewood Park & Natural Preserve ( ) Fitzgerald Marine Reserve
( ) Flood Park ( ) Friendship Park
( ) Huddart Park ( ) Junipero Serra Park
( ) Memorial Park ( ) Mirada Surf
( ) Moss Beach Park ( ) Pescadero Creek Park
( ) Pillar Point Bluff ( ) Quarry Park
( ) Sam McDonald Park ( ) San Bruno Mountain State & County Park
( ) San Pedro Valley Park ( ) Wunderlich Park
2) Including today, how many times have you visited _this park_ in last 12 months? ___
3) On this visit, what kind of personal group (alone, family and/or friends, not a guided group
or other organized group) are you with today? Check just one type.
( ) Alone
( ) Family
( ) Friends
( ) Family and friends
( ) Other - Describe: _________________________________________________
19 | P a g e
4) How many people are in your personal group today at this park, including yourself?
__ Number of persons in group
5) We are interested in knowing the number of persons in your group and their ages. Enter the
number of people in your personal group within each of the following age categories. Do not
enter your actual age.
__ Under 6 __ 6-12 __13-18 __19-24 __25-34
__35-44 __45-54 __55-64 __65-74 __75+
43) Are you willing to provide your email address so we can send you a follow-up survey and
be entered into a drawing for a $100 prize or annual county pass and help improve San
Mateo County parks?
* SFSU and SMCP will not share your email address with anyone.*
( ) Yes. If Yes, continue with next question
( ) No. If No, go to end of survey
44) Please provide your name so we can contact you if you win.
_________________________________________________
45) Provide your email address so you can be entered in a drawing to win a prize after
completing the follow-up survey.
If you would like to see the entire survey, please contact Carla Schoof at (650) 549-1306.
20 | P a g e
Attachment C – Agriculture, Weights and Measures
Paper Survey
21 | P a g e
Attachment D – Health System
WECARE Improvement Partner Observation/Validation
WECARE Improvement Partner Observation/Validation
Observer Name: Unit/Dept.:
Employee Name: Date/Time:
WECARE Competency Element Did
the employee…
Validate
* Notes:
(Feedback or observation)
Welcome with
a smile
Establish eye contact and smile
Acknowledge patients in a courteous way (ex.
Help lost patients, say ‘please’/’thank you’)
Ask and confirm patient’s name and address
them by their preferred name/title
Keep a warm, calm, and welcoming voice
Keep badge visible/front-facing
Explain who
you are
Introduce themselves by name and explain their
role.
Explain what they are going to do
Communicate
clearly
Repeat information to confirm accuracy
Respond with empathy to patient concerns
Describe what’s going to happen; explains
process in everyday language
Inform patients about delays and checks back.
Uses patients preferred language
Ask how you
can help
Ask if the patient needs anything else and if
needs are taken care of
Respond
respectfully
Answer questions clearly and respectfully
Act and speak positively of other staff
Be mindful of patients privacy
Check for understanding of next step
Be mindful of nonverbal communication
Express gratitude
Thank the patients
Ask if there are any other questions and/or
inform patient of follow-ups
*Met= √ or NA Not met= X
22 | P a g e
Attachment E
4. Examples of Likert Scaled Responses Used in Data-Gathering
A variety of methods are available to assist evaluators in gathering data. One of those methods involves the use of a scale. One of the most common scale types is a Likert scale. A Likert scale is commonly used to measure attitudes, knowledge, perceptions, values, and behavioral changes. A Likert-type scale involves a series of statements that respondents may choose from in order to rate their responses to evaluative questions (Vogt, 1999).
23 | P a g e
24 | P a g e
TRAINING
Improving Performance at the County of San Mateo
County of San Mateo
Revised August 23, 2016
“Job seekers from entry-level to executive are more
concerned with opportunities for learning and
development than any other aspect of a prospective job.
This makes perfect sense, since continuous learning is a key
strategy for crafting a sustainable career.”
Harvard Business Review
1 | P a g e
Training, development and growth opportunities are key to recruiting and retaining a highly engaged
and energized workforce. With pension reform and an emerging labor force characterized by frequent
job changes in the search of new experiences and challenges, the public sector will struggle to
compete for talent in the near future. Developing an organization that values and promotes continuous
learning will be a key strategy in the acquisition and retention of talented workers.
The County’s 20-hour training target was implemented following an organizational review conducted in
2006 by Management Partners Inc., to promote and reinforce the County’s core values of learning and
development in order to strengthen and prepare the workforce for current and future changes and
needs. The target is now a policy and employees are now expected to complete at least 20-hours of
training annually, after a 2016 workgroup project recommended a stronger emphasis on employee
training and development and the development of new policies and resources to help employees
meet this important requirement. The 2006 and 2016 reports are available below:
2006 Report - Management Partners Report
2016 Report - Performance Priorities Workgroup on Training
The County’s Learning Management system (“LMS”) helps departments and staff access, track and
deliver training for employees. The following resources are available to help departments utilize the LMS
to support staff in meeting their training goals and educate and promote staff on the 20-hour training
policy:
Support resources for the LMS User:
Each department has at least one LMS Administrator who can assist staff with accessing and navigating the LMS – LMS Administrator Contact List
LMS User Reference Guides
LMS External Training User Guides
Support and Tools for Supervisors and Managers:
LMS training video for managers and supervisors (10 minutes): Accessible at www.smcgov.org/LMS by searching for “LMS for Supervisors”
LMS Manager Delegation Guide
User Guide for Running Your Team’s 20-Hour Training Report for Supervisors and Managers
Sample Form to Capture Internal Training Events and Activities
20-Hour Training Policy Educational Flyer 20-Hour Training Policy Promotional/Email Graphic
Sample goals: A) Enhance my job/career development by meeting the County’s 20-hour training target by enrolling in the following classes (name of classes here) by June 20XX or B) Meet the County’s 20-hour training target by capturing both my external and internal training related to my job and career activities in the LMS by June 20XX.
Support and tools for LMS Administrators:
LMS administrator user guides o How to Proxy Enroll o Creating Events and Sessions
Instructor-led training for LMS administrators o LMS Basics o LMS Advanced
LMS Power User Group In-Service Meetings (every 2 months)
BENCHMARKS AND THE BENCHMARKING PROCESS
Benchmarking to Improve Performance
County of San Mateo
Revised August 23, 2016
“Benchmarking is a method of measuring and
improving our organizational performance by
comparing ourselves with the best.”
– Tim Stapenhurst
Contents
About this Chapter and Workgroup .......................................................................................... 1
Benchmarking Policy ................................................................................................................... 2
Benchmarking Steps .................................................................................................................... 2
Step 1. Set Reasonable Expectations .................................................................................... 2
Step 2. Establish a Goal for the Benchmark .......................................................................... 3
Step. 3 Get Buy-in ...................................................................................................................... 3
Step 4. Define What You are Trying to Measure ................................................................... 3
Step 5. Identify an Appropriate Comparison ....................................................................... 4
Step 6. Review the Data in Context ....................................................................................... 5
Benchmarking Guidelines ........................................................................................................... 5
How Do You Find Benchmarks? .............................................................................................. 5
Determination of the Benchmark ........................................................................................... 6
Application of Benchmarks ..................................................................................................... 6
Adjustments to Existing Benchmark Targets .......................................................................... 8
Human Services Agency’s Performance Benchmarks ........................................................ 8
Peer Group Benchmark .................................................................................................... 8
“Best in Class” Benchmark ............................................................................................... 8
Presenting Data and Measuring Against Benchmarks ....................................................... 9
Glossary of Terms ........................................................................................................................ 11
Frequently Asked Questions ...................................................................................................... 19
Resources ..................................................................................................................................... 21
Budget/Performance Contact Information ........................................................................ 21
Department Contact Information ........................................................................................ 22
Budget and Performance Information ................................................................................ 23
1 | P a g e
About this Chapter and Workgroup
The Performance Priority Workgroup Team, “Raising the Bar,” was formed in order to
build the County’s capacity to effectively use benchmarks when measuring
performance. Based on the FY 2014-15 Year End Performance Survey, benchmarks was
a performance area that the County determined to need improvement. Results from
the specific countywide measure, “Percent of supervisors indicating programs are
performing effectively in comparison to regional benchmarks or state/national
standards” were disappointing as over 50% of respondents answered “No” or “I Do Not
Know.”
Similarly, the “Raising the Bar” or Benchmarks Workgroup surveyed users of the County’s
Department Performance dashboards site in February 2016. The goal of the survey was
to determine what specific issues certain departments encountered in the areas of
creating benchmarks and understanding the benchmarking process. The results
provided the Workgroup with a vision to focus on developing a useful toolkit that will be
shared on E.R.I.N., the County’s intranet.
Overall, the group is hopeful that, as a result of this resource, the next Performance
Survey’s results will show improvements in the abovementioned measure. The following
individuals and departments contributed to this chapter of the Performance Handbook:
Name Department
Deanna Haskell Department of Public Works
Jason Escareno County Manager’s Office
Mark Hertz County Counsel’s Office
Marnita Garcia-Fulle Human Services Agency
Rolando Jorquera County Manager’s Office
Scott Gruendl Behavioral Health & Recovery Services
William Harven Human Services Agency
This chapter will include the following:
Benchmarking policy;
Index of all County benchmark data;
Summary of best practices;
Information on finding, presenting and calculating benchmarks data; and
Benchmarking resources
2 | P a g e
Benchmarking Policy
Benchmarking is an important part of the County’s goal to create a culture of
performance with an emphasis on accountability, transparency and improvement. The
County’s success in being able to provide high quality services and becoming a high-
performing organization is dependent on being able to measure ourselves against local
government and industry peers in order to guide data-driven decision making.
The County’s benchmarking guidelines are to use comparator counties, industries,
agencies, and empirical evidence that best mirror the County’s business models and
service delivery or are those recognized as best in class. Preference is given to Bay Area
counties unless more appropriate comparators exist. The Bay Area counties include the
following: (1) Alameda; (2) Contra Costa; (3) Marin; (4) Napa; (5) San Francisco; (6)
Santa Clara; (7) Solano; and (8) Sonoma.
Each department or program will establish a series of appropriate benchmarks against
which performance will be compared on a regular basis. The benchmarks shall be
reflective of the actual mission and vision of the department or program.
Each benchmark provides a frame of reference for comparing and evaluating
performance. A good benchmark shapes and reflects the strategy of the department
or program.
Benchmarking Steps
Benchmarks are useful high-level comparisons. They can be a starting point for
discussions regarding continuous process improvement or a way to identify successful
programs. Benchmarking is often challenging. Issues of comparability, usefulness, and
methodology often arise when trying to establish a benchmarking program. Developing
meaningful benchmarks can take time and some trial and error. Below are suggested
steps to help you develop your benchmark program.
Step 1. Set Reasonable Expectations
Developing a good benchmark does not happen overnight. Recognize that you may
be reaching out to other organizations to obtain data. They may have constraints that
are hindering their ability to provide you with data. Focus on creating a solid foundation
for your benchmark and build upon your efforts over time.
3 | P a g e
Step 2. Establish a Goal for the Benchmark
Establish the purpose of the benchmark. Consider what is important or what you are
trying to learn from the data. Track information that is available and meaningful to you
and your counterparts. When choosing what to measure, consider how easy it will be
for you and those who you may be contacting to provide the data.
Step 3. Get Buy-in
Secure agreement from executive management and program staff. Benchmarking
often takes a team and requires cooperation from staff across your department. Clearly
define each team member’s role.
Step 4. Define What You Are Trying To Measure
Develop clear definitions for the data components that will be used for calculations.
Consider if there is other general information that will need to be gathered to interpret
the data that is received from outside sources. For example, if you ask another
jurisdiction for the “acres of parkland maintained per full-time employee,” do you want
to know what percentage of those acres are playgrounds and open space in order to
determine the general level of effort necessary to maintain those acres? An apples to
apples comparison may not always be possible, however setting parameters to what
your organization is attempting to measure will help obtain more accurate comparisons.
Example Measure: Response Time to an Emergency
If you asked another jurisdiction to provide you with data relating to this
measure, how would they know what an emergency is? How would they
(and you) define response time? Is it initial contact with the person who
reported the emergency? Is it when personnel arrived on-site to address
the emergency?
Example Data Definitions
Response Time: The number of minutes from when County personnel
received the call to when County personnel arrived on-site to resolve the
issue.
Emergency: An emergency is an event in which people or County property
are in imminent danger.
4 | P a g e
Example Measure: Cost of Custodial Services per Square Foot
Without data definitions, if you were to survey all the Bay Area Counties and
ask them to provide this data to you, the basis of the data that you receive
may not be the same. Request your data in components and clearly define
each component. See below.
Component of Measure
Definition Data
Cost of Custodial Services
Please include the cost to perform X, Y, Z services. Include services that are provided by contractors.
Square Footage Maintained
Include only the square footage actively maintained (where service is provided).
Table 1. Example Measure of the Cost of Custodial Services
Step 5. Identify an Appropriate Comparison
Below is a list of possible comparators. The nature of the program that you are trying to
develop a benchmark for will influence the type of comparator you use. For example,
if your program is unique, a historical average of program performance may be the
best comparator. If you have a highly regulated program, you may be able to reach
out to peer counties governed by the same regulations and develop a comparator.
Associations and Industry Standards: Large reputable associations often gather
data to publish industry standards or complete benchmarking surveys that may
provide useful comparators.
Peer Comparable Organizations: Select government agencies that have
programs that are similar to yours. Consider the demographics of the potential
comparator and the structure of the program to determine if it is an appropriate
comparator.
Historical Averages: If you cannot find a comparable jurisdiction or industry
standard, start by comparing to the historical performance of your program.
Consider developing a comparator equivalent to a three year average of
historical performance. This may help you to identify a shift in performance that
requires further investigation.
5 | P a g e
Step 6. Review the Data in Context
Consider the data in context. Not all jurisdictions run programs in a similar manner. A
benchmark is a helpful tool for comparison only. Benchmarking can help identify where
a program is performing well and where there may be opportunity for improvement.
Involve program staff in the analysis of data and research best practices for your
program that may be appropriate to implement.
Benchmarking Guidelines
Private, public and non-profit entities use measurements to evaluate changes in results,
achievement of goals and objectives and peer-to-peer ranking. This enables them to
track their performance and to make needed adjustments so that they can efficiently
achieve their goals. Measurements are an integral component of this process and one
that often takes the form of benchmarks.
Benchmarks are metrics that relate to measurements. The metric can represent an
aspect or dimension of utility such as miles of paved roads, hours of services delivered,
property tax receipts, etc. This can be useful in ranking one entity to another, or tracking
progress against a goal. However, the level of comparison is not normalized to each
entity’s characteristics. Normalizing the metric against factors or variables describing
the entity’s characteristics will help in making reasonable comparisons. This could be
depicted in miles of road per business sales receipts, hours of services delivered per
enrolled client. Some metrics encompass a “model” of measurement, such as Work
Participation Rate (WPR) or Quality of Life.
How Do You Find Benchmarks?
Determining a benchmark often starts with the logic model or value proposition of a
service or product. A logic model consists of inputs, processes or systems, outputs and
outcomes. Mapping the relationships will help in establishing the right components of
the measurement desired. When defining the outcomes, it is important to distinguish
between the proximate and distal outcomes.
Example of a Logic Model
Adults in families with insufficient incomes need short-term financial
assistance. The assistance is an incentive toward participation in work-
related activities. With a concerted effort at achieving job readiness, the
client will move into unsubsidized employment when job opportunities
occur.
6 | P a g e
Model Node Possible Determinants
Distal Outcome (Goal) Self-sufficiency
Proximal Outcome Unsubsidized employment
Outputs Training certificates obtained, family budget implemented, internships completed, behavior modification accomplished, problem solving skills
Processes Classes and training, case management, counseling and clinical interventions, sanctions
Inputs Eligible low-income households, Program policies and regulations, staff and staff competency, facilities and suppliers, supporting infrastructure
Table 2. Example Logic Model for Employment Services Program
Work Participation Rate (WPR) is a benchmark based on the outputs of this
model. It focuses on the accumulation of hours of participation. A different
type of benchmark would focus on an assessment of job readiness after a
certain period of participation. Other measures might look at the proximal
outcome of unsubsidized employment.
Determination of the Benchmark
Sources of existing benchmarks can be located among the following:
Academic and clinical research and national and regional studies
Professional peer groups, associations and societies
Federal, state and local governments and non-governmental organizations
Surveys, budgets and audit reports
When researching existing benchmarks, use discretion in selecting the measures.
Variables and data components should all be well defined. If the exact calculation for
measures have not been provided, discern the prominent data components, such as
the data that make up numerators and denominators. Think of it as backwards
engineering.
Application of Benchmarks
At this step, the target will be defined. Initially, aspects of the internal program, service
or product will reveal helpful information. Refer to the logic model below.
7 | P a g e
Figure 1. Relationship Map of Logic Model for Public Education Program1
Answering the following questions about the program, service or product will help to
reveal helpful information:
1. What outcomes are desired? Prioritize them.
2. What outputs have strong relationships to the desired outcomes?
3. Which outputs require the most resources?
4. What performance level is expected from the processes, activities or systems on
the inputs?
5. What primary, secondary, and tertiary effects are expected of the outputs?
6. Other mandates or requirements?
1 Carey, J and Martin, I (2016). “Development of a Logic Model to Guide Evaluations of the ASCA National Model
for School Counseling Programs”, TPCJournal.nbcc.org.
8 | P a g e
Adjustments to Existing Benchmark Targets
Determine peer group characteristics. What contrasting elements would impact the
results? Usually there are some design differences in the processes or systems that could
affect whether the target should be adjusted.
Compare inputs, demographics and normalizing factors.
Human Services Agency’s Performance Benchmarks
HSAs performance benchmarks are listed in Table 3 below. The following sections
describes certain types of benchmarks that are applied to HSAs performance measures.
Peer Group Benchmark
The benchmark for “percent of staff who met annual training requirements (20 hours or
more)” constitutes a useful measure for staff development as well as professional
certification upkeep. See measure #300 for more information. The Agency faced
training deficiencies and needed to correct its training regimen. New programs will be
implemented to ensure that refresher training is offered periodically, covering all line
workers on recent regulations and model practices. In addition, Social Workers with
certification from national associations are required to attend 40 hours of service-related
training every two years.
Previously, HSA used a benchmark target equal to the County department average
(peer group). However, the Agency Director understood that a different target was
preferred to meet the goal of excellent staff competency. For this reason, as well as to
keep the level of training utilization high, the target was raised from 62% to 80%. This
measure can also be considered an input toward meeting other compliance
benchmarks.
“Best In Class” Benchmark
The benchmark for “percent of Service Connect participants in 550Jobs! Program that
secure employment” has a foundation in employment outcomes as well as
rehabilitation activities. See measure #530 for more information. Recall that Table 1
presents a typical logic model for employment services program. The distal outcome
for typical employment services is Self Sufficiency. For the formerly incarcerated, the
outcomes might be life-style adjustments, integration into community setting as well as
self-sufficiency.
Other national programs have conducted evaluations on the outputs and outcomes of
participants. Most prominent is Center for Employment Opportunities (CEO). The CEO
evaluation observed program and control groups from the perspectives of participation,
9 | P a g e
job placement and long-term recidivism. The study found that 38.4% of the program
participants that spent up to 12 weeks in transitional jobs went on to get unsubsidized
employment. The target of 38.4% might well suit as the benchmark for 550Jobs! given
the program similarities.
But the context for the benchmark needs to account for the job market, the economy
and demographic characteristics of the participants. While the 550Jobs! measure has
remained constant, the populations entering the program have changed. A proxy was
identified that has remained somewhat constant and serves a population with
employment challenges. The final benchmark was set at 55%, which is the employment
placement rate of San Mateo County VRS participants.
Presenting Data and Measuring Against Benchmarks
Generally, a benchmark represents a constant threshold of performance that an
organization should monitor closely. The premise of benchmarks takes place in statistics
or empirical research. When based on statistics, then we expect the benchmark to
reside in an upper echelon or percentile of the population. An example would be
“Overall customer satisfaction rated good or better.” Superior firms and institutions will
make this an extremely challenging benchmark, such that positive satisfaction is set at
90% - 95%.
The emphasis on empirical research can depict either a “best in class” metric or a
derivation from any number of phenomena. The “industry leader” benchmark informs
peers in the same class about highly successful processes, approaches, methods, etc.
that transform products and services. Manufacturing processes engineered to fine
constraints of variability will provide benchmarks for similar industries. Queuing theory
and large team design would provide benchmarks for call center operators. Another
focus of empirical research can delve into common metrics such as unemployment rate,
recidivism, cancer rates, and education attainment levels, among others. For example,
a job training program may benchmark placement in jobs against the employment rate
for industries matching skills development opportunities for clients.
What happens if actual performance results contrast markedly from the benchmarks?
If the measurements are reported frequently, say monthly, then after a sufficient time
interval, patterns of gradual movement can emerge. The rate of change can provide
an option for targeting performance to the next level. A running rate of 3% can be
calculated into the target for the next measurement period. The internal target will
require recalibration for each reporting period that follows, while the benchmark
remains pointed to a desired state of performance. Charting both the benchmark and
the internal target has been a standard reporting practice on the County’s SMC
Performance Dashboards website.
10 | P a g e
Table 3. HSA Benchmark Measures
ID MEASURE TITLE BENCHMARK
VALUE BENCHMARK RATIONALE DATA SOURCE
421 Customer Satisfaction rating of good or better
90% Countywide benchmark Cares! Surveys
432
Percent of employees describing their experience working for the County as "Good" or "Very Good'
78% Average of County Departments Annual Employee Engagement survey
300 Percent of staff who met annual training requirements (20 hours or more)
80%
Internal benchmark, but connected to HSA's initiative to provide ongoing program training to line staff. Also all Social Workers must receive 40 hours of in-service training related to their job function every 2 years to maintain their certification.
LMS report for in-service training; New Worker Training Unit log sheets compiled by Staff Development and Training.
409
Percent of public assistance applications that are processed within State standards for timelineness
90% State QC standard Monthly CalWIN System reports
315 Overall satisfaction rated good or better for all VRS services
90% Internal benchmark, but based on Countywide benchmark for customer satisfaction
Manual calculation of surveys
312
Percent of Welfare-to-Work families meeting requirements in federal Work Participation Rate (WPR) based on State measurement
50% Federal program requirement measured on a fiscal year basis
Monthly samples of cases reviewed by State
423 Rate of child abuse reports per 1,000 children
37.30 Average of past 3 years for Bay Area Counties
Calendar Year results from CWS/CMS system, calculated by UC Berkeley
424 Rate of allegations substantiated per 1,000 children
4.70 Average of past 3 years for Bay Area Counties
Calendar Year results from CWS/CMS system, calculated by UC Berkeley
530
Percent of Service Connect participants in 550Jobs! Program that secure employment
55% Internal benchmark based on VRS client experience
Vtrack System
529
Percent of clients residing in homeless transitional shelters (HUD funded) that are connected to mainstream services and benefits
77% Internal benchmark, based on expected performance as stated in HSA's HUD CSBG grant application
Clarity/HMIS
11 | P a g e
Glossary of Terms2
Average - number expressing the central or typical value in a set of data, in particular
the mode, median, or (most commonly) the mean, which is calculated by dividing the
sum of the values in the set by their number.
Benchmarking - the process of comparing one's business processes and performance
metrics to industry bests and best practices from other companies. Dimensions typically
measured are quality, time and cost. In the process of best practice benchmarking,
management identifies the best firms in their industry, or in another industry where similar
processes exist, and compares the results and processes of those studied (the "targets")
to one's own results and processes. In this way, they learn how well the targets perform
and, more importantly, the business processes that explain why these firms are
successful.
Benchmarking is used to measure performance using a specific indicator (cost per unit
of measure, productivity per unit of measure, cycle time of x per unit of measure or
defects per unit of measure) resulting in a metric of performance that is then compared
to others.
Also referred to as "best practice benchmarking" or "process benchmarking", this
process is used in management and particularly strategic management, in which
organizations evaluate various aspects of their processes in relation to best practice
companies' processes, usually within a peer group defined for the purposes of
comparison.
Best in Class – a practice or item that serves as the premier example in the field for which
the item or practice has been developed.
Best Practice – a method or technique that has been generally accepted as superior to
any alternatives because it produces results that are superior to those achieved by other
means or because it has become a standard way of doing things, e.g., a standard way
of complying with legal or ethical requirements.
Best practices are used to maintain quality as an alternative to mandatory legislated
standards and can be based on self-assessment or benchmarking. Best practice is a
feature of accredited management standards such as ISO 9000 and ISO 14001.
2 The information presented in this Glossary is derived from the following sources: The Business Dictionary; Merriam-
Webster Dictionary; Wikipedia; and the University of California.
12 | P a g e
Some consulting firms specialize in the area of best practice and offer pre-made
'templates' to standardize business process documentation. Sometimes a "best
practice" is not applicable or is inappropriate for a particular organization's needs. A key
strategic talent required when applying best practice to organizations is the ability to
balance the unique qualities of an organization with the practices that it has in common
with others.
Characteristics - a distinguishing trait, quality, or property.
Denominator - the number below the line in a common fraction; a divisor; a figure
representing the total population in terms of which statistical values are expressed.
Distal - distant or further away from.
Goal - a desired result that a person or a system envisions, plans and commits to
achieve: a personal or organizational desired end-point in some sort of assumed
development. Goal setting may involve establishing specific, measurable, achievable,
relevant, and time-bounded (SMART) objectives, but not all researchers agree that
these SMART criteria are necessary.
Inputs - something that is put in or the act or process of putting in, such as your ideas
and comments (your input) are necessary before a decision can be reached.
Logic Model - a tool used by funders, managers, and evaluators of programs to
evaluate the effectiveness of a program. They can also be used during planning and
implementation. Logic models are usually a graphical depiction of the logical
relationships between the resources, activities, outputs and outcomes of a program.
While there are many ways in which logic models can be presented, the underlying
purpose of constructing a logic model is to assess the "if-then" (causal) relationships
between the elements of the program.
Normalizing - Adjusting values measured on different scales to a notionally common
scale; may refer to more sophisticated adjustments where the intention is to bring the
entire probability distributions of adjusted values into alignment; allow the comparison
of corresponding normalized values for different datasets in a way that eliminates the
effects of certain gross influences; some types of normalization involve only a rescaling,
to arrive at values relative to some size variable.
Numerator - the number above the line in a common fraction showing how many of the
parts indicated by the denominator are taken, for example, 2 in 2/3.
Outcomes - something that happens as a result of an activity or process.
13 | P a g e
Outputs - the act of turning out; production; the factory's output of cars; artistic output;
the quantity or amount produced, as in a given time; to increase one's daily output; the
material produced or yield; product.
Performance - the process of collecting, analyzing and/or reporting information
regarding the performance of an individual, group, organization, system or component.
It can involve studying processes/strategies within organizations, or studying engineering
processes/parameters/ phenomena, to see whether output are in line with what was
intended or should have been achieved. Performance measurement estimates the
parameters under which programs, investments, and acquisitions are reaching the
targeted results.
Performance Indicator or Key Performance Indicator (KPI) – a type of performance
measurement, KPIs evaluate the success of an organization or of a particular activity in
which it engages. Often success is simply the repeated, periodic achievement of some
levels of operational goal (e.g. zero defects, 10/10 customer satisfaction, etc.), and
sometimes success is defined in terms of making progress toward strategic
goals. Accordingly, choosing the right KPIs relies upon a good understanding of what is
important to the organization. 'What is important' often depends on the department
measuring the performance - e.g. the KPIs useful to finance will really differ from the KPIs
assigned to sales. Since there is a need to understand well what is important, various
techniques to assess the present state of the business, and its key activities, are
associated with the selection of performance indicators. These assessments often lead
to the identification of potential improvements, so performance indicators are routinely
associated with 'performance improvement' initiatives. A very common way to choose
KPIs is to apply a management framework such as the balanced scorecard.
Performance Metric - determines an organization's behavior and performance.
Performance metrics measure an organization's activities and performance. It should
support a range of stakeholder needs from customers, shareholders to employees. While
traditionally many metrics are finance based, inwardly focusing on the performance of
the organization, metrics may also focus on the performance against customer
requirements and value.
In project management, performance metrics are used to assess the health of the
project and consist of the measuring of seven criteria: safety, time, cost, resources,
scope, quality, and actions. In call centers, performance metrics help capture internal
performance and can include productivity measurements and the quality of service
provided by the customer service advisor. These metrics can include: Calls Answered,
Calls Abandoned, Average Handle Time and Average Wait Time.
14 | P a g e
Developing performance metrics usually follows a process of: 1) Establishing critical
processes/customer requirements; 2) Identifying specific, quantifiable outputs of work;
and 3) Establishing targets against which results can be scored.
A criticism of performance metrics is that when the value of information is computed
using mathematical methods, it shows that even performance metrics professionals
choose measures that have little value. This is referred to as the "measurement inversion".
For example, metrics seem to emphasize what organizations find immediately
measurable — even if those are low value — and tend to ignore high value
measurements simply because they seem harder to measure (whether they are or not).
To correct for the measurement inversion other methods, like applied information
economics, introduce the "value of information analysis" step in the process so that
metrics focus on high-value measures. Organizations where this has been applied find
that they define completely different metrics than they otherwise would have and,
often, fewer metrics.
For projects, the effort to collect a metric has to be weighed against its value as projects
are temporary endeavors performed with finite resources. There are a variety of ways in
which organizations may react to results. This may be to trigger specific activity relating
to (i.e., an improvement plan) or to use the data merely for statistical information. Often
closely tied in with outputs, performance metrics should usually encourage
improvement, effectiveness and appropriate levels of control.
The Department of Energy has promulgated a set of Total Quality Management
guidelines that indicate that performance metrics should lead to a quantitative
assessment of gains in: Customer Satisfaction; Organizational Performance; and
Workforce Excellence. The key elements of the performance metrics to these guidelines
should address: Alignment with Organizational Mission; Cost Reduction and/or
Avoidance; Quality of Product; Cycle Time Reduction; Meeting Commitments; Timely
Delivery; and Customer Satisfaction.
The first step in developing performance metrics is to involve the people who are
responsible for the work to be measured because they are the most knowledgeable
about the work. Once these people are identified and involved, it is necessary to:
1. Identify critical work processes and customer requirements.
2. Identify critical results desired and align them to customer requirements.
3. Develop measurements for the critical work processes or critical results.
4. Establish performance goals, standards, or benchmarks.
15 | P a g e
The establishment of performance goals can best be specified when they are defined
within three primary levels:
Objectives: Broad, general areas of review. These generally reflect the end
goals based on the mission of a function.
Criteria: Specific areas of accomplishment that satisfy major divisions of
responsibility within a function.
Measures: Metrics designed to drive improvement and characterize
progress made under each criteria. These are specific quantifiable goals
based on individual expected work outputs.
The SMART test is frequently used to provide a quick reference to determine the quality
of a particular performance metric:
S = Specific: clear and focused to avoid misinterpretation. Should include
measure assumptions and definitions and be easily interpreted.
M = Measurable: can be quantified and compared to other data. It should
allow for meaningful statistical analysis. Avoid "yes/no" measures except in
limited cases, such as start-up or systems-in-place situations.
A = Attainable: achievable, reasonable, and credible under conditions
expected.
R = Realistic: fits into the organization's constraints and is cost-effective.
T = Timely: doable within the time frame given.
Quality performance metrics allow for the collection of meaningful data for trending
and analysis of rate-of-change over time. Examples are:
-Trending against known standards: the standards may come from either
internal or external sources and may include benchmarks.
-Trending with standards to be established: usually this type of metric is
used in conjunction with establishing a baseline.
-Milestones achieved.
Yes/No metrics are used in certain situations usually involving establishing trends,
baselines, or targets, or in start-up cases. Because there is no valid calibration of the
level of performance for this type of measure, it should be used sparingly. Examples are:
16 | P a g e
establish/implement a system; reporting achieved (without analyses); system is in place
(without regard to effectiveness); threshold achieved (arbitrary standards); and analysis
performed (without criteria).
The following questions serve as a checklist to determine the quality of the performance
metrics that have been defined.
1. Is the metric objectively measurable?
2. Does the metric include a clear statement of the end results expected?
3. Does the metric support customer requirements, including compliance issues
where appropriate?
4. Does the metric focus on effectiveness and/or efficiency of the system being
measured?
5. Does the metric allow for meaningful trend or statistical analysis?
6. Have appropriate industry or other external stands been applied?
7. Does the metric include milestones and/or indicators to express qualitative
criteria?
8. Are the metrics challenging but at the same time attainable?
9. Are assumptions and definitions specified for what constitutes satisfactory
performance?
10. Have those who are responsible for the performance being measured been fully
involved in the development of this metric?
11. Has the metric been mutually agreed upon by you and your customers?
Process - Sequence of interdependent and linked procedures which, at every stage,
consume one or more resources (employee time, energy, machines, and money) to
convert inputs (data, material, parts, etc.) into outputs. These outputs then serve as
inputs for the next stage until a known goal or end result is reached.
Proximal - in proximity to or closer to.
Proxy - something serving to replace or substitute for another thing.
Requirement - a singular documented physical and functional need that a particular
design, product or process must be able to perform. It is most commonly used in a formal
sense in systems engineering, software engineering, or enterprise engineering. It is a
statement that identifies a necessary attribute, capability, characteristic, or quality of a
system for it to have value and utility to a customer, organization, internal user, or other
stakeholder. A requirement specification (often imprecisely referred to as the spec,
because there are different sorts of specifications) refers to an explicit set
of requirements to be satisfied by a material, design, product, or service.
17 | P a g e
In the classical engineering approach, sets of requirements are used as inputs into the
design stages of product development. Requirements are also an important input into
the verification process, since tests should trace back to specific requirements.
Requirements show what elements and functions are necessary for the particular
project. This is reflected in the waterfall model of the software life-cycle. However,
when iterative methods of software development or agile methods are used, the
system requirements are incrementally developed in parallel with design and
implementation.
Standard - level of quality, achievement, etc., that is considered acceptable or
desirable.
Utility – a result that cannot be measured or observed directly that can be measured
through the development of metrics, for example, that when applied to the specific
instance to be measured allows for a measured result that can be compared to other
results of similar or like entities.
Value - the value of a mathematical expression is the result of the computation
described by this expression when the variables and constants in it are replaced by
some numbers. The value of a function is the number implied by the function as a result
of a particular number being assigned to its argument (also called the variable of the
function).
In marketing, also known as customer-perceived value, is the difference between a
prospective customer's evaluation of the benefits and costs of one product when
compared with others. Value may also be expressed as a straightforward relationship
between perceived benefits and perceived costs: Value = Benefits / Cost.
The basic underlying concept of value in marketing is human needs. The basic human
needs may include food, shelter, belonging, love, and self-expression. Both culture and
individual personality shape human needs in what is known as wants. When wants are
backed by buying power, they become demands. With a consumers wants and
resources (financial ability), they demand products and services with benefits that add
up to the most value and satisfaction.
The four types of value include: functional value, monetary value, social value,
and psychological value. The sources of value are not equally important to all
consumers. How important a value is, depends on the consumer and the purchase.
Values should always be defined through the "eyes" of the consumer.
Functional Value: This type of value is what an offer does, it's the solution an offer
provides to the customer.
18 | P a g e
Monetary Value: This is where the function of the price paid is relative to an offerings
perceived worth. This value invites a trade-off between other values and monetary costs.
Social Value: The extent to which owning a product or engaging in a service allows the
consumer to connect with others.
Psychological Value: The extent to which a product allows consumers to express
themselves or feel better.
For a firm to deliver value to its customers, they must consider what is known as the "total
market offering." This includes the reputation of the organization, staff representation,
product benefits, and technological characteristics as compared to competitors'
market offerings and prices. Value can thus be defined as the relationship of a firm's
market offerings to those of its competitors.
Value in marketing can be defined by both qualitative and quantitative measures. On
the qualitative side, value is the perceived gain composed of individual's emotional,
mental and physical condition plus various social, economic, cultural and
environmental factors. On the quantitative side, value is the actual gain measured in
terms of financial numbers, percentages, and dollars.
For an organization to deliver value, it has to improve its value: cost ratio. When an
organization delivers high value at high price, the perceived value may be low. When
it delivers high value at low price, the perceived value may be high. The key to deliver
high perceived value is attaching value to each of the individuals or organizations—
making them believe that what you are offering is beyond expectation—helping them
to solve a problem, offering a solution, giving results, and making them happy.
Value can change based on time, place and people in relation to changing
environmental factors. It is a creative energy exchange between people and
organizations in the marketplace.
Value Proposition - A value proposition is a statement which clearly identifies benefits
consumers get when buying a particular product or service. It should convince
consumers that this product or service is better than others on the market. This
proposition can lead to a competitive advantage when consumers pick that particular
product or service over other competitors because they receive greater value.
Variable - something that may or does vary or change; a variable feature or factor; a
symbol for an unspecified member of a class of things or statements.
19 | P a g e
Frequently Asked Questions
Why do we need to benchmark? Benchmarking is a common practice across industries
to measure organizational performance and in San Mateo the practice helps to inform
the public, the department, and county management of performance in key areas of
your department. Benchmarks often allow for the comparison between your
department and other entities as a way to measure how well your department may be
doing.
How are my department’s benchmarks established? Benchmarks are created in a
number of ways. First, a few benchmarks are selected for your department that best
represent how well your department performs overall, which may have been selected
as part of Measure A, imposed by a funding source or oversight entity, or negotiated
between your department and county management. Second, benchmarks are
typically ones that are commonly used on the federal or state level. Third, benchmarks
can also be a common measure, such as cost per capita, that are measured against
other like and similar entities, such as other Bay Area counties.
When are benchmarks required to be used? An oversight entity or funding source may
require benchmarks as part of the agreement they have with your department for
funding or program delivery. All County of San Mateo departments have benchmarks
so that the public, other County departments, and County management can monitor
performance. These benchmarks are commonly used in conjunction with the County’s
open data platform Socrata or as part of Measure A if your department receives funding
from the voter approved measure. Benchmark data is collected throughout the year
and is often reported on a quarterly, semi-annual, and annual basis.
Who is responsible for benchmarking? Each County department has identified an
individual as the department’s point person on benchmarking. However, there may be
several persons in a department that have responsibilities for benchmarking because
they are often program specific measures. You can ask you department’s leadership
who in your department has the main responsibility for benchmarking.
Can my department change its benchmarks? Yes, departments can change their
benchmarks, especially when they have become outdated or are no longer relevant.
Performance Measure Workgroups meet to discuss issues such as this and a formal policy
on benchmarking changes is in the works. In the meantime, departments can work
through their executive leadership and County management to modify, add, or delete
benchmarks. If the benchmark is related to Measure A, modifications, additions, or
deletions are taking up on an annual basis with the Measure A Oversight Committee. If
the benchmark is related to the State or Federal government or some other funding or
20 | P a g e
program oversight entity, you will need to find out what the specific steps are from those
organizations.
What are goals and/or targets? Within a benchmark there will be a goal or target that
establishes the goal in any specific time period to be achieved for that measure. Goals
and targets are way to make the benchmark relevant in a specific time period that is
being measured and establishes a point to be reached by the department in that
period. It is a field goal for that measure in that time period.
How are goals and targets established? Goals and targets are usually tied to a standard,
such as a state or federal standard for the activity being measured, but it can also be
tied to previous performance. Most importantly, goals and targets move, they are not
static. They move based changes to standards or the performance of organizations that
drive the standard. They also change based on performance and can be increased
when your organization is able to achieve them or lowered if not achieved.
What is Socrata? Socrata is a computer platform that used by the County of San Mateo
for a number of purposes, but generally the data platform ties goals, budget, and
performance together for presentation in documents and publication to the Internet.
21 | P a g e
Resources
Bay Area County Information
County Contact Information
*Updated March 2016*
Budget/Performance Contact Information
1. Alameda County
http://www.acgov.org/cao/contactus.htm
2. Contra Costa County
http://www.cccounty.us/120/Budget-Finance
3. Marin County
http://www.marincounty.org/depts/ad/contact-us
4. Napa County
http://www.countyofnapa.org/Pages/DepartmentContent.aspx?id=4
294968608#Budget
5. San Francisco County
http://sfmayor.org/index.aspx?page=882
6. San Mateo County
http://cmo.smcgov.org/sites/cmo.smcgov.org/files/FY%202015-
16_Analyst%20Assignments.pdf
7. Santa Clara County
https://www.sccgov.org/sites/ceo/oba/Pages/Office-of-Budget-and-
Analysis.aspx
8. Solano County
https://www.solanocounty.com/depts/county_admin/contact_us.asp
9. Sonoma County
http://sonomacounty.ca.gov/CAO/Contact-Us/
22 | P a g e
Department Contact Information
1. Alameda County
http://www.acgov.org/government/departments.htm
2. Contra Costa County
http://www.co.contra-costa.ca.us/8/Departments
3. Marin County
http://www.marincounty.org/depts
4. Napa County
http://www.countyofnapa.org/departments/
5. San Francisco County
http://www6.sfgov.org/index.aspx?page=40
6. San Mateo County
http://www.smcgov.org/departments
7. Santa Clara County
https://www.sccgov.org/sites/scc/Pages/Search.aspx?svtyp=Organiz
ations
8. Solano County
http://solanocounty.com/depts/default.asp
9. Sonoma County
http://sonomacounty.ca.gov/Departments-and-Agencies/
23 | P a g e
Budget and Performance Information
1. Alameda County
-Budget and Performance Documents and Reports
http://acgov.org/MS/OpenBudget/BudgetReports.aspx#.Vvr5fuIrLIU
-Budget Overview
http://acgov.org/MS/OpenBudget/BudgetOverview.aspx#.Vvr64uIrLIU
-Financial Documents and Reports
http://www.acgov.org/cao/financial.htm
2. Contra Costa County
-Budget Documents and Reports
http://www.cccounty.us/770/Budget-Documents
-Budget Overview
http://www.cccounty.us/757/Budget-Information
-Financial Documents and Reports
http://www.cccounty.us/756/Financial-Information
-Performance Documents and Reports
http://www.cccounty.us/798/Performance-Report-by-Department
24 | P a g e
3. Marin County
-Budget Overview, Documents, and Reports
http://www.marincounty.org/depts/ad/divisions/management-and-
budget/budget-overview
-Financial Documents and Reports
http://www.marincounty.org/depts/df/financial-information
-Performance Documents and Reports
http://www.marincounty.org/depts/ad/divisions/management-and-
budget/performance-management
4. Napa County
-Budget and Performance Overview, Documents, and Reports
http://www.countyofnapa.org/Pages/DepartmentContent.aspx?id=429496921
8
-Financial Documents and Reports
http://www.countyofnapa.org/auditor/fiscal/
5. San Francisco County
-Budget Documents and Reports
http://www.sfmayor.org/index.aspx?page=981
http://www.sfmayor.org/index.aspx?page=873
-Budget Overview
http://www.sfmayor.org/index.aspx?page=870
-Financial Documents and Reports
http://sfcontroller.org/index.aspx?page=275
-Performance Documents and Reports
http://www.sfcontroller.org/index.aspx?page=75
25 | P a g e
6. San Mateo County
-Budget and Performance Overview, Documents, and Reports
http://cmo.smcgov.org/budget-and-performance
https://performance.smcgov.org/
-Financial Documents and Reports
http://controller.smcgov.org/information-and-reports
7. Santa Clara County
-Budget Overview, Documents, and Reports
https://www.sccgov.org/sites/scc/countygovernment/Pages/Budget-and-
Finance.aspx
-Financial Documents and Reports
https://www.sccgov.org/sites/fin/Controller-
Treasurer%20Department/CAFR%20Report/Pages/Comprehensive-Annual-
Financial-Report-(CAFR).aspx
8. Solano County
-Budget Overview, Documents, and Reports
https://www.solanocounty.com/depts/county_admin/budget_documents/defa
ult.asp
-Financial Documents and Reports
https://www.solanocounty.com/depts/auditor/finance_reports.asp
26 | P a g e
9. Sonoma County
-Budget and Financial Documents and Reports
http://www.sonoma-county.org/auditor/financial_reports.htm
http://sonomacounty.ca.gov/_templates_portal/LandingPage.aspx?id=2147503
505
-Budget Overview
http://sonomacounty.ca.gov/CAO/Public-Reports/About-Sonoma-County/The-
Budget/
EMPLOYEE ENGAGEMENT
Engaging Employees to Improve Performance
County of San Mateo
Revised August 23, 2016
“Employee engagement lies at intersection of
maximum contribution for the business and
maximum satisfaction for employees. It’s a
sustainable level of high performance that
benefits both the company and the
employee.”
– BlessingWhite
Contents
About this Chapter and Workgroup .......................................................................................... 1
What is Employee Engagement? .............................................................................................. 3
Why Employee Engagement is Important? ............................................................................. 3
Creating a Culture of Engagement in San Mateo County ................................................... 3
Resources ....................................................................................................................................... 4
1 | P a g e
About this Chapter and Workgroup
In the Fall of 2015, the County Manager’s Office conducted a survey of all Managers
and Supervisors inquiring about their performance practices. Based on survey results, six
teams were formed to address the areas where Managers/Supervisors indicated they
could benefit from additional resources or improvement including: Dashboards,
Customer Service, Benchmarks, Training, Employee Engagement, and Employee
Feedback and Evaluation.
In December 2015, an Employee Engagement Workgroup was formed to identify the
resources and/or improvements needed to assist Managers/Supervisors in fostering
higher levels of engagement in our workforce.
The Engagement Performance Workgroup joined forces with the Employee
Engagement Champions, under the auspice of the Employee Engagement Committee
in order to accomplish their collective goals. The contributors from the various
workgroups are listed below.
Name Department
Michelle Durand County Manager’s Office
Bill Dean Human Services Agency
Chad Kempel Health System
Diana Chung Health System
Diane Tom Health System
Gladys Balmas Health System
Jei Africa Health System
Joy Cheechov Department of Public Works
Karen Pugh Health System
Kristine Averilla Health System
Leilani Chua Health System
Natalie Kwong Lloyd Assessor Clerk, Recorder, Elections
Nicole Pasini Library
Patty Artega Child Support Services
Samantha DalPorto Sheriff’s Office
Sheree Calhoun Probation
Anne Weiss Human Resources
Christina Thompson Information Services
Danielle Lee Office of Sustainability
Eun-Soo Lim Office of Sustainability
Felicia Flores Human Resources
2 | P a g e
Laura Williams Treasurer-Tax Collector
Rochelle Kiner Department of Public Works
Angela Sajuthi Health System
Donna Vaillancourt Human Resources
Marissa King Human Resources
Rocio Kiryczun Human Resources
Rolando Jorquera County Manager’s Office
Shawn Yu Information Services
The goals of the workgroup, and their respective and current status are outlined below.
Promote the availability of the Employee Engagement Portals and Reports through
targeted outreach efforts such that at least 60% of Managers/Supervisors their Employee
Engagement Portal by 6/30/16.
Thanks to extensive effort on the part of County Leadership/Department Heads and
Engagement Champions, as of June 30th, 2016 65% (570 of 875) of
Managers/Supervisors accessed their Engagement Portal.
The availability of the Engagement Portals, the Survey Results (Reports) and how to
interpret them, as well as what resources are available in the Portal were shared through
various communication means (e.g. Engagement Champions, Yammer, Departmental,
Division, Team Presentations, email, etc.) and most intimately during Engagement
Workshops for Managers/Supervisors.
In order to educate Managers/ Supervisors on tools/resources available to create more
great days at work, host at least 20 countywide Engagement Workshops.
Again, due in large part to the commitment of Engagement Champions, 27
Engagement Workshops were conducted between November 2015 and March 2016.
During the three hour workshop Managers/Supervisors were able to reflect on their
“great day(s) at work;” watch videos describing BlessingWhite’s Engagement Model
and who owns engagement; discuss engagement in their work environment; read and
review the Summary Analysis Report (SAR) for their Department, Budget Unit, Rolled-Up
Group, and/or Team; as well as the tools and resources available to them to create a
more engaged environment, some of which are listed below.
3 | P a g e
What is Employee Engagement?
Until recently, the County defined Employee Engagement as “the degree to which
employees are connected and committed to their work, their colleagues, and the
purpose of the organization. Engagement is encouraged and demonstrated in the way
the organization and employee work to support each other’s success.”
In 2015, through the partnership with BlessingWhite, the County adopted a more
structured and standardized definition:
Why Employee Engagement is Important?
Engaged employees experience greater meaning, satisfaction and success in their
work. And their organizations experience higher levels of customer satisfaction, service
quality, innovation, and productivity and lower rates of absenteeism, turnover and
accidents (BlessingWhite).
Creating a Culture of Engagement in San Mateo County
Realizing the value of an engaged workforce, the County began surveying Employee
Engagement in October 2011. The Employee Engagement Survey was designed to be
an ongoing opportunity to listen to employees throughout the organization, stay in
touch with emerging issues, identify opportunities for improvement and take action.
Conducting the survey annually also allowed for tracking of information over time.
The alignment of maximum satisfaction for the individual
with maximum contribution for the organization.”
Engaged employees are not just committed. They are
not just passionate or proud. They have a line-of-sight on
their own future and on the organization’s mission and
goals. They are “enthused” and “in gear,” using their
talents and discretionary effort to make a difference in
their employer’s quest for sustainable business success.”
BlessingWhite Research: the Employee Engagement
Report
4 | P a g e
Following the first survey an Employee Engagement Committee was formed. The results
of the first survey, and subsequent ones, helped in shape and create the Essential
Supervisory Skills series, the Employee Engagement Guide, and the Supervisor Online
Support website (available only on the County Intranet) and more.
While significant advances were being made, in order to realize a deeper, more
sustaining impact on engagement, more had to be done in the immediate work
environment. This could only be accomplished with the assistance of
Managers/Supervisors.
Thus, in October 2015 the County partnered with BlessingWhite to administer the annual
survey and further disseminate results. Through this partnership, Employee Engagement
results were provided to every Manager/Supervisor in the organization through a
personalized Engagement Portal.
The Engagement Portal contained Summary Analysis Report(s) (SAR) for the Manager/
Supervisors’ Department, Budget Unit, Roll-Up/Team, and Direct Reports, provided there
were at least six or more survey respondents in each group. The Portal also hosted a
number of resources to assist Managers/Supervisors in understanding the concept of
engagement, his/her report(s) and how to take action.
Resources
Resource Description Location
San Mateo County
Employee
Engagement Website
Website containing results, resources,
tools, FAQ, list of Champions, etc.
relating to the 2015 Engagement
Survey
http://hr.smcgov.org
/employee-
engagement
Great Day Resource
Guide
Guide that provides questions to ask
and actions to take based on Survey
questions
Attached, and will
be posted on ERIN,
new Intranet site
Portal Contents Document which contains all the
tools and resources within the portal.
Ideal for person who recently
became Managers/Supervisor, or will
be, and who do not have their own
portal
To be posted on
ERIN, new Intranet
site
Engagement Self-
Assessment Engagement Worksheet Attached
5 | P a g e
Manager’s Daily
Engagement Guide
A one-page tool designed to help
Managers/Supervisors to explore
others levels of engagement in
various interactions.
Attached
Videos
Name of Video Content Location Length
Engagement
Model
An explanation of
the X Model and the
BlessingWhite’s
Engagement Model
Main Menu
Left-hand panel of the
Home Page
Engagement Report, Page 3
Link here
4:25
View and
Interpret Your
Report
Video to help you
better understand
your Snapshot
Analysis Report (SAR).
As you view it, take
notes on your printed
report.
Review Your Report: Interpret
Your Report Tab
Engagement Report, Page
16
Link here
9:34
IME Model
(Shared
Responsibility)
Addresses the roles
of individuals,
managers and
executives in building
a culture of
engagement.
Engagement Report, Page 3
Link here 2:48
Review and Analyze Countywide Employee Engagement Survey Results
The Employee Engagement Committee met monthly in 2016 to review and analyze the
data from the 2015 Employee Engagement Survey. The Committee analyzed the
County compared to 71,000 other Public Sector organizations (our benchmark) and the
top quartile (last one million global responses) who completed BlessingWhite’s survey.
In addition, the committee compared data across demographic categories (e.g. age,
tenure, position, union representation, etc.) and evaluated the overall results from all
Departments, as well as their results on specific questions: Would you recommend the
County to a family or friend? And, how would you rate your overall experience working
for the County?
6 | P a g e
The information yielded from the 3778 or 63% of staff responses indicate that 67% (2531)
of the survey respondents are Engaged (42%) or Almost Engaged (25%), another 9% are
Satisfied and Not Contributing, 12% are Contributing but not Satisfied and 11% are
disengaged.
San Mateo County had an Overall Favorability Score of 71%. This means that 71% of staff
who took the survey responded favorably or that they Strongly Agree/Agree to the Core
22 survey questions. Favorability score on the Individual (I Index) were higher, at 81%,
and lower on the Manager (M Index) (69%) and Executive (57%) Indexes.
The Workgroup/Committee found that the greatest areas for growth are:
Creating avenues for staff to explore career opportunities
Working with Managers/Supervisors to provide regular, specific feedback and to
recognize and reward achievements
Building trust, and opening the lines of honest communication
Creating an environment that drives high performance
Identify Internal and External Best Practices, and Spearheaded Organizational
Approaches to Creating More “Great Days at Work.”
Because the question that has the most opportunity for growth related to opportunities
for Career Development, the Workgroup/Committee has chosen to focus its efforts on
assisting with personal and professional development opportunities which are planned
and/or currently underway.
The Workgroup/Committee has also formed a sub-committee which has created a
targeted training on “Owning Your Own Engagement” which will be taught by
Engagement Champions and Workgroup/Committee Members, and available to all
staff in late Summer/early Fall.
The training is designed to foster a greater understanding of Engagement, its
importance in the workplace, how we all are responsible for our own engagement and
promote the availability of the Employee Engagement Portals, Reports, resources and
tools.
In addition to the Owning your Own Engagement training, the Employee Engagement
Committee is creating an incentive program for Managers/Supervisors which will
provide an incentive for the team as a result of conducting a Team Meeting and/or
Individual Engagement Conversations focused on the Engagement Survey Results.
7 | P a g e
Lastly, a strong showing of Engagement Champions and Committee members
attended Engage 2016 – a GuideSpark conference focused on engagement and
communication. Many attendees walked away with new ideas. Following the
conference attendees met to formulate their ideas to share with the larger Committee
as possible future initiatives.
EMPLOYEE FEEDBACK AND EVALUATION
Ongoing Conversations to Improve Performance
County of San Mateo
Revised August 23, 2016
“We all need people who will give us feedback.
That is how we improve.”
– Bill Gates
Contents
About this Chapter and Workgroup .......................................................................................... 1
Performance Management in San Mateo County ................................................................ 2
Key Findings .................................................................................................................................... 2
Focus Group/Survey ..................................................................................................................... 3
External Best Practices ................................................................................................................. 4
Outreach ....................................................................................................................................... 4
Recommendations ....................................................................................................................... 4
Supervisor Resources for Employee Feedback, Goal Setting
and Evaluations ............................................................................................................................ 6
Upcoming ...................................................................................................................................... 7
1 | P a g e
About this Chapter and Workgroup
In the Fall of 2015, the County Manager’s Office conducted a survey of all supervisors
and managers inquiring about their performance practices. Based on survey results,
employee feedback and evaluations was a performance area that the County
determined to be an area that needed improvement.
In December 2015, the Employee Feedback and Evaluation (EFEs) Workgroup was
formed. The goals of the workgroup included the following:
Increase communication between employees and supervisors/managers
Increase the number of employees developing/completing goals that align to
program/department/County priorities
Increase the number of evaluations completed
The workgroup focused on the following:
Identifying Baseline Information
Conducting Internal Best Practices – Focus Group and Survey
Conducting External Best Practices – Public and Private Sectors
Identifying Opportunities to promote employee feedback and evaluation /
Conducting Outreaching/Education/Training
Developing Recommendations
The following individuals and departments contributed to this chapter of the
Performance Handbook:
Name Department
Rocio Kiryczun Human Resources Department
Theresa Rabe Human Resources Department
Carol Clancy Health System
Laurel Finnegan Parks Department
Felicia Flores Human Resources Department
Michael Leach Health System
Michelle Kuka (SME) Human Resources Department
2 | P a g e
Performance Management in San Mateo County
The County strives to promote a work environment that offers a systematic and
ongoing process of communication and collaboration between a supervisor and an
employee during a year-long performance management cycle that aligns individual
performance to the organization’s strategic priorities and supports individual learning
and development.
There are two major approaches currently used in the County:
Collaborative Performance Management System (CPMS)
Traditional (one-page performance evaluation)
The traditional one-page evaluation form focuses on prior year results and sets goals for
the next cycle. While CPMS is more comprehensive and focuses on ongoing
performance conversations between supervisors and employees. The cycle includes
goal setting, feedback and coaching and performance review (focusing on goals and
performance factors).
Best practices show that a solid performance management process includes frequent
communication between supervisor and employee relating to performance and
development goals, as well as overall feedback and coaching. While there are more
departments utilizing the traditional form, efforts are underway to expand the CPMS
model.
Key Findings
Baseline Information – About 44%-50% of evaluations are completed countywide
annually. Departments either utilize the traditional evaluation form (or some modified
version) or CPMS. The departments utilizing CPMS include Human Resources, County
Manager’s Office-Budget and Performance, Library, Health IT, Controller’s Office, and
Housing. This represents about 10% of the workforce. Employees in these
departments/division have performance and development goals, are having regular
performance conversations with their supervisors/managers and have current
evaluations that focus on goals.
3 | P a g e
Focus Group/Survey
In February 2016, a focus group was conducted. Twelve (12) department
representatives participated in the discussion that focused on departmental
management practices. In addition, in April 2016 a survey was sent out to all supervisors
and managers asking them about their practices relating to providing regular meetings
with employees (1:1s), providing feedback and coaching, completing goals and
conducting evaluations. There were 245 responses from 23 departments. The feedback
from the focus group and survey respondents included the following:
Challenges:
Having enough time for performance management - takes time to complete
multiple evaluations (43% noted a challenge to complete evaluations on time)
Providing quality evaluations (narrative) that are meaningful – keeping
information relevant and fresh
Ratings may not be meaningful especially competent rating
Ensuring documentation is happening throughout the year
Having different forms –not all evaluations in system or same format
Limited triggers – notifications to indicate when evaluations are due
Consistency between supervisors/managers (relating to ratings and narrative)
Aligning goals to division/department
Needing more coaching resources
Having new technology/tools
Opportunities:
Simplify processes (to address time constraints)
Revamp ratings
Promote accountability – set expectations for supervisors/managers that
meeting with staff and conducting evaluations is required (e.g., goal)
Provide outreach/education/training in the following areas:
o Workday – all aspects relating to goals, evaluations
o Performance evaluation process
o Goal setting and monitoring
Offer Incentives/competition to complete goals/evaluations
Offer additional self-service/reporting – Workday
Review Performance evaluation cycle/timing
Identify method of tracking when evaluations are due
Provide resources to assist with coaching, 1:1s meetings, documentation
The most common themes: Time; Accountability; Quality Evaluations; Training;
Triggers/Reminders; Documentation; Rating System; Resources
4 | P a g e
External Best Practices
For the most part, the trend in the private sector has been to either do away with the
annual evaluation or eliminate ratings. The focus is more on performance discussions
relating to goals. Some of the companies include: Deloitte, Netflix, Hubspot, Accenture,
Google, among others. We found two counties that are moving in this direction as well:
Riverside County and Pinellas County. Riverside is starting to focus on capacity building
– centering on development and growth. Pinellas County’s performance management
process is called FACE (Feedback, Ask Questions, Conversation, Explore Options). The
emphasis centers around setting and monitoring performance goals, ongoing
feedback and coaching, growth and development, and observing and noting,
summarizing conversations. These discussions are documented and are regular. There
are no ratings.
The workgroup also reviewed other local agencies, including cities. It was observed that
many of these agencies have higher performance evaluation completion rates than
the County. This could be due to the size of the organizations. All the local agencies
surveyed where utilizing traditional evaluation forms and ratings.
Outreach
The team identified existing resources relating to employee feedback, goal-setting and
evaluations. Team members also have conducted Workday roadshows providing
information about the tool used for performance evaluation process. Additional training
and outreach efforts are included as part of the team’s recommendations.
Recommendations
The workgroup has identified several short-term and long-term approaches to promote
employee/supervisor performance conversations, including feedback and coaching,
goal-setting and evaluations. CPMS offers all these components. However, the
workgroup has determined that CPMS materials will need to be streamlined/simplified
and then an outreach effort will need to occur. Additional recommendations focus on
training and revamping the rating system. The recommendations are as follows:
5 | P a g e
Phase I - Short-term Priorities (2016):
1. Promote existing resources pertaining to feedback and evaluation (e.g., email,
NEW, online Onboarding video)
a. Supervisors Online Support (SOS) site
b. Essential Supervisors Skills
c. HR Sessions
d. Training Matrix
e. Workday Training, Videos and Resources
f. Collaborative Performance Management System (CPMS) site/resources
2. Establish goal for supervisors/managers relating to 1:1s, coaching/feedback,
evaluations, training:
On an ongoing basis, set clear goals and expectations with direct reports, and
provide effective coaching and feedback so that staff can achieve their
performance and development goals: 1) collaborate with direct reports to
prepare plans with performance and development goals, 2) conduct one-on-
one meetings with direct reports on a bi-weekly/monthly basis to provide
feedback and support goal achievement, 3) ensure all staff complete 20 hours of
training by end of fiscal year, and 4) complete performance evaluations for all
direct reports.
3. Update/simplify Collaborative Performance Management System (CPMS)
materials and website, incorporating best practices, and market/promote –
emphasis goals and performance conversations
Phase II - Long Term Priorities (2017)
4. Form new subcommittee to:
a. Develop and offer online onboarding session for new supervisors/managers
relating to feedback and evaluation, e.g., conducting performance
conversations, developing and monitoring SMART goals, etc.
b. Revamp/Simplify performance evaluation ratings
c. Update all material and offer training, as needed
5. Work with County Manager’s Office to identify incentives for
departments/programs that have the highest number of employees with goals
Note: These recommendations will be presented to Executive Council in July 2016.
6 | P a g e
Supervisor Resources for Employee Feedback, Goal
Setting and Evaluations
SOS: Supervisors’ Online Support – Performance Management Page
Your go to place for general performance resources and information
http://www.co.sanmateo.ca.us/hr/sos/perfmanagement.html
Collaborative Performance Management System (CPMS) Page
A page dedicated to the CPMS process. To learn about how CPMS differs from
traditional evaluations, click on the link below.
http://hr.smcgov.org/collaborative-performance-management-system-cpms
Contains a wealth of information including:
Performance Planning and Goal Setting; Performance Feedback and Coaching;
and Performance Review and Evaluation
Additional CPMS Resources http://hr.smcgov.org/cpms-resources
Training for Goal Setting and Traditional Performance Evaluations in Workday
Quick Reference Cards and Videos for Goals and Traditional Performance Evaluations
http://intranet.co.sanmateo.ca.us/workday/performance-2/
Full training video available on Yammer’s Workday Group
https://www.yammer.com/smcgov.org/#/groups/3170613/files
Online Trainings through LMS
Browse the catalog and check out the courses in Leadership Advantage 2.0
https://sanmateocounty.csod.com/client/sanmateocounty/default.aspx
In Person Training Opportunities
Communicating Expectations
Effective 1:1 Meetings
HR Basics
ESS – Setting Goals to Achieve Performance and Development (Required
Supervisor Course)
ESS – Providing Effective Feedback to Guide Performance (Required Supervisor
Course)
ESS - Preparing Meaningful Performance Evaluations (Required Supervisor Course)
ESS – Coaching for Performance and Development (Required Supervisor Course)
7 | P a g e
Upcoming
Training for CPMS Performance Reviews in Workday
Quick Reference Card and Video for CPMS
In Person Manager Workday Training
Training will cover several topics but will include performance evaluation components
Additional Resources - Employee Feedback and Evaluation Workgroup
Recommendations
PERFORMANCE TRACKING
Visualization and Automation to Improve Performance
County of San Mateo
Revised August 23, 2016
“If you can’t measure it, you can’t manage it.”
-Peter Drucker
Contents
About this Chapter and Workgroup ...................................................................................................................................... 1
SMC Performance: Perspectives ............................................................................................................................................ 2
Data Entry Spreadsheet Redesign ......................................................................................................................................... 4
Data Automation ...................................................................................................................................................................... 6
1 | P a g e
About this Chapter and Workgroup
The Performance Priority Workgroup Team “Performance Matters” was formed with the goal of improving
countywide performance through more dynamic storytelling and visualizations in SMC Performance, as well as
automating data to reduce manual data entry.
Based on the FY 2014-15 Year End Performance Survey, performance through the use of SMC Performance was
identified as an area that needed improvement.
The Performance Matters workgroup approached this need in three different ways: (1) piloting a new
performance storytelling product in SMC Performance called Perspectives; (2) reimagining the way that data is
entered into SMC Performance; and (3) automating data to reduce or eliminate manual data entry for
performance measures, when feasible.
The following individuals and departments contributed to this chapter of the Performance Handbook:
Name Department
Mary-Claire Katz Office of Sustainability
John Ridener Information Services Department
Bill Harven Human Services Agency
Gina Wilson Health System
Rolando Jorquera County Manager’s Office
Alison Holt County Manager’s Office
Jim Saco County Manager’s Office
Michelle Durand County Manager’s Office
Vanita Narayan Information Services Department
Yvonne Ho Department of Housing
2 | P a g e
This chapter will include the following sections:
Perspectives Introduction;
Revised data entry method; and
Data automation plan.
SMC Performance: Perspectives
San Mateo County is committed to data-driven performance. While numbers and charts are essential to
measuring performance, it is the story behind the performance that gives the full picture and engages the public
in the County’s work.
SMC Performance is a web-based platform that gives County departments the opportunity to combine charts
and data with the narrative, pictures and video to tell the whole story behind their performance. Perspectives is
the latest product from SMC Performance that makes it easy to create a modern, responsive, and dynamic story
to tell colleagues and the public.
In the past, County departments have used a product from SMC Performance called Reports to tell their stories.
These reports were fairly standardized, static, and did not give departments much opportunity to tell an engaging
story with their performance data. Here is an example of a Report from the Office of Sustainability.
3 | P a g e
By moving performance information into Perspectives, departments will have the opportunity to create dynamic,
creative, and media-rich stories. Here is an example of a Story from the Office of Sustainability:
In addition to being less cluttered and easier to read, Stories can be turned into a PowerPoint-like slideshow,
rather than having to take the extra step to copy and paste information into a PowerPoint to get the slideshow
functionality that is preferable for public presentations.
4 | P a g e
Stories also have the ability to embed HTML, video, social media, and a Goal Tile (which will link back to the full
Goal Dashboard)—all new features not available in Reports.
Creating a Story in extremely user-friendly. It is as simple as dragging and dropping a component block (this is a
pre-loaded text box or visualization element) into your Story, and begin typing or adding media content. Stories
auto-save as you work, so you don’t have to worry about clicking save every time you make an edit.
Click this link [https://evergreen.data.socrata.com/stories/s/s6ah-d5t2] to give you step-by-step instructions on
how to create a Story in Perspectives. This tutorial will show you everything you need to get started, from adding
text and changing fonts and color, to embedding Goal tiles and visualizations like videos and charts.
The Budget, Policy and Performance Unit in the County Manager’s Office, as well as the Open Data Coordinator
in the Information Services Department, will be available to assist you in creating your Stories.
Data Entry Spreadsheet Redesign
Departments are required to report their performance data to the County Manager’s Office, and the way they
do this is through a dataset in the Open Data Portal.
Adding department performance data into a Socrata dataset is currently done by uploading one Excel
spreadsheet per Program. This spreadsheet was designed by the Budget, Policy and Performance Unit (BPP),
with both fixed drop-down menus as well as manual data entry, and a total of 24 columns.
Since creating the spreadsheet, the BPP Unit has learned that not all 24 columns are used by departments. With
this in mind, the Performance Matters Workgroup distributed a survey requesting feedback on these columns
from the Human Services Agency, the Information Services Department, the Health Department, and the BPP
Unit.
These surveys confirmed that many of these columns are not used by departments at all. A revised data entry
spreadsheet has been created based on these survey responses (Attachment 1). In addition to revising the
current data entry spreadsheet, the Performance Matters Workgroup discussed reimagining the way data is
entered into SMC Performance. Through consulting with the County’s Open Data Coordinator, a new data
5 | P a g e
entry method was created, with the Human Services Agency and the Health System being the first
Departments to pilot the new method.
Below is an example of the reimagined data entry method.
There is the reduction of 24 columns down to 6, and the use of the metadata attached to each dataset in SMC
Performance, which fills in the information that was eliminated after the extra columns were deleted.
Additionally, instead of one dataset per department program, each performance measure has its own
dataset, which eliminates the need for complex filtering within a dataset.
When both the Human Services Agency and the Health System have piloted this new data entry method, the
Workgroup recommends expanding the pilot to other departments in order to continue gathering feedback,
with the ultimate goal of using this new data entry method during the next Performance Cycle year.
In the interim, it is recommended that the BPP Unit use the revised spreadsheet in Attachment A to eliminate
the unused columns and streamline data entry for departments.
6 | P a g e
Data Automation
Data automation is the process of connecting two or more applications so that data can be transferred
directly to SMC Performance without any manual data entry or uploading.
SMC Performance has the capacity to publish data via automated processes, but the management of
department and program performance data have not yet taken advantage of this feature. The Performance
Matters Workgroup had a goal of automating a dataset and Workgroup members provided potential datasets
from their respective departments that were good candidates for automation.
The Open Data Coordinator began the process of meeting with departments to see if their data sources were
able to automate. The Human Resources Department became the primary candidate for automation, as the
new Workday application collects employee evaluations, which is a performance measure that every County
department is required to report on.
The Information Services Department and the Human Resources Department have begun coordinating this
automation process, with the goal of completing automation in August, 2016. Once this data has been
automated, this handbook will be updated with the process, best practices that emerge, and suggested
candidates for the next data automation.
The BPP Unit is in the process of choosing a new budget system, which is an ideal candidate for data
automation. A group of representatives from the Information Services Department and the BPP Unit has been
organized to discuss incorporating data automation practices along with the new budget system
implementation.
Updated 08/01/2016
*Fiscal Officer Assignment **Project Manager Assignment
County Manager’s Office of Budget, Policy and Performance
Name Departmental Assignments Phone # Fax # Pony #
Jim Saco Budget Director (Unit Manager) Revenue / Budget Forecasting Debt Financing Budget System Replacement** BRASS Administrator**
363-4439 363-1916 CMO105
Heather Ledesma Principal Management Analyst – Performance, Health Health System Center for Continuous Process Improvement**
363-4174 556-1751 CMO105
Matthew Chidester
Principal Management Analyst – Community Services Construction Funds Department of Public Works Capital Projects / 5-Year Capital Improvement Plan**
363-4326 556-1751 CMO105
Danae Ramirez
Management Analyst – Prosperous and Healthy Communities County Support of the Courts Department of Child Support Services First 5 Commission Human Services Agency Private Defender Program Big Lift** SMC Saves** Children’s Budget**
363-4131 556-1751 CMO105
Alison Holt
Management Analyst – Criminal Justice Coroner’s Office District Attorney Message Switch Probation Department Sheriff / Office of Emergency Services Measure A** AB109**
363-4957 556-1751 CMO105
Sophie Mintier (Term)
Management Analyst – Admin/Community Services Board of Supervisors County Manager’s Office Information Services Department Office of Sustainability Planning and Building Department Real Property Services Retirement (SamCERA)
CMO105
Jason Escareno (Fellow)
Agricultural Commissioner / Sealer Assessor-County Clerk-Recorder County Library Department of Housing Human Resources Department Public Safety Communications Menlo Park Fire District Analysis**
599-1245 556-1751 CMO105
Rolando Jorquera (Fellow)
Controller’s Office County Counsel Fire Protection / CSA #1* Grand Jury*/** LAFCo* Parks Department, Parks Funds, Coyote Point Treasurer-Tax Collector Menlo Park Fire District Analysis** Student Consultants Program**
363-4396 556-1751 CMO105
Joy Limin (Works in OOS)
Accountant Trial Court Funding Fiscal support for CMO/BOS/Non-Departmental/Office of Sustainability
363-1941
Divina Nicdao (Works in OOS
Extra Help Fiscal Office Specialist Fiscal support for Fire, CSA #1 and Coroner’s Office
363-4137