Top Banner
HENRY JOHN N. NUEVA ME 201: Strategic Management of Engineering Enterprise Master in Management Engineering Pangasinan State University PLAN EVALUATION & IMPLEMENTATION Strategic Planning
46

Plan Evaluation & Implementation

Nov 30, 2014

Download

Education

ME 201 Strategic Management of Engineering Enterprise
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Plan Evaluation & Implementation

HENRY JOHN N. NUEVAME 201: Strategic Management of Engineering EnterpriseMaster in Management Engineering Pangasinan State University

PLAN EVALUATION & IMPLEMENTATION

Strategic Planning

Page 2: Plan Evaluation & Implementation

At the end of this lecture and presentation, we will be able to:

• Verify the importance of Plan Evaluation and its concept as applied to every project proposals;•Be aware of the procedures and composition of Evaluation Study Committee;•Be knowledgeable upon identifying activities as applied in the conduct of Plan Evaluation;•Understand the importance of Evaluation Results for wider spectrum of development and the effect of feedback process.

Page 3: Plan Evaluation & Implementation

•The Concept of Evaluation•The Plan Evaluation•Primary Purpose •Reasons for neglectful conduct

•Plan, Program & Project Evaluation •Organization of Evaluation Committee•Preparing the Evaluation Proposal

•Implementing the Evaluation•Data Gathering and Processing •Presentation and Analysis of Data •Findings & Conclusions •Plan Update

Page 4: Plan Evaluation & Implementation
Page 5: Plan Evaluation & Implementation

EVALUATIONEvaluation as applied to a project proposal and planning management, describes as the “PROCESS OF ANALYZING PROJECT INPUTS, TRANSFORMATION TECHNIQUES AND THE EFFECT & IMPACT OF OUTPUTS AGAINST DEFINITE STATED GOALS AND OBJECTIVES”

Hence,

Evaluation on its simplest term, define as “SYSTEMATIC DETERMINATION OF WORTH AND SIGNIFICANCE OF SOMETHING USING A CRITERIA AGAINST A SET OF STANDARDS”

Page 6: Plan Evaluation & Implementation

Plan Evaluation-a quality determinitation of Strategic Planning

For the last twenty-five years, why does Plan Evaluation indeed grossly disregard by planners and managers causes neglectful conduct?

1. Planners and Managers general opinion was that, their main task is just to put the project in place with the hope that expected results would come up;

2. They are quite reluctant to subject their projects be evaluated by outsourced group because their motivation, integrity and competence are placed under scrutiny;

3. As to their presumptions, evaluation has no practical value because of whatever results known would not be in any way put to effective use.

Page 7: Plan Evaluation & Implementation

But what are the Importance of Plan Evaluation as applied to a project?CONTROLLING BUDGET- During the planning process, the production

manager is forced to create a production plan that fits the given budget offered by the company. The budget should be evaluated and assessed each time a decision is made in the planning stages. Each decision may cost money, whether it is in labor costs or equipment fees. Examining the changing budget in the planning stages can help the production manager stay in control of the spending.

Page 8: Plan Evaluation & Implementation

But what are the Importance of Plan Evaluation as applied to a project?ADDRESS RISKS UPFRONT- Another important reason why

evaluation and assessments should be done during any planning stages is the risks associated with a given project. Each project performed by a company may have a set of risks, such as the lack of operational equipment, sick or absent employees or the lack of a flexible budget. Each of the risks the production manager faces should have a set of solutions, so the risks are prevented upfront.

Page 9: Plan Evaluation & Implementation

But what are the Importance of Plan Evaluation as applied to a project? TIME FRAME- Without a steady and solid plan for a

project, the time line can be extremely flexible. The time line may not have a deadline or monthly goals if a plan or schedule has not been created. Once a deadline has been set by the board of executives or business owner, the production manager must evaluate the tasks to determine whether a project can be completed within the time frame provided. The tasks must be assessed to ensure the schedule is realistic.

Page 10: Plan Evaluation & Implementation

But what are the Importance of Plan Evaluation as applied to a project?

QUALITY CONTROL- Once the planning is complete, the manager

must go back and assess the schedule in terms of the quality produced in the given time. If the schedule is too packed, it may affect the quality of the production. The assessment and evaluation of the planning process is important to ensure the quality of the product, as the manager will be held responsible if it is not satisfactory.

Page 11: Plan Evaluation & Implementation

Plan Evaluation

“To determine the quality of a program by formulating a

judgment”

PRIMARY PURPOSE:

Page 12: Plan Evaluation & Implementation

2 Functions to consider in Evaluation Purpose

FORMATIVE EVALUATIONS(Click Here)

SUMMATIVE EVALUATIONS(Click Here)

Page 13: Plan Evaluation & Implementation

FORMATIVE EVALUATIONSAn evaluation that is performed during the development or implementation of a project to help developers modify or improve the project

Example: 

An example of Formative Evaluation includes mid-term surveys and focus groups asking students about a new technology that has been introduced. During a class an evaluator gave the students a brief survey and engages them in a discussion on specific topics. The results from this research led to an increased awareness of some of the problems students were having and suggested some short-term changes in how the programs were implemented, and some larger longer-term changes in the way the courses were designed.

Page 14: Plan Evaluation & Implementation

SUMMATIVE EVALUATIONSAn evaluation that is performed near, at, or after the end of a project or a major section of a project.

Examples of summative evaluations can range from relatively simple and common IDQ (Instructor Designed Questionnaire) results to a careful study that compares two sections of the same class using different methods of instruction.Careful planning is key to successful summative evaluations. This is because determining the possible outcomes and developing criteria need to occur well in advance of the completion of the project to be evaluated. ATL can provide support for planning and implementation of this type of evaluation.

Page 15: Plan Evaluation & Implementation

2 Stages of Conducting Plan Evaluation

(Annual Project Basis)

PLAN, PROGRAM AND PROJECT EVALUATION

“Plan Evaluation as a component of strategic plan provides the key

making strategic planning a cyclical and continuous process”

Page 16: Plan Evaluation & Implementation

In reference to the above model, results of the Mid-Period Evaluation would have the specific program or project as its frame of reference.

MID-PERIOD

YEAR ENDQ4Q2Q1 Q3

ANNUAL PROJECT/PROGRAM TIMELINE

Page 17: Plan Evaluation & Implementation

FINAL

YEAR ENDQ4Q2Q1 Q3

In reference to the above model, Results of the Final Plan Evaluation would partly tell whether the mission and the vision of the plan are achieved or not. Thus, accomplishment reports are integrated and consolidated by the planner in reference to the set objective.

ANNUAL PROJECT/PROGRAM TIMELINE

Page 18: Plan Evaluation & Implementation

Major Effect of EVALUATION RESULTS

•It would identify conclusively whether program/project

objectives are adequate and responsively attained or not.

•Conclusively resolves whether the plan mission and

vision are realized or not.

Page 19: Plan Evaluation & Implementation

What is the use / What will happen to the results of outputs and outcomes in terms of effects and impacts?

1. Planners could use these results for research and study since it is eventually recycled.

2. May use as feedback or as an input in the planning process.

Page 20: Plan Evaluation & Implementation

Organization of EVALUATION COMMITEE

Since a medium or long-term strategic development plan requires a periodic evaluation, a need of evaluators or committee is highly recommended. Task to perform in-depth reviews of selected evaluation issues, strategies and methodologies.

Evaluation Committee also discusses selected evaluation reports to make suggestions for including evaluations of particular interest towards the annual work program. It is also suggested that the composition of committee comprises of multi-disciplinary orientation individual or experts in parallel to the focused project.

Page 21: Plan Evaluation & Implementation

Elements in Preparing EVALUATION PROPOSAL

OVERVIEW OF THE EVALUATION.•All experts are briefed orally or in writing before the evaluation in order to inform them of the general evaluation guidelines and the objectives of the research area under consideration.

•Each proposal is evaluated against the applicable criteria independently by experts who fill in individual evaluation forms giving marks and providing comments.

•For each proposal a consensus report is prepared. The report faithfully reflects the views of the independent experts referred to in Step 2.

•A panel discussion may be convened, if necessary, to examine and compare the consensus reports and marks in a given area, to review the proposals with respect to each other to make recommendations on a priority order and/or on possible clustering or combination of proposals.

Page 22: Plan Evaluation & Implementation

Elements in Preparing EVALUATION PROPOSAL

THE EVALUATION CRITERIA.

•In all circumstances, proposals are evaluated against the criteria for the instrument for which they are submitted. In clear-cut cases a proposal may be ruled out of scope by the Commission without referring it to experts.

•Any proposal for an indirect action which contravenes fundamental ethical principles or which does not fulfil any conditions set out in the call shall not be selected and may be excluded from the evaluation and selection procedure at any time.

•Any particular interpretations of the criteria to be used for evaluation are set out in the work programme, in particular the way in which they translate into the issues to be examined.

Page 23: Plan Evaluation & Implementation

Elements in Preparing EVALUATION PROPOSAL

PROPOSAL MARKING.

•Evaluators examine the individual issues comprising each block of evaluation criteria and in general mark the blocks on a six-point scale from 0 to 5 or any other marking. An example of score markings are as follows:

•Where appropriate, half marks may be given. If appropriate, evaluators may also be asked to give a mark to each of the individual issues comprising the blocks of criteria. Only the marks for the blocks of criteria are taken into account (after applying any weightings) for the overall score for the proposal.

0 - the proposal fails to address the issue under examination or can not be judged against the criterion due to missing or incomplete information1 - poor2 - fair3 - good4 - very good5 - excellent

Page 24: Plan Evaluation & Implementation

Elements in Preparing EVALUATION PROPOSAL

THRESHOLDS AND WEIGHTINGS.

•Thresholds may be set for some or all of the blocks of criteria, such that any proposal failing to achieve the threshold marks will be rejected. The thresholds to be applied to each block of criteria as well as any overall threshold are set out in the call. If the proposal fails to achieve a threshold for a block of criteria, the evaluation of the proposal may be stopped. The reasons will be detailed in the consensus report. It may be decided to divide the evaluation into several steps with the possibility of different experts examining different aspects. Where the evaluation is carried out in several successive steps, any proposal failing a threshold mark may not progress to the next step. Such proposals may immediately be categorised as rejected.

•According to the specific nature of the instruments and the call, it may be decided to weight the blocks of criteria. The weightings to be applied to each block of criteria are set out in the call.

Page 25: Plan Evaluation & Implementation

IMPLEMENTING THE EVALUATION

“Implementation of Evaluation is set & ready if and only if

organized team, approved proposal, released budget,

validated evaluation instruments are all prepared”

“Evaluation – an EVIDENCE in Program Development Process

Page 26: Plan Evaluation & Implementation

EVALUATION IMPLEMENTATION MODEL

Evaluation Focus

Collect Data

Analyze & Interpret Report

Page 27: Plan Evaluation & Implementation

Evaluation Focus

•The following guidelines will determine the correct evaluation focus

Guidelines for utility consideration in determining the correct evaluation focus .

•What is the purpose of the evaluation?Purpose refers to the general intent of the evaluation. A clear purpose serves as the basis for the evaluation questions, design, and methods. Some common purposes:

Gain new knowledge about program activitiesImprove or fine-tune existing program operations (e.g., program processes or strategies)

Determine the effects of a program by providing evidence concerning the program’s contributions to a long-term goalAffect program participants by acting as a catalyst for self-directed change (e.g., teaching)

Page 28: Plan Evaluation & Implementation

Evaluation Focus

•The following guidelines will determine the correct evaluation focus

Guidelines for utility consideration in determining the correct evaluation focus .

•Who will use the evaluation results?

Users are the individuals or organizations that will employ the evaluation findings in some way. The users will likely have been identified during Step 1 during the process of engaging stakeholders. In this step, you need to secure their input into the design of the evaluation and the selection of evaluation questions. Support from the intended users will increase the likelihood that the evaluation results will be used for program improvement.

Page 29: Plan Evaluation & Implementation

Evaluation Focus

•The following guidelines will determine the correct evaluation focus

Guidelines for utility consideration in determining the correct evaluation focus .

•How will they use the evaluation results? Uses describe what will be done with what is learned from the evaluation, and many insights on use will have been identified in Step 1. Information collected may have varying uses, which should be described in detail when designing the evaluation. Some examples of uses of evaluation information:

To document the level of success in achieving objectivesTo identify areas of the program that need improvementTo decide how to allocate resourcesTo mobilize community supportTo redistribute or expand the locations where the intervention is carried outTo improve the content of the program’s materialsTo focus program resources on a specific populationTo solicit more funds or additional partners

Page 30: Plan Evaluation & Implementation

Collect Data

•The following guidelines will determine the correct evaluation focus

Collecting data is a major part of any evaluation, but we need to take note that the method follows purpose.

• SOURCES OF EVALUATION INFORMATIONA variety of information sources exist which to gather your evaluative data. Thus, in a major program evaluation, we may need more than one source.

The information source we select will depend upon what is available and what answers the evaluation questions effectively. Most common source of evaluative information fall into 3 categories namely:

1. EXISTING INFORMATION2. PEOPLE3. PICTORAL RECORDS AND OBSERVATIONS

Page 31: Plan Evaluation & Implementation

Collect Data

EXISTING INFORMATION

• Might use of program documents

• Log-books• Minutes of the

meeting• Accomplishment

reports • Media releases• Local statistics• Agency data• Etc.

PEOPLE

• Think about who can best answer the questions via:

• Participants or beneficiaries (directly or indirectly)

• Nonparticipants, proponents, critics, victims

• Experts & Specialists

• Collaborators & Policy makers

PICTORAL RECORDS & OBSERVATIONS

• Data collection via:• Visual accounts• Pictures and

photographs• Direct observation

of situations• Behaviors• Program activities

and outcomes

Page 32: Plan Evaluation & Implementation

Collect Data

Major Methods for collecting information about an Evaluation.

SURVEY-collecting standardized information through structured questionnaires to generate quantitative data . Surveys may be mailed or online through WebPages, completed on-site or administered through interviews, Conducted either face to face, by telephone or electronically.

Sample surveys use probability sampling which Allow us to generalize findings to a largerPopulation while informal surveys do not.

Page 33: Plan Evaluation & Implementation

Collect Data

Major Methods for collecting information about an Evaluation.

CASE STUDY-an in-depth examination of a particular case- a program, group of participants, single individual, site or location.

Case studies rely on multiple sources of information and methods to provide as complete a picture as possible.

Page 34: Plan Evaluation & Implementation

Collect Data

Major Methods for collecting information about an Evaluation.

•INTERVIEWS-an information collected by talking with and listening to people.Interviews range on a continuum from those which are tightly structured (as in a survey) to those that are free flowing and conversational.

Page 35: Plan Evaluation & Implementation

Collect Data

Major Methods for collecting information about an Evaluation.

•GROUP & PEER ASSESMENT -collecting evaluation information through the use of group processes such as nominal group technique, focus group, brainstorming and community forum.

Page 36: Plan Evaluation & Implementation

Analyze & Interpret

What does it mean by ANALYZING DATA ?

•Analyzing data involves examining it in ways that reveal the relationships, patterns, trends, etc. that can be found within it.

•That may mean subjecting it to statistical operations that can tell you not only what kinds of relationships seem to exist among variables, but also to what level you can trust the answers you’re getting. 

•It may mean comparing your information to that from other groups to help draw some conclusions from the data.  The point, in terms of evaluation, is to get an accurate assessment in order to better understand the work and its effects in order to better understand the overall situation.

Page 37: Plan Evaluation & Implementation

Analyze & Interpret

2 Types of data and how to analyze as applied to planning

• To view Page Pls. Click ButtonQUANTITATIVE

DATA

• To view Page Pls. Click ButtonQUALITATIVE

DATA

Proceed

Proceed

Page 38: Plan Evaluation & Implementation

QUANTITATIVE DATA-Refer to the information that is collected as, or can be translated into, numbers, which can then be displayed and analyzed mathematically

Examples include:

The FrequencyTest scoresSurvey ResultsNumbers or PercentagesThis data allow us to compare those changes to one another, to changes in another variable, or to changes in another population.  It will be able to tell us, at a particular degree of reliability, whether those changes are likely to have been caused by your intervention or program, or by another factor, known or unknown.  And they can identify relationships among different variables, which may or may not mean that one causes another.

Page 39: Plan Evaluation & Implementation

QUALITATIVE DATA

- Data collected as descriptions, anecdotes, opinions, quotes, interpretations, etc., and are generally either not able to be reduced to numbers, or are considered more valuable or informative if left as narratives. 

The challenges of translating qualitative into quantitative data have to do with the human factor.  Even if most people agree on what 1 (lowest) or 5 (highest) means in regard to rating “satisfaction” with a program, ratings of 2, 3, and 4 may be very different for different people. 

Page 40: Plan Evaluation & Implementation

Analyze & Interpret

How to analyze & interpret gathered data?

•Record data in the agreed-upon ways.  These may include pencil and paper, computer (using a laptop or handheld device in the field, entering numbers into a program, etc.), audio or video, journals, etc.

•Score any tests and record the scores appropriately

•Sort your information in ways appropriate to your interest.  This may include sorting by category of observation, by event, by place, by individual, by group, by the time of observation, or by a combination or some other standard.

•When possible, necessary, and appropriate, transform qualitative into quantitative data.  This might involve, for example, counting the number of times specific issues were mentioned in interviews, or how often certain behaviors were observed.  

Page 41: Plan Evaluation & Implementation

Analyze & Interpret

How to analyze & interpret gathered data?

•Simple counting, graphing and visual inspection of frequency or rates of behavior, events, etc., over time.

•Calculating the mean (average), median (midpoint), and/or mode (most frequent) of a series of measurements or observations.

•Finding patterns in qualitative data.  If many people refer to similar problems or barriers, these may be important in understanding the issue, determining what works or doesn’t work and why, or more

•Comparing actual results to previously determined goals or benchmarks.  One measure of success might be meeting a goal for planning or program implementation, for example 

Page 42: Plan Evaluation & Implementation

Report

Report – a final stage of Evaluation Implementation .

• Depending on the nature of the research or project, results may be statistically significant or simply important or unusual.  Also, These may or may not be socially significant.Once we’ve organized the results and run them through whatever statistical or other analysis we’ve planned for, it’s time to figure out what these mean for the evaluation.  Probably the most common question that evaluation research is directed toward is whether the program being evaluated works or makes a difference. 

Page 43: Plan Evaluation & Implementation

Report

“What were the effects of the independent variable (the program, intervention, etc.) on the dependent variable(s) (the behavior, conditions, or other factors it was meant to change)?.

•Findings on the report should be stated in clear, straight-forward and objective fashion.

•It should also be in agreement with the facts presented, briefly stated in answer to the questions raised and preferably arranged sequentially in accordance with the order of the problems or objectives of the project.

•On the report, conclusions should be presented in a more detailed manner and resulting directly from the findings or tested hypothesis if there are.

Page 44: Plan Evaluation & Implementation

PLAN UPDATERecommendations advanced and proposed should be further verified and substantiated in the light of study findings and conclusions.Once validated, said recommendations provide useful inputs to planners and managers in the planning and decision-making processes. Said inputs not only update the plan but also make the programs and projects more responsive and relevant.

Page 45: Plan Evaluation & Implementation

en.wikipedia.org/wiki/Evaluation

Website Sources:

http://www.ehow.com/info_8013194_importance-planning-evaluation- assessments.html

http://webxtc.extension.ualberta.ca/research/evaluation//evalModel3a.cfm?&subsecti onid=3&sectionid=1&level3=6&sublevel3=17

http://www.er.undp.org/Procurement/docs/undp_procurement_evaluation.pdf

http://cordis.europa.eu/documents/documentlibrary/66623291EN6.pdf

http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html

http://learningstore.uwex.edu/assets/pdfs/G3658-4.pdf

http://ctb.ku.edu/en/tablecontents/chapter37/section5.aspx

Page 46: Plan Evaluation & Implementation

End of Presentation

HENRY JOHN N. NUEVAPLAN EVALUATION & IMPLEMENTATIONMasters in Management Engineering

THANK YOU