The Incremental Commitment Model process patterns for rapid‐fielding projects PhD Qualifying Examination Proposal Supannika Koolmanojwong November 2009 Viterbi School of Engineering University of Southern California
The Incremental Commitment Model process patterns for rapid‐fielding
projects
PhD Qualifying Examination Proposal
Supannika Koolmanojwong
November 2009
Viterbi School of Engineering
University of Southern California
Table of Contents
List of Figures ................................................................................................................................................ 3
List of Tables ................................................................................................................................................. 4
Abstract ......................................................................................................................................................... 1
Chapter 1 Introduction ................................................................................................................................. 2
1.1. Motivation ..................................................................................................................................... 2
1.2. Research Questions....................................................................................................................... 4
1.3. Intended Research Contribution ................................................................................................... 4
Chapter 2 Background and Related Work .................................................................................................... 5
2.1. The Incremental Commitment Model (ICM) ................................................................................ 5
2.2. Software Development Processes ................................................................................................ 6
2.3. Process Patterns of Software Development Project ..................................................................... 8
2.4. USC Software Engineering Course .............................................................................................. 11
Chapter 3 Proposed Methodology .............................................................................................................. 12
3.1. Design of Empirical Experiment .................................................................................................. 12
3.2. The Incremental Commitment Model – Electronic Process Guide (ICM EPG) ........................... 13
3.3. Process Decision Drivers ............................................................................................................. 14
3.4. Hypothesis ................................................................................................................................... 15
3.5. Data Collection and Analysis ....................................................................................................... 16
3.6. Threats to Validity ....................................................................................................................... 18
Chapter 4 Research Plan ............................................................................................................................. 19
4.1. Overall of Research Plan ............................................................................................................. 19
4.2. Progress of the ICM Electronic Process Guide (EPG) .................................................................. 19
4.3. Status of Data Collection ............................................................................................................. 21
References .................................................................................................................................................. 22
Appendix A‐ Survey form ............................................................................................................................ 25
Appendix B‐ Effort Category ....................................................................................................................... 29
Appendix C‐ ICM EPG: Roles, Activities, Work products, and Delivery Process ......................................... 30
Appendix D‐ Example of Decision driver ..................................................................................................... 34
Appendix E‐ CSCI 577 projects .................................................................................................................... 37
Appendix F‐ Client Feedback Form ............................................................................................................. 38
Appendix G – Qualitative Interview Form .................................................................................................. 43
List of Figures
Figure 1. Net Centric Services Usage in Software Development Projects in USC Software Engineering Class .............................................................................................................................................................. 3 Figure 2. Statistics about Mashup created and listed at ProgrammableWeb.com (accessed 11/9/09) ...... 3 Figure 3. Overview of the Incremental Commitment Model ....................................................................... 6 Figure 4. Different Risk Patterns yield Different Processes ........................................................................ 10 Figure 5. The Incremental Commitment Model Electronic Process Guide (EPG) ....................................... 13 Figure 6. An Example of Using Decision Drivers to map with an NDI‐Intensive Project ............................. 15 Figure 7. Research Timeline ........................................................................................................................ 19 Figure 8. Student Background Information Survey ‐ Page 1 ....................................................................... 25 Figure 9. Student Background Information Survey ‐ Page 2 ....................................................................... 26 Figure 10. Student Background Information Survey ‐ Page 3 ..................................................................... 27 Figure 11. Student Background Information Survey ‐ Page 4 ..................................................................... 28 Figure 12. Welcome Page of ICM EPG ........................................................................................................ 30 Figure 13. List of Roles in ICM EPG ............................................................................................................. 30 Figure 14. List of Practices in ICM EPG ........................................................................................................ 31 Figure 15. Practice Page in ICM EPG ........................................................................................................... 31 Figure 16. A task page in ICM EPG .............................................................................................................. 32 Figure 17. A role and responsibilities page in ICM EPG .............................................................................. 32 Figure 18. A delivery process page in ICM EPG ........................................................................................... 33 Figure 19. list of work products in ICM EPG ............................................................................................... 33 Figure 20. An Architected Agile team project status with the Architected Agile Decision Pattern ........... 34 Figure 21. An Architected Agile team project status with the Use Single NDI Decision Pattern ................ 34 Figure 22. An Architected Agile team project status with the NDI‐Intensive Decision Pattern ................. 34 Figure 23. An Architected Agile team project status with the Services‐Intensive Decision Pattern .......... 34 Figure 24. An Services‐intensive team project status with the Services‐Intensive Decision Pattern ......... 35 Figure 25. An Services‐intensive team project status with the Architected Agile Decision Pattern .......... 35 Figure 26. An Services‐intensive team project status with the Use Single NDI Decision Pattern .............. 35 Figure 27. An Services‐intensive team project status with the NDI‐Intensive Decision Pattern ................ 35 Figure 28. An NDI‐intensive team project status with the NDI‐Intensive Decision Pattern ...................... 36 Figure 29. An NDI‐intensive team project status with the Architected Agile Decision Pattern ................ 36 Figure 30. An NDI‐intensive team project status with the Use Single NDI Decision Pattern .................... 36 Figure 31.An NDI‐intensive team project status with the Services‐Intensive Decision Pattern ................ 36
List of Tables
Table 1. Differences between NDI and Net‐Centric Services ........................................................................ 7 Table 2. Differences between NDI and NCS .................................................................................................. 8 Table 3. Characteristics of the Risk‐Driven Process Patterns of the ICM ..................................................... 9 Table 4. Process Decision Drivers ............................................................................................................... 14 Table 5. Status of Process Guidelines used in USC Software Engineering class ......................................... 20 Table 6. Status of Data Collection ............................................................................................................... 21 Table 7. Effort Categories in Effort Reporting System ................................................................................ 29 Table 8. List of Projects and their process in Fall 2008 ‐ Spring 2009 ......................................................... 37 Table 9. List of Projects and their process in Fall 2009 ‐ Spring 2010 ......................................................... 37
1
Abstract
To provide better services to customer and not to be left behind in a competitive business environment,
wide variety of ready‐to‐use software and technologies are available for one to grab and go and build up
software system in a very fast pace. Rapid fielding comes to play a major role in developing software
system to provide a quick response to the organization. This research investigates the appropriateness
of current software development processes and develops new software development process
guidelines, focusing on four process patterns: Use single NDI, NDI‐intensive, Services‐intensive, and
Architected Agile. Currently, there is no single software development process model that is applicable
to all four process patterns, but the Incremental Commitment Model (ICM) can help a new project
converge on a process that fits their process scenario. The output of this research will be implemented
as an Electronic Process Guide for USC Software Engineering students to use as a guideline to develop
real‐client Software Engineering course projects. An empirical study will be conducted to verify the
suitability of the newly developed process as compared to results data from previous course projects.
2
Chapter 1 Introduction
1.1. Motivation
The growing diversity of software systems (requirements‐driven, NDI‐driven, services‐driven, learning‐
driven, qualities‐driven, systems of systems) has made it clear that there are no one‐size‐fits‐all
processes for the full range of software systems. Some process models are being developed that provide
specific evidence‐based and risk‐based decision points. One of the most thoroughly elaborated of these
models is the Incremental Commitment Model (ICM). The ICM with its risk‐driven nature and its process
decision table can help new projects converge on a process that fits their process drivers and
circumstances. To select and follow the appropriate process pattern helps the development to finish the
project faster and more efficient. A set of decision criteria and the ICM decision points have been
defined on a general‐experience basis. Quantitative evidence is lacking on the ability of the criteria to
produce a viable process decision early in the life cycle.
The USC real‐client MS‐level team projects provide a significant number of projects that are suitable to
four of the ICM process patterns: Architected Agile, Use Single NDI, NDI‐intensive, and Services‐
intensive. The research will test the hypothesis that the decision criteria are usable to determine a
viable process during the first stage of the life cycle. Architected Agile exemplifies a scalable balance
between plan‐driven and agile‐driven approaches [Boehm and Turner 2004]. NDI is an alternative for
software developers to reduce development time and cost while increasing software quality and
productivity via software reuse [Basili and Boehm 2001; Li et.al 2006]. Use single NDI provides an option
of a ready‐to‐use product either as a complete project solution or as a partial development solution.
3
Figure 1. Net Centric Services Usage in Software Development Projects in USC Software Engineering Class
For Services‐intensive project, based on projects from at USC’s Software Engineering Class, Figure 1
shows an increasing trend in using net centric services in real‐client software development projects. Not
only in the academic environment, 80% of the world economy provides services in various forms [CMMI
for Services 2009]. As shown in Figure 2, in each day, there are about 3 new mashups or web service
extensions created and listed at programmableweb.com [Programmableweb.com 2009]. The users who
are consuming these services need to know how to select the right service and utilize the service
properly. Moreover, based on our preliminary study from Fall’08‐Spring’09 projects, we found that for
some projects, a pure COTS‐Based Development process does not fit well for Net‐Centric Services case.
Figure 2. Statistics about Mashup created and listed at ProgrammableWeb.com (accessed 11/9/09)
4
This research mainly experiments with new software development processes for Architected Agile and
Services‐Intensive and extends Yang and Boehm’s COTS‐Based Application Development guidelines
(CBD) [Yang 2006] by using the risk‐driven approach of the Incremental Commitment Model and
incorporating feedback from Bhuta’s empirical analysis on COTS interoperability assessment [Bhuta
2007].
1.2. Research Questions
Research questions are developed around the concepts of process modeling and process improvement.
RQ 1. How does the Incremental Commitment Model fit in each process pattern?
RQ 2. What are the decision criteria that branch a project to each process pattern?
RQ 3. What activities, roles, responsibilities, and work products should exist for each process pattern?
RQ 4. In what ways do the process patterns improve project outcomes?
RQ 5. In what ways could the process patterns be improved?
1.3. Intended Research Contribution
The research is intended to provide the following contributions:
• Current Software Development Process Investigation and Analysis
• Decision Criteria of Process Deployment
• Software Development Process Guidelines for 4 process patterns
• The ICM Electronic Process Guide
5
Chapter 2 Background and Related Work
2.1. The Incremental Commitment Model (ICM)
The ICM [Boehm and Lane 2007; Pew and Mavor 2007] is a new generation process model. ICM covers
the full system development life cycle consisting of the Exploration phase, Valuation phase, Foundations
phase, Development phase, and Operation phase. ICM has been evaluated to be a reasonably robust
framework for system development. The core concepts of the ICM include 1) commitment and
accountability of system sponsors, 2) success‐critical stakeholder satisficing, 3) incremental growth of
system definition and stakeholder commitment, 4) concurrent engineering, 5) iterative development
cycles, and 6) risk‐based activity levels and milestones. One of the main focuses of the ICM is feasibility
analysis; evidence must be provided by the developer and validated by independent experts. The ICM
combines the strengths of various current process models and limits their weaknesses. The ICM, like the
V‐Model [V‐Model 2009], emphasizes early verification and validation, but allows for multiple‐
incremental interpretation and alleviates on sequential development. Compared to the Spiral Model
[Boehm 1988], the ICM also focuses on risk‐driven activity prioritization, but offers an improvement by
adding well‐defined in‐process milestones. While ICM, RUP, and MBASE [Boehm 1996] perform
concurrent engineering that stabilizes the process at anchor point milestones, ICM also supports
integrated hardware‐software‐human factors oriented development. Comparing with Agile methods
[Agile 2009], ICM embraces adaptability to unexpected change perspective and at the same time allows
scalability.
6
Figure 3. Overview of the Incremental Commitment Model
2.2. Software Development Processes
The proposed process patterns of software development processes in ICM are developed by combining
strengths from several development processes. The Architected Agile case balances a plan‐driven
approach in building up the steady architecture and an agile‐driven approach in iterative incremental
and frequent delivery as practice in Scrum [Rising 2000] or Agile Unified Process [Ambler 2009]. The
Software Engineering Institute (SEI) CMMI‐COTS [CMMI‐COTS 2009] and USC‐CSSE COTS‐Based
Development (CBD) Guidelines [Yang 2007] provide strong foundations for Use NDI, NDI‐Intensive or
COTS‐Based systems (CBS). Regarding Services‐Intensive, most of the processes, including CMMI‐SVC
[CMMI‐SVC 2009], cover only the process of how to develop and maintain web services. None of them
are able to pick, choose and use available online services. Although the Services‐Intensive case is similar
to the NDI‐intensive case, we found that the differences between NDI and NCS shown in Table 1 and
7
Table 2 below make the CBD guidelines an imperfect fit with a Services‐Based development process.
Having properly defined software process models is essential, but the ability to effectively communicate
those models to the software engineers is also important. At USC‐CSSE, we have been using the IBM
Rational Method Composer to develop an Electronic Process Guide (EPG) for the ICM. Currently, the ICM
EPG covers the guidelines for all four process patterns.
Table 1. Differences between NDI and Net‐Centric Services
Category Non‐Developmental Item [ includes open source, customer‐furnished software]
Net‐Centric Services
Payment • Non‐commercial items usually have no monetary cost
• Expensive initial costs, moderate recurring fee, training fee, licensing arrangement‐dependent
• Not all services are free, mostly pay per transaction • Low initial costs, moderate marginal cost, duration
depending license
Platform • Specific and limited to specific platform / language • Generally supported on a subset of platforms or
multiple platforms but with different editions
• Platform and language independent • Server and client can work on different platform • Interaction between machines over a network
Integration • Generally more tightly coupled • Not very flexible on existing legacy systems when
proprietary standard is used • Difficult when it is a platform dependent and
different technologies involved in it. • detailed documentation and on‐site extensive
support
• Generally more loosely coupled • Common web standards, flexible, easy to integrate • Requires internet access • Support forums and API documentation available • This integration could be done merely in code,
without additional installation of external components
Changes • Able to freeze the version, under user control • Designed for specific use so costly for customization
and change • Change on server side doesn’t impact the client side • Major releases once in while • Requires end user intervention to upgrade
• Changes are out of developers’ control • Not easy to predict change, cannot avoid upgrade • The end‐user has the latest version of the service • Change on the server side can result in the client
side • Minor releases frequently (through patching) • Does not require end user intervention
Extensions • Only if source is provided and the license permits • Extension must be delivered to and performed at
the end‐user’s site • Custom extensions may not be portable across COTS
or compatible with future releases
• Extension is limited to data provided by the web services
• In‐house extension such as wrapper or mashup • Little control over performance overhead
Evaluation
Criteria
• Maintenance, extensibility, scalability, reliability, cost, support, usability, dependency, ease of implementation, maintainability, upgrades, size, Access to source and code‐escrow considerations
• Upfront costs opposed to subscription • Platform compatibility; Feature controllability
• Reliability, Availability, Cost, Available Support, Speed, Predicted longevity of the service provider, release cycle, Bandwidth
• Recurring costs to use of the service and future functionality offered
• Standards compatibility; Feature‐ data controllability Support
Services
• Vendor support for integration, training and tailoring/modification sometimes available for a fee
• Help topics or FAQs would likely not be updated after installation
• Upgrades/Patches and data migration support • Sometimes can be customized for specific user • Upgrade through purchasing new releases, self‐
install
• Support for tailoring/modification, training generally not available
• Help topics would generally be frequently updated; self‐learning
• Usually not customized for specific user • Patching on service provider’s side; mostly does not
require installation on client side
8
Data • Data often stored locally. Backups generally the responsibility of the user
• Data access is generally fast • Possible variety of proprietary formats • May be inflexible for change but more secure • Platform‐dependent data format • Can process data offline
• Data stored on service host’s servers. Backups by the provider. Introduces privacy and data‐retention
• Data access could be slower since it is internet based • Common XML using web standard protocols • Data from different web services can be used by a
single client program • Process data online
Table 2. Differences between NDI and NCS
Characteristics NDI NCS Platform Independent Yes / No Yes Required Internet Access Yes / No Yes Common Standard No Yes Option of rejecting next release Yes No Change / upgrade control Client /Server’s site Server’s site End user has the latest version Yes / No Yes Database Ownership Yes Yes/No
2.3. Process Patterns of Software Development Project
Given a project description, one may not know that it should follow a general waterfall or agile
development process or if it needs to adopt different development process. As shown in Table 3, in the
exploration phase, with the known initial scoping, project risks, and project’s nature, the following
process drivers generally help make an appropriate decision whether the project should follow one of
several process pattern development processes: system’s size and complexity; its rate of change; its
mission criticality; the extent of NDI support for desired capabilities; and the available organizational
and personnel capability for developing the system.
9
Table 3. Characteristics of the Risk‐Driven Process Patterns of the ICM
Process patterns Example
Size, Com
plexity
Change Rate (%
/Mon
th)
Criticality
NDI Sup
port
Organizational and
Person
nel Capability
Time/Bu
ild; Tim
e/Increm
ent
Architected Agile Business data processing
Med 1‐10 Med‐High
Good; most in place
Agile‐ready Med‐high
2‐4 wks; 2‐6 months
Use single NDI Small accounting Complete NDI‐ intensive Supply chain
management Med‐High
0.3‐3 Med‐Very High
NDI‐driven architecture
NDI‐ experienced; med‐high
SW:1‐4 weeks; Systems: 6‐18 months
Services‐intensive Community Services or Special Interest Group
Low‐Med
0.3‐3 Low‐Med
Tailorable service elements
NDI‐ experienced
<= 1 day; 6‐12 months
Moreover, as shown in Figure 4, different risk patterns for each project yields different software
processes. For example, in the second example, when in the Exploration phase, the developers spent a
good amount of effort to explore and find a perfect NDI that satisfied all the win conditions, the
development team could skip the Valuation phase and Foundations phase. In the third and the fourth
example, if the development team found possible NDIs/NCSs in the Exploration phase, the team could
spend more effort and time evaluating the NDIs/NCSs, prioritizing their win‐conditions or their
constraints. On the other hand, since NDIs/NCSs will provide a majority of the end product features, the
development team could spend less time in Foundations and Development‐related efforts. Processes in
the NDI and NCS cases might look similar, but the differences are in the details as mentioned in Table 3.
10
Figure 4. Different Risk Patterns yield Different Processes
For the small e‐services projects developed in our project course, four of the 12 process patterns of the
ICM [Boehm et.al 2009] predominate:
Architected Agile ‐ For less than 80 agile‐ready‐people team size and a fairly mature technology project,
agile methods can be scaled up using an Architected Agile approach emphasizing early investment in a
change‐prescient architecture and all success‐critical‐stakeholders team building [Boehm and Turner
2004]. The Valuation phase and Foundations phases can be brief. A scrum of Scrums approach can be
used in the Development phase.
Use NDI – When an appropriate NDI (COTS, open source, reuse library, customer‐furnished package)
solution is available, it is an option to either use the NDI or develop perhaps a better version by oneself
or outsource such a development which generally incurs more expense and takes longer to begin
capitalizing on its benefits. On the other hand, an NDI may come with high volatility, complexity, or
incompatibility. Major effort will then be spent on appraising the NDI.
11
NDI‐Intensive –NDI‐Intensive system is a system where 30% of end‐user functionality is provided by NDI
[Yang 2006]. A great deal of attention goes into appraising the functionality and interoperability of NDI,
effort spent on NDI tailoring and Integration, and NDI upgrade synchronization and evolution [Li et.al
2006; Morisio et.al. 2000].
Services‐Intensive –Net Centric Services support community service organizations in their online
information processing services such as donation, communication or their special interest group
activities such as discussion boards, file sharing, and cloud computing. Similar to NDI‐Intensive, the focus
goes to appraising the functionality of the available services and tailoring to meet needs.
2.4. USC Software Engineering Course
In the keystone two‐semester team project graduate software engineering course sequence CS577ab
[USC CSCI577 2008] at USC, students learn through experience how to use good software engineering
practices to develop software systems from the Exploration Phase to the Operation Phase, all within a
24‐week schedule. Six on‐campus and two off‐campus students team up to develop real‐client software
system products. Based on the nature of the course projects, all teams will follow the ICM Exploration
Phase guidelines to determine their most appropriate process pattern. Most of the clients are
neighborhood non‐profit organizations, small businesses or USC departments. Examples of the projects
are an accounting system, an art gallery web portal, a theatre script online database, and an EBay search
bot. Because of the semester break between Fall and Spring semester, for the Software Engineering
class, we added a short Rebaselined Foundations phase to accommodate the possible changes.
12
Chapter 3 Proposed Methodology
3.1. Design of Empirical Experiment
Experiment Preparation: I applied the ICM process to itself by first exploring the problems of current
processes and opportunities to improve them. Besides an extensive literature review, I collected data,
feedback and input from the CSCI 577‐ software engineering class in Fall 2008‐Spring 2009, where the
first version of the ICM EPG for Architected Agile and USC‐CSSE CBD guidelines were used. The collected
data are a) student’s individual critiques which is mainly focused on process improvement; b) Effort; c)
Weekly risk report; d) Grades; e) Interview of service‐based development teams that used CBD
guidelines and CBD teams regarding feedback of the guidelines and criteria or incidents of decisions
changed to process pattern project, and f) feedback from clients. This data is being analyzed and will be
used as feedback and input to develop the EPG for 4 process patterns.
Pre‐experiment: Before the project starts, there will be a questionnaire to collect students’ general
information and appraise their knowledge about general software engineering, NDI and NCS.
Experiment: The experiment will be started once the students form their teams and select the projects.
Teams will follow the ICM Exploration phase process to determine their most appropriate process
patterns. Tutorials, lectures, assignments, and extra readings will help students come up with a software
engineering learning curve and will be similar to previous years. In addition, they will use material on the
four process patterns. During the experiment, the following tools and activities will be used to collect
data and to ensure project success: the ICM EPG, effort reporting system, risk analysis tool, NDI‐
Interoperability assessment tool [Bhuta 2007], architecture review board, verification & validation by
off‐campus students, teaching staff, and team mentor by Software Engineering PhD students.
13
Post‐Experiment: After the experiment, the following information will be collected and analyzed a)
follow‐up questionnaires; b) individual critiques; c) clients’ feedback; d) effort; e) risk items; f) Incidence
of direction changes at reviews relative to outcomes; and g) team performance based on team grading.
3.2. The Incremental Commitment Model – Electronic Process Guide (ICM EPG)
Effectively communicating the software process model to the software engineers is essential in enabling
them to understand the overall process as well as specific areas of focus. To satisfy the objective of
helping students learn the software processes, the ICM EPG is developed by using the IBM Rational
Method Composer (RMC) [IBM RMC 2008]. The ICM EPG as shown in Figure 5, describes the software
development process by providing guidance in multiple‐view representations, which are role‐based
representation, activity‐based representation, chronological event‐based representation, and artifact‐
based representation. Moreover, additional artifact templates and supplementary guides have been
demonstrated to speed up the users’ learning curve and support them in their development process
[Koolmanojwong 2007; Phongpaibul 2007]. Samples of the ICM EPG can be found in Appendix C.
Figure 5. The Incremental Commitment Model Electronic Process Guide (EPG)
14
3.3. Process Decision Drivers
In order to select the appropriate process pattern, the development team will use the decision drivers
presented in Table 4. The development team will use 16 decision drivers to evaluate the project status
and map it with the possible maximum‐minimum boundary range of each possible process pattern.
Table 4. Process Decision Drivers
Decision Criteria Importance Architected
Agile Use NDI
NDI‐Intensive
Services‐Intensive
Alternatives More than 30% of features available in NDI/NCS 0 – 1 2 – 3 3 – 4 3 – 4
Has a single NDI/NCS that satisfies a complete solution 0 – 1 4 2 – 3 2 – 3
Very unique/ inflexible business process 2 – 4 0 – 1 0 – 1 0 – 1
Life Cycle
Need control over upgrade / maintenance 2 – 4 0 – 1 0 – 1 0 – 1
Rapid Deployment; Faster time to market 0 – 1 4 2‐ 4 2 – 3
Architecture
Critical on compatibility 2 – 4 3 – 4 1 – 3 2 – 4
Internet Connection Independence 0 – 4 0 – 4 0 – 4 0
Need high level of services / performance 0 – 4 0 – 3 0 – 3 0 – 2
Need high security 2 – 4 0 ‐ 4 0 – 4 0 – 2
Asynchronous Communication 0 – 4 0 – 4 0 – 4 0
Access Data anywhere 0 – 4 0 – 4 0 – 4 4
Resources
Critical mass schedule constraints 0 – 1 3 – 4 2 – 3 2 – 4
Lack of Personnel Capability 0 – 2 3 – 4 2 – 4 2 – 3
Little to no upfront costs (hardware and software) 0 – 2 2 – 4 2 – 4 3 – 4
Low total cost of ownership 0 – 1 0 – 3 0 – 3 2 – 4
Not‐so‐powerful local machines 1 – 4 1 – 3 0 – 4 3 – 4
The “Importance” attribute, whose values are 1‐Low, 2‐Medium, 3‐High, will act as a tie breaker to
support decision making in selecting the best fit process pattern. The value of project status ranges from
0‐Very Low, 1‐Low, 2‐Moderate, 3‐High, and 4‐Very High. Figure 6 shows an example of a team that is
developing a simple website. The team found a possible content management system NDI, but it does
not satisfy all of the capability win conditions. The team rates the project status based on 16 decision
drivers, the result is shown in the blue line. The background block diagram is the max‐min boundary of
15
NDI‐Intensive project. The red underline represents High Importance level, while green dashed
underline represents Low Importance level. As a result, the decision driver shows that this team could
follow the NDI‐intensive software development process. More examples can be found in Appendix D.
Figure 6. An Example of Using Decision Drivers to map with an NDI‐Intensive Project
3.4. Hypothesis
The ICM characteristics in each process pattern
• H1a: Commitment and Accountability exist in each process pattern
• H1b: Success‐critical stakeholder satisficing exists in each process pattern
• H1c: Incremental growth of system definition and stakeholder commitment exists in each process
pattern
• H1d: Concurrent engineering exists in each process pattern
• H1e: Iterative development cycles exists in each process pattern
• H1f: Risk‐based activity levels and milestones exists in each process pattern
Decision driver and each process pattern
16
• H2a: if Evaluating Alternative is part of the decision criteria, the development team can effectively
select the appropriate process pattern
• H2b: if Control Over Project Life Cycle is part of the decision criteria, the development team can
effectively select the appropriate process pattern
• H2c: if Architectural Performance is part of the decision criteria, the development team can
effectively select the appropriate process pattern
• H2e: if Project Resources is part of the decision criteria, the development team can effectively select
the appropriate process pattern
Project outcome improvement
• H3: The development teams following the correct process pattern would outperform others using
traditional processes
Process patterns improvement
• H4: The development teams following the correct process pattern would outperform others using
traditional processes
3.5. Data Collection and Analysis
Data collection process will be done to answer the research questions.
RQ 1. How does the Incremental Commitment Model fit in each process pattern?
The 6 key concepts of the ICM are Commitment and accountability, Success‐critical stakeholder
satisficing, Incremental growth of system definition and stakeholder commitment, Concurrent
engineering, Iterative development cycles, and Risk‐based activity levels and milestones. The empirical
17
analysis will be performed to verify which each key concept is contained in each process pattern.
Validation will use concrete examples.
RQ 2. What are the decision criteria that branch a project to each process pattern?
The initial decision criteria or decision drivers have been developed based on experience and the
literature review. The development teams are currently using the decision driver to help them select the
appropriate process. Qualitative interviews and the incidence of direction changes will be used to refine
the decision criteria.
RQ 3. What activities, roles, responsibilities, and work products should be there for each process
pattern?
Based on the current CBD guidelines [Yang 2007] and various literature reviews, activities, roles,
responsibilities, and work products will be defined and developed into the ICM Electronic Process Guide.
At the end of the semester, qualitative interview and students’ individual critique will be used as a main
source of process guidelines improvement.
RQ 4. In what ways do the process patterns improve project outcomes?
The following measurements will be used to identify whether the proposed process patterns have
improve the project outcomes:
‐ Personnel Effort
‐ Number of Defect
‐ Team performance
‐ Client Satisfaction
RQ 5. In what ways could the process patterns be improved?
Based on the results from RQ3 and RQ4, the gap will be identified for further improvement.
18
3.6. Threats to Validity
This section discusses possible validity threats and ways in which the threats could be reduced.
‐ Inconsistent Effort Reporting: It is possible that there might be inaccurate efforts reported, so 2
forms of effort reporting will be used, classroom effort reporting system and a post‐experiment
questionnaire.
‐ Non‐representativeness of subjects: Based on the history of the software engineering class, the
participants, with average of not more than 2 years of industrial experience and an average 12‐hour,
non‐collocated work week, are not representative of a software engineer in the industry. However,
the clients and off‐campus students are full‐time working professionals.
‐ Learning Curve: The possibility of an imbalanced team and the threat of learning curve in using
these process models could be overcome by providing tutorials and discussion sessions in order to
build the foundations for all participants.
‐ Non‐representativeness of projects: Although the target projects need to conform to fixed
semester schedules, this is to some degree representative of fixed‐schedule industry projects. The
projects are small e‐services applications, but will be developed for real clients with diverse
domains, and will use the same COTS or services that are used in industry projects. Moreover, the
process guidelines and decision drivers will be cross checked with experts in the field for enterprise‐
level compatibility.
19
Chapter 4 Research Plan
4.1. Overall of Research Plan
Figure 7 shows the overall picture of the research timeline starting from 2007. The first group painted in
pink represents time line of the CSCI 577 Software Engineering class. The green bars represent works
involved in this research. The blue bar represents a related work which is the use of USC CSSE CBD
guidelines [Yang and Boehm 2007]. The red markings represent the dissertation milestones.
Figure 7. Research Timeline
4.2. Progress of the ICM Electronic Process Guide (EPG)
One of the main contributions of this research is the ICM EPG. In Fall 2005 – Spring 2008, there were 2
kinds of software development process guidelines, LeanMBASE guidelines for Non‐COTS‐Based
20
Development project and USC CSSE CBA guidelines for COTS‐Based Development project. In summer
2008, the IBM Rational Method Composer was selected as a tool to implement the EPG. At the same
time, the Incremental Commitment Model was selected to replace the LeanMBASE approach. In Fall
2008, the first version of ICM EPG for the Architected Agile team was available for developers to follow.
For NDI/NCS teams, they followed the USC CSSE CBA guidelines. Based on feedback of the first version
of ICM EPG for Architect Agile from Fall 2008 – Spring 2009 and discrepancies and outdated data
observed in USC CSSE CBA guidelines for NDI/NCS teams, in summer 2009, the second version of ICM
EPG for Architected Agile process and the first version of ICM EPG for NDI/NCS were developed.
Currently, 4 available process guidelines for 4 different process patterns are being used by the
development teams in CSCI577 Software Engineering class. The summary of progress of the ICM EPG is
shown in Table 5.
Table 5. Status of Process Guidelines used in USC Software Engineering class
Process Pattern / Semester
Fall ’05‐ Spring ‘08 Summer ‘08 Fall ‘08
Spring ‘09
Summer ‘09 Fall ‘09
Spring’10
Architected Agile Use LeanMBASE
Guidelines Develop
ICM EPG : AA V1 Deploy
ICM EPG : AA V1 Revise V2 Deploy V2
NDI‐Intensive
Use USC CSSE COTS‐Based Application Guidelines
Develop ICM EPG : NDI‐ intensive V1
Deploy V1
Services‐Intensive Develop ICM EPG :
Services‐intensive V1 Deploy V1
Use NDI Develop ICM EPG : Use single NDI V1
Deploy V1
21
4.3. Status of Data Collection
Currently the experiment is running at the USC CSCI577 Software Engineering Class. Table 6 shows the projects’ important
milestones and key dates for data collection.
Table 6. Status of Data Collection
Schedule Activities Fall 2009
October 19‐23 First project milestone review November 30 – December 4 Second project milestone review December 7 Follow up questionnaire December 9 Client Feedback, Individual Critique
Spring 2010 February 10‐11 Third project milestone review April 14‐15 Fourth project milestone review May 7 Project deliverables May Interview, Client Feedback, Individual Critique
22
References
Agile, Principles behind the agile manifesto, http://agilemanifesto.org/principles.html. accessed on 8/4/2009.
Amazon Payment Services, https://payments.amazon.com/sdui/sdui/index.htm
Ambler, S.W., Agile Unified Process, http://www.ambysoft.com/unifiedprocess/agileUP.html Accessed 8/4/2009
Basili, V., Boehm, B., "COTS‐Based Systems Top 10 List," Computer, Volume 34, Number 5, May, 2001, pp. 91‐93
Bhuta, J., “A Framework for Intelligent Assessment and Resolution of Commercial‐Off‐The‐Shelf Product Incompatibilities” PhD Dissertation, Department of Computer Science, University of Southern California, August 2007
Boehm B, "A Spiral Model of Software Development and Enhancement", IEEE Computer, 21(5):61‐72, May 1988
Boehm B, Anchoring the Software Process, IEEE Software, v.13 n.4, p.73‐82, July 1996
Boehm, B. and Bhuta, J., "Balancing Opportunities and Risks in Component‐Based Software Development," IEEE Software, November‐December 2008, Volume 15, Issue 6, pp. 56‐63
Boehm, B. and Lane, J., “Using the Incremental Commitment Model to Integrate Systems Acquisition, Systems Engineering, and Software Engineering,” “Cross Talk, October 2007, pp.4‐9
Boehm, B. and Lane, J., "Guide for Using the Incremental Commitment Model (ICM) for Systems Engineering of DoD Projects" USC CSSE Tech Report 2009‐500
Boehm, B., Lane, J. and Koolmanojwong, S. "A Risk‐Driven Process Decision Table to Guide System Development Rigor," Proceedings of the 19 th International Conference on Software Engineering, Singapore, July, 2009.
Boehm, B. and Turner, R. 2004. Balancing agility and discipline: a guide for the perplexed. Addison‐Wesley, Boston.
CMMI for COTS‐based Systems, http://www.sei.cmu.edu/publications/documents/03.reports/03tr022.html
CMMI for Services Version1.2, ftp://ftp.sei.cmu.edu/pub/documents/09.reports/09tr001.doc
CMMI for Services flyer, www.sei.cmu.edu/cmmi/models/CMMI‐for‐Services‐one‐pager‐20090320.doc
23
CrossTalk, The Journal of Defense Software Engineering, http://www.stsc.hill.af.mil/index.html, accessed 08/24/09
DOD 5000.02 Operation of the Defense Acquisition System [ACC] http://www.dtic.mil/whs/directives/corres/pdf/500002p.pdf accessed 09/11/09
Google Map, http://maps.google.com/
IBM Rational Method Composer http://www‐01.ibm.com/software/awdtools/rmc/
Instructional ICM‐Software Electronic Process Guide, http://greenbay.usc.edu/IICMSw/index.htm
Koolmanojwong, S., et.al, "Comparative Experiences with Software Process Modeling Tools for the Incremental Commitment Model" USC CSSE Technical Report 2007‐824
Koolmanojwong, S., et.al, "Incremental Commitment Model Process Guidelines for Software Engineering Class", USC CSSE Technical Report 2008‐832
Lenth, R. V., “Some practical guidelines for effective sample size determination”, American Statistician, 2001, 55, 187‐193.
Li, J., et al., “An empirical study of variations in COTS‐Based Software Development Processes in Norwegian IT industry,” Journal of Empirical Software Engineering vol. 11, no.3, 2006, pp 433‐461
Morisio, M., C. B. Seaman , A. T. Parra , V. R. Basili , S. E. Kraft , S. E. Condon, Investigating and improving a COTS‐based software development, Proceedings of the 22nd international conference on Software engineering, p.32‐41, June 04‐11, 2000, Limerick, Ireland
Ning, http://www.ning.com/, accessed 08/24/08
Pew, R. W., and Mavor, A. S. , “Human‐System Integration in the System Development Process: A New Look”. 2007, National Academy Press.
Phongpaibul, M., Koolmanojwong, S., Lam, A., and Boehm, B. "Comparative Experiences with Electronic Process Guide Generator Tools," ICSP 2007, pp. 61‐72
ProgrammableWeb http://www.programmableweb.com/mashups accessed on 11/9/2009
Rising, L., & Janoff, N. The Scrum Software Development Process for Small Teams. IEEE Software, July/August 2000
Scaffidi, C.,” Topes: Enabling End‐User Programmers to Validate and Reformat Data”, PhD Dissertation, Technical Report CMU‐ISR‐09‐105, Institute for Software Research (ISR), Carnegie Mellon University, May 2009.
USC – CSCI577 Software Engineering Class I Website, http://greenbay.usc.edu/csci577/fall2008/site/index.html accessed on 08/24/08
V‐Model Lifecycle Process Model http://www.v‐modell.iabg.de/kurzb/vm/k_vm_e.doc accessed on 10/9/2009
24
Yahoo Developer Network, http://developer.yahoo.com/
Yang, Y., "Composable Risk‐Driven Processes for Developing Software Systems from Commercial‐Off‐The‐Shelf (COTS) Products," PhD Dissertation, Department of Computer Science, University of Southern California, December 2006
Yang Y. and Boehm B., COTS‐Based Development Process guidelines, http://greenbay.usc.edu/csci577/spring2007/site/guidelines/CBA‐AssessmentIntensive.pdf Accessed 08/24/07
25
Appendix A Survey form
Figure 8. Student Background Information Survey ‐ Page 1
26
Figure 9. Student Background Information Survey ‐ Page 2
27
Figure 10. Student Background Information Survey ‐ Page 3
28
Figure 11. Student Background Information Survey ‐ Page 4
29
Appendix B Effort Category
Table 7. Effort Categories in Effort Reporting System Operational Concept Development Implementation Analyze Current System Explore and evaluate alternatives Identify Shared Vision Analyze and prioritize capabilities to prototype Establish New Operational Concept Acquire NDI/ServicesIdentify System Transformation Prepare development / operational environment Identify Organizational and Operational Transformation Develop prototype Identify Objectives, Contraints and Priorities Develop componentAssess Operational Concept Assess Prototype / componentDocumenting of OCD Tailor NDI/ ServicesSystem and Software Requirements Development Integrate ComponentsSet up WinWin negotiation context Transition the systemNegotiate (during meeting) Develop Transition PlanNegotiation using WikiWinWin tool (after meeting) Develop Support PlanIdentify win conditions Provide TrainingIdentify issues Develop User ManualNegotiate options Documenting of PrototypingDevelop Requirements Definition Testing Assess requirements definition Identify Test CasesDocumenting of SSRD Identify Test PlanDocumenting of WinWin Negotiation Report Identify Test ProceduresSystem and Software Architecture Development Record Test ResultsAnalyze the Proposed System Identify Regression Test Package Define Technology‐Independent Architecture Documenting Test‐related documents Define Technology‐Dependent Architecture Perform custom component test Specify Architecture Styles, Patterns and Frameworks Perform NDI/service component test Assess System Architecture perforn Integration testDocumenting of SSAD perform acceptance test
Life Cycle Planning perform regression test Identify Milestones and Products perform unit testIdentify Responsibilities and Skills Project Administration (currently not in ICM EPG) Identify Life Cycle Management Approach Create and maintain project website Estimate Project Effort and Schedule using COCOMO II Interact with ClientsEstimate Project Effort and Schedule using COCOTS Interact between team members Assess Life Cycle Content Learn about Incremental Commitment Model Detail Project Plan Learn about Application domain Record Project Progress Prepare transition siteRecord Project Individual Effort Manage Code configurationIdentify Development Iteration Attend ARBAssess Development Iteration Planning and controlPerform Core Capabilities Drive‐Through Control Project PerformanceDocumenting of LCP Quality Management Documenting Iteration‐related cocuments Gather DefinitionsTrack problem and report closure Construct Traceability MatrixFeasibility Evidence Description Identify Configuration Management Strategy Analyze Business Case Identify Quality Management Strategy Identify, assess, and manage risks Verify and Validate Work Products Provide Architecture Feasibility Evidence Assess Quality Management Strategy Provide Process Feasibility Evidence Documenting of QMPAssess Feasibility Evidence Documenting of SIDAssess/Evaluate NDI/Services Candidate Documenting of review‐related document Check components Interoperability Documenting of FED
30
Appendix C ICM EPG: Roles, Activities, Work products, and Delivery Process
Figure 12. Welcome Page of ICM EPG
Figure 13. List of Roles in ICM EPG
31
Figure 14. List of Practices in ICM EPG
Figure 15. Practice Page in ICM EPG
32
Figure 16. A task page in ICM EPG
Figure 17. A role and responsibilities page in ICM EPG
33
Figure 18. A delivery process page in ICM EPG
Figure 19. list of work products in ICM EPG
34
Appendix D Example of Decision driver
Figure 20. An Architected Agile team project status with the Architected Agile Decision Pattern
Figure 21. An Architected Agile team project status with the Use Single NDI Decision Pattern
Figure 22. An Architected Agile team project status with the NDI‐Intensive Decision Pattern
Figure 23. An Architected Agile team project status with the Services‐Intensive Decision Pattern
35
Figure 24. An Services‐intensive team project status with the Services‐Intensive Decision Pattern
Figure 25. An Services‐intensive team project status with the Architected Agile Decision Pattern
Figure 26. An Services‐intensive team project status with the Use Single NDI Decision Pattern
Figure 27. An Services‐intensive team project status with the NDI‐Intensive Decision Pattern
36
Figure 28. An NDI‐intensive team project status with the NDI‐Intensive Decision Pattern
Figure 29. An NDI‐intensive team project status with the Architected Agile Decision Pattern
Figure 30. An NDI‐intensive team project status with the Use Single NDI Decision Pattern
Figure 31.An NDI‐intensive team project status with the Services‐Intensive Decision Pattern
37
Appendix E CSCI 577 projects
Table 8. List of Projects and their process in Fall 2008 ‐ Spring 2009
# Project Name Process Followed P1 Master Pattern Architected Agile P2 Housing Application Tracking System Architected Agile P3 Revamping Proyecto Pastoral Architected Agile P4 UNO Web Tool Architected Agile P5 Hunter‐gatherer interactive research database Architected Agile P6 The IGM On‐Line Art Gallery COTS‐Based Development P7 EZBay Architected Agile P8 AAA Petal Pushers Remote R&D Architected Agile P9 Information Organization System COTS‐Based Development P10 The Roots of Inspiration web site Architected Agile P11 Web‐based Service for TPC Foundation COTS‐Based Development P12 Online Peer Review System for Writing Program Architected Agile P13 Acme Research Engine for USC‐WB Archives Architected Agile P14 The Virtual Assistant Living and Education Program Architected Agile P15 Data Base for III, Inc. COTS‐Based Development P16 Theatre Script Online Database Architected Agile
Table 9. List of Projects and their process in Fall 2009 ‐ Spring 2010
# Project Name Process Followed P1 Online DB support for CSCI 511 Architected Agile P2 SHIELDS for Family Architected Agile P3 Theater Stage Manager Program Architected Agile P4 Growing Great Online Services‐Intensive P5 SPC Website Automation Enhancement NDI‐Intensive P6 VALE Information Management System Services‐Intensive P7 LANI D‐Base Architected Agile P8 Freehelplist.org Services‐Intensive P9 Early Medieval East Asian Timeline Architected Agile P10 BHCC Website Development Architected Agile P11 AI Client Case Management Database Architected Agile P12 AI Website Development Services‐Intensive P13 Healthcare The Rightway Services‐Intensive
P14 AROHE Web Development Architected Agile
38
Appendix F Client Feedback Form
Appendix F1‐Client Feedback Form – CSCI577a Project Name: __________________________________________ Team # _______ Please provide a ranking where indicated. Use a 1 ‐ 5 scale where 1 is low and 5 is high. Where comments are requested, please include any descriptive statements you wish to add. FOR THE FIRST TWO ITEMS, consult your team's Development Commitment Package documentation by clicking on the project name in the table shown on the course webpage http://greenbay.usc.edu/csci577/fall2008/site/projects/index.html 1. Operational Concept Description (especially Sections: 2. Shared Vision; 3. System Transformation). How well did the team capture your ideas of what the new system should do? Ranking: (1‐5) Comments: 2. Team helpfulness Did the team suggest new idea about your project? Did the team support you in any project complexity? Ranking : (1‐5) Comments: FOR THE NEXT ITEMS, base your responses on your interactions with the project team and their product. 3. Team Responsiveness: Was the team responsive to you and to your requirements? Did the team answer any questions you might have had? How successful was the requirements negotiation between you and your team? Ranking : (1‐5) Comment: 4. Project Results: How satisfied are you with the prototype? How well does the proposed system meet the need identified by your project? Ranking: (1‐5) Comment: 5. Project Future: Do you think this project should be carried forward for development in CS577b? Has the team adequately identified and managed the potential risks and complications associated with the project? Do you foresee difficulties in transitioning this project into the development and implementation phase? Ranking : (1‐5) Comment: 6. Team Communication: How effective was the team in communicating with you? Do you feel you had enough meetings with them? Did the team make effective use of your time and expertise? What means did you use to communicate?
39
Ranking : (1‐5) Comment: 7. Tools: Regarding software tools students used in the class 7a. Did the team share WinWin negotiation results with you? If so, how? Comment 7b. Did the team mention or describe any other software engineering tools? Comment: 8. Your Learning: Did you gain a better understanding of software engineering and information technology by participating in this project? Please provide specifics where possible. Ranking : (1‐5) Comment: 9. Suggestions about the course processes from your (a client's) perspective. Comment: 10. Overall Value: Did you feel your participation was worthwhile? Would you participate again? Did you find participation valuable as either a teaching or research activity? Ranking : (1‐5) Comment: If you have any other comments that you would like to offer, please feel free. We may get back to you in the future on these for clarification.
40
Appendix F1‐Client Feedback Form – CSCI577b 577B Client Evaluation Team # _______________________ Project ______________________ The items listed below will ask for either a ranking or a yes/no response. For rankings, use a 1‐5 scale where 1 is low and 5 is high. For yes/no items, p[lease indicate your choice. Additional comments will be very helpful for evaluation and planning purposes. Please use the questions in parenthesis as starting points for your comments. Documentation 1. User Manual: Rank your level of satisfaction with the User Manual. (Is it well written? Will it reasonably answer both user and administrator questions? Did your team ask you to evaluate the User Manual? Did they incorporate your comments?) Ranking: (1 to 5) Comments: 2. System Documents: Rank your level of satisfaction with the As Built system documents. Ranking: (1 to 5) Comments: Team Interaction 3. Team Responsiveness: How responsive was the team to your needs as a client? Did they address your project objectives and concerns? Ranking: (1 to 5) Comments: 4. Team Communication: How satisfied were you with the team’s communication with you? (Did they tell you what you needed to know when you needed to know it? Did they explain their work/the issues in terms you could understand?) Ranking: (1 to 5) Comments: System Preparation and Testing 5. How effective were the students at installing the software and providing any necessary set‐up and configuration support to enable you to get started? (Had they anticipated and prepared for any set‐up issues or problems?) Were the students responsive and effective in handling any configuration adjustments that might have been needed once you began using the system? Ranking: (1 to 5) Comments: 6. Did the students help to adequately prepare you to handle ongoing support and maintenance of the system? Yes/No: Comments: 7. Training: Did the team provide appropriate training on how to use the system? (How was this done?) Yes/No: Comments:
41
8. Training Quality: (Answer only if you received training for the system) – How adequate was your system training? Ranking: (1 to 5) Comments: 9A. Software Testing: Did the team ask you to test the final product? Yes/No: 9B. Software Testing: If yes, did you test all of the system’s features? Yes/No: 9C. Software Testing: How much time did you spend doing system testing? 10. Software Test Results: (Answer only if you participated in software testing) – Please rate how close the functions you used came to meeting your expectations. Did they work as you expected? Ranking: (1 to 5) Comments: Implementation 11. Is the system up and running in full release? or in a beta release? Is any additional programming required to achieve sufficient basic functionality for public release? Yes/No: Comments: 12. Hardware & Software: Do you need additional hardware and/or software to implement the system? Yes/No: Comments: 13. Other implementation issues: Are there any other issues which impact whether or not the system can be implemented? Yes/No: Comments: Overall Value 14. Rate how successful the product is, as delivered, at achieving your objectives and desired functionality. Has the team made the right choices and trade‐offs? Ranking: (1 to 5) Comments: 15. Rate the value / anticipated benefits of the product you’ve received. Is it a worth the time you’ve spent working with the team over the past 2 semesters? Will the product provide sufficient utility? Does it have long term potential or applicability beyond the need outlined in your original proposal? Ranking: (1 to 5) Comments: Summary 16. Your Learning: Did you learn anything new this semester? Yes/No: Comments:
42
17. Teaching: Did you feel you made a contribution to the education of graduate CSCI students at USC? Yes/No: Comments: 18. Research: Did you feel you made a research contribution by participating in this project? Yes/No: Comments: 19. Recommendation for Others: Would you recommend participation in future projects to your colleagues? (Are there specific projects, either your own or projects of your colleagues, which you would recommend for future courses?) Comments: 20. Feel free to provide any other comments or suggestions for future course improvement.
43
Appendix G – Qualitative Interview Form
Part 1: Information about COTS/ Services in your project Section 1: General info about your project 1. Name, team: 2. What is your client’s computer technology level
[ ] Low [ ] Medium [ ] High 3. What is your client’s role in this project (check all that apply)
[ ] End user [ ] Maintainer/ Administrator [ ] Policy Planner [ ] Acquirer [ ] Other, 4. What are the COTS/ Services you finally selected? 5. What are the functionalities of your selected COTS/Service? 6. What were on your choices? 7. What are the cost(s) of the selected C/S? 8. Does your selected C/S satisfy all the project goals? 9. If not, what are the part(s) that is left to develop? 10. Who introduced these C/Ss?
[ ] Client [ ] Development Team [ ] Other, ___________________________ 11. If client was not the one who introduced the C/S, what was his/her first reaction?
[ ] Oppose [ ] Hesitate [ ] Support [ ] Other, ___________________________________ 12. When did you introduce your C/S?
[ ] Exploration Phase [ ] Valuation Phase [ ] Foundations Phase 13. When did you finalize your C/S?
[ ] Exploration Phase [ ] Valuation Phase [ ] Foundations Phase 14. What were the process / steps in finalizing the C/S selection? Section 2: Criteria in selecting COTS/ Services 1. What kind of information you used to select the COTS/ Services? [ ] Core capabilities in OCD [ ] Business workflow in OCD [ ] Requirements in SSRD [ ] Use case in SSAD [ ] Current system infrastructure [ ] Proposed system infrastructure [ ] Prototype [ ] Business Case Analysis Anything else? 3. Where did you get information about these COTS/ Services? Section 3: Activities in each phase Section 3.1 Exploration Phase 1. Was C/S part of the project proposal? a. If yes, what did your client have any specific C/S in mind at the beginning? b. If no, did you plan to develop everything from scratch at the beginning, why? 2. When you read the project proposal, did you know that there is/ are possible C/Ss available for this project? 3. Any problem regarding C/S in this phase? Section 3.2 Valuation Phase 1. In this phase, did you switch to C/S team or still follow the traditional development guidelines? 2. Did you prioritize or reprioritize your criteria? 3. Did you build any prototype? 4. If your C/S is selected, did you tailor the C/S? 5. If your C/S is selected, did you write any glue code for the C/S? Section 3.3 Foundations Phase 1. In this phase, did you switch to C/S team or still follow the traditional development guidelines?
44
2. Did you reprioritize your criteria? 3. Did you drop any criteria? 4. Did you build any prototype? 5. If your C/S is selected, did you tailor the C/S? 6. If your C/S is selected, did you write any glue code for the C/S? Section 4: Hypothetical Situations Part 2: EPG for COTS/Services 1. Are there any activities/tasks that you perform but they are not listed/ explained in CBA Guidelines? 2. Is there any role(s) that should be added? 3. Is there any work product(s) that should be added? 4. Is there any guideline(s) that should be added?