COMPLEX ACQUISITION · 2017-04-19 · additional government organizations. This article provides a guide for today’s acquisition leaders to establish effective and prioritized requirements
Post on 11-Mar-2020
0 Views
Preview:
Transcript
Image designed by Diane Fleischer
COMPLEX ACQUISITION REQUIREMENTS ANALYSISUsing a Systems Engineering Approach
Col Richard M. Stuckey, USAF (Ret.), Shahram Sarkani, and Thomas A. Mazzuchi
The technology revolution over the last several decades has compounded system complexity with the integration of multispectral sensors and inter-active command and control systems, making requirements development more challenging for the acquisition community. The imperative to start programs right with effective requirements is becoming more critical. Research indicates the Department of Defense lacks consistent knowledge as to which attributes would best enable more informed trade-offs. This research examines prioritized requirement attributes to account for program complexities, using the expert judgement of a diverse and experienced panel of acquisition professionals from the Air Force, Army, Navy, industry, and additional government organizations. This article provides a guide for today’s acquisition leaders to establish effective and prioritized requirements for complex and unconstrained systems needed for informed trade-off decisions. The results found the key attribute for unconstrained systems is “achievable” and verified a list of seven critical attributes for complex systems.
DOI: https://doi.org/10.22594/dau.16-755.24.02 Keywords: Bradley-Terry methodology, complex systems, requirements attributes, system of systems, unconstrained systems
268 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
Recent Government Accountability Office (GAO) reports outline con-cerns with requirements development. One study found programs with unachievable requirements cause program managers to trade away per-formance and found that informed trade-offs between cost and capability establish better defined requirements (GAO, 2015a, 2015b). In another key report, the GAO noted that the Department of Defense could benefit from ranking or prioritizing requirements based on significance (GAO, 2011).
Establishing a key list of prioritized attributes that supports requirements development enables the assessment of program requirements and increases focus on priority attributes that aid in requirements and design trade-off decisions. The focus of this research is to define and prioritize requirements attributes that support requirements development across a spectrum of system types for decision makers. Some industry and government programs are becoming more connected and complex while others are geographically dispersed yet integrated, thus creating the need for more concentrated approaches to capture prioritized requirements attributes.
The span of control of the program manager can range from low programmatic authority to highly dependent systems control. For example, the program manager for a national emergency command and control center typically has low authority to influence cost, schedule, and performance at the local, state, and tribal level, yet must enable a broader, national unconstrained systems capability. On the opposite end of the spectrum are complex, dependent systems. The F-35 Joint Strike Fighter’s program manager has highly dependent control of that program, and the program is complex as DoD is building variants for the U.S. Air Force, Navy, and Marine Corps, as well as multi-ple foreign countries.
Complex and unconstrained sys-tems are becoming more prevalent. There needs to be increased focus on complex and unconstrained systems requirements attributes development and prioritization to develop a full range of dynamic requirements for decision makers. In our research, we use the terms
269Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
systems, complex systems, and unconstrained systems, and their associated attributes. All of these categories are explored, developed, and expanded with prioritized attributes. The terms systems and complex systems are used in the acquisition community today. We uniquely developed a new category called unconstrained systems and distinctively define complex systems as:
Unconstrained System
A collection of component systems, simple or complex, that is managed, operated, developed, funded, maintained and sustained independently of its overarching principal system that creates a new capability.
Complex System
A collection of large, multifaceted, and interrelated com-ponent systems that is dependent on the entirety of the principal system for management, operations, development, funding, maintenance, and sustainment. Complex systems are nondeterministic, adaptive, holistic, and have nonlinear interfaces between attributes.
We derived a common set of definitions for requirements, systems, uncon-strained systems, and complex systems using an exhaustive list from government, industry, and standards organizations. Using these definitions, we then developed and expanded requirements attributes to provide a select group of attributes for the acquisition community. Lastly, experts in the field prioritized the requirements attributes by their respective importance.
We used the Bradley-Terry (Bradley & Terry, 1952) methodology, as ampli-fied in Cooke (1991), to elicit and codify the expert judgment to validate the requirements attributes. This methodology, using a series of repeatable surveys with industry, government, and academic experts, applies expert judgment to validate and order requirements attributes, and to confirm the attributes lists are comprehensive. This approach provides an import-ant suite of valid and prioritized requirements attributes for systems, unconstrained systems, and complex systems for acquisition and systems engineering decision makers’ consideration when developing requirements and informed trade-offs.
270 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
Terms Defined and Attributes DerivedWe performed a literature review from a broad base of reference mate-
rial, reports, and journal articles from academia, industry, and government. Currently, a wide variety of approaches defines requirements and the various forms of systems. For this analysis, we settle on a single definition to com-plete our research. Using our definitions, we further derive the requirements attributes for systems, unconstrained systems, and complex systems (American National Standards Institute/Electronic Industries Alliance [ANSI/EIA], 1999; Ames et al., 2011; Butterfield, Shivananda, & Schwartz, 2009; Chairman, Joint Chiefs of Staff [CJCS], 2012; Corsello, 2008; Customs and Border Protection [CBP], 2011; Department of Defense [DoD], 2008, 2013; Department of Energy [DOE], 2002; Department of Homeland Security [DHS], 2010 [Pt. 1], 2011; Department of Transportation [DOT], 2007, 2009; Institute for Electrical and Electronics Engineers [IEEE], 1998a, 1998b; Internationa l Council on Systems Eng ineering [INCOSE], 2011; I nt er nat iona l Orga n i zat ion for St a nda rd i zat ion / I nt er nat iona l Electrotechnical Commission [ISO/IEC], 2008; International Organization for Standardization/International Electrotechnical Commission/Institute for Electrical and Electronics Engineers [ISO/IEC/IEEE], 2011; ISO/IEC/IEEE, 2015; Joint Chiefs of Staff [JCS], 2011; JCS, 2015; Keating, Padilla, & Adams, 2008; M. Korent (e-mail communication via Tom Wissink, January 13, 2015, Advancing Complex Systems Manager, Lockheed Martin); Madni & Sievers, 2013; Maier, 1998; National Aeronautics and Space Administration [NASA], 1995, 2012, 2013; Ncube, 2011; U.S. Coast Guard [USCG], 2013).
RequirementsLiterature research from government and standards organizations
reveals varying definitions for system requirements. In our study, we use the IEEE’s requirements definition that provides a broad, universal, and vetted foundation that can be applied to industry, government, and academia, and also aligns with DoD definitions (IEEE, 1998a; JCS, 2015).
In our study, we use the IEEE’s requirements definition that provides a broad, universal, and vetted foundation that can be applied to industry, government, and academia, and also aligns with DoD definitions.
271Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
Requirement
1. A condition or capability needed by a user to solve a problem or achieve an objective.
2. A condition or capability that must be met or possessed by a system or system component to satisfy a contract, stan-dard, specification, or other formally imposed document.
3. A document representation of a condition or capability as in definition 1) or 2).
272 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
SystemsThe definitions of systems are documented by multiple government
organizations at the national and state levels, and standards organiza-tions. Our literature review discovered at least 20 existing approaches to defining a system. For this research, we use a more detailed definition as presented by IEEE (1998a); based on our research, it aligns with DoD and federal approaches.
Systems
An interdependent group of people, objectives, and pro-cedures constituted to achieve defined objectives or some operational role by performing specified functions. A complete system includes all of the associated equipment, facilities, material, computer programs, firmware, technical documentation, services, and personnel required for opera-tions and support to the degree necessary for self-sufficient use in its intended environment.
Various authors and organizations have defined attributes to develop requirements for systems (Davis, 1993; Georgiadis, Mazzuchi, & Sarkani, 2012; INCOSE, 2011; Rettaliata, Mazzuchi, & Sarkani, 2014). Davis was one of the earliest authors to frame attributes in this manner, though his primary approach concentrated on software requirements. Subsequent to this, researchers have adapted and applied attributes more broadly for use with all systems, including software, hardware, and integration. In addi-tion, Rettaliata et al. (2014) provided a wide-ranging review of attributes for materiel and nonmateriel systems.
The attributes provided in Davis (1993) consist of eight attributes for content and five attributes for format. As a result of our research with government and industry, we add a ninth and critical content attribute of ‘achievable’ and expand the existing 13 definitions for clarity. INCOSE and IEEE denote the ‘achievable’ attribute, which ensures systems are attainable to be built and operated as specified (INCOSE, 2011; ISO/IEC/IEEE, 2011). The 14 requirements attributes, with our enhanced definitions, are listed in Table 1 (Davis, 1993; INCOSE, 2011; ISO/IEC/IEEE, 2011; Rettaliata et al., 2014).
273Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
TABLE 1. SYSTEM REQUIREMENTS ATTRIBUTES
Attribute Type DefinitionCorrect Content Correct if and only if every requirement stated
therein represents something required of the system to be built.
Unambiguous Content Unambiguous if and only if every requirement stated therein has only one interpretation and includes only one requirement (unique).
Complete Content Complete if it possesses these qualities:1. Everything it is supposed to do is included.2. Definitions of the responses of software to
all situations are included.3. All pages are numbered.4. No sections are marked “To be determined.”5. Is necessary
Verifiable Content Verifiable if and only if every requirement stated therein is verifiable.
Consistent Content Consistent if and only if (1) no requirement stated therein is in conflict with other preceding documents, and (2) no subset of requirements stated therein conflict.
Understand-able by Customer
Content Understandable by customer if there exists a complete unambiguous mapping between the formal and informal representations.
Achievable Content Achievable—the designer should have the expertise to assess the achievability of the requirements, including subcontractors, manufacturing, and customers/users within the constraints of the cost and schedule life cycle.
Design Independent
Content Design independent if it does not imply a specific architecture or algorithm.
Concise Content Concise if given two requirements for the same system, each exhibiting identical level of all previously mentioned attributes—shorter is better.
Modifiable Format Modifiable if its structure and style are such that any necessary changes to the requirement can be made easily, completely, and consistently.
Traced Format Traced if the origin of each of its requirements is clear.
Traceable Format Traceable if it is written in a manner that facilitates the referencing of each individual requirement stated therein.
274 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
TABLE 1. SYSTEM REQUIREMENTS ATTRIBUTES, CONTINUED
Attribute Type DefinitionAnnotated Format Annotated if there is guidance to the
development organization such as relative necessity (ranked) and relative stability.
Organized Format Organized if the requirements contained therein are easy to locate.
While there are many approaches to gather requirements attributes, for our research we use these 14 attributes to encompass and focus on software, hardware, interoperability, and achievability. These attributes align with government and DoD requirements directives, instructions, and guidebooks as well as the recent GAO report by DoD Service Chiefs, which stresses their concerns on achievability of requirements (GAO, 2015b). We focus our research on the nine content attributes. While the five format attributes are necessary, the nine content attributes are shown to be more central to ensuring quality requirements (Rettaliata et al, 2014).
Unconstrained SystemsThe acquisition and systems engineering communities have attempted
to define ‘system of systems’ for decades. Most definitions can be traced back to Mark W. Maier’s (1998) research, which provided an early definition
275Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
and set of requirements attributes. As programs became larger with more complexities and interdependencies, the definitions of system of systems expanded and evolved.
In some programs, the program manager’s governance authority can be low or independent, creating ‘unconstrained systems’—a term that while similar to the term system of systems, provides an increased focus on the challenges of program managers with low governance authority between a principal system and component systems. Unconstrained systems center on the relationship between the principal system and the component system, the management and oversight of the stakeholder involvement, and governance level of the program manager between users of the principal system and the component systems. This increased focus and perspective enables greater requirements development fidelity for unconstrained systems.
An example is shown in Figure 1 where a program manager of a national command and communications program can have limited governance authority to influence independent requirements on unconstrained systems with state and local stakeholders. Unconstrained systems do not explicitly depend on a principal system. When operating collectively, the component systems create a unique capability. In comparison to the broader definition for system of systems, unconstrained systems require a more concentrated approach and detailed understanding of the independence of systems under a program manager’s purview. We uniquely derive and define unconstrained systems as:
Unconstrained Systems
A collection of component systems, simple or complex, that is managed, operated, developed, funded, maintained, and sustained independently of its overarching principal system that creates a new capability.
The requirements attributes for unconstrained systems are identical to the attributes for systems as listed in Table 1. However, a collection of unconstrained systems that is performing against a set of requirements in conjunction with each other has a different capability and focus than a singular system, set of dependent systems, or a complex system. This perspective, though it shares a common set of attributes with a singular or simple system, can develop a separate and different set of requirements unique to an unconstrained system.
276 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
FIG
UR
E 1
. UN
CO
NST
RA
INE
D A
ND
CO
MP
LEX
SY
STE
MS
Princ
ipal
Syste
mPr
incipa
lSy
stem
Indep
ende
ntCo
mpo
nent
Syste
m
Indep
ende
ntCo
mpo
nent
Syste
m
Depe
nden
tCo
mpo
nent
Syste
m
Depe
nden
tCo
mpo
nent
Syste
m
Unco
nstra
ined S
yste
mCo
mplex
Syste
m
Gove
rnan
ceAu
thor
ity
EXAM
PLE
EXAM
PLE
Natio
nal O
pera
tions
& Co
mm
unica
tions
Cent
erDe
pend
ent
Com
pone
ntSy
stem
sTo
Spac
e Shu
ttle
Indep
ende
ntCo
mpo
nent
Syste
ms
Exte
rnal
Tank
Solid
Rock
et Bo
oste
rs
Orbit
er
Loca
l, Sta
te &
Triba
l La
w En
force
men
t
Loca
l & Tr
ibal F
ireDe
partm
ent
Loca
l Hos
pitals
Inter
natio
nal P
artn
ers,
Astro
naut
s & Tr
aining
Cong
ress
Exte
rnal
Focu
s
Spac
e Sta
tion
277Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
Complex SystemsThe systems engineering communities from industry and government
have long endeavored to define complex systems. Some authors describe attributes that complex systems demonstrate versus a singular definition. Table 2 provides a literature review of complex systems attributes.
TABLE 2. LITERATURE REVIEW OF COMPLEX SYSTEM ATTRIBUTES
Attribute DefinitionAdaptive Components adapt to changes in others, as well as to
changes in personnel, funding, and application, shift from being static to dynamic systems (Chittister & Haimes,2010; Glass et al., 2011; Svetinovic, 2013)
Aspirational To influence design, control, and manipulate complex systems to solve problems to predict, prevent or cause, and to define decision, robustness of decision, and enabling resilience (Glass et al., 2011; Svetinovic, 2013)
Boundary Liquidity
Complex systems do not have a well-defined boundary. The boundary and boundary criteria for complex systems are dynamic and must evolve with new understanding (Glass et al., 2011; Katina & Keating, 2014)
Contextual Dominance
A complex situation can exhibit contextual issues that can stem from differing managerial world views, and other nontechnical aspects stemming from the elicitation process (Katina & Keating, 2014)
Emergent Complex systems may exist in an unstable environment and be subject to emergent behavioral, structural, and interpretation patterns that cannot be known in advance and lie beyond the ability of requirements to effectively capture and maintain (Katina & Keating, 2014)
Environmental Exogenous components that affect or are affected by the engineering system; that which acts, grows, and evolves with internal and external components (Bartolomei, Hastings, de Nuefville, & Rhodes, 2012; Glass et al., 2011; Hawryszkiewycz, 2009)
Functional Range of fulfilling goals and purposes of the engineering system, ease of adding new functionality or ease of upgrading existing functionality, the goals and purposes of the engineering systems, ability to organize connections (Bartolomei et al., 2012; Hawryszkiewycz, 2009; Jain, Chandrasekaran, Elias, & Cloutier, 2008; Konrad & Gall, 2008)
278 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
TABLE 2. LITERATURE REVIEW OF COMPLEX SYSTEM ATTRIBUTES, CONTINUED
Attribute DefinitionHolistic Consider the whole of the system, consider the role of
the observer, and consider the broad influence of the system on the environment (Haber & Verhaegen, 2012; Katina & Keating, 2014; Svetinovic 2013)
Interdependen-cies
A number of systems are dependent on one another to produce the required results (Katina & Keating, 2014)
Multifinality Two seemingly identical initial complex systems can have different pathways toward different end states (Katina & Keating, 2014)
Process Processes and steps to perform tasks within the system, methodology framework to support and improve the analysis of systems, hierarchy of system requirements (Bartolomei et al., 2012; Haber & Verhaegen, 2012; Konrad & Gall, 2008; Liang, Avgeriou, He, & Xu, 2010)
Predictive Proactively analyze requirements arising due to the implementation of the system underdevelopment and the system’s interaction with the environment and other systems (Svetinovic, 2013)
Social Social network consisting of the human components and the relationships held among them; social network essential in supporting innovation in dynamic processes; centers on groups that can assume roles with defined responsibilities (Bartolomei et al., 2012; Hawryszkiewycz, 2009; Liang et al., 2010)
Technical Physical, nonhuman components of the system to include hardware, infrastructure, software, and information, complexity of integration technologies required to achieve system capabilities and functions (Bartolomei et al., 2012; Chittister & Haimes, 2010; Haber & Verhaegen, 2013; Jain et al., 2008)
Complex systems are large and multidimensional with interrelated dependent systems. They are challenged with dynamic, national-level, or international intricacies as social, political, environmental, and technical issues evolve (Bartolomei et al., 2012; Glass et al., 2011). Complex sys-tems with a human centric and nondeterministic focus are typically large national- and international-level systems or products. Noncomplex systems, or ‘systems’, do not have these higher order complexities and relationships. Based on our research with federal, DoD, and industry approaches, we uniquely define a complex system as:
279Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
Complex System
A collection of large, multifaceted, and interrelated com-ponent systems that is dependent on the entirety of the principal system for management, operations, development, funding, maintenance, and sustainment. Complex systems are nondeterministic, adaptive, holistic, and have nonlinear interfaces between attributes.
It can be argued that complex and unconstrained systems have similar properties; however, for our research we consider them distinct. Complex systems differ from unconstrained systems depending on whether the com-ponent systems within the principal system are dependent or independent of the principal system. These differences are shown in Figure 1. Our exam-ple is the ‘space shuttle’ in which the components of the orbiter, external tank, and solid rocket boosters are one dependent space shuttle complex system. For complex systems, the entirety of the principal system depends on component systems. Thus, the governance and stakeholders of the com-ponent systems depend on the principal system.
Complex systems have an additional level of integration with internal and external focuses as shown in Figure 2. Dependent systems within the inner complex systems boundary condition derive a set of requirements attributes that are typically more clear and precise. For our research, we use the attributes from systems, as shown in Table 2, to define internal requirements. Using the ‘space shuttle’ example, the internal requirements would focus on the dependent components of the orbiter, external tank, and solid rocket boosters.
Complex systems differ from unconstrained systems depending on whether the component systems within the principal system are dependent or independent of the principal system.
280 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
FIGURE 2. COMPLEX SYSTEMS INTERNAL AND EXTERNAL PERSPECTIVES
Complex System Boundary
Adaptive
Technical
Interdependence
Political
Holistic
Environmental Social
DependentSystem
DependentSystemDependent
System
(internal)
(external)
Complex systems have a strong external focus. As complex systems inter-face with their external sphere of influence, another set of requirements attributes is generated, as the outer complex boundary conditions become more qualitative than quantitative. When examining complex systems exter-nally, the boundaries are typically indistinct and nondeterministic. Using the ‘space shuttle’ example, the external focus could be Congress, the space station, the interface with internationally developed space station modules and international partners, training, management relations, and standards.
Using our definition of complex systems, we distinctly derive and define seven complex system attributes as shown in Table 3. The seven attributes (holistic, social, political, adaptable, technical, interdependent, and envi-ronmental) provide a key set of attributes that aligns with federal and DoD approaches to consider when developing complex external requirements. Together, complex systems with an external focus (Table 3) and an internal focus (Table 2) provide a comprehensive and complementary context to develop a complete set of requirements for complex systems.
281Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
TABLE 3. COMPLEX SYSTEMS EXTERNAL REQUIREMENTS ATTRIBUTES
Attribute DefinitionHolistic Holistic considers the following:
• Security and surety; scalability and openness; and legacy systems
• Timing of schedules and budgets• Reliability, availability, and maintainability• Business and competition strategies• Role of the observer, the nature of systems requirements,
and the influence of the system environment (Katina & Keating, 2014)
Social Social considers the following:• Local, state, national, tribal, international stakeholders• Demographics and culture of consumers; culture of
developing organization (Nescolarde-Selva & Uso-Demenech, 2012, 2013)
• Subcontractors, production, manufacturing, logistics, maintenance stakeholders
• Human resources for program and systems integration (Jain, 2008)
• Social network consisting of the human components and the relationships held among them (Bartolomei et al., 2011)
• Customer and social expectations and customer interfaces (Konrad & Gall, 2008)
• Uncertainty of stakeholders (Liang et al., 2010) • Use of Web 2.0 tools and technologies (e.g. wikis,
folksonomie, and ontologies) (Liang et al., 2010) • Knowledge workers’ ability to quickly change work
connections (Hawryszkiewycz, 2009)
Political Political considers the following:• Local, state, national, tribal, international political
circumstances and interests• Congressional circumstances and interests to include
public law and funding• Company, partner, and subcontractor political
circumstances and interests• Intellectual property rights, proprietary information, and
patents
Adaptable Adaptability considers the following:• Shifts from static to being adaptive in nature (Svetinovic,
2013) • System’s behavior changes over time in response to
external stimulus (Ames et al., 2011)• Components adapt to changes in other components, as
well as changes in personnel, funding, and application (Glass et al., 2011)
282 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
TABLE 3. COMPLEX SYSTEMS EXTERNAL REQUIREMENTS ATTRIBUTES, CONTINUED
Attribute DefinitionTechnical Technical considers the following:
• Technical readiness and maturity levels• Risk and safety• Modeling and simulation• Spectrum and frequency • Technical innovations (Glass et al., 2011)• Physical, nonhuman components of the system to include
hardware, software, and information (Bartolomei et al., 2011; Nescolarde-Selva & Uso-Demenech, 2012, 2013)
Interde-pendent
Interdependencies consider the following:• System and system components’ schedules for developing
components and legacy components• Product and production life cycles• Management of organizational relationships• Funding integration from system component sources• The degree of complication of a system or system
component, determined by such factors as the number of intricacy of interfaces, number and intricacy of conditional branches, the degree of nesting, and types of data structure (Jain et al., 2008)
• The integration of data transfers across multiple zones of systems and network integration (Hooper, 2009)
• Ability to organize connections and integration between system units and ability to support changed connections (Hawryszkiewycz, 2009)
• Connections between internal and external people, projects, and functions (Glass et al., 2011)
Environ-mental
Environmental considers the following:• Physical environment (e.g. wildlife, clean water protection)• Running a distributed environment by distributed teams
and stakeholders (Liang et al., 2010)• Supporting integration of platforms for modeling,
simulation, analysis, education, training, and collaboration (Glass et al., 2011)
MethodologyWe use a group of experts with over 25 years of experience to validate
our derived requirements attributes by using the expert judgment method-ology as originally defined in Bradley and Terry (1952) and later refined in Cooke (1991). We designed a repeatable survey that mitigated expert bias
283Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
using the pairwise comparison technique. This approach combines and elicits experts’ judgment and beliefs regarding the strength of requirements attributes.
Expert JudgmentExpert judgment has been used for decades to support and solve complex
technical problems. Commonly, expert judgment is used when substantial scientific uncertainty has an impact on a decision process (Cooke & Goossens, 2008). Thus, expert judgment allows researchers and communities of inter-est to reach rational consensus when there is scientific knowledge or process uncertainty (Cooke & Goossens, 2004). In addition, it is used to assess out-comes of a given problem by a group of experts within a field of research who have the requisite breadth of knowledge, depth of multiple experiences, and perspective. Based on such data, this research uses multiple experts from a broad range of backgrounds with in-depth experience in their respective fields to provide a diverse set of views and judgments.
Expert judgment has been adopted for numerous competencies to address contemporary issues such as nuclear applications, chemical and gas indus-try, water pollution, seismic risk, environmental risk, snow avalanches, corrosion in gas pipelines, aerospace, banking, information security risks, aircraft wiring risk assessments, and maintenance optimization (Clemen & Winkler, 1999; Cooke & Goossens, 2004; Cooke & Goossens, 2008; Goossens & Cooke, n.d.; Lin & Chih-Hsing, 2008; Lin & Lu, 2012; Mazzuchi, Linzey, & Bruning, 2008; Ryan, Mazzuchi, Ryan, Lopez de la Cruz, & Cooke, 2012; van Noortwijk, Dekker, Cooke, & Mazzuchi, 1992; Winkler, 1986). Various methods are employed when applying this expert judgment. Our method-ology develops a survey for our group of experts to complete in private and allows them to comment openly on any of their concerns.
Bradley-Terry MethodologyWe selected the Bradley-Terry expert judgment methodology (Bradley
& Terry, 1952) because it uses a proven method for pairwise comparisons to capture data via a survey from experts, and uses it to rank the selected
Commonly, expert judgment is used when substantial scientific uncertainty has an impact on a decision process.
284 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
requirements attributes by their respective importance. In addition to allow-ing pairwise comparisons of factors by multiple experts, which provides a relative ranking of factors, this methodology provides a statistical means for assessing the adequacy of individual expert responses, the agreement of experts as a group, and the appropriateness of the Bradley-Terry model.
The appropriateness of experts’ responses is determined by their number of circular triads. Circular triads, C(e), as shown in Equation (1), are when an expert (e) ranks one object, A, in a circular fashion such as A1 > A2, and A2 > A3, and A3 > A1 (Bradley & Terry, 1952; Mazzuchi et al., 2008).
(1)
The defined variables for the set of equations are:
e = expertt = number of objectsn = number of expertsA(1) ….. A(t) = objects to be compareda(i,e) = number of times expert, e, prefers A(i)R(i,e) = the rank of A(i) from expert, eV(i) = true values of the objectsV(i,e) = internal value of expert, e, for object i
The random variable C(e) defined in Equation (1) represents the number of circular triads produced when an expert provides an answer in a random fashion. The random variable has a distribution approximated by a chi-squared distribution, as shown in Equation (2), and can be applied to each expert to test the hypothesis that the expert answered randomly versus the alternative hypothesis that a certain preference was followed. Experts for whom this hypothesis cannot be rejected at the 5 percent significance level are eliminated from the study.
(2)
The coefficient of agreement, U, a measure of consistency of rankings from expert to expert (Bradley & Terry, 1952; Cooke, 1991; Mazzuchi et al., 2008), is defined in Equation (3).
(3)
C(e) = − ∑ t [a(i,e)− (t−1)]2 t(t2 - 1)24
12 i = 1
12
Cˇ(e) = + ( ) * [( )( )] − c(e) + ]t(t - 1) (t - 2)(t - 4)2
8t − 4
14
12
t3
( )( )e2
t2
2 ∑ t ∑ t
i = 1 j = 1, j ≠ i 2U = − 1
285Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
When the experts agree 100 percent, U obtains its maximum of 1. The coefficient of agreement distribution, U ', defines the statistic under the hypothesis that all agreements by experts are due to chance (Cooke, 1991; Mazzuchi et al., 2008). U ' has an approximate chi-squared distribution.
(4)
The sum of the ranks, R(i) is given by:
R(i) = ∑e R(i,e) (5)
The Bradley-Terry methodology uses a true scale value, Vi, to determine rankings, and they are solved iteratively (Cooke, 1991; Mazzuchi et al., 2008). Additionally, Bradley-Terry and Cooke (1991) define the factor, F, for the goodness of fit for a model as shown in Equation (6). To determine if the model is appropriate (Cooke, 1991; Mazzuchi et al., 2008), it uses a null hypothesis. This approach approximates a chi-squared distribution using (t-1)(t-2)/2 for degrees of freedom.
(6)
AnalysisSurvey participants were selected for their backgrounds in acquisition,
academia, operations, and logistics. For purposes of this study, each expert (except one) met the minimum threshold of 25 years of combined experience
∑ t ∑ t a(i,j) −
n − 2i = 1 j = 1, j ≠ i 2Uˇ = ( )( )( )( )n
2t2
12
n − 3n − 2
i = 1
i = 1
i = 1j = 1, j ≠ i
j = i + 1
F = 2{∑ t ∑ t a(i, j) ln(R(i, j)) − ∑ t a(i) ln(Vi ) + ∑ t ∑ t e ln(Vi
+ Vj )}
286 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
and training in their respective fields to qualify as an expert. Twenty-five years was the target selected for experts to have the experience, perspective, and knowledge to be accepted as an expert by the acquisition community at large, and to validate the requirements attributes.
Survey DesignThe survey contained four sections with 109 data fields. It was designed
to elicit impartial and repeatable expert judgment using the Bradley-Terry methodology to capture pairwise comparisons of requirements attributes. In addition to providing definitions of terms and requirements attributes,
a sequence randomizer was implemented providing ran-dom pairwise comparisons for each survey to ensure unbiased and impartial results. The survey and all required documentation were submitted and subse-quently approved by the Institutional Review Board in the Office of Human Research at The George Washington University.
Participant Demographic DataA total of 28 surveys was received and used to
perform statistical analysis from senior per-sonnel in government and industry. Of the
experts responding, the average experi-ence level was 33.9 years. Government
participants and industry partici-pants each comprise 50 percent
of the respondents. Table 4 shows a breakout of experi-
ence skill sets from survey participants with an average of
10.8 years of systems engineering and requirements experience. Participants show a
diverse grouping of backgrounds. Within the government participants’ group, they represent the Army, Navy, and Air Force,
and multiple headquarters organizations within the DoD, multiple orga-nizations within the DHS, NASA, and Federally Funded Research and Development Centers. Within the industry participants’ group, they rep-resent aerospace, energy, information technology, security, and defense sectors, and have experience in production, congressional staff, and small entrepreneurial product companies. We do not note any inconsistences within the demographic data. Thus, the demographic data verify a senior, experienced, and well-educated set of surveyed experts.
287Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
TABLE 4. EXPERTS’ EXPERIENCE (YEARS)
Average Minimum MaximumOverall 34.2 13 48
SubcategoriesProgram Management 9.8 3 30
Systems Engineering / Requirements 10.8 1 36
Operations 7.7 2 26
Logistics 6.1 1 15
Academic 6.7 1 27
Test and Evaluation 19.5 10 36
Science & Technology 8.3 4 15
Aerospace Marketing 4.0 4 4
Software Development 10.0 10 10
Congressional Staff 5.0 5 5
Contracting 13.0 13 13
System Concepts 8.0 8 8
Policy 4.0 4 4
Resource Allocation 3.0 3 3
Quality Assurance 3.0 3 3
Interpretation and ResultsRequirements attribute data were collected for systems, unconstrained
systems, and complex systems. When evaluating p-values, we consider data from individual experts to be independent between sections. The p-value is used to either keep or remove that expert from further analysis for the systems, unconstrained systems, and complex systems sections. As defined in Equation (2), we posit a null hypothesis at the 5 percent significance level for each expert. After removing individual experts due to failing the null hypothesis for random answers using Equation (2), we apply the statistic as shown in Equation (4) to determine if group expert agreement is due to chance at the 5 percent level of significance. A goodness-of-fit test, as defined in Equation (6), is performed on each overall combined set of expert data to confirm that the Bradley-Terry model is representative of the data set. A null hypothesis is successfully used at the 5 percent level of significance. After completing this analysis, we capture and analyze data for the overall set of combined experts. We perform additional analysis by dividing the experts into two subsets with backgrounds in government and industry.
288 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
While it can be reasoned that all attributes are important to developing sound, solid requirements, we contend requirements attribute prioritization helps to focus the attention and awareness on requirements development and informed design trade-off decisions. The data show the ranking of attributes for each category. The GAO reports outline the recommendation for ranking of requirements for decision makers to use in trade-offs (GAO, 2011, 2015). The data in all categories show natural breaks in requirements attribute rankings, which requirements and acquisition professionals can use to prioritize their concentration on requirements development.
Systems requirements attribute analysis. The combined expert data and the subsets of government and industry experts with the associated 90 percent confidence intervals are shown in Figures 3 and 4. They show the values of the nine attributes, which provides their ranking.
FIGURE 3. SYSTEM REQUIREMENTS ATTRIBUTE RANKINGS OF ALL EXPERTS WITH 90% CONFIDENCE INTERVALS
0.3500
0.3000
0.2500
0.2000
0.1500
0.1000
0.0500
0.0000
All Experts (n = 25)
Systems Requirements Attributes
Value
(Ran
king)
Understandableby Customer
Achievable DesignIndependent
ConciseConsistentVerifiableCompleteUnambiguousCorrect
289Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
FIGURE 4. SYSTEM REQUIREMENTS ATTRIBUTE RANKINGS OF GOVERNMENT AND INDUSTRY EXPERTS WITH 90%
CONFIDENCE INTERVALS
0.4500
0.4000
0.3500
0.3000
0.2500
0.2000
0.1500
0.1000
0.0500
0.0000
Systems Requirements Attributes
Understandableby Customer
Achievable DesignIndependent
ConciseConsistentVerifiableCompleteUnambiguousCorrect
Value
(Ran
king)
Government Experts (n = 12) Industry Experts (n = 13)
Overall, the systems requirements attribute values show the top-tier attri-butes are achievable and correct, while the bottom-tier attributes are design-independent and concise. This analysis is consistent between the government and industry subsets of experts as shown in Figure 4.
The 90 percent confidence intervals of all experts and subgroups over-lap, which provide correlation to the data and reinforce the validity of the attribute groupings. This value is consistent with industry experts and government experts. From Figure 4, the middle-tier attributes from govern-ment experts are more equally assessed between values of 0.0912 and 0.1617. Industry experts, along with the combination of all experts, show a noticeable breakout of attributes at the 0.1500 value, which proves the top grouping of systems requirements attributes to be achievable, correct, and verifiable.
Unconstrained requirements attribute analysis. The overall expert data, along with subgroups for government and industry experts with the associated 90 percent confidence intervals for unconstrained systems, are
290 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
shown in Figures 5 and 6. This section has the strongest model goodness-of-fit data, with a null successfully used at less than a 1 percent level of significance as defined in Equation (6).
FIGURE 5. UNCONSTRAINED SYSTEMS REQUIREMENTS ATTRIBUTE RANKINGS FOR ALL EXPERTS WITH 90% CONFIDENCE INTERVALS
0.3500
0.3000
0.2500
0.2000
0.1500
0.1000
0.0500
0.0000
All Experts (n = 25)
Unconstrained Systems Requirements Attributes
Value
(Ran
king)
Understandableby Customer
Achievable DesignIndependent
ConciseConsistentVerifiableCompleteUnambiguousCorrect
291Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
FIGURE 6. UNCONSTRAINED SYSTEMS REQUIREMENTS ATTRIBUTE RANKINGS OF GOVERNMENT AND INDUSTRY EXPERTS WITH 90%
CONFIDENCE INTERVALS
0.4500
0.4000
0.3500
0.3000
0.2500
0.2000
0.1500
0.1000
0.0500
0.0000
Unconstrained Systems Requirements Attributes
Understandableby Customer
Achievable DesignIndependent
ConciseConsistentVerifiableCompleteUnambiguousCorrect
Value
(Ran
king)
Government Experts (n = 13) Industry Experts (n = 12)
As indicated in Figure 5, the overall top-tier requirements attributes are achievable and correct. These data correlate with the government and industry expert subgroups in Figure 6. The 90 percent confidence intervals of all experts and subgroups overlap, which validate and provide consistency of attribute groupings between all experts and subgroups. The bottom-tier attributes are design-independent and concise, and are consistent across all analysis categories. The middle tier, unambiguous, complete, verifiable, consistent, and understandable by the customer, is closely grouped together across all subcategories. Overall, the top tier of attributes by all experts remains as achievable with a value of 0.2460 and correct with a value of 0.1862. There is a clear break in attribute values at the 0.1500 level.
Complex requirements attribute analysis. The combined values for com-plex systems by all experts and subgroups are shown in Figures 7 and 8 with a 90 percent confidence interval and provide the values of the seven attributes.
292 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
FIGURE 7. COMPLEX SYSTEMS REQUIREMENTS ATTRIBUTE RANKINGS FOR ALL EXPERTS WITH 90% CONFIDENCE INTERVALS
0.4500
0.4000
0.3500
0.3000
0.2500
0.2000
0.1500
0.1000
0.0500
0.0000
All Experts (n = 25)
Complex Systems Requirements Attributes
Value
(Ran
king)
Interdependent EnvironmentalTechnicalAdaptablePoliticalSocialHolistic
293Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
FIGURE 8. COMPLEX SYSTEMS REQUIREMENTS ATTRIBUTES FOR GOVERNMENT AND INDUSTRY EXPERTS WITH 90%
CONFIDENCE INTERVALS
0.4500
0.4000
0.3500
0.3000
0.2500
0.2000
0.1500
0.1000
0.0500
0.0000
Complex Systems Requirements Attributes
Value
(Ran
king)
Government Experts (n = 13) Industry Experts (n = 12)
Interdependent EnvironmentalTechnicalAdaptablePoliticalSocialHolistic
The 90 percent confidence intervals of all experts and subgroups overlap, confirming the consistency of the data and strengthening the validity of all rankings between expert groups. Data analysis, as shown in Figure 7, shows a group of four top requirements attributes for complex systems: technical, interdependent, holistic, and adaptable. These top four attributes track with the subsets of government and industry experts, as shown in Figure 8. In addition, these top groupings of attributes are all within the 90 percent confidence interval of one another; however, the attribute values within these groupings differ.
Data conclusions. The data from Figures 3–8 show consistent agreement between government, industry, and all experts. Figure 9 shows the com-bined values with a 90 percent confidence interval for all 28 experts across systems, unconstrained systems, and complex systems. Between systems and unconstrained systems, the experts’ rankings are similar, though the values differ. The achievable attribute for systems and unconstrained sys-tems has the highest value in the top tier of attribute groups.
294 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
FIGURE 9. COMPARISON OF REQUIREMENTS ATTRIBUTES ACROSS SYSTEMS, UNCONSTRAINED SYSTEMS, AND COMPLEX SYSTEMS
WITH 90% CONFIDENCE INTERVALS
0.4500
0.4000
0.3500
0.3000
0.2500
0.2000
0.1500
0.1000
0.0500
0.0000
Systems Unconstrained Systems Complex Systems
Understandable by Cu
stomer
Achievable
Design Independent
ConciseHolist
icSocial
Political
Adaptable
Technical
Interdependent
Environmental
Consistent
Verifiable
Complete
Unambiguous
Correct
Systems and Unconstrained Systems Requirements Attributes
Complex External Requirements Attributes
Our literature research revealed this specific attribute—achievable—to be a critical attribute for systems and unconstrained systems. Moreover, experts further validate this result in the survey open response sections. Experts state, “Achievability is the top priority” and “You ultimately have to achieve the system so that you have something to verify.” Additionally, experts had the opportunity to comment on the completeness of our requirements attributes in the survey. No additional suggestions were submitted, which further confirms the completeness and focus of the attribute groupings.
While many factors influence requirements and programs, these data show the ability of management and engineering to plan, execute, and make pro-grams achievable within their cost and schedule life cycle is a top priority, regardless of whether the systems are simple or unconstrained. For com-plex systems, experts clearly value technical, interdependent, holistic, and adaptable as their top priorities. These four attributes are critical to create achievable, successful programs across very large programs with multiple
295Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
interfaces. Finally, across all systems types, the requirements attributes provide a validated and comprehensive approach to develop prioritized, effective, and accurate requirements.
Conclusions, Limitations, and Future WorkWith acquisition programs becoming more geographically dispersed
yet tightly integrated, the challenge to capture complex and unconstrained systems requirements early in the system life cycle is crucial for program success. This study examined previous requirements attributes research and expanded approaches for the acquisition community’s consideration when developing a key set of requirements attributes. Our research cap-tured a broad range of definitions for key requirements development terms, refined the definitions for clarity, and subsequently derived vital require-ments attributes for systems, unconstrained systems, and complex systems. Using a diverse set of experts, it provided a validated and prioritized set of requirements attributes.
These validated and ranked attributes provide an important foundation and significant step forward for the acquisition community’s use of a pri-oritized set of attributes for decision makers. This research provides valid requirements attributes for unconstrained and complex systems as new, focused approaches for developing sound requirements that can be used in making requirements and design trade-off decisions. It provides a compel-ling rationale and an improved approach for the acquisition community to channel and tailor their focus and diligence, and thereby generate accurate, prioritized, and effective requirements.
Our research was successful in validating attributes for the acquisition community; however, there are additional areas to continue this research. The Unibalance-11 software, which is used to determine the statistical information for pairwise comparison data, does not accommodate weight-ing factors of requirements attributes or experts. Therefore, this analysis only considers the attributes and experts equally. Future research could expand this approach to allow for various weighting of key inputs such as attributes and experts to provide greater fidelity. This expansion would determine the cause and effect of weighting on attribute rankings. A key finding in this research is the importance of the achievable attribute. We recommend additional research to further define and characterize this vital attribute. We acknowledge that complex systems, their definitions, and link-ages to other factors are embryonic concepts in the systems engineering, program management, and operational communities. As a result, we rec-ommend further exploration of developing complex systems requirements.
296 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
ReferencesAmes, A. L., Glass, R. J., Brown, T. J., Linebarger, J. M., Beyeler, W. E., Finley, P. D., &
Moore, T. W. (2011). Complex Adaptive Systems of Systems (CASoS) engineering framework (Version 1.0). Albuquerque NM: Sandia National Laboratories.
ANSI/EIA. (1999). Processes for engineering a system (Report No. ANSI/EIA-632-1998). Arlington, VA: Author.
Bartolomei, J. E., Hastings, D. E., de Nuefville, R., & Rhodes, D. H. (2012). Engineering systems multiple-domain matrix: An organizing framework for modeling large-scale complex systems. Systems Engineering, 15(1), 41–61.
Bradley, R. A., & Terry, M. E. (1952). Rank analysis of incomplete block designs I: The method of paired comparisons. Biometrika, 39(3-4), 324–345.
Butterfield, M. L., Shivananda, A., & Schwarz, D. (2009). The Boeing system of systems engineering (SOSE) process and its use in developing legacy-based net-centric systems of systems. Proceedings of National Defense Industrial Association (NDIA) 12th Annual Systems Engineering Conference (pp. 1–20), San Diego, CA.
CBP. (2011). Office of Technology Innovation and Acquisition requirements handbook. Washington, DC: Author.
Chittister, C., & Haimes, Y. Y. (2010). Harmonizing High Performance Computing (HPC) with large-scale complex systems in computational science and engineering. Systems Engineering, 13(1), 47–57.
CJCS. (2012). Joint capabilities integration and development system (CJCSI 3170). Washington, DC: Author.
Clemen, R. T., & Winkler, R. L. (1999). Combining probability distributions from experts in risk analysis. Risk Analysis, 19(2), 187–203.
Cooke, R. M. (1991). Experts in uncertainty: Opinion and subjective probability in science. New York, NY: Oxford University Press.
Cooke, R. M., & Goossens, L. H. J. (2004, September). Expert judgment elicitation for risk assessments of critical infrastructures. Journal of Risk, 7(6), 643–656.
Cooke, R. M., & Goossens, L. H. J. (2008). TU Delft expert judgment data base. Reliability Engineering and System Safety, 93(5), 657–674.
Corsello, M. A. (2008). System-of-systems architectural considerations for complex environments and evolving requirements. IEEE Systems Journal, 2(3), 312–320.
Davis, A. M. (1993). Software requirements: Objects, functions, and states. Upper Saddle River, NJ: Prentice-Hall PTR.
DHS. (2010). DHS Systems Engineering Life Cycle (SELC). Washington, DC: Author.DHS. (2011). Acquisition management instruction/guidebook (DHS Instruction Manual
102-01-001). Washington, DC: DHS Under Secretary for Management.DoD. (2008). Systems engineering guide for systems of systems. Washington, DC:
Office of the Under Secretary of Defense (Acquisition, Technology and Logistics), Systems and Software Engineering.
DoD. (2013). Defense acquisition guidebook. Washington, DC: Office of the Under Secretary of Defense (Acquisition, Technology and Logistics).
DOE. (2002). Systems engineering methodology (Version 3). Washington, DC: Author.DOT. (2007). Systems engineering for intelligent transportation systems (Version 2.0).
Washington, DC: Federal Highway Administration.DOT. (2009). Systems engineering guidebook for intelligent transportation systems
(Version 3.0). Washington, DC: Federal Highway Administration.
297Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
GAO. (2011). DoD weapon systems: Missed trade-off opportunities during requirements reviews (Report No. GAO-11-502). Washington, DC: Author.
GAO. (2015a). Defense acquisitions: Joint action needed by DoD and Congress to improve outcomes (Report No. GAO-16-187T). Testimony Before the Committee on Armed Services, U.S. House of Representatives (testimony of Paul L. Francis). Washington, DC: Author.
GAO. (2015b). Defense acquisition process: Military service chiefs’ concerns reflect need to better define requirements before programs start (Report No. GAO-15-469). Washington, DC: Author.
Georgiadis, D. R., Mazzuchi, T. A., & Sarkani, S. (2012). Using multi criteria decision making in analysis of alternatives for selection of enabling technology. Systems Engineering. Wiley Online Library. doi: 10.1002/sys.21233
Glass, R. J., Ames, A. L., Brown, T. J., Maffitt, S. L., Beyeler, W. E., Finley, P. D., … Zagonel, A. A. (2011). Complex Adaptive Systems of Systems (CASoS) engineering: Mapping aspirations to problem solutions. Albuquerque, NM: Sandia National Laboratories.
Goossens, L. H. J., & Cooke, R. M. (n.d.). Expert judgement—Calibration and combination (Unpublished manuscript). Delft University of Technology, Delft, The Netherlands.
Haber, A., & Verhaegen, M. (2013). Moving horizon estimation for large-scale interconnected systems. IEEE Transactions on Automatic Control, 58(11), 2834–2847.
Hawryszkiewycz, I. (2009). Workspace requirements for complex adaptive systems. Proceedings of the IEEE 2009 International Symposium on Collaborative Technology and Systems (pp. 342–347), May 18-22, Baltimore, MD. doi: 10.1109/CTS.2009.5067499
Hooper, E. (2009). Intelligent strategies for secure complex systems integration and design, effective risk management and privacy. Proceedings of the 3rd Annual IEEE International Systems Conference (pp. 1–5), March 23–26, Vancouver, Canada.
IEEE. (1998a). Guide for developing system requirements specifications. New York, NY: Author.
IEEE. (1998b). IEEE recommended practice for software requirements specifications. New York, NY: Author.
INCOSE. (2011). Systems engineering handbook: A guide for system life cycle processes and activities. San Diego, CA: Author.
ISO/IEC. (2008). Systems and software engineering—Software life cycle processes (Report No. ISO/IEC 12207). Geneva, Switzerland: ISO/IEC Joint Technical Committee.
ISO/IEC/IEEE. (2011). Systems and software engineering—Life cycle processes—Requirements engineering (Report No. ISO/IEC/IEEE 29148). New York, NY: Author.
ISO/IEC/IEEE. (2015). Systems and software engineering—System life cycle processes (Report No. ISO/IEC/IEEE 15288). New York, NY: Author.
Jain, R., Chandrasekaran, A., Elias, G., & Cloutier, R. (2008). Exploring the impact of systems architecture and systems requirements on systems integration complexity. IEEE Systems Journal, 2(2), 209–223.
298 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
JCS. (2011). Joint operations (Joint Publication [JP] 3.0). Washington, DC: Author.JCS. (2015). Department of Defense dictionary of military and associated terms (JP
1-02). Washington, DC: Author.Katina, P. F., & Keating, C. B. (2014). System requirements engineering in complex
situations. Requirements Engineering, 19(1), 45–62.Keating, C. B., Padilla, J. A., & Adams, K. (2008). System of systems engineering
requirements: Challenges and guidelines. Engineering Management Journal, 20(4), 24–31.
Konrad, S., & Gall, M. (2008). Requirements engineering in the development of large-scale systems. Proceedings of the 16th IEEE International Requirements Engineering Conference (pp. 217–221), September 8–12, Barcelona-Catalunya, Spain.
Liang, P., Avgeriou, P., He, K., & Xu, L. (2010). From collective knowledge to intelligence: Pre-requirements analysis of large and complex systems. Proceedings of the 2010 International Conference on Software Engineering (pp. 26-30), May 2-8, Capetown, South Africa.
Lin, S. W., & Chih-Hsing, C. (2008). Can Cooke’s model sift out better experts and produce well-calibrated aggregated probabilities? Proceedings of 2008 IEEE International Conference on Industrial Engineering and Engineering Management (pp. 425–429).
Lin, S. W., & Lu, M. T. (2012). Characterizing disagreement and inconsistency in experts' judgment in the analytic hierarchy process. Management Decision, 50(7), 1252–1265.
Madni, A. M., & Sievers, M. (2013). System of systems integration: Key considerations and challenges. Systems Engineering, 17(3), 330–346.
Maier, M. W. (1998). Architecting principles for systems-of systems. Systems Engineering, 1(4), 267–284.
Mazzuchi, T. A., Linzey, W. G., & Bruning, A. (2008). A paired comparison experiment for gathering expert judgment for an aircraft wiring risk assessment. Reliability Engineering & System Safety, 93(5), 722–731.
Meyer, M. A., & Booker, J. M. (1991). Eliciting and analyzing expert judgment: A practical guide. London: Academic Press Limited.
NASA. (1995). NASA systems engineering handbook. Washington, DC: Author.NASA. (2012). NASA space flight program and project management requirements.
NASA Procedural Requirements. Washington, DC: Author.NASA. (2013). NASA systems engineering processes and requirements. NASA
Procedural Requirements. Washington, DC: Author.Ncube, C. (2011). On the engineering of systems of systems: Key challenges for the
requirements engineering community! Proceedings of International Workshop on Requirements Engineering for Systems, Services, and Systems-of-Systems (RESS), held in conjunction with the International Requirements Engineering Conference (RE11), August 29–September 2, Trento, Italy.
Nescolarde-Selva, J. A., & Uso-Donenech, J. L. (2012). An introduction to alysidal algebra (III). Kybernetes, 41(10), 1638–1649.
Nescolarde-Selva, J. A., & Uso-Domenech, J. L. (2013). An introduction to alysidal algebra (V): Phenomenological components. Kybernetes, 42(8), 1248–1264.
Rettaliata, J. M., Mazzuchi, T. A., & Sarkani, S. (2014). Identifying requirement attributes for materiel and non-materiel solution sets utilizing discrete choice models. Washington, DC: The George Washington University.
299Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
Ryan, J. J., Mazzuchi, T. A., Ryan, D. J., Lopez de la Cruz, J., & Cooke, R. (2012). Quantifying information security risks using expert judgment elicitation. Computer & Operations Research, 39(4), 774–784.
Svetinovic, D. (2013). Strategic requirements engineering for complex sustainable systems. Systems Engineering, 16(2), 165–174.
van Noortwijk, J. M., Dekker, R., Cooke, R. M., & Mazzuchi, T. A. (1992, September). Expert judgment in maintenance optimization. IEEE Transactions on Reliability, 41(3), 427–432.
USCG. (2013). Capability management. Washington, DC: Author.Winkler, R. L. (1986). Expert resolution. Management Science, 32(3), 298–303.
300 Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
Complex Acquisition Requirements Analysis http://www.dau.mil
Author Biographies
Col Richard M. Stuckey, USAF (Ret.), is a senior scientist with ManTech supporting U.S. Customs and Border Protection. Col Stuckey holds a BS in Aerospace Engineering from the University of Michigan, an MS in Systems Management from the University of Southern California, and an MS in Mechanical Engineering from Louisiana Tech University. He is currently pursuing a Doctor of Philosophy degree in Systems Engineering at The George Washington University.
(E-mail address: richstuckey@gwu.edu)
Dr. Shahram Sarkani is professor of Engineer-ing Management and Systems Engineering (EMSE), and director of EMSE Off-Campus Programs at The George Washington University. He designs and administers graduate programs that enroll over 1,000 students across the United States and abroad. Dr. Sarkani holds a BS and MS in Civil Engineering from Louisiana State University and a PhD in Civil Engineering from Rice University. He is also credentialed as a Professional Engineer.
(E-mail address: donald.l.washabaugh.ctr@mail.mil )
301Defense ARJ, April 2017, Vol. 24 No. 2 : 266–301
April 2017
Author Biographies
Col Richard M. Stuckey, USAF (Ret.), is a senior scientist with ManTech supporting U.S. Customs and Border Protection. Col Stuckey holds a BS in Aerospace Engineering from the University of Michigan, an MS in Systems Management from the University of Southern California, and an MS in Mechanical Engineering from Louisiana Tech University. He is currently pursuing a Doctor of Philosophy degree in Systems Engineering at The George Washington University.
(E-mail address: richstuckey@gwu.edu)
Dr. Shahram Sarkani is professor of Engineer-ing Management and Systems Engineering (EMSE), and director of EMSE Off-Campus Programs at The George Washington University. He designs and administers graduate programs that enroll over 1,000 students across the United States and abroad. Dr. Sarkani holds a BS and MS in Civil Engineering from Louisiana State University and a PhD in Civil Engineering from Rice University. He is also credentialed as a Professional Engineer.
(E-mail address: donald.l.washabaugh.ctr@mail.mil )
Dr. Thomas A . Mazzuchi is professor of E n g i ne er i n g M a n a gem ent a n d S y s t em s Engineering at The George Washington University. His research interests include reliability, life testing design and inference, maintenance inspection policy analysis, and expert judgment in risk analysis. Dr. Mazzuchi holds a BA in Mathematics from Gettysburg College, and an MS and DSC in Operations Research from The George Washington University.
(E-mail address: mazzu@gwu.edu)
top related