Top Banner
This version is available at https://doi.org/10.14279/depositonce-9772.2 Copyright applies. A non-exclusive, non-transferable and limited right to use is granted. This document is intended solely for personal, non-commercial use. Terms of Use © owner/authors 2020, publication rights licensed to ACM. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive version was published in IEEE/ ACM 42nd International Conference on Software Engineering Workshops (ICSEW’20), ACM ISBN 978-1-4503-7963-2, https://doi.org/10.1145/3387940.3392162 Schlutter, Aaron; Vogelsang, Andreas (2020): Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph. In: First International Workshop on Knowledge Graph for Software Engineering. https://doi.org/10.1145/3387940.3392162 Aaron Schlutter, Andreas Vogelsang Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Accepted manuscript (Postprint) Conference paper |
8

Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

Jul 27, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

This version is available at https://doi.org/10.14279/depositonce-9772.2

Copyright applies. A non-exclusive, non-transferable and limited right to use is granted. This document is intended solely for personal, non-commercial use.

Terms of Use

© owner/authors 2020, publication rights licensed to ACM. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive version was published in IEEE/ACM 42nd International Conference on Software Engineering Workshops (ICSEW’20), ACM ISBN 978-1-4503-7963-2, https://doi.org/10.1145/3387940.3392162 Schlutter, Aaron; Vogelsang, Andreas (2020): Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph. In: First International Workshop on Knowledge Graph for Software Engineering. https://doi.org/10.1145/3387940.3392162

Aaron Schlutter, Andreas Vogelsang

Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph

Accepted manuscript (Postprint)Conference paper |

Page 2: Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

Knowledge Extraction from Natural Language Requirementsinto a Semantic Relation Graph

Aaron SchlutterAndreas Vogelsang

[email protected]@tu-berlin.deTechnische Universität Berlin

Berlin, Germany

ABSTRACTKnowledge extraction and representation aims to identify informa-tion and to transform it into a machine-readable format. Knowledgerepresentations support Information Retrieval tasks such as search-ing for single statements, documents, or metadata. Requirementsspecifications of complex systems such as automotive softwaresystems are usually divided into different subsystem specifications.Nevertheless, there are semantic relations between individual doc-uments of the separated subsystems, which have to be consideredin further processes (e.g. dependencies). If requirements engineersor other developers are not aware of these relations, this can leadto inconsistencies or malfunctions of the overall system. Therefore,there is a strong need for tool support in order to detects semanticrelations in a set of large natural language requirements specifica-tions. In this work we present a knowledge extraction approachbased on an explicit knowledge representation of the content ofnatural language requirements as a semantic relation graph. Ourapproach is fully automated and includes an NLP pipeline to trans-form unrestricted natural language requirements into a graph. Wesplit the natural language into different parts and relate them toeach other based on their semantic relation. In addition to semanticrelations, other relationships can also be included in the graph. Weenvision to use a semantic search algorithm like spreading acti-vation to allow users to search different semantic relations in thegraph.

CCS CONCEPTS• Information systems → Document representation; Infor-mation extraction; Language models.KEYWORDSknowledge extraction, requirement engineering, natural languageprocessing, semantic relation graph, spreading activation

ACM Reference Format:Aaron Schlutter and Andreas Vogelsang. 2020. Knowledge Extraction fromNatural Language Requirements into a Semantic RelationGraph. In IEEE/ACM42nd International Conference on Software EngineeringWorkshops (ICSEW’20),

Publication rights licensed to ACM. ACM acknowledges that this contribution wasauthored or co-authored by an employee, contractor or affiliate of a national govern-ment. As such, the Government retains a nonexclusive, royalty-free right to publish orreproduce this article, or to allow others to do so, for Government purposes only.ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea© 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM.ACM ISBN 978-1-4503-7963-2/20/05. . . $15.00https://doi.org/10.1145/3387940.3392162

May 23–29, 2020, Seoul, Republic of Korea. ACM, New York, NY, USA, 7 pages.https://doi.org/10.1145/3387940.3392162

1 INTRODUCTIONComplex systems are often developed by distributed teams or evenacross several companies (e.g., suppliers) that deliver subsystems,which are finally integrated into a product. One part of Require-ments Engineering (RE) is to specify the requirements for the sub-systems, i.e., write them down for further system developmentprocess. Due to the heterogeneity of the subsystems and their stake-holders, requirements are often spread across several specifications,which contain a set of natural language requirements [8]. A com-mon approach to handle the amount of requirements specificationsis to organize them in simple structures like documents, paragraphs,and folders. Since specifications are mostly written in natural lan-guage, they contain a lot of semantic content that is not formallystated but important for many engineering tasks.

Such informally noted information may be needed by differentstakeholders, e.g., they need to know, how two subsystems interactwith each other or if there are other specifications that relate totheir current work-in-progress requirement. Several studies haveshown that in such distributed development contexts, there is a highchance of developers being unaware of dependencies and importantinformation from related subsystems [6, 27]. The correspondingtask that may support developers is called Information Retrieval(IR). There are various information needs in RE, where typicallya requirements engineer formulates a search query and a systemsuggests relevant results (called targets). A generic setup for such IRsystems is to extract knowledge from the requirements, representit in a knowledge base and provide a search algorithm to execute aquery and show the results.

In order to obtain a comprehensive overview of the requirements,the statements of the semantic content must be related to each other.State-of-the-art Information Retrieval approaches use algebraic IRmodels (e.g., vector space models, Latent Semantic Indexing) orprobabilistic models (e.g., Latent Dirichlet Allocation) [2]. Morerecently, machine-learning approaches have also been applied suc-cessfully [21]. While most approaches only consider single words orsmall parts of natural language, they rely on the distributional hy-pothesis [25] to compare semantic similarity and do not recognizesemantic statements with their relations. Another disadvantageof such models is, that their knowledge base is often specific forcertain kind of search queries and therefore not or hardly reusablefor other IR tasks.

Page 3: Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea Aaron Schlutter and Andreas Vogelsang

We follow a different approach on an explicit model of the knowl-edge represented in unrestricted NL requirements.We use a pipelineto translate NL requirements automatically into a semantic relationgraph that encodes the terms as vertices and edges. We plan touse the graph with a spreading activation algorithm as a basis forvarious semantic searches (e.g. full text search, trace link recovery).

For analysis, we applied the pipeline to 7 datasets from differentdomains, with different sizes, and compared two of them withsimpler knowledge graphs that we extracted in previous work.Compared to our previous study, we show, that we have fixed someshortcomings by adding more semantic content and consideringmore semantic relations.

2 BACKGROUND2.1 Natural Language RequirementsRequirements specifications are often stated through unstructurednatural language (NL) because it is equally available to all stake-holders. While there is usually no consistently applied pattern ortemplate, requirements are expressed via technical terminologywhich is characterized by a reduced choice of words and the use ofdomain specific terms [9].

NL requirements specifications are typically arranged in docu-ments as a hierarchy structure. A document consists of multi-levelparagraphs, each having a headline and specifications. To distin-guish each specification from each other, they have their own iden-tifier. In practice, complex systems are often developed in terms ofseveral loosely coupled subsystems, each of them stated in a sin-gle document. Requirements are spread over several specifications,so that a user can only understand them as a whole through thecontext.

2.2 Knowledge RepresentationsKnowledge representation focuses on the depiction of informationthat enables computers to solve complex problems. Borgida et al. [3]already noted in 1985 that knowledge representation is the basisfor requirements engineering.

Dermeval et al. [8] report on the use of ontologies in require-ments engineering in their systematic literature review. They re-viewed 67 publications from academic and industrial applicationcontexts dealing with different types of requirements. While only34% reused existing ontologies, most of them specified their ownontology. The largest number of publications rely on textual re-quirements as the RE modeling style, especially in the specificationphase.

Robeer et al. [24] automatically derive conceptual models fromuser stories. The models enable discussion between stakeholdersand show promising accuracy results (precision and recall between80 and 92%). They use heuristics to analyze the user stories due tosemi-structured natural language.

In an earlier study [26], we presented an NLP pipeline that ex-tracts knowledge from requirement documents and transforms itinto a graph representing RDF1 triples. We applied the approach to2 datasets, an academic and an industrial one, and used the graphof the academic requirements specification to show the separation

1https://www.w3.org/TR/2014/REC-rdf11-concepts-20140225/

of two subsystems. The generated sample graphs were not wellconnected and yielded only a subset of fully connected vertices ina main graph.

3 APPROACH: GRAPH CONSTRUCTIONWe use several techniques to extract information from the require-ments and to build the semantic relation graph. Of course, a graphis not capable to store every aspect of any kind of information. Ourgraph does not meet the conditions of a valid ontology nor an RDFgraph, but it tries to depict major, semantic parts of common NL(e.g., words and phrases within sentences, but also documents andcorpora) and their semantic relation to each other. An ontologyor an RDF graph would require at least some kind of typing ofinformation, which is not always fully automated achievable fromour point of view.

The main goal while building the graph is to store all availablepieces of information in vertices as small as possible and connectthose vertices with each other based on their relation. First, weuse an NLP pipeline for the semantic content of the requirements.Second, we analyze the requirements for given structural character-istics which should be added to the graph, too. Finally, we combineall information to build the semantic graph as the knowledge base.

Currently, our approach only supports English, mainly becausethere exist a variety of different NLP techniques and tools thatare not available for other languages. Furthermore, English is arelatively easy language, e.g., it consists only of a small set of part-of-speech tags, which we have to consider in the subsequent process.

3.1 Natural Language Processing PipelineSince we consider requirements as NLwithout any specific templateor similar characteristics, the NLP pipeline consists of commoncomponents without special adjustments or optimizations for acertain kind of requirements specifications and is therefore able toprocess any kind of text. The core parts of our pipeline, StanfordCoreNLP [18] and DeepSRL [16] for semantic role labeling, arepipelines themselves and will be described in detail.

Stanford CoreNLP2 is a collection of solutions for commonNLP tasks, which are assembled and coordinated in a pipeline. Weonly use parts of it, particularly the tokenization, sentence splitting,part-of-speech (POS) tagging, lemmatizing (morphological analy-sis), dependency parser (grammatical structure), and coreferenceresolution (Coref). The first two tasks determine tokens (words,punctuation marks, etc.) and sentences in a NL text. Part-of-speechtagging categorizes these tokens by their grammatical role insideof a sentence, e.g., as noun, verb, or article.

Next, while lemmatizing, each word is annotated with its lemma,a base form which depends on the part-of-speech. For example,the lemma of “studying” (as verb) is “(to) study” (also a verb) andin comparison the lemma of “students” (noun) is “student” (also anoun). In contrast to this, stemming of these words would result in“stud” as the word stem. Therefore we use lemmatizing instead ofstemming to keep the sense of each word.

Example 3.1. Barack Obama was born in the fabulous and tropi-cal Hawaii, a small shallow isle in the big pacific ocean.

2https://stanfordnlp.github.io/CoreNLP/

Page 4: Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea

The dependency parser determines the grammatical structureof a sentence. While the result contains much more information,we use basically two features: the identification of noun phrasesand the dependencies of coordinating conjunctions. In Example 3.1,amongst others, “Hawaii”, “isle”, and “ocean” are noun phrases (de-terminers and adjectives removed) and the coordinating conjunc-tion “and” depend on the both adjectives “fabulous” and “tropical”.

Example 3.2. Barack Obama was born in Hawaii. He is the presi-dent. Obama was elected in 2008.

Lastly, the coreference resolution is looking for mentions ofthe same entities. There are two different kinds of mentions. InExample 3.2, the word “he” refers to “Barack Obama” and is apronominal reference without any further meaning of the word“he”. In another example, “he” might refers to a totally differentperson. In contrast, “Obama” (in the last sentence from Example 3.2)also refers to “Barack Obama” but is a nominal reference whichcontains additional information, i.e., the person Barack Obama isalso just called Obama. Due to the interpretation of “is” as equation,the phrase “the president” is also a coreference for “Barack Obama”.

Semantic role labeling (SRL) [4] is a sentence-based NLP task.At first, all predicates of a sentence are searched. Subsequent, allarguments for each predicate are associated with their roles withinthis sentence. The role of an argument is represented by its type,e.g., the verb gets V, main arguments are enumerated by A0, A1 etc.and secondary arguments are prefixed by AM- and their concretefunction, e.g., AM-MNR for manner. In general, the first argumentis called the agent, the second the patient, all other roles dependon the verb.

Example 3.3. If the user pushes the button, the engine is started.

Example 3.3 contains two predicates “push” and “start”. The verbof the predicate “start” (from the main clause) is [Vstarted], thearguments are [A1the engine] and [AM-ADVIf the user pushes thebutton]. Because this is a passive clause, there is no first argumentand “the engine” is just the receiver/patient of the start procedure.Also, the whole subordinate clause is labeled as an adverbial argu-ment, as it describes the circumstances of the predicate. The verband arguments of “push” are [Vpushes], [A0the user] and [A1thebutton]. The numbered arguments would be the same even if thesyntax of the sentence would change, e.g., “If [A1the button] is[Vpushed] [A0by the user], [. . .]”.

The semantic roles are predefined in the PropBank3 database.Most of the identified propositions have at least one argument,about two-third also got a second argument [5, Table 1]. We useDeepSRL4 as state-of-the-art implementation for SRL tagging. Ituses a deep BiLSTMmodel to perform SRL and achieves an F1 scoreof 97.4% for CoNLL 2005 [5].

3.2 Structural InformationBesides the semantics in NL, requirements specifications often con-tain additional information. Very common is a hierarchical structurelike single documents formodules, chapters within these documentsand folders arranging the documents. It can also be concluded that

3https://verbs.colorado.edu/~mpalmer/projects/ace.html4https://github.com/luheng/deep_srl

verb

first argument second argument

A0 A1

Figure 1: Graph structure for single SRL predicate

start

engine push

user button

A1 AM-ADV

A0 A1

Figure 2: SRL structure for Example 3.3

bear

barack obamahawaii

president

2008 obama2008

elect

A0AM-ADV coref

A0A1

coref

Figure 3: Graph structure for Example 3.2

two documents in the same folder are more related to each otherthan completely foreign ones.

Trace links also express some kind of relatedness between twoor more requirements which should be taken into consideration.

3.3 Semantic Relation GraphAs last step, we build a graph which contains all information andrelations, we have found in NL and the additional information. Tosupport common graph-based algorithms like Spreading Activation,the graph must be a regular directed graph without hyperedges(more than 2 vertices connected to a single edge) nor hypervertices(a single vertex contains a graph within itself).

The leading structure for NL content is based on SRL predicatesincluding their verb and arguments. A predicate is representedby the verb as a vertex in the graph (verb vertex) and additionalvertices for each argument (argument vertex), as shown in Figure 1.If an argument contains a predicate itself, the graph structure forthat predicate is likewise added and connected via an edge betweenboth verb vertices as shown in Figure 2 for Example 3.3 insteadof adding a single argument vertex containing that predicate. Ifthere is a coreference between arguments, both argument verticesare either connected for a nominal one or only a single argumentvertex is added for pronominal as shown in Figure 3.

If an argument occurs more than once, the same argument vertexis used for both occurrences. This only applies to arguments thatcontain at least one noun, otherwise simple adjectives (e.g., “down”)would lead as single arguments to a relation which is undesirable.To support this deduplication of arguments, the information textof arguments is transformed into a simplified presentation. Forexample, articles are removed, nouns and adjectives are replacedby their lemma and their simplified POS tag which only differs

Page 5: Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea Aaron Schlutter and Andreas Vogelsang

hawaii small shallowisle pacific ocean

hawaii isle ocean

isle ocean

Figure 4: Graph noun phrase structure for Example 3.1

between verbs, noun, and adjectives. For instance, “the engine” istransformed into “engine#n”.

As with arguments, duplicates may occur with verbs, too. Sincea verb is on a par with its predicate, the arguments are also takeninto account and a verb vertex is only reused, if the lemma of theverb and all argument vertices, except other verb vertices, are equal.

While the arguments in Example 3.2 and 3.3 only contain simplephrases, an argument may be a much more complex phrase. InExample 3.1, the whole phrase “in the fabulous and tropical Hawaii,a small shallow isle in the big pacific ocean” is the second argument.The resulting argument vertex will not lead to a deduplication, ifthe argument of another predicate is just “Hawaii”. To circumventthis issue, we add additional vertices to the graph based on thegiven noun phrases and their dependencies as shown in Figure 4.

If an argument contains a coordinating conjunction that dependson noun phrases, the argument is split and for each part, the corre-sponding graph structure is added. Otherwise, a separation wouldlead to undesirables vertices. In Example 3.1, “and” does not dependon a noun phrase and would lead to two vertices for the splittedarguments “in the fabulous” and “tropical Hawaii, a [. . .]”.

Next to the vertices and edges for NL, the additional informationhave to be add to the graph, too. This depends on the characteristicsof the additional information. For each requirement a vertex shouldbe added which is connected to all verb vertices, whose predicatesare contained in the requirement. A hierarchical structure may beadded as a tree to the graph, e.g., including vertices for each moduleand chapter which are connected to the requirement vertices. Tracelinks can also be interpreted as edges between requirement vertices.

4 ANALYSIS: GRAPH CHARACTERISTICSIn our previous study [26] we introduced knowledge representationgraphs which construction differs from the presented process. Webuilt graphs for two requirements documents, namely AutomotiveSystem Cluster (ASC)5 and “Charging system” (for electric vehicles)by our industry partner, both having an automotive backgroundand containing real industrial data.

For our current approach, we reuse these datasets. Additional tothe graph construction described in section 3.3, we add vertices foreach requirement identifier which are connected to all predicatesof all sentences in this requirement text. The graphs are shownin Figure 5. The colors in Figure 5a represent the two containedsubsystems of ASC: ELC is blue, ACC green, and intersections ofboth are colored orange. Figure 5b has a different color codingfor the single “Charging System”: identifier vertices are blue, SRL

5https://www.aset.tu-berlin.de/fileadmin/fg331/Docs/ASC-EN.pdf

Table 1: Datasets and corresponding graphs

Dataset vertices edges no trace pathASC 1,277 2,320 -ASC [26] 345 440 -Charging S. 40,453 107,862 -Charging S. [26] 13,676 19,769 -Infusion P. 1,731 3,622 0CCHIT 9,239 22,669 0GANNT 824 1,755 0CM-1 7,616 18,375 0WARC 1,188 1,975 63

predicate vertices orange, SRL argument vertices green, and nounphrase arguments are salmon.

For comparison, the graphs from [26] are shown in Figure 6. Themain difference to the current process is that we used from SRL onlythe first two arguments of each predicate as vertices and connectedthem via an edge that represents the predicate. In addition, therewere no vertices for identifiers or noun phrases.

Next to the 2 datasets, we analyze the tracing datasets InfusionPump, CCHIT, GANNT, CM-1, and WARC from [15]. They havedifferent domains, like health care, science, and business. Each ofthem contain source and target requirements that are linked to eachother. In the graphs, source and target requirements are not directlylinked (i.e. trace links are not considered, no edges are inserted).

Table 1 gives an overview of the semantic knowledge graphs forall datasets, including the number of vertices and edges. For thetracing datasets, we determine, how often there is no path betweenthe source and target identifier vertices regardless of whether atrace link is actually defined.

Figure 7 shows the degree centrality for each dataset, i.e., howmany incoming and outgoing edges a type has. Due to some ex-treme outliers up to 2,130, the plot is cut off above 50 to make theboxes clearly visible. We do not differentiate between indegree andoutdegree, because the direction of the edges is independent fromthe statement, it represents, due to the fact that in most cases, thesemantics of an edge may be restated the other way around. Whilethe datasets and graphs vary in size, the average degree centralityis nearly the same.

Figure 8 shows the distribution of existing shortest path lengthsbetween each source and target vertex of the tracing datasets. Again,despite the different dataset and graph sizes, the average shortestdistance between a source and a target is nearly the same.

5 DISCUSSIONThere are mainly two parts of the considered semantic search thatwill affect the performance results, the graph and the search al-gorithm, which both also depend on each other. In absence of aconcrete search algorithm, we will take a look at the graph char-acteristics including its semantic relations and try to argue whichparts might not be optimal.

The graphs provide connections between almost all possiblequeries and targets, i.e., there are paths between each pair of ver-tices. The 3 vertices in the lower left corner of Figure 5a are an

Page 6: Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea

drive#v

automatically#a

again#a

FA-77

FA-32

set#v

FA-36

perform#v

absolute#a distance#n

maximum#n permissible#a speed#n

reach#v

activate#v

activate#v

speed#n limit#n

recognize#v exceed#vactivate#v

set#v

down#a standstill#n

set#v

deactivate#v

activate#v

value#n f.#n

brake#v

accord#v

less#a

2m#v

activate#vactual#a distance#n accord#v

full#a standstill#nregulate#v

decrease#v

include#vstandstill#n standstill#n

then#a

be#v

temporarily#a

less#a

be#v

f.#n

automatically#a

road#n sign#n

speed#n limit#n of#o f#n km/h#nFA-82

precede#v

not#a

brake#v

set#vaccelerate#v

set#v

less#ao than#o current#a speed#n 3.6#o t.#o

calculate#v

indicate#v

2.5#o s#n

not#a

again#abe#v

be#v

2.5#o s#n

3.6#o

3#n

f#n km/h#n

FA-81

less#ao than#o

not#a

ahead#a

FA-87

set#v

set#v

emergency#n brake#n assistant#n

be#v

FA-88

set#v

traffic#n jam#n

FA-17

lever#n upward#n

set#vFA-96

turn#v

>#ao 20#o km/h#n

maintain#v

road#n sign#n

hold#v

advance#n cruise#n control#n

follow#v

enabling/disabling#n of#o speed#n limit#n

start#v

provide#v

light#a illumination#n distance#n

FA-100to-be#a speed#n

set#v

activate#v

be#vdecrease#v

speed#n limit#n function#n

FA-67

precede#v

increase#v

distance#n

speed#n limit#n function#n impact#n

distance#n level#n

reduce#v

traffic#n law#npart#n of#o vehicle#n

decrease#v

speed#n control#n

vehicle#n speed#n

adopt#v

20#o km/h#n user#n

FA-6

maximum#n speed#n

threshold#n value#n m#n

65#o meter#n

lever#n downward#n

next#a lower#ao ten#o place#n

active#a speed#n limit#nincrease#v

decrease#v

20#o km/h#nbrake#n

start#villumination#n distance#nhand#n brake#nFA-68

FA-66 speed#n

FA-99

lever#n upward#n

exceed#v

accelerate#v

speed#n limit#n

distance#n

set#v

speed#n

limit#v

precede#v

FA-19

resistance#n stage#n

increase#v

maintain#vlimit#v

FA-65FA-58

adaptive#a cruise#n control#n

current#a speed#nlimit#vstand#vactivate#v

impact#n 20#o km/h#n

push#v

local#a traffic#n law#n

brake#v

set#v

be#vFA-98

precede#v

AL-150

4#o seconds#n

6#o seconds#n

available#a

only#a

4#o seconds#n

velocity#n window#n

velocity#n window#n

apply#v

do#v

hold#v

hold#v40#o km/h#n

hold#v

follow#v

hold#v

hold#v70#o km/h#n

54#o km/h#n etc.#o

50#o km/h#n

6#o seconds#nFA-45

start#v

desire#v

press#v

FA-94set#v

start#v

set#v

set#v

set#v

valid#a

55#o km/h#n

t2#n

warn#v

FA-85

be#v

hold#v

enabling/disabling#n

58#o km/h#n

vehicle#n depart#n

·#n visual#n

·#n alarm#n

·#n visual#n3#n

AL-73

·#n

activate#v

target#n speed#n of#o cruise#n control#n

speed#n set#n point#n

increase#v

activate#v use#v

indicator#n cruise#n control#n

set#vnext#a ten#o place#n

cruise#n control#n speed#nreduce#vsystem#n autonomously#a limit#vvehicle#n

object#n

driver#n press#narea#n of#o illumination#n

8#o bit#n

d#nadvance#n cruise#n control#ntarget#n speed#n part#n

set#v drive#v

lever#n downward#n

brake#n assist#nemergency#n brake#n assistant#n

cruise#n control#n speed#nemergency#n brake#n assist#n

start#v

hold#vspeed#n limit#n 57#o

etc.#o

set#v

hold#v80#o km/h#n etc.#o

set#v

point#n 30#o km/h#n etc.#o

FA-89

60#o km/h#n

FA-93

hold#v

set#v

FA-90

point#n 59#o km/h#nset#v

set#vpoint#n 40#o km/h#n

point#n 58#o km/h#n

speed#n 57#o km/h#n

FA-78

hold#vset#v

level#n

point#n 60#o km/h#n etc.#o

exceed#v

speed#n 57#o km/h#n à#n speed#n

point#n set#vdecelerate#v

speed#n limit#n 57#o km/h#n à#n target#n speed#n limit#n 60#o

km/h#n head#n of#o cruise#n control#n

lever#n point#n 55#o km/h#n

speed#n limit#n 57#o km/h#n à#n target#n speed#n limit#n

traffic#n jam#n

90#o %#n

warn#vup#o

current#a vehicle#n speed#n

value#n be#v

least#a one#o door#n of#o vehicle#n

use#vfootwell#n area#n of#o driver#n

distance#n vehicle#n ahead#avehicle#n brake#n

up#a or#o down#a

20#o

user#n control#nbe#v

FA-30

system#nclassification#n of#o system#n adaptive#a exterior#n light#n

drive#v

FA-37recognize#v

function#npush#vdriver#n

last#a chosen#a speed#n set#n point#n

advance#v

vehicle#n

FA-21

increase#v

AL-16

vehicle#n increase#n90#o %#n

user#n

be#v

be#v

exterior#n lighting#n system#n

distance#n warning#n

follow#v vehicle#n ahead#a

FA-95

push#v

AL-69

available#a

ahead#a of#o vehicle#n

gas#n pedal#n

user#n control#n and#o option#n

%#n

function#n of#o low#a beam#n light#n

deactivate#v

value#n n.#n

collision#n with#o vehicle#n ahead#a

down#orecognize#v

fall#v

maximum#n deceleration#n

value#n

user#n function#n

set#vslower#ao than#o 10#o km/h#nclassification#n

maximum#n deceleration#n of#o 5#o m/s#n ²#o

activate#vfunction#n of#o system#n

maximum#n

deactivate#v

means#n

100#o %#n

²#o

y#o %#n

specify#v

be#v

mount#v

FA-28

be#v

depend#v

have#v

calculate#v

issue#v

deceleration#n

FA-86

follow#v

20km/h#n

FA-2

-10#o %#n

blinking#a

car#n

acceleration#n

vehicle#n system#n

collision#n

AL-142

classify#v

activate#vcamera#n

camera#n

activate#v

resolution#n

have#v

14.5#onot#aclassify#v

FA-4

5#o m/s#n

AL-124

AL-143

send#vsee#v

option#n100#o

maximum#n deceleration#n

FA-53

be#v

FA-50

asil#n b.#n

again#a

warning#n

follow#v

asil#n b.#n5m/s#a ²#n

FA-52

AL-132

safety#n relevant#a

shoot#v

resolution#n of#o 800#o x#o 600#o pixel#n

safety-relevant#a in#o sense#n of#o iso#n 26262#o

lvds#n

respect#n iso#n 26262#o

overvoltage#n

strengthen#v

recognition#n

brake#v

then#a

recognition#nforce#n

activate#v

FA-75

press#v

follow#v

user#n function#n

include#v

speed-dependent#a safety#n distance#n

2#o m/s#n

safety#n distance#nAL-2y#o %#n FA-15

FA-48

brake#n pedal#nactivate#v

ta#n ta2#n

include#v

FA-35

100#o %#n

follow#v

safety#n classification#n of#o system#n speed#n control#n

maintain#v

maximum#n of#o 2#o m/s#n ²#o

distance#n time#n

ta#n seconds#n

press#v

allow#v

safety#n distance#n

FA-23 value#n for#o maximum#n deceleration#n

contain#v20km/h#nmount#v

x#o %#n

adaptive#a cruise#n control#n system#n

one#o door#n

safety#n classification#n

10#o km/h#n

x#o %#n

calculate#v

recognize#v

less#a

sign#n

FA-31

sign#v

FA-97

push#v

65#o meter#nlimit#v

pressure#n on#o gas#n pedal#n

move#v

FA-22

activate#v

advance#v

brake#n pedal#n

activate#v

brake#n

certain#a threshold#n value#n m#n

time#n

maintain#v

activate#v

brake#n or#o hand#n brake#n

use#vobject#n

n.#n

footwell#n area#n

issue#v

activate#v

down#o

cruise#n control#n system#n

speed#n control#n system#n

automatically#a

adopt#v

FA-84

be#v

ta2#n seconds#n

determine#v

automatically#a

deactivate#v

speed#n set#n point#n

precede#v

FA-16 ten#o

accelerate#v

not#a

time#n

speed#n control#n system#n

AL-137

tolerance#n of#o +#o -10#o %#n of#o actual#a power#n

overvoltage#nmore#ao than#o 14.5#o

up#o

b#n request#v

more#a acceleration#n

voltage#n in#o vehicle#n electrical#a system#n

FA-63

FA-64

hold#v

FA-24

open#v

increase#v

vehicle#n door#nexterior#n lighting#n system#n

time#n >#ao t#n seconds#n

reduce#v

have#v

adopt#v220#o m.#nacoustic#a signal#n

push#v activate#v

0.5#o %#n

accept#vinsufficient#a

resolution#n of#o camera#n signal#n

10#o seconds#n

headlight#n

activate#v advance#v50#o %#n

reduce#vcruise#n control#nbe#v be#vmeans#n of#o pwm#n pulse#n

width#n modulation#n be#v

pwm#nvoltage#n

n.#n

prevent#v

push#v

gas#n pedal#n

pull#vbe#v

60#o hz#n

FA-55

AL-141

<not elaborated within the demonstrator>#n

region#n

60#o hz#n

fa#n -78#o

color#n range#n

800#o

390-1000#o nm#n

near#a infrared#a region#n

datum#n

partly#a

cover#v

frame#n rate#npresent#a

AL-123

fa#n -78#o

AL-135

²#n

datum#n

lvds#n

600#o pixel#n

color#n range#n of#o 390-1000#o nm#n

frame#n rate#n of#o camera#n

sense#n

iso#n 26262#o

AL-133

14.5#o v.#n

not#a

active#a

ignition#n key#n

also#a

1:2#o

engine#n

headlamp#n flasher#n

AL-15

al-119#n

al-48#n

0.05#o s.#o

due#a energy#n saving#n reason#n

exceed#vheadlamp#n flasher#n

energy#n saving#n reason#n

back#nengine#n1:2#o

protection#n protection#n from#obe#v

al-119#n

start#vsee#v bright#a dark#a 1:1#o

activate#vAL-125

pull#v

two#o signal#n

integrate#v

deactivate#v

AL-139

beam#n switch#n

pull#v

AL-55

pull#v

engaging/disengaging#v

beam#n headlight#n

recognize#v

deactivate#v

activation#n

push#v

duration#n of#o 1#o second#a gentle#a fade-out#n

pitman#n arm#n

avoid#v

light#n

accept#v

position#n off#o or#o exterior#a light#n on#o

curve#nengage#v

rear#a left#ndeactivation#nlow#a beam#n headlight#n

0.5#o seconds#n

light#a rotary#a switch#n

move#v

position#n auto#n ambient#a light#nreduction#n

adaptive#a high#a beam#n headlight#n

turn#n signal#nturn#v

be#v

deviation#n of#o less#ao than#o 0.05#o s#n

reduction#n of#o pulse#nactivate#v

be#v

steer#v

be#v

pulse#n ratio#n

not#a available#a

exterior#n mirror#n steer#v

AL-121

exterior#n mirror#n

adaptation#n of#o pulse#n ratio#n

wheel#n switch#n

present#a

reduction#n of#o power#n FA-25be#v position#nturn#vignition#n on#o

activate#vmount#vactivate#v strength#n

activate#v

activation#n of#o clamp#n 30#o ignition#n on#o

exterior#n brightness#n pull#vactivate#vactivate#v

dependence#n

position#n b#n

pulse#n

attach#v

leave#vpulse#n ratio#n 1:1#o

AL-122

car#n battery#n

leave#v

ignition#n lock#n

see#v

al-55#n

lock#n

al-55#n

be#v

switch#v

have#v

complete#vfeedback#n

remain#v

follow#v

deflection#n

move#vflash#v

be#v

happen#vflash#v

not#a 100m#n

provide#v

engage#v

temporary#a activation#n

flash#v

engage#v

flash#v

AL-109

AL-120

follow#v

none#n

occur#v

always#aAL-42flash#v

flasher#n

AL-79ignition#n lock#n

be#vseveral#a condition#n

AL-48

back#n

have#v

pull#v

AL-149ignition#n key#n

again#a

activate#vburn#v

activate#v

AL-51

move#v

haptic#a feedback#n

permanent#a activation#n

be#v

rotary#a light#a switch#n

move#v

lock#n

not#a

1#o second#a

condition#n

7#o °#o deflection#n

1000#oAL-72

AL-103

next#a 30#o seconds#ncomplete#v

always#a

7#o °#o deflection#n

flash#vflash#v

30#o seconds#n

occur#v

up#o or#o down#o

not#a consider#vmaximum#n deviation#n of#o pulse#n ratio#n

exterior#n brightness#nAL-146deviation#n of#o pulse#n ratio#n

with#o less#ao than#o 0.5#o %#n

more#a

only#a vehicle#n

AL-112

hazard#n warning#n light#n

shortly#a before#o deactivation#n of#o hazard#n warning#n

be#vAL-119

be#v

switch#v

due#a adjustment#n of#o headlight#n position#n as#a well#a

as#o by#o reduction#n of#o luminous#a strength#n

8.5#o v.#n

corner#n

be#v

AL-147

power#n

figure#n schematically#afollow#v

corner#n

AL-115

FA-62

detection#n

push#v

be#v

AL-80

function#n ambient#a lighting#n

deactivation#n of#o hazard#n warning#n AL-56

AL-134

activate#v

50#o %#nactivate#v

exterior#a light#n on#o

driver#n

AL-54

signal#n

activate#vmore#a

be#vAL-40 corner#v

AL-76

high#a beam#n headlight#n

hold#v

turn#v

increase#v

acceleration#n setting#n

position#v

activate#v

control#n lever#n

activate#v

be#v

activate#v

accept#v

pulse#n width#n modulation#nless#ao than#o 8.5#o push#vposition#n of#o exterior#n lighting#n

element#n of#o vehicle#n show#v

pass#vposition#n automatic#a

intervene#vsubvoltage#n

b#nFA-26subvoltage#n

signal#ndeviation#n

AL-44down#o

AL-108

0.5#o seconds#n

demand#v

camera#n signal#n

AL-64 more#a acceleration#n

observer#ntolerance#n

exterior#n brightness#n outside#o vehicle#n rgb#n

deactivate#v

AL-144maintain#v

control#n lever#nheadlight#n

FA-5cruise#n control#n lever#n

depend#v

car#n battery#n

be#v

away#o

cognitive#a threshold#n of#o human#a observer#n

down#oFA-27

be#vreduce#v

activate#v

demand#vactivate#vAL-136 be#v

adopt#v acceleration#n

perform#vmaximum#n deviation#n

hold#vAL-43

activate#v

not#a active#a more#ao for#o 10#o seconds#n

reduce#v

be#v

activate#vwarn#v

self-test#nactivate#vpush#vFA-20 activate#v

al-48#n

AL-116

0.05#o

empty#a

pulse#n ratio#n AL-118

FA-42

call#v

activate#v

be#vperform#v

accord#v

be#v

activate#v

serve#vpull#vcumulated#a deviation#n

position#n auto#n

1:1#o

0.05#o s#n

AL-78

column#n

clamp#n 30#o

deactivate#v

direction#n

position#n a#nbe#v

position#n tip-blinking#a left#n leave#vtip-blinking#n

run#vnote#vmore#ao than#o 0.5#o seconds#n

remain#v

blinking#a

i.e.#o as#a long#a as#o pitman#n arm#n be#v

leave#v

horizontal#a neutral#a position#n

activate#v

pull#v

run#vactivate#v

AL-45 wheel#n switch#n module#n

position#npitman#n arm#n

less#ao than#o 0.5#o seconds#n

lead#v

position#n tip-blinking#a

activate#v

AL-41

switch#n

tip-blinking#n

AL-67

deactivate#v

flash#v

two#o complete#a adaptation#n

occur#v

AL-148

xenon#n

AL-96

halogen#n

AL-93

be#v

be#v

halogen#n

xenon#n

install#v

AL-63

al-40#n

AL-110

switch#v

AL-88 follow#v

AL-66symbol#n in#o instrument#n cluster#n

design#n

low#a beam#n illuminant#n prioritization#n

visible#a effect#n

deactivate#vlighting#n element#n

3#o seconds#nAL-102

cycle#n

thus#a

or#o direction#n

al-40#n

canada#nblinking#a right#a

darkness#n

lead#v

AL-97

AL-127

usa#nsymbol#n

indicate#v detail#n

point#n 80#o km/h#nacc#n lever#n

select#vpoint#n 54#o km/h#n

point#n 60#o km/h#n

speed#n 57#o km/h#n t#n =#a 2#o seconds#n à#a

speed#n 57#o km/h#n n#n =#a 1#o km/h#n t#n =#a 2#o seconds#n à#a

overload#n

movement#n

point#n 56#o km/h#n

accelerator#n or#o brake#n pedal#n

FA-3

front#n

directive#n 93/92/eec#n

integrate#v

switching#n

lever#n

demand#n

control#n field#n

t#nmenu#n

turn#v

context#n

front#nmaintain#v acc#n lever#nlevel#n

set#vvehicle#n speed#n area#n

FA-60

illumination#n start#vstart#v

example#nFA-92

point#n 56#o km/h#n

start#v

point#n 70#o km/h#n three#o level#ndifference#n be#vcritical#a situation#npoint#n 30#o km/h#n

faster#ao than#o 30#o km/h#n

al-38#n and#o al-16#nsee#v

not#a

al-38#n

al-16#n

hold#v

point#n 50#o km/h#n

point#n 59#o km/h#n

50#o km/h#n

overload#nFA-91

need#v

illumination#n area#n requirement#n

limit#vlast#a start#n of#o motor#n and#o cruise#n control#n lever#n be#v

AL-151

head#n2#o seconds#n

three#o level#n

respect#v

illumination#n area#n requirement#n

n#n 1#o km/h#n AL-126

two#o acoustical#a signal#n 0.1#o seconds#n long#a with#o 0.2#o seconds#n pause#n between#o

light#n setting#n

cornering#n light#nrestore#v

hazard#n warning#n switch#n

hazard#n warning#n light#nAL-145headlight#n position#n

beam#n

AL-75 hold#v24#o bit#nAL-47

run#v

pull#v

deactivate#vaccord#v

beginning#n

fact#n

indicate#v

drive#v

first#a resistance#n level#ndim#v

duration#n fade-out#n

direction#n

direction#n blinking#a

AL-68run#v

lighting#n

instrument#n cluster#n setting#n

AL-46

AL-62

light#n

lower#ao than#o threshold#n s1#n

setting#n

cognitive#a threshold#n of#o human#a be#v

menu#n setting#n vehicle#n setting#n ambient#a lighting#n

position#n c#n

not#a

flash#v

release#v

activate#v

change#n

FA-46

leave#v

AL-65

flash#v

synchronically#acycle#n

activate#v

switch#v

AL-82

flash#v

case#nflash#v

darkness#n switch#n

5#o °#o

flash#v

case#n of#o emergency#n situation#n

case#n

leave#v

illumination#n area#nflash#v

immediately#a

request#v

AL-98

blinking#a cycle#n

see#v

tip-blinking#a cycle#n

100m#n and#o 300m#n

tip-blinking#a

darkness#n

300m#nadditionally#a

deactivate#v

complete#v

left#n

threshold#n s2#n with#o s2#n >#ao s1#n

0.1#o seconds#n

0.2#o seconds#n

hazard#n warning#n

figure#nfade-out#n

four#o direction#n

adjustment#n

start#n

FA-44threshold#n s2#narea#n of#o upper#a control#n

field#n

motor#n

2#o seconds#n

FA-41street#n

high#a beam#n illumination#n

detail#n about#o design#n of#o lighting#n element#n

situation#n regulate#v

determine#v

prioritization#n of#o display#n in#o instrument#n cluster#n

s2#nAL-101

display#n

again#a

pit#n arm#n

or#o down#o

beam#n illuminant#n

direction#n right#n

illumination#n of#o street#ninstrument#n cluster#n

instrument#n cluster#n

directive#n 93/92/eec#n

2.5#o seconds#n

available#a 2#o seconds#n 2.5#o seconds#n and#o 3#o seconds#n

example#n

transmit#v

effect#n

AL-100

illumination#n area#npress#v

beam#n illumination#n

cornering#n light#n

first#a resistance#n level#n beyond#o pressure#n point#n

threshold#n s1#ndirection#n indicator#n

street#n accelerator#n

information#n about#o defective#a illuminant#n

30#o km/h#nilluminant#n

defective#a illuminant#n

information#ndetect#v

illustration#n

fade-out#nlighting#n element#n

give#vdepend#vinstrument#n cluster#n setting#nclose#v

AL-17

leave#vpressure#n point#n

pull#vAL-111

FA-59

control#nspeed#n set#n point#n of#o cruise#n control#n

exterior#n light#n

activate#v

c#n

illuminate#v

switching#n behavior#n of#o indicator#n

pressure#n24#o bit#n 8#o bit#n rgb#n

t#n seconds#n

vehicle#n setting#n

exterior#n lighting#n element#n

lever#n

beam#n headlight#n illumination#n

t#n seconds#n

press#v

beam#n light#n

pressure#n point#nFA-61reaction#n time#n

switching#n behavior#n

limit#v cruise#n control#n lever#n

context#n of#o exterior#n lighting#n system#n

push#v

threshold#n

enable#v

switching#n between#o cruise#n control#n and#o speed#n limit#n

function#n set#v

resistance#n level#n

self-test#n

hazard#n warning#nvehicle#n door#n

advance#v

move#v

ambient#a lighting#n

AL-106

flash#v

low#a beam#n headlight#n as#o ambient#a light#n

change#n of#o pitman#n arm#n from#o tip-blinking#a direction#n

blinking#a or#o back#a

run#v

AL-81

characteristic#a curve#n al-112#n

have#v

mount#v

emergency#n situation#n

operational#a availability#n of#o adaptive#a high#a beam#n

headlight#n

leave#v

AL-89

least#a for#o 3#o seconds#n

i.e.#o without#o gentle#a fade-out#n

AL-107

AL-38

availability#nblinking#a side#n

see#v

duration#n

darkness#n switch#n

active#aleave#v

release#v

run#v

release#vstop#v

AL-84

remain#v

AL-104

AL-129 see#v

AL-105

hold#vactivate#v AL-138

exceed#vposition#n direction#n

left#n

activation#n of#o clamp#n 30#o dependable#a picture#n

information#n

direction#n indicator#nright#nactivate#v

activate#v

cause#v fix#v

direction#n of#o turn#n signal#n tip-blinking#n hazard#n warning#n

light#n

curve#n al-112#n

hazard#n warning#n light#a switch#n

direction#n blinking#a on#o blinking#a side#n

light#a illumination#n area#n of#o high#a beam#n headlight#n

reduce#v

AL-71

activate#v

light#n setting#n

release#v

be#v

(a) Automotive System Cluster (ASC) (b) Charging System

Figure 5: Semantic relation graphs

(a) Automotive System Cluster (b) Charging System

Figure 6: Knowledge representation graphs from [26]

exception, they represent the identifiers AL-141 and FA-55, andtheir content “<not elaborated within the demonstrator>”. In Fig-ure 5b at the bottom, most of the single vertices represent identifierswithout any text content like a sentence or a noun phrase. Our pre-vious approach [26] in comparison yielded only a subset of fullyconnected vertices in a main graph. This is also achieved, becausewe include more information in more graph elements (see Table 1).

Figure 7 indicate, that there are only a few strongly connectedvertices while the majority is only connected to certain other, rele-vant information. Excluding those strongly connected nodes, thegraph is able to depict a meaningful relationship path between

●●●●●

●●●

●●

●●

●●●

●●●

●●●

●●●●●●●●

●●

●●●●●●●

●●

●●

●●●●

●●●●●

●●●●●●●

●●●

●●

●●

●●●

●●

●●●●

●●

●●●

●●

●●

●●●●

●●●

●●

●●

●●●

●●

●●●●

●●●

●●

●●

●●

●●●●●

●●●

●●

●●

●●

●●

●●●●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●●

●●●

●●

●●●

●●

●●●

●●●

●●●

●●

●●●

●●

●●●

●●

●●●●●

●●

●●●●●

●●

●●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●●

●●●

●●●

●●●●

●●●●

●●

●●

●●●

●●●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●●●

●●

●●●

●●●●●●●

●●

●●●

●●

●●

●●

●●●●●

●●

●●●●●

●●●

●●●●

●●

●●

●●

●●

●●●●●●●

●●

●●

●●●

●●

●●●

●●●

●●●

●●

●●●●●●●●

●●●●●●

●●

●●

●●

●●●●●●●●

●●

●●

●●●●

●●

●●

●●●

●●

●●●●

●●●●●

●●

●●

●●●●●

●●

●●

●●

●●●

●●●

●●

●●●

●●●

●●●

●●

●●●

●●●●●●●

●●

●●

●●

●●

●●●

●●

●●●●●

●●

●●

●●

●●

●●

●●●●●●●●●●●●●●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●●●●●

●●

●●●

●●

●●

●●●

●●

●●●●

●●●●●

●●

●●●

●●●

●●●●●●●●●●

●●●●●

●●

●●●

●●●

●●

●●●●●●●

●●

●●●

●●●

●●●●

●●●●

●●

●●

●●

●●

●●●●●●●●●

●●●

●●●

●●

●●

●●●

●●●●

●●

●●●

●●●●●

●●●

●●●●●

●●●

●●

●●

●●

●●●●

●●

●●●●●

●●

●●●●

●●

●●

●●

●●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

●●●●

●●●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●●●●

●●●

●●●

●●

●●

●●

●●●●●

●●

●●●

●●

●●●

●●●●●

●●●●●

●●

●●

●●●●

●●●

●●

●●●●●●●●●●

●●

●●●

●●●●●●●●●

●●

●●

●●●

●●●●

●●

●●

●●

●●●●

●●●

●●

●●

●●●

●●

●●●●

●●●●

●●

●●●

●●

●●

●●●

●●

●●

●●●●●●●

●●

●●●

●●

●●

●●●●●●●

●●

●●●

●●

●●●

●●●

●●●●●●●●●●●●●●●

●●

●●

●●

●●

●●

●●●

●●●●●●●●

●●

●●●

●●

●●●●●●●

●●●●●

●●●

●●●

●●●

●●●

●●

●●

●●●●●●

●●●●●●●●●●

●●

●●●●●

●●●●

●●●

●●

●●●

●●●●

●●

●●

●●●●●●

●●

●●

●●●

●●

●●

●●●●●

●●●●●●

●●●

●●

●●●●

●●●●

●●

●●●●

●●

●●

●●●●●●●●●

●●●

●●●

●●●●●

●●●●●●●●

●●

●●●

●●●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●●

●●●

●●●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●●

●●●●●

●●●●

●●●

●●

●●●

●●●

●●

●●

●●●

●●●

●●

●●●●

●●

●●

●●

●●●●●●●●●

●●●

●●

●●

●●●●●

●●

●●●

●●●●●●●●

●●●●

●●●

●●●●

●●●●●●●

●●●●

●●

●●●●

●●

●●

●●●●●●●

●●

●●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●

●●●●●

●●●●

●●●●

●●●

●●

●●

●●

●●●●

●●●●●●●●

●●

●●

●●●

●●●

●●●●●

●●

●●

●●

●●●

●●●

●●●

●●●

●●●

●●●

●●●

●●●

●●●●●

●●●●●

●●

●●●

●●●●

●●●

●●●●●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●●

●●●

●●●

●●●

●●●●●

●●●

●●

●●

●●●●

●●●

●●

●●●●●

●●●●●●

●●●

●●●●

●●

●●

●●

●●

●●●●●

●●●●

●●

●●●

●●●

●●●

●●

●●●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●●●

●●

●●●●

●●

●●

●●●●

●●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●●●

●●●●●●●●●●●

●●

●●●

●●●

●●

●●●●●●

●●

●●

●●●●●●●●●●●

●●

●●

●●●

●●●

●●●

●●

●●

●●

●●●●●

●●●

●●●●

●●●

●●●

●●●

●●

●●

●●●●●

●●

●●●

●●●●

●●●●●●●

●●●

●●

●●

●●●

●●●●

●●●

●●

●●

●●●

●●

●●●●●

●●

●●

●●

●●●

●●●

●●

●●

●●●

●●●●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●●

●●

●●

●●●

●●

●●●

●●●

●●●

●●

●●

●●●●

●●●

●●

●●

●●●

●●●●●

●●●●●●●

●●

●●

●●

●●●

●●

●●

●●

●●●●●

●●

●●

●●

●●●

●●

●●●●

●●

●●●●

●●

●●

●●●●

●●

●●

●●

●●●●

●●●

●●

●●●●

●●●●●●

●●

●●●

●●

●●●●●●

●●

●●●

ASC Charging S. Infusion Pump CCHIT GANNT CM−1 WARC

010

2030

4050

Figure 7: Degree centrality for all datasets (cut above 50,greatest outlier at 2,130)

●●●●●

●●●●●●●●●●

●●●●●●●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●

●●●●●●●●

●●●●●●●●●●●●●●●

●●●●●●●●●●●

●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●

●●●●●●●

●●●●

●●●●●●●

●●●●●

●●●

●●●●●●●●●●●

●●●●●●● ●

●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●

●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●

●●●

●●●●●●●●●●●

●●●

●●●●●●●●●●●

●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●

●●●

●●●

●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●

●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●●

●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●

●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●

●●

●●●●

●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●

●●●●●●

●●

●●

●●●●●●●●●●●●●●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●

●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●

●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●

●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●

●●●

●●●●●●

●●●●●●●

●●●●●●●●●●●

●●●●●

●●●

●●●●●●●●●

●●●●●●●●●●

●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●

●●

●●●●

●●●

●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●

●●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●

●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●

●●

●●●

●●●●●●●●

●●●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●

●●●●●●●●●●

●●●●●●●●●

●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●

●●●●

●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●

●●●●●●●●●●

●●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●●

●●●●●

●●●●●●●

●●●●●●●●●●●●●●●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●

●●●●●●

●●

●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●

●●

●●●

●●●●

●●●●●●●●●

●●●●

●●●

●●●●

●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●

●●●●●●●●●●

●●

●●

●●●●●●●●●●●●●●●●●●

●●

●●●●

●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●

●●●●●●●●●●●

●●●

●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●

●●●●●●

●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●

●●●●●●

●●

●●●●●●●

●●●●●●●●●

●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●

●●●

●●●

●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●

●●●

●●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●

●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●

●●

●●●●●

●●●

●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●

●●●

●●

●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●

●●●●●●

●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●

●●●●●

●●●●●●●●●●●

●●●●

●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●

●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●

●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●

●●

●●

●●

●●●●●●●

●●●

●●●●●●●

●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●

●●●●●●●●●●●●●●●●●

●●●●

●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●

●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●●

●●

●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●

●●●●●●●●●

●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●●

●●

●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●

●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●

●●●

●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●●●

●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●

●●●●●●

●●●●●

●●●●●●●●

●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●

●●●●●

●●●●●●●

●●●●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●

●●

●●●●

●●

●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●

●●

●●●

●●

●●●●●●●●●●●●●●●●●●●●●

●●

●●

●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●

●●

●●●●

●●

●●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●

●●

●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●

●●●●

●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●

●●●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●

●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●

●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●

●●●●

●●

●●

●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●

●●●●

●●●●●●●

●●

●●●●●●●●●

●●●

●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●

●●●●●●●●●●●●●●●●

●●●●●●●

●●●●●●

●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●

●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●

●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●

●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●

●●●●●●●

●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●

●●●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●

●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●

●●

●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

Infusion Pump CCHIT GANNT CM−1 WARC

05

1015

20

Figure 8: Shortest Path Lengths for Trace Links

single vertices. Figure 8 shows that the length of the shortest re-lationship paths stays constant, i.e., even in larger datasets suchrelationships are still comprehensible.

However, there is still room for improvement respectively otherinterpretations how information should be parted and connected toeach other. For example, we do merge noun phrase vertices withoutconsidering their adjectives or coreference. But in some cases, the

Page 7: Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea Aaron Schlutter and Andreas Vogelsang

adjective may crucially impact the meaning, e.g., “pacific ocean” vs.(any) “ocean” in Example 3.1.

Furthermore, we do not merge or connect semantic similar ver-tices. Two or more words/phrases are semantic similar, if they havethe same meaning but different spelling. There are different ap-proaches to identify such semantic similarities. A very commonapproach are word embeddings like word2vec [20] or GloVe [23].They rely on the distributional hypothesis that similar words orphrases are used in similar contexts. Such word embeddings arebuild as mathematical vectors with many dimensions to calculatethe distance between words respectively their context. This alsocauses words of opposite meaning to be related to each other be-cause they occur in the same context [22]. Another approach is adatabase which contain known similarities, e.g., WordNet6. Theyusually contain only common similarities and are not aware oftechnical terms.

Also there is no identification of common phrases. We plan to usetf-idf to downgrade certain vertices which phrases are commonlyused. While a graph algorithm may also downgrade such vertices(e.g. based on its edge count), it have only a local but not a globalscope (based on general corpora, e.g., newspaper), where thesephrases are commonly used but not in our datasets.

6 APPLICATION: SEMANTIC SEARCHSince the graph includes structural and semantic vertices, a rela-tionship may be found by structural or semantic artifacts. If allvertices in a graph are connected through paths, each vertex has a(depictable) relationship to all others. Different search applicationsare conceivable, depending on the query, relationships betweenartifacts (i.e. graph characteristics), and the targets.

6.1 Spreading ActivationWe plan to use Spreading Activation as semantic search algorithmto find for a given query all related information (targets). We’ll usethis to create a list of candidates to sort all (reachable) targets for aquery based on their relatedness.

Spreading Activation has its origin in the field of psychology.It is a theoretical model, how our mind connects information andtries to find an appropriate context with associated terms for a newword. The basic assumption is that more relevant terms are highlyinterconnected while less relevant are less or not connected at all.This model is applied to various science areas like IR [7].

Spreading Activation works on graphs and consists of threephases. In the initial phase, start vertices representing the queryare activated, i.e., they will be assigned an initial activation value.While the spreading phase, this activation is stepwise distributedover the graph, i.e., the activation of a vertex is transferred torelated (connected) vertices. These steps are called pulses and atthe end of each pulse, a termination condition is checked to stopthe pulsation. In the end phase the activation values are used toorder all vertices by relevance. Depending on the type of elementssearched, the results list can also be filtered, but the values arenot used for classification (e.g., by a threshold), since they are notlimited to an upper or lower bound and not directly comparable.

6https://wordnet.princeton.edu/

There are different ways to configure the spreading of the activa-tion values. Berthold et al. [1] proved that restriction is necessary.Otherwise, approaches are equivalently converging to a query-independent state in which the same central vertices are displayedas result for each query.

7 RELATEDWORKOther semantic approaches often deal with semantic distance mod-els between single parts (e.g., words or phrases). Mahmoud et al. [17]uses various semantically enabled IR methods like VSM includingthesaurus or Part-of-Speech, LSI, LDA, explicit semantic analysis,and normalized Google distance. They calculate vectors for eachword and try to find related documents based on small distances.Their results revealed that explicit semantic methods tend to do abetter job than latent methods.

Guo et al. [10] also focus on singlewords, usingword embeddingsas semantic enhancement to train several RNN to accomplish a deeplearning approach. They focus on automatically generated tracelinks and achieve a higher MAP than the existing approaches VSMand LSI on their unpublished dataset.

Guo et al. [11] present a technique to build an ontology semi-automatically. They focus on term mismatches between relateddocuments. Compared to the classification technique, which per-forms best when sufficient training data is available, the ontologyshows improvements because it aims to add semantics which en-ables higher levels of reasoning. The big disadvantage is the effortto create an ontology. Another benefit of the ontology is that itforms the basis for textual explanations of trace links [12].

Hartig et al. [13, 14] use Spreading Activation to find relatedhazard analysis and risk assessments (HARA). Initially they mapa given class diagram including information such as classes, prop-erties, instances, and data values to a Web Ontology Language7model. Using this model, an ontology is automatically created fromexisting HARA. During the search phase, they first apply SpreadingActivation to the ontology to find the relevant subnetwork. In asecond step, they filter the results within this subnetwork by thesought-after type sorted by their assigned activation. Finally, theygenerate a textual explanation for the user derived from spreadgraphs by Michalke et al. [19].

8 CONCLUSIONIn this paper, we present a novel approach for knowledge extractionusing semantic relations between parts of natural language that arestored in a knowledge graph. The approach is fully automated, doesnot have any prerequisites to the natural language requirements(except English language) and is scalable to various sizes of corpora.The graph and its semantic relations are independent from the syn-tax of the natural language and able to depict user-comprehensiblepaths between distant linked statements. We propose to use spread-ing activation as semantic search algorithm and support the userby providing an explanation for each result.

REFERENCES[1] Michael R. Berthold, Ulrik Brandes, Tobias Kötter, Martin Mader, Uwe Nagel,

and Kilian Thiel. 2009. Pure spreading activation is pointless. In Conference

7https://www.w3.org/TR/owl2-overview/

Page 8: Knowledge Extraction from Natural Language Requirements ... · Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph Aaron Schlutter ... Requirements

Knowledge Extraction from Natural Language Requirements into a Semantic Relation Graph ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea

on information and knowledge management (CIKM). Association for ComputingMachinery (ACM), Beijing, China, 1915–1918. https://doi.org/10.1145/1645953.1646264

[2] Markus Borg, Per Runeson, and Anders Ardö. 2014. Recovering from a decade: Asystematic mapping of information retrieval approaches to software traceability.Empirical Software Engineering (EMSE) 19, 6 (2014), 1565–1616. https://doi.org/10.1007/s10664-013-9255-y

[3] Alexander Borgida, Sol Greenspan, and John Mylopoulos. 1985. KnowledgeRepresentation as the Basis for Requirements Specifications. , 152–169 pages.https://doi.org/10.1007/978-3-642-70840-4_13

[4] Xavier Carreras and Lluís Màrques. 2004. Introduction to the CoNLL-2004 SharedTask: Semantic Role Labeling. In Computational Natural Language Learning(CoNLL). Association for Computational Linguistics (ACL), Boston, MA, USA,89–97.

[5] Xavier Carreras and Lluís Màrquez. 2005. Introduction to the CoNLL-2005shared task: Semantic role labeling. In Computational Natural Language Learning(CoNLL). Association for Computational Linguistics (ACL), Ann Arbor, MI, USA,152–164.

[6] Marcelo Cataldo and James D. Herbsleb. 2011. Factors Leading to Integra-tion Failures in Global Feature-oriented Development: An Empirical Analy-sis. In International Conference on Software Engineering (ICSE). Association forComputing Machinery (ACM), Waikiki, Honolulu, Hawaii, 161–170. https://doi.org/10.1145/1985793.1985816

[7] Fabio Crestani. 1997. Application of Spreading Activation Techniques in In-formation Retrieval. Artificial Intelligence Review 11, 6 (1997), 453–482. https://doi.org/10.1023/A:1006569829653

[8] Diego Dermeval, Jéssyka Vilela, Ig Ibert Bittencourt, Jaelson Castro, Seiji Isotani,Patrick Brito, and Alan Silva. 2016. Applications of ontologies in requirementsengineering: A systematic review of the literature. In Requirements Engineering(RE). Springer, Beijing, China, 405–437. https://doi.org/10.1007/s00766-015-0222-6

[9] Alessio Ferrari, Giorgio Oronzo Spagnolo, and Stefania Gnesi. 2017. PURE:A Dataset of Public Requirements Documents. In 2017 IEEE 25th InternationalRequirements Engineering Conference (RE). IEEE, Lisbon, Portugal, 502–505. https://doi.org/10.1109/RE.2017.29

[10] Jin Guo, Jinghui Cheng, and Jane Cleland-Huang. 2017. Semantically EnhancedSoftware Traceability Using Deep Learning Techniques. In International Con-ference on Software Engineering (ICSE). IEEE, Buenos Aires, Argentina, 3–14.https://doi.org/10.1109/ICSE.2017.9

[11] Jin Guo, Marek Gibiec, and Jane Cleland-Huang. 2017. Tackling the term-mismatch problem in automated trace retrieval. Empirical Software Engineering(EMSE) 22, 3 (2017), 1103–1142. https://doi.org/10.1007/s10664-016-9479-8

[12] Jin Guo, Natawut Monaikul, and Jane Cleland-Huang. 2015. Trace links explained:An automated approach for generating rationales. In Requirements Engineering(RE). IEEE, Ottawa, Canada, 202–207. https://doi.org/10.1109/RE.2015.7320423

[13] Kerstin Hartig. 2019. Entwicklung eines Information-Retrieval-Systems zur Unter-stützung von Gefährdungs- und Risikoanalysen. Ph.D. Dissertation. TechnischeUniversität Berlin. https://doi.org/10.14279/depositonce-8408

[14] Kerstin Hartig and Thomas Karbe. 2016. Recommendation-Based DecisionSupport for Hazard Analysis and Risk Assessment. In Conference on Informa-tion, Process, and Knowledge Management (eKNOW). International Academy,Research and Industry Association (IARIA), Venice, Italy, 108–111. https://doi.org/10.14279/depositonce-6974

[15] Jane Huffman Hayes, Jared Payne, and Mallory Leppelmeier. 2019. Toward Im-proved Artificial Intelligence in Requirements Engineering: Metadata for TracingDatasets. In International Requirements Engineering Conference Workshops (REW).IEEE, Jeju Island, South Korea, 256–262. https://doi.org/10.1109/REW.2019.00052

[16] Luheng He, Kenton Lee, Mike Lewis, and Luke Zettlemoyer. 2017. Deep SemanticRole Labeling: What Works and What’s Next. In Association for ComputationalLinguistics. Association for Computational Linguistics (ACL), Vancouver, Canada,473–483. https://doi.org/10.18653/v1/p17-1044

[17] Anas Mahmoud and Nan Niu. 2015. On the role of semantics in automatedrequirements tracing. In Requirements Engineering (RE). Springer, Ottawa, Canada,281–300. https://doi.org/10.1007/s00766-013-0199-y

[18] Christopher D. Manning, Mihai Surdeanu, John Bauer, Jenny Finkel, Steven J.Bethard, and David McClosky. 2014. The Stanford CoreNLP Natural LanguageProcessing Toolkit. In System Demonstrations. Association for ComputationalLinguistics (ACL), Baltimore, MD, USA, 55–60.

[19] Vanessa N. Michalke and Kerstin Hartig. 2016. Explanation Retrieval in SemanticNetworks. In Knowledge Discovery, Knowledge Engineering and Knowledge Man-agement (IC3K). SciTePress, Porto, Portugal, 291–298. https://doi.org/10.14279/depositonce-7136

[20] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. 2013. EfficientEstimation of Word Representations in Vector Space. In International Conferenceon Learning Representations (ICLR). arXiv, Scottsdale, AZ, USA. https://arxiv.org/abs/1301.3781

[21] Chris Mills. 2017. Towards the automatic classification of traceability links. InAutomated Software Engineering (ASE). Urbana-Champaign, IL, USA, 1018–1021.

https://doi.org/10.1109/ASE.2017.8115723[22] Nikola Mrkšić, Diarmuid Ó Séaghdha, Blaise Thomson, Milica Gašić, Lina Maria

Rojas-Barahona, Pei-Hao Su, David Vandyke, Tsung-Hsien Wen, and Steve J.Young. 2016. Counter-fitting Word Vectors to Linguistic Constraints. In Confer-ence of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Association for Computational Linguistics (ACL),San Diego, CA, USA, 142–148. https://doi.org/10.18653/v1/N16-1018

[23] Jeffrey Pennington, Richard Socher, and Christopher D. Manning. 2014. GloVe:Global Vectors for Word Representation. In Empirical Methods in Natural Lan-guage Processing (EMNLP). Association for Computational Linguistics (ACL),Doha, Qatar, 1532–1543. https://doi.org/10.3115/v1/d14-1162

[24] Marcel Robeer, Garm Lucassen, Jan Martijn E. M. van der Werf, Fabiano Dalpiaz,and Sjaak Brinkkemper. 2016. Automated Extraction of Conceptual Models fromUser Stories via NLP. In Requirements Engineering (RE). IEEE, Beijing, China,196–205. https://doi.org/10.1109/RE.2016.40

[25] Magnus Sahlgren. 2008. The distributional hypothesis. Italian Journal of Linguis-tics 20, 1 (2008), 33–54.

[26] Aaron Schlutter and Andreas Vogelsang. 2018. Knowledge Representationof Requirements Documents Using Natural Language Processing. In Natu-ral Language Processing for Requirements Engineering (NLP4RE) (CEUR Work-shop Proceedings), Vol. 2075. RWTH Aachen, Utrecht, Netherlands. https://doi.org/10.14279/depositonce-7776

[27] Andreas Vogelsang. 2020. Feature dependencies in automotive software systems:Extent, awareness, and refactoring. Journal of Systems and Software (JSS) 160(2020). https://doi.org/10.1016/j.jss.2019.110458