SERVICE ORIENTED QUALITY REQUIREMENT FRAMEWORK FOR CLOUD COMPUTING R.M.M.W RATHNAYAKE SUPERVISED BY DR. W.M.J.I WIJAYANAYAKE
Aug 18, 2015
SERVICE ORIENTED QUALITY REQUIREMENT
FRAMEWORK FOR CLOUD COMPUTING
R.M.M.W RATHNAYAKE
SUPERVISED BY DR. W.M.J.I WIJAYANAYAKE
OVERVIEW
• Introduction
• Objective
• Literature Review
• Conceptual Model
• Collection of data
• Data Analysis
• Conclusion
• Limitation
• Recommendation
• Future Avenue
INTRODUCTION
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing.
(The NIST Definition of Cloud Computing September, 2011.)
Service-orientation is a way of thinking in terms of services and service-based development and the outcomes of services.
INTRODUCTION
Servi
ce
Quality
OBJECTIVE
• Identifying cloud service quality requirements at functional level and Runtime level
• Identifying cloud computing attributes which are supported for value creation
• Identifying most effective quality requirements at each phases (Functional and runtime )
• Based on that, develop a service oriented quality requirement Framework for cloud computing
LITERATURE REVIEW
Models / Frameworks Focus study
[S. Negaeshet.al., 2003]) SERVQUAL requirement model
Yu’s quality measurement model for web service [Yu et. al.,2006]
QoWs is categorized in to two, Business and run time and separate quality factors has identified under each factor.
IAAS (Infrastructure as services) Requirement of cloud users [Rochwerger et. al.,2009]
Focused on main quality requirements for the cloud services at IaaS
LITERATURE REVIEW
Models / Frameworks Focus study
NIST model for cloud characteristics
On demand Self- Service, Broad network access, Resource pooling, Rapid elasticity, Measured services.
Yu’s Dimensional model for Deploying & managing web services [Yu et.al., 2006]l
Arsanjani ‘s layered SOA architecture [Arsanjani A.,2004]
Interoperability, Web, Security/ Privacy, Quality of web, service Management
IT M Framework on Cloud Computing Environment [Arabalidousti F.,et al.,2014]
Service management;Security, resiliency , performance & consumptionGovernance of IT
Brandis’s dimensional model for cloud governance [Brandis K.et.al.,2013]
Strategic alignment, Value delivery, Risk management, Resource management, and Performance measurement
CONCEPTUAL MODEL
DIMENSIONS OF SOCC (SERVICE ORIENTED CLOUD COMPUTING)
Dimension Indicators Variable name
Interoperable service
architecture (INT)
systematic interoperability INT1
semantic interoperability INT2
Cloud service management
(CSM)
Service provisioning
management
CPM3
Business and operational
support management
CSM4
Service measurement (SM) Service billing BIL5
Service monitoring SMN6
On demand self service (OND) Service provisioning capability SPR7
Shared Resource pooling (SRP) Multi-tenant model MLT8
capability of assigning
Different physical and virtual
resources dynamically
RES9
DIMENSIONS OF SOCC
Broad network access (BNA)
Access over the networks NET10
Access over client platforms CLP11
Rapid elasticity (RE) Number of versions released V12
Availability at any given time AVI13
FUNCTIONAL LEVEL QUALITY REQUIREMENT INDICATORS
Dimension Factors Variable Name
User friendliness
(F_UF) - The physical features of
the system, such as whether the
system is appealing and looks good.
Attractiveness of the
application
F_UF1
Consistency F_UF2
Understandability F_UF3
Reliability
(F_REL) - focusing on whether the
system is right, useful, and
dependable
Relevance F_REL1
Dependability F_REL2
Cost benefit F_REL3
Accuracy F_REL4
FUNCTIONAL LEVEL QUALITY REQUIREMENT INDICATORS
Responsiveness (F_RES) - The
readiness of the service to provide
service
Service time F_RES1
Assurance (F_AS) - The
knowledge and courtesy expressed
in the system and its ability to
inspire trust and confidence in its
safety
Service Transparency
(SLAs)
F_AS1
Reputation F_AS2
Information security F_AS3
Completeness F_AS4
Sufficiency F_AS5
User orientation (F_UO) -
individualized attention
Customization of
application
F_UO1
QUALITY REQUIREMENT INDICATORS AT RUN TIME
Dimension Factor Variable
Name
User friendliness
(R_UF)
Throughput R_UF1
Number of active sessions
(concurrency level)
R_UF2
Resource allocation R_UF3
Redundancy R_UF4
Reliability (R_REL) Dependability R_REL1
Recoverability R_REL2
Responsiveness
(R_RES)
Response time R_RES1
Assurance (R_ASS) Availability R_ASS1
Accessibility R_ASS2
Data security R_ASS3
Technical support service R_ASS4
User orientation
(R_UO)
Integrity R_UO1
DATA COLLECTION
Pilot survey - 10 respondents• Sample : Graduates from 2009 Batch from the department of
Industrial Management who has the industry exposure and the knowledge.
Main survey – 53 respondents• Online questionnaire• Focus on IT companies
Likert scale
Response SuperiorSomewhat satisfactor
y
About average
Somewhat unsatisfactor
y
Very poor
Scale value
5 4 3 2 1
DATA ANALYSIS
Company Name Respondents
JKH 3
Attune Lanka Consulting 9
Pearson Lanka 2
Leapset 5
YooFoo Technologies 3
Elecctro Scientific
Industries (USA) 1
IFS 8
Platform1 2
Navantis IT pvt Ltd 4
Rezgateway Pvt Ltd. 4
Virtusa 5
Hsenid 4
Excelsoft global 3
Total 53
JKH6%
Attune Lanka Consulting
17%
Pearson Lanka
4%
Leapset9% YooFoo
Technologies6%
Elecctro Scientific In-
dustries (USA)
2%
IFS15%
Platform14%
Navantis IT pvt Ltd
8%
Rezgateway Pvt Ltd.
8%
Virtusa9%
Hsenid8%
Excelsoft global6%
Respondents
DATA ANALYSIS
Areas of Expertise Respondents %
Software Engineering 43 81.13%
Network systems and data communications
analysis 11 20.75%
Network and computer systems
administration 13 24.53%
Computer support specialization 22 41.51%
Business Analysis 7 13.21%
Database administration 8 15.09%
Software Quality Assurance 12 22.64%Experience Level respondents %
< =1 year 8 15.09%
2- 4 years 26 49.06%
5 - 7 years 15 28.30%
above 7 Years 4 7.55%
DATA ANALYSIS
Software En-gineering
37%
Network sys-tems and data
communications analysis
9%
Network and computer systems administration
11%
Computer sup-port special-
ization19%
Business Anal-ysis6%
Database administra-
tion7%
Software Quality Assuarence
10%
Expertised
< =1 year15%
2- 4 years49%
5- 7 years28%
above 7 Years8%
Experience
DATA ANALYSIS – SERVICE PROVIDERS
Cloud service provider Response %
Microsoft Azure 28 52.83%
IBM 13 24.53%
Google App Engine 21 39.62%
SAP HANA 9 16.98%
Amazon Web Services (AWS) 5 9.43%
CISCO 3 5.66%
Mango Apps 2 3.77%Other 2 3.77%
Microsoft Azure34%
IBM16%
Google App Engine
25%
SAP HANA11%
Amazon Web Ser-
vices (AWS)6%
CISCO4% Mango Apps
2%Other
2%
cloud service providers
DATA ANALYSIS
Service Type Response
Software as a service 52
Platform as a service 47
Infrastructure as a service 16
Other 0
Software as a service
45%Platform as a service
41%
Infrastruc-ture as a service
14%
Service type
STATISTICAL ANALYSIS
• Hypothesis starts with a causal model
• The model is tested against the obtained data
• The operationalization then allow testing the relationship between the concepts,
Confirmatory Modeling
• SEM (Structural Equation Modeling)
SEM TECHNIQUE : PLS (PARTIAL LEAST SQUARES METHOD )• Out of the SEM techniques Partial Least Squares (PLS) is
the well-established technique for estimating path coefficients in structural models and has been widely used in various research studies [Gefen et. al., 2000]
• Path model consists of three components:
• Structural model -
• inner model (Graphical model)
• Measurement model -
• The connections between LVs and MVs are referred to as measurement or outer model
• Weighting scheme -
• estimation of the inner weights of the PLS algorithm
PLS MODEL
PLS MODEL – PATH LEAST SQUARE VALUE
LESS RELIABLE INDICATORS
( Churchill, 1979) recommend eliminating reflective indicators from measurement models if their outer standardized loadings are smaller than 0.4.
Only if an indicator’s reliability is low , otherwise it should be 0.7
SOCC dimensions Quality requirement at functional
Quality requirement at runtime
Indicator Outer loadings
Indicator Outer loadings
IndicatorOuter
loadings
INT1 0.0713 F_AS2 0.2788 R_ASS4 0.355CSM4 0.2909 F_REL2 0.2455 R_ASS3 0.2592BILL5 0.1617 F_UO1 -0.0906 R_REL1 0.039SPR7 0.1107 R_REL2 0.3478MLT8 -0.0936 R_UF3 0.2066CLP11 0.3585 R_UF4 0.1666 AVI13 -0.1205 R_UO1 0.3912
DERIVED MODEL
RELIABILITY STATISTICS
AVE Composite
ReliabilityR Square Cronbach’s
AlphaCommunality
SOCC service dimensions 0.3629 0.7673 0 0.6362 0.3629Service Quality requirement 0.257 0.8634 0.0538 0.8321 0.257Functional level requirements 0.3087 0.8282 0.9146 0.7717 0.3087Quality Requirements at Runtime 0.2761 0.7407 0.7877 0.6106 0.2761
AVE Composite Reliability
R Square Cronbach’s Alpha
Communality
SOCC service performance dimensions 0.1804 0.628 0 0.4908 0.1804Service Quality Requirement 0.1961 0.8409 0.0585 0.8035 0.1961Functional requirement 0.2537 0.8039 0.9047 0.7438 0.2537 Runtime Requirement 0.1973 0.7176 0.7765 0.5931 0.1973
Reliability improvement
T - VALUES
T- VALUES
Confidence level = 90%
Significance level: 0.1
The hypothesized paths of the constructs are considered to be significant if t-value is greater than 1.6747
All t values > 1.67
T Statistics (|O/STERR|)Cloud Dimension -> Service Quality requirement 1.8223Service Quality requirement -> Functional requirement 60.523Service Quality requirement -> Runtime Requirement 22.002Cloud Dimension -> Functional requirement 1.8115
Cloud Dimension -> Runtime Requirement 1.8363
MAIN PATH COEFFICIENTsignificance P value <0.01
If path coefficient value is > 0.01 , the model is accepted
Therefore null hypothesis is rejected
H1: Runtime Level quality
requirements are positively
related to Service Quality
requirements of users
1.3253
Accepted
H2: Functional Level quality requirements
are positively related to Service
Quality requirements of users
2.4538
Accepted
H3:
main
Service Quality requirements are
positively related with SOCC service
dimensions.
0.1064
Accepted
PATH COEFFICIENT OF DERIVED MODEL
Quality requireme
nt
Functional level
quality requireme
nt
Quality requireme
nt at runtime
SOCC dimension
s
P = 0.1064
P=2.4538
P= 1.3253
Significance P value <0.01Significance T value <1.674
T= 22.002
T= 1.823
T= 60.523
T= 1.8115
T= 1.8363
ANOTHER FINDINGS
Individual behaviour of each Y variables against SOCC dimension
The individual T values show strong relationship with each functional and runtime
Strength
>Functional
Runtime
CONCLUSION
The impact of functional level quality is more higher than Runtime level quality requirements, when determining service quality of cloud services
Service quality of cloud services
Functional level quality requirement
s
Runtime level quality requirement
s
CONCLUSION
The importance of quality attributes of cloud services at functional level.
AccuracyRelevance (service
alignmentAttractiveness of the
application Sufficiency
Cost benefitService Transparency SLA (Support Service)
Completeness
Information security
Service time
Functional levelQuality requirement
Pri
ori
ty
CONCLUSION (CONT.)
The importance of quality attributes of cloud services at Runtime level.
Response time
Throughput
Availability
Integrity
Accessibility
Recoverability
Data security
Quality Requirements at Runtime
Pri
ori
ty
CONCLUSION
• The relationship between Functional level quality requirements and SOCC dimensions is stronger than ,
• the relationship between Quality requirements at runtime level and SOCC dimensions.
• Therefore the functional level quality requirements can make great impact on overall service quality of cloud services
LIMITATION
• The perception of service quality requirement can be differed from its spectrum of the domain where the cloud services are used in
• The collection of data is more bias in geographical aspects because of the unfeasibility of achieving
• The uncertainty of the total population of cloud users in Sri Lanka
RECOMMENDATION
• This framework can be used as benchmark to assess the service discrepancies of cloud services in both perspectives; Provider and the user
• Minimize the service gap by standardizing the quality as to improve the satisfaction level of users.
• The outcome of the research may valid for next 5 years; since the avenue of the Service oriented cloud technology has continuous improvements in advanced.
FUTURE AVENUE
• Lead the analysis on domain wise such as development, manufacturing, service sector…etc.. .Then the prioritization of variables can be attributed based on domain
• The improving the framework in the perspective of IT governance which includes assessment of continuous performance and Continuous Quality management and improvement
• Thereby it is possible to develop ontology for the purpose of quality assurance of cloud implementations
THANK YOU !!
Q&A