Accolades
Firstly, I’d like to thank everyone attending, taking the time out of their day, and allowing me to present…
This methodology was developed for military application, and therefore this presentation is the theory of applying this same methodology to a more consumer based model…
Cloud Burst Methodology
• Methodology• Explanations of Cloud Burst Methods• Network Access Methodologies – Inclusive of Cloud Bursting• Identity Management Methodologies, inclusive of Cloud
Bursting• Special Purpose Computing – Cloud Bursting• Business and Use Cases Applied to Special Purpose
Computing• High Level Architecture Walkthrough• Q & A
Cloud Burst Methodology
Cloud Burst Methodology is the practical theory, and application of, the formation of an autonomous cloud computing fabric, intended for the creation of dedicated fabrics, established for single use, to officiate a process/mission/directive, after which that fabrics contents are then warehoused, and the core destroyed, eliminating any access structure and/or access to the data. The data is warehoused in a separate environment, with exclusionary access rights, different from and in contrast to the rights assessed within the special purpose fabric.
Cloud Burst Methodologies - Explained
The concept of ‘Cloud Burst Methodology’ came from a project named ‘MVII’ Vehicle Intelligence Initiative for Mobility, and the use of a technology terms Ad-Hoc VPN spawning. Ad-Hoc VPN spawning would occur, as one MVII equipped car approached another, creating an Ad-Hoc network VPN, via a spontaneously created network access key, then piggybacking to the POP (Point of Presence). The process in which, was spontaneously created, made the chance of penetration, or intrusion almost negligible.
Network Access Methodologies – Explained
The key purpose of Network, or Systems security is to create an environment where security ‘remediation’ was osmotic, establishing a healthy environment where CISO’s could rectify issues. The key issue here is the word remediate, meaning to cure a defect, and/or issue. Even with the current behavioral ‘AI’ technology being implemented within standardized Network Security platforms, we are still ‘remediating’ network penetration issues rather then proactively avoiding them.
Security Implications
Such as, in any network access structure, the more perpetual time, users, and traffic in/out of even a contained network unit increases the possibility of penetration and/or intrusion, either by an internal, or external entities.
Access Management Solutions
Cloud Burst methodologies can, by proxy, mitigate possible penetrations/intrusions, not by instantiating new network protocols, but by limiting access, to single use fabrics, disallowing perpetual access by utilizing these units for special purpose computing procedures…
Special Purpose Computing Solutions
• Special Purpose (Use) Computing– Using special purpose fabrics, for single use
computing instances• Such as transactional processing during high traffic
periods
• Limit Network Accessibility– By limited access
• Access Control
– By instantiating spontaneous user controls• Identity Management
Access Management Architecture
Special purpose computing reduces the chances of intrusion by establishing access rights, via 1024 bit encrypted keys, spontaneously created at instantiation. These key pairs have a half life, that of the users access capabilities and or accessibility, and are tied to that specific fabric.
Cloud Burst Methodology
• Practical Applications– Business Cases– Use Cases
• Best Practices– Methodology– Architecture
Practical Applications of a Cloud Burst Methodology
• Originally designed surrounding a military application– Other potential applications• High traffic, high volume shopping seasons
– PCI, and SOX compliancy packages
• Compliancy driven arenas, such as health care, specifically in disaster situations– HIPAA, PCI, and SOX compliancy packages
• Mergers and Acquisitions– SOX, and other compliancy packages
Business Case – Retail
Problem: Majority of credit, checking account, and identity theft occurs during high traffic seasonal shopping periods• Christmas, Thanksgiving, Easter, New Release, other
applicationsSolution: Using a derivative of a Cloud Burst Methodology, onboarding compliancy packages, you can process transactions, utilize CRM packages, create management protocols, and destroy the fabric after use, thereby eliminating all access control assigned to that fabric, and any other associations…
Business Case – Medical HIPAAProblem: In 2014 the clinical application, of the new HIPAA laws, will come into effect. HIPAA will now not only govern patient care, but also the application of new laws surrounding DLP, Network Security, and systems hardening…• Patient file storage• Patient demographic storage• Medical records storage• Patient care applicationsSolution: Using a derivative of a Cloud Burst Methodology, onboarding compliancy packages, you can establish special purpose cloud to officiate disaster initiatives, such as Hurricane Katrina, or Sandy, for the soul purpose of medical care, storage of documents, patients data, morgue and autopsy data. This would allow that facility to bring a fully functioning, prebuilt fabric, up in minutes. This in turn would allow unfettered access, limited by the half-life of the assessors credentials, for a certain time period, and/or until that fabrics life has expired; in turn allowing life saving medical information to be shared, safely and in a secure environment, for the time allotted.
Business Case – Financial
Problem: XXX
Solution: XXX
Introduction to the Stack
• Intro to hardware and support infrastructure
• Hardware– Vendor Blade Servers• HP, IBM, Cisco choices, the original theory was to utilize
HP equipment, i.e. 25 ‘C’ series HP Blade Servers
• Software
Stack Architecture
• Hardware– Vendor Blade Servers
• HP, IBM, Cisco choices, the original theory was to utilize HP equipment, i.e. 3 pools, consisting of, 24 (8 per chassis per pool) ‘C’ series HP Blade Servers, in a (3) NetApp 500 TB Flexpod Configurations
• Software– Orchestration and Server Automation– Access Management, Identity Management– Management Portal– User Authentication Portal– User Environment
Management Environment
• Administration Portal– Controls Orchestration• Creation of Flows• Delivery of Special Purpose Infrastructure
– Access Control• Management of Access Control
– Identity Management• Control and input of Identity Management
Environment
Special Purpose Computing Environment Instantiation
• Request would be initialized by internal ticket– Flow would be built
• Providing the functionality, and in this case compliancy packages, were pre-built, would then be injected into the flow
– Package retrieval• Golden (or Root) images, originally a mission protocol, built into
image form, with precise locations, mission status, mission scope, so on
– Package Instantiation• Flow would be initialized and executed, based on predetermined
requirements• Authentication keys are generated, based on prerequisites
Identity Management
• Access Rights– Keys would then be assigned to mission handlers,
i.e. mission stakeholders, or in this case project stakeholders
• Identity Management– Keys would then be assigned to mission executors,
i.e. operatives, in this case project managers
Pass-through or pass-off
• Administrators would then be reassigned, and ownership passed off to the mission, or in this case project, stakeholders– Although the pass-off has take place,
administrators still have some authority for break/fix scenarios
Authentication Process
• Mission, or in this case project, stakeholders take control of the Special Purpose Computing environment
Next Steps – Authentication Process
• Mission, or in this case project, stakeholders assign operators
• Verify requirements• Identify key processes• In this case, execute compliancy packages
At this point the half life of the authentication process has been initiated
Access Rights
• Mission (project) stakeholders initiate operators requested identities– Access is granted to operators
Contained User Authentication Portal
• Mission (project) operators take control of the user environment, and execute requested protocols– This pertains to mission, or project status, mission,
or project objectives
Repetition
• The same protocols and procedures would be executed, in order, subsequent to further instantiations…
• After each mission, or in this case project, concludes, the fabrics’ data is warehoused, and the core destroyed, taking with it all keys associated, as well as access rights granted…– Pass-off is then given back to the administrators, to access
raw data collected• Event correlation, data mining, so on, is initiated• Depending on the department and/or organization, internal
handling of the data will differ…
Conclusion
As stated, at the beginning, of this presentation, originally this methodology was created for purely military application… However I have seen the necessity to carry it forward to more of a consumer application, such as fabrics that further more of a compliancy driven model. This being said, there would be a need for business and use cases to determine sustainability within that model, and subsequent configuration changes, if need be.
Presentation End
Q & A
Ladies and Gentlemen, thank you for your time and consideration… I look forward to working with you all, in the near future. Please feel free to contact me, with any questions…
Jonathan SpindelEmail: [email protected]: (954) 299-2132