ABSTRACT Cloud is basically a clusters of multiple dedicated servers attached within a network. Cloud Computing is a network based environment that focuses on sharing computations or resources. In cloud customers only pay for what they use and have not to pay for local resources which they need such as storage or infrastructure. so this is the main advantage of cloud computing and main reason for gaining popularity in todays world Also..But in cloud the main problem that occurs is security. And now a day’s security and privacy both are main concern that needed to be considered. To overcome the problem of security we are introducing the new technique which is called as Fog Computing .Fog Computing is not a replacement of cloud it is just extends the cloud computing by providing security in the cloud environment. With Fog services we are able to enhance the cloud experience by isolating user’s data that need to live on the edge. The main aim of fog computing is to place the data close to the end user.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
ABSTRACT
Cloud is basically a clusters of multiple dedicated servers attached within a network. Cloud
Computing is a network based environment that focuses on sharing computations or resources. In cloud
customers only pay for what they use and have not to pay for local resources which they need such as
storage or infrastructure. so this is the main advantage of cloud computing and main reason for gaining
popularity in todays world Also..But in cloud the main problem that occurs is security. And now a day’s
security and privacy both are main concern that needed to be considered. To overcome the problem of
security we are introducing the new technique which is called as Fog Computing .Fog Computing is not
a replacement of cloud it is just extends the cloud computing by providing security in the cloud
environment. With Fog services we are able to enhance the cloud experience by isolating user’s data that
need to live on the edge. The main aim of fog computing is to place the data close to the end user.
1. INTRODUCTION
In today's worlds the small as well as big -big organizations are using cloud computing technology to protect
their data and to use the cloud resources as and when they need. Cloud is a subscription based service .Cloud
computing is a shared pool of resources. The way of use computers and store our personal and business
information can arises new data security challenges. Encryption mechanisms not protect the data in the cloud
from unauthorized access. As we know that the traditional database system are usually deployed in closed
environment where user can access the system only through a restricted network or internet. With the fast
growth of W.W.W user can access virtually any database for which they have proper access right from
anywhere in the world . By registering into cloud the users are ready to get the resources from cloud
providers and the organization can access their data from anywhere and at any time when they need. But this
comfortness comes with certain type of risk like security and privacy. To overcome by this problem we are
using a new technique called as fog computing. Fog computing provides security in cloud environment in a
greater extend to get the benefit of this technique a user need to get registered with the fog. once the user is
ready by filling up the sign up form he will get the message or email that he is ready to take the services
from fog computing.
1.1Existing System
Existing data protection mechanisms such as encryption was failed in securing the data from the
attackers. It does not verify whether the user was authorized or not. Cloud computing security does not focus
on ways of secure the data from unauthorized access. Encryption does not provide much security to our data.
In 2009.We have our own confidential documents in the cloud. This files does not have much security. So,
hacker gains access the documents. Twitter incident is one example of a data theft attack in the Cloud.
Difficult to find the attacker. In 2010 and 2011 Cloud computing security was developed against attackers.
Finding of hackers in the cloud. Additionally, it shows that recent research results that might be useful to
protect data in the cloud.
1.2Proposed System
We proposed a completely new technique to secure user’s data in cloud using user behavior and
decoy information technology called as Fog Computing. We use this technique to provide data security in
the cloud. A different approach for securing data in the cloud using offensive decoy technology. We monitor
data access in the cloud and detect abnormal data access patterns. In this technique when the unauthorized
person try to access the data of the real user the system generates the fake documents in such a way that the
unauthorized person was also not able to identify that the data is fake or real .It is identified thought a
question which is entered by the real user at the time of filling the sign up form. If the answer of the question
is wrong it means the user is not the real user and the system provide the fake document else original
documents will be provided by the system to the real user.
2.SYSTEM OVERVIEW
2.1 Cloud Architecture
In Cloud architecture, the systems architecture(A system architecture or systems architecture is the
conceptual model that defines the structure, behavior, and more views of a system. An architecture
description is a formal description and representation of a system) of the software systems(The term
software system is often used as a synonym of computer program or software.) involved in the delivery of
cloud computing, typically involves multiple cloud components communicating with each other over
application programming interfaces, usually web services. This resembles the Unix philosophy of having
multiple programs each doing one thing well and working together over universal interfaces. Complexity is
controlled and the resulting systems are more manageable than their monolithic counterparts.
Fig 2.1 :Cloud Computing Sample Architecture
2.2 Cloud computing Services:
Cloud computing is a model for enabling convenient, on demand network access to a shared pool of
configurable computing resources (for example, networks, servers, storage, applications, and services) that
can be rapidly provisioned and released with minimal management effort or service-provider interaction. It
is divide into three types.
1. Application as a service.
2. Infrastructure as a service.
3. Platform as a service.
Fig 2.2: Cloud computing Services
Cloud computing exhibits the following key characteristics:
1. Agility:
improves with users' ability to re-provision technological infrastructure resources.
2. Cost:
Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted
to operational expenditure. This is purported to lower barriers to entry, as infrastructure is typically
provided by a third-party and does not need to be purchased for one-time or infrequent intensive
computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and
fewer IT skills are required for implementation. The e-FISCAL project's state of the art
repository contains several articles looking into cost aspects in more detail, most of them concluding
that costs savings depend on the type of activities supported and the type of infrastructure available in-
house.
3. Virtualization:
Technology allows servers and storage devices to be shared and utilization be increased. Applications
can be easily migrated from one physical server to another.
4. Multi tenancy:
Enables sharing of resources and costs across a large pool of users thus allowing for.
5. Centralization:
Centralization of infrastructure in locations with lower costs. (such as real estate, electricity, etc.)
6. Utilization and efficiency:
Improvements for systems that are often only 10–20% utilized.
7. Reliability:
Reliability is improved if multiple redundant sites are used, which makes well-designed cloud
computing suitable for business continuity and disaster recovery.
8. Performance:
Performance is monitored and consistent and loosely coupled architectures are constructed using web
services as the system interface.
9. Security:
Could improve due to centralization of data, increased security-focused resources, etc., but concerns
can persist about loss of control over certain sensitive data, and the lack of security for stored kernels.
Security is often as good as or better than other traditional systems, in part because providers are able
to devote resources to solving security issues that many customers cannot afford. However, the
complexity of security is greatly increased when data is distributed over a wider area or greater number
of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access
to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by
users' desire to retain control over the infrastructure and avoid losing control of information security.
10. Maintenance:
Maintenance of cloud computing applications is easier, because they do not need to be installed on
each user's computer and can be accessed from different places.
Fig 2. 3: Represents The Benefit
2.3 Security Issues in Service Model
Cloud computing having three delivery models through which services are delivered to end users.
These models are SaaS, IaaS and PaaS which provide software, Infrastructure and platform assets to the
users. They have different level of security requirements.
Fig 2.4 : Security Issues in Service Model
Security issues in SaaS:
Software as service is a model, where the software applications are hosted slightly by the service provider
and available to users on request, over the internet. In SaaS, client data is available on the internet and may
be visible to other users, it is the responsibility of provider to set proper security checks for data protection.
This is the major security risk, which create a problem in secure data migration and storage. The following
security measures should be counted in SaaS application improvement process such that Data Security, Data
locality, Data integrity, Data separation, Data access, Data confidentiality, Data breaches, Network Security,
Authentication and authorization, Web application security, Identity management process. The following are
the basics issues through which malicious user get access and violate the data Aruna et al., International
2013, IJARCSSE All Rights Reserved Page | 294 Suggested remedies by CSA to lessen this threat:
Implement strong API access control.
Encrypt and protect integrity of data in transit.
Analyze data protection at both design and run time.
Implement strong key generation, storage and management, and destruction practices.
Contractually demand providers to wipe persistent media before it is released into the pool.
Contractually specify provider backup and retention strategies.
vi. Account, Service & Traffic Hijacking:
Account service and traffic hijacking is another issue that cloud users need to be aware of. These threats
range from man-in the-middle attacks, to phishing and spam campaigns, to denial-of service attacks.
Suggested remedies by CSA to lessen this threat:
Prohibit the sharing of account credentials between users and services.
Leverage strong two-factor authentication techniques where possible.
Employ proactive monitoring to detect unauthorized activity.
Understand cloud provider security policies and SLAs.
vii. Unknown Risk Profile:
Security should be always in the upper portion of the priority list. Code updates, security practices,
vulnerability profiles, intrusion attempts – all things that should always be kept in mind ,Suggested remedies
by CSA to lessen this threat:
Disclosure of applicable logs and data.
Partial/full disclosure of infrastructure details (e.g., patch levels, firewalls, etc).3
Monitoring and alerting on necessary information.
3.SECURING CLOUDS USING FOG
3.1Fog Computing:
Below is the reference architecture of a Fog computing environment in an enterprise. You can see that the Fog network is close to the smart devices, data processing is happening closer to the devices and the processed information is passed to the cloud computing environment.
Fig 3.1: Reference Architecture
Just got comfortable with the concept of cloud computing Well, that is now in past. Cloud computing has now been overtaken by a new concept called fog computing which is certainly much better and bigger than the cloud.
Fog computing is quite similar to cloud and just like cloud computing it also provides its users with data, storage, compute and application services. The thing that distinguishes fog from cloud is its support for mobility, its proximity to its end-users and its dense geographical distribution. Its services are hosted at the network edge or even on devices such as set-top-boxes or access points. By doing this, fog computing helps in reducing service latency and even improves QoS, which further result in a superior user experience.
Fog computing even supports emerging Internet of Things (IoT) applications that require real time or
predictable latency. A thing in Internet of Things is referred to as any natural or manmade object that can
be assigned an Internet Protocol (IP) address and provided with an ability to transfer data over a network.
Some of these can end up creating a lot of data. Cisco here provides us with an example of a jet engine,
which is capable of creating 10 terabytes of data about its condition and performance that too in half-hour.
Transmitting all this data to the cloud and then transmitting response data back ends up creating a huge
demand on bandwidth. This process further requires a considerable amount of time to take place and can
suffer from latency.
In fog computing, much of the processing takes place in a router. This type of computing creates a
virtual platform that provides networking, compute and storage services between traditional cloud
computing data centers and end devices. These services are central to both fog and cloud computing. They
are also important for supporting the emerging Internet deployments. Fog computing also has the
capability of enabling a new breed of aggregated services and applications, such as the smart energy
distribution. In smart energy distribution, all the energy load balancing apps will run on network edge
devices that will automatically switch to alternative energies like wind and solar etc., based on availability,
demand and lowest price.
The usage of fog computing can accelerate the innovation process in ways that has never been seen
before. This includes self-healing, self-organising and self-learning apps for industrial networks.
products.
Fig 3.2: Without Fog Computing and With Fog Computing in Grid
3.2 Real-Time Large Scale Distributed Fog Computing
"Fog Computing" is a highly distributed broadly decentralized "cloud" that operates close to the
operational level, where data is created and most often used. Fog computing at the ground-level is an
excellent choice for applications that need computing near use that is fit for purpose, where there is high
volume real-time and/or time-critical local data, where data has the greatest meaning within its context,
where fast localized turn around of results is important, where sending an over abundance of raw data to an
enterprise "cloud" is unnecessary, undesireable or bandwidth is expensive or limited.
Example applications of fog computing within an industrial context are analytics, optimization and
advanced control at a manufacturing work center, unit-operation, across and between unit-operations
where sensors, controllers, historians, analytical engines all share data interactively in real-time. At the
upper edges of the "fog" is local site-wide computing, such manufacturing plant systems that span work
centers and unit operations, higher yet would be regional clouds and finally the cloud at the enterprise
level. Fog computing is not independent of enterprise cloud computing, but connected to it sending
cleansed summarized information and in return receiving enterprise information needed locally.
Fog computing places data management, compute power, performance, reliability and recovery in
the hands of the people who understand the needs; the operators, engineers and IT staff for a unit
operation, an oil and gas platform, or other localized operation, so that it can be tailored for "fit-for-
purpose" in a high speed real-time environment.
Fog computing reduces bandwidth needs, as 80% of all data is needed within the local context, such
as; pressures, temperatures, materials charges, flow rates. To send such real-time information into the
enterprise cloud would be burdensome in bandwidth and centralized storage. Enterprise data base bloat
would occur for information rarely used at that level. In this way a limited amount of summarized
information can be transmitted up to the cloud and also down from the cloud to the local operation, such as
customer product performance feedback to the source of those products.
Fig 3.2: Real-Time Large Scale Distributed Fog Computing
We place computing where it is needed, and performant, suited for the purpose, sitting where it needs to be,
at a work center, inside a control panel, at a desk, in a lab, in a rack in a data center, anywhere and
everywhere, all sharing related data to understand and improve your performance. While located throughout
your organization, a fog computing system operates as a single unified resource, a distributed low level
cloud that integrates with centralized clouds to obtain market and customer feedback, desires and behavior’s
that reflect product performance in the eyes of the customer.
The characteristics of a fog computing system are:
A Highly Distributed Concurrent Computing (HDCC) System.
A peer-to-peer mesh of computational nodes in a virtual hierarchical structure that matches your
organization
Communicates with smart sensors, controllers, historians, quality and materials control systems and
others as peers
Runs on affordable, off the shelf computing technologies
Supports multiple operating platforms; Unix, Windows, Mac
Employs simple, fast and standardized IoT internet protocols (TCP/IP, Sockets, etc.)
Browser user experience, after all, it is the key aspect of an "Industrial Internet of Things"