CLOUD
COMPUTING
Dr. Akram Alhammadi
Examining Infrastructure Capable of Supporting
a Migration
• Available Network Capacity
• Downtime During the Migration
• Local time zones and follow the sun migration constraints
Managing User Identities
and Roles
RBAC: Identifying Users and What Their Roles Are
• Role-based access control (RBAC) is a method in which access rights are
granted to, or restricted from, users based on which roles they perform
in an organization.
Single Sign-on Systems
• Single sign-on (SSO) is an approach that reduces the need to sign into
multiple systems for access. SSO allows a user to log in just one time
and be granted access rights to multiple systems. SSO is also effective
when terminating a session. When you log off, the directory services
will log out, or disconnect you from the multiple systems you had
accessed.
Understanding Infrastructure Services
• Domain Name Service
• Dynamic Host Configuration Protocol
• Certificate Services
Most cloud providers offer their customers the ability to outsource the
creation, management, and deployment of digital security certificates.
• Load Balancing
Load balancing addresses the issues found when cloud workloads and
connections increase to the point where a single server can no longer handle
the workload or performance requirements of web, DNS, and FTP servers;
firewalls; and other network services. A load balancer is commonly found in
front of web servers.
• Multilayer User Authentication Services
Multifactor or multilayer authentication adds an additional layer of
authentication by adding token-based systems in addition to the
traditional username and password authentication model.
Virtual MFA device Universal 2nd Factor (U2F) Security Key
Support for multiple tokens on a single device.
Support for multiple root and IAM users
using a single security key
Firewall Security
• Firewalls are generally deployed between the cloud network and the
cloud consumer for protection of unauthorized access into the
networks. A firewall is either hardware based or a virtualized device
that inspects network traffic and compares the traffic to a defined
rules’ list to determine whether that traffic is allowed. If it is not
permitted, the firewall will block the traffic from entering the
network.
Cloud Data Security
Asset management broken up into two parts: data asset management and cloud asset management.
• Data assets are the important information you have, such as customer names and addresses, credit card
information, bank account information, or credentials to access such data.
• Cloud assets are the things you have that store and process your data—compute resources such as servers or
containers, storage such as object stores or block storage, and platform instances such as databases or queues.
Understanding the Cloud Data Lifecycle
phase steps in the lifecycle may be repeated, or taken out of order.
Create Although this initial phase is called "Create," it can
also be thought of as modification.
Store Immediately after the data is created, it must be
stored in a way that is useable to the system or application.
Data can be stored in numerous ways. Storage methods
include files on a file system, remote object storage in
a cloud, and data written to a database.
Cloud Security Professional must ensure that all storage methods employ whatever technologies are necessary for
its data classification level, including the use of access controls, , and auditing. The use of appropriate redundancy
and backup methods also comes into play immediately at the Store phase to protect the data on top of the security
controls.
Cloud Data Lifecycle
• Use The Use phase is where the data is actually consumed and processed by an
application or user. At this point, because the data is being used, viewed, or
processed, it is more exposed and faces increased chance of compromise or
leak. it also must be exposed in an unencrypted state. data requires auditing
and logging mechanisms to be in place as it is accessed for the first time. The
Use phase is considered to be purely in a read-only mode because this phase
does not cover modification in this sense; modification is covered in the
Create phase of the lifecycle.
• Share In the Share phase, data is made available for use outside the system.
This presents a big challenge for a system to ensure proper protections are in
place once the data leaves the system and is shared externally. Unlike the Use
phase, with the Share phase the data is being enabled for use by customers,
partners, contractors, and other associated groups, it is no longer under the
security control mechanisms employed there.
• Archive simply involves moving data to long-term storage, thus removing it
from being active or "hot" within a system. One of the more overlooked aspects
of archiving data is the ability to retrieve and recover it as well.
• The Destroy phase of the lifecycle is where the data is either made
inaccessible or permanently erased and protected, with the method
and approach being based on the classification and sensitivity of the
data.
• Design and Implement Cloud Data Storage Architectures
• Each of the three hosting models with a cloud environment-laaS, PaaS, and SaaS-
uses its own unique storage methods
• IaaS, storage falls into two basic categories: volume and object
• Volume storage is a virtual hard drive that is allocated by the cloud provider and attached
to the virtual host.
• Object storage : files are stored as objects in an independent system and given a key value
for reference and retrieval. Many cloud systems use object storage for virtual host
images and large files.
• PaaS The storage design for PaaS is quite a bit different from IaaS. PaaS, storage falls
into the categories of structured and unstructured.
• 1. Structured Data
• Definition: Data that is organized in a predefined format or model, making within a
database it easily import data from other data sources. organized and optimized for
searching technologies.
• Characteristics:
• Format: Typically stored in tables with rows and columns (e.g., relational databases).
• Schema: Follows a strict schema that defines the data types and relationships.
• Ease of Use: Easily queried using Structured Query Language (SQL).
• Examples:
• Customer databases (name, address, phone number)
• Transaction records (date, amount, transaction ID)
• Inventory lists (product ID, description, price)
• 2. Unstructured Data
• Definition: Data that does not have a predefined format or structure,
making it more complex to organize and analyze. This can be because of
the size of the files or the types of files
• Characteristics:
• Format: Can include text, images, audio, video, and more.
• Schema: more difficult to categorize and analyze.
• Searchability: Requires advanced technologies (like natural language processing or
machine learning) for effective analysis.
• Examples:
• Emails and text documents
• Social media posts and comments
• Multimedia files (photos, videos, audio recordings)
• Examples:
• Object Storage: Amazon S3, Google Cloud Storage, Microsoft Azure Blob Storage.
• NoSQL Databases: MongoDB, Cassandra, Couchbase.
• The two most common storage types for SaaS are information storage and management as well as content and file
storage.
Content and File Storage
• Content and file storage are essential components of data management, enabling users to store, organize, and retrieve
digital assets efficiently. Here’s a detailed overview of both concepts:
• 1. Content Storage
• Definition: Refers to the storage of digital content, which can include text, images, audio, video, and other multimedia
types.
• Characteristics:
• Metadata: Content storage often involves rich metadata to describe and categorize content for easier retrieval and management.
• Access Methods: Typically accessed through web interfaces or APIs, allowing for integration with content management systems
(CMS) and applications.
• Examples:
• Digital Asset Management (DAM): Systems designed specifically for managing digital media files (e.g., Adobe Experience Manager,
Bynder).
• 2. File Storage
• Definition: A method of storing data as individual files within a hierarchical file system, often organized into directories
or folders.
• Characteristics:
• File Structure: Uses a traditional file and folder organization, making it easy to navigate and manage files.
• Access Methods: Accessed via file paths, with support for various protocols (e.g., SMB, NFS).
• Examples:
• File Systems: Local storage systems on hard drives, network-attached storage (NAS), and cloud file storage solutions (e.g., Google
Drive, Dropbox).
Information Storage and Management
• Information storage and management encompass the strategies, technologies, and practices used to
store, organize, and retrieve data efficiently and securely. Here’s an overview of the key aspects of
this field:
• 1. Types of Information Storage
• Structured Storage:
• Data is organized in a predefined format (e.g., relational databases).
• Examples: MySQL, Oracle Database.
• Unstructured Storage:
• Data lacks a fixed structure (e.g., documents, multimedia).
• Examples: Object storage (Amazon S3, Google Cloud Storage).
• 2. Storage Technologies
• Databases:
• Relational Databases: Use tables and SQL for data management.
• NoSQL Databases: Designed for unstructured data, allowing for flexible data models.
• File Systems:
• Traditional file systems for storing documents and files on physical or network drives.
• Cloud Storage:
• Provides scalable storage solutions over the internet, allowing for flexible access and management.
• Data Warehousing:
• Central repositories for storing and analyzing large volumes of structured data.
• 3. Data Management Practices
• Data Governance:
• Policies and procedures to ensure data quality, security, and agreement with regulations.
• Data Backup and Recovery:
• Strategies for regularly backing up data to prevent loss and ensure business continuity.
• Data Lifecycle Management:
• Managing data from creation and storage to archiving and deletion.
• Access Controls:
• Implementing permissions and authentication measures to secure data.
• 4. Challenges in Information Storage and Management
• Data Volume: The exponential growth of data makes storage management increasingly complex.
• Data Security: Protecting sensitive information from breaches and unauthorized access.
• Integration: Ensuring seamless integration between different storage systems and applications.
Object Storage in the Cloud
• Object storage is a data storage architecture that manages data as objects, which are stored in a flat
namespace. It is widely used in cloud computing for its scalability, stability, and accessibility. Here are
some key aspects of cloud object storage:
• 1. What is Object Storage?
• Data Representation: Data is stored as objects, each containing the data itself, metadata, and a unique
identifier.
• Flat Structure: Unlike traditional file systems that use hierarchical structures, object storage uses a flat
namespace, making it easier to manage large amounts of data.
• 2. Key Features
• Scalability: Can handle vast amounts of unstructured data, making it ideal for big data applications, backups,
and media storage.
• Stability: Data is often replicated across multiple locations, ensuring redundancy and protection against data
loss.
• Accessibility: Objects can be accessed via RESTful APIs, allowing for easy integration with applications and
services.
• Advantages
• Cost-Effective: Pay-as-you-go pricing models that help reduce costs, especially for large volumes of data.
• Global Accessibility: Data can be accessed from anywhere, making it suitable for distributed applications.
• Metadata Management: Rich metadata capabilities enhance data organization and retrieval.
5. Popular Cloud Object Storage Services
• Amazon S3 (Simple Storage Service): A widely used service that
provides high availability and scalability.
Security Considerations
• Access Control: Implementing policies to control who can access
data.
• Data Encryption: Encrypting data at rest and in transit to protect
against unauthorized access.
Threats to Storage Types
• The most common and well understood threat to storage is the
unauthorized access or use of the data itself.
• This can be an external threat or a compromise of a system, or it can be in
the form of a malicious insider who possesses the credentials to access the
data but uses them for unauthorized purposes.
• Storage systems within a cloud also face threats from the network and
physical perspectives. From a network perspective, storage systems are
also susceptible to DoS attacks.
• Technologies Available to Address Threats
• A major concept and approach employed in a cloud environment to protect
data is known as data loss prevention (DLP), or sometimes as data leakage
prevention. DLP is a set of controls and practices put in place to ensure
that data is only accessible and exposed to those users and systems
authorized to have it. DLP strategy for an organization are to manage and
minimize risk. Any DLP implementation is composed of three common
components: discovery and classification, monitoring, and
implementation.
• The discovery and classification stage is the first stage of the DLP
implementation; it is focused on the actual finding of data that is pertinent
to the DLP strategy and determining the security classification and
requirements of the data once it has been found .
• Once data has been discovered and classified, it can then be monitored
with DLP implementations. It involves the actual process of watching data
as it moves through the various states of usage to ensure it is being used in
appropriate and controlled ways. also ensures that those who access and
use the data are authorized to do so and are using it in an appropriate
manner.
• The final stage of a DLP implementation is the actual enforcement of
policies and any potential violations caught as part of the monitoring stage.
• DLP Data States:
• With data at rest (DAR), the DLP solution is installed on the systems
holding the data, which can be servers, desktops, workstations, or mobile
devices.
• With data in transit (DIT), the DLP solution is deployed near the network
perimeter to capture traffic as it leaves the network through various
protocols, such as HTTP/ HTTPS and SMTP. One thing to note: if the traffic
leaving the environment is encrypted, the DLP solution will need to be able
to read and process the encrypted traffic in order to function, which might
require key management and encryption aspects coming into play.
• with data in use (DIU), the DLP solution is deployed on the users'
workstations or devices in order to monitor the data access and use from
the endpoint.
Design and Apply Data Security Strategies
Several toolsets and technologies are commonly used as data security
strategies:
• Encryption
• Key management
• Masking
• Obfuscation
• Anonymization
• Tokenization
Design and Apply Data Security Strategies
• Encryption
• use of encryption to protect data is essential and required, as the typical
protections of physical separation and segregation found in a traditional
data center model are not available or applicable to a cloud environment.
The architecture of an encryption system has three basic components: the
data itself, the encryption engine that handles all the encryption activities,
and the encryption keys used in the actual encryption and use of the data.
• Encryption is used in various manners and through different technology
approaches, depending on the state of the data at the time-in use, at rest,
or in motion.
• There is a many of challenges with implementing encryption. A central
challenge to encryption implementations is the dependence on key sets to
handle the actual encryption and decryption processes. without the proper
security of encryption keys, or exposure to external parties such as the
cloud provider itself, the entire encryption scheme could be rendered
vulnerable and insecure. encryption does not ensure data integrity, only
confidentiality within an environment.
Key Management
• Key management in the cloud is a crucial aspect of ensuring the security and integrity of sensitive data. It involves
the processes and technologies used to manage cryptographic keys throughout their lifecycle, which includes
generation, storage, distribution, use, and deletion. Here are some key concepts and practices related to cloud key
management:
• 1. Key Management Systems (KMS)
• Definition: A KMS is a service that allows users to create, manage, and control cryptographic keys.
• Examples: AWS Key Management Service, Azure Key Vault, Google Cloud Key Management.
• 2. Key Lifecycle Management
• Key Generation: Creating secure keys using strong algorithms.
• Key Storage: Storing keys securely to prevent unauthorized access.
• Key Rotation: Regularly updating keys to minimize the risk of compromise.
• Key Usage: Applying keys in encryption and decryption processes.
• Key Archiving and Deletion: Safely archiving old keys or securely deleting them when no longer needed.
• 3. Encryption and Decryption
• Data-at-Rest Encryption: Protecting stored data by encrypting it with keys managed in the KMS.
• Data-in-Transit Encryption: Encrypting data as it travels over networks to prevent interception.
• Key management should always be performed only on trusted
systems and by trusted processes, whether in a traditional data
center or a cloud environment.
• The Cloud Security Professional will always need to consult with
applicable regulatory concerns for any key management, access, and
storage requirements, and determine whether a cloud provider can
meet those requirement.

Additionally, it has enhanced the resources such as RAM, CPU, network bandwidth, and storage capacity without spending on new infrastructure, installing, maintaining, or licensing new software. lecture 7.pdf

  • 1.
  • 2.
    Examining Infrastructure Capableof Supporting a Migration • Available Network Capacity • Downtime During the Migration • Local time zones and follow the sun migration constraints
  • 3.
  • 4.
    RBAC: Identifying Usersand What Their Roles Are • Role-based access control (RBAC) is a method in which access rights are granted to, or restricted from, users based on which roles they perform in an organization.
  • 5.
    Single Sign-on Systems •Single sign-on (SSO) is an approach that reduces the need to sign into multiple systems for access. SSO allows a user to log in just one time and be granted access rights to multiple systems. SSO is also effective when terminating a session. When you log off, the directory services will log out, or disconnect you from the multiple systems you had accessed.
  • 6.
    Understanding Infrastructure Services •Domain Name Service • Dynamic Host Configuration Protocol • Certificate Services Most cloud providers offer their customers the ability to outsource the creation, management, and deployment of digital security certificates. • Load Balancing Load balancing addresses the issues found when cloud workloads and connections increase to the point where a single server can no longer handle the workload or performance requirements of web, DNS, and FTP servers; firewalls; and other network services. A load balancer is commonly found in front of web servers.
  • 8.
    • Multilayer UserAuthentication Services Multifactor or multilayer authentication adds an additional layer of authentication by adding token-based systems in addition to the traditional username and password authentication model. Virtual MFA device Universal 2nd Factor (U2F) Security Key Support for multiple tokens on a single device. Support for multiple root and IAM users using a single security key
  • 9.
    Firewall Security • Firewallsare generally deployed between the cloud network and the cloud consumer for protection of unauthorized access into the networks. A firewall is either hardware based or a virtualized device that inspects network traffic and compares the traffic to a defined rules’ list to determine whether that traffic is allowed. If it is not permitted, the firewall will block the traffic from entering the network.
  • 10.
    Cloud Data Security Assetmanagement broken up into two parts: data asset management and cloud asset management. • Data assets are the important information you have, such as customer names and addresses, credit card information, bank account information, or credentials to access such data. • Cloud assets are the things you have that store and process your data—compute resources such as servers or containers, storage such as object stores or block storage, and platform instances such as databases or queues. Understanding the Cloud Data Lifecycle phase steps in the lifecycle may be repeated, or taken out of order. Create Although this initial phase is called "Create," it can also be thought of as modification. Store Immediately after the data is created, it must be stored in a way that is useable to the system or application. Data can be stored in numerous ways. Storage methods include files on a file system, remote object storage in a cloud, and data written to a database. Cloud Security Professional must ensure that all storage methods employ whatever technologies are necessary for its data classification level, including the use of access controls, , and auditing. The use of appropriate redundancy and backup methods also comes into play immediately at the Store phase to protect the data on top of the security controls.
  • 11.
    Cloud Data Lifecycle •Use The Use phase is where the data is actually consumed and processed by an application or user. At this point, because the data is being used, viewed, or processed, it is more exposed and faces increased chance of compromise or leak. it also must be exposed in an unencrypted state. data requires auditing and logging mechanisms to be in place as it is accessed for the first time. The Use phase is considered to be purely in a read-only mode because this phase does not cover modification in this sense; modification is covered in the Create phase of the lifecycle. • Share In the Share phase, data is made available for use outside the system. This presents a big challenge for a system to ensure proper protections are in place once the data leaves the system and is shared externally. Unlike the Use phase, with the Share phase the data is being enabled for use by customers, partners, contractors, and other associated groups, it is no longer under the security control mechanisms employed there. • Archive simply involves moving data to long-term storage, thus removing it from being active or "hot" within a system. One of the more overlooked aspects of archiving data is the ability to retrieve and recover it as well.
  • 12.
    • The Destroyphase of the lifecycle is where the data is either made inaccessible or permanently erased and protected, with the method and approach being based on the classification and sensitivity of the data. • Design and Implement Cloud Data Storage Architectures • Each of the three hosting models with a cloud environment-laaS, PaaS, and SaaS- uses its own unique storage methods
  • 13.
    • IaaS, storagefalls into two basic categories: volume and object • Volume storage is a virtual hard drive that is allocated by the cloud provider and attached to the virtual host. • Object storage : files are stored as objects in an independent system and given a key value for reference and retrieval. Many cloud systems use object storage for virtual host images and large files. • PaaS The storage design for PaaS is quite a bit different from IaaS. PaaS, storage falls into the categories of structured and unstructured. • 1. Structured Data • Definition: Data that is organized in a predefined format or model, making within a database it easily import data from other data sources. organized and optimized for searching technologies. • Characteristics: • Format: Typically stored in tables with rows and columns (e.g., relational databases). • Schema: Follows a strict schema that defines the data types and relationships. • Ease of Use: Easily queried using Structured Query Language (SQL). • Examples: • Customer databases (name, address, phone number) • Transaction records (date, amount, transaction ID) • Inventory lists (product ID, description, price)
  • 14.
    • 2. UnstructuredData • Definition: Data that does not have a predefined format or structure, making it more complex to organize and analyze. This can be because of the size of the files or the types of files • Characteristics: • Format: Can include text, images, audio, video, and more. • Schema: more difficult to categorize and analyze. • Searchability: Requires advanced technologies (like natural language processing or machine learning) for effective analysis. • Examples: • Emails and text documents • Social media posts and comments • Multimedia files (photos, videos, audio recordings) • Examples: • Object Storage: Amazon S3, Google Cloud Storage, Microsoft Azure Blob Storage. • NoSQL Databases: MongoDB, Cassandra, Couchbase.
  • 15.
    • The twomost common storage types for SaaS are information storage and management as well as content and file storage. Content and File Storage • Content and file storage are essential components of data management, enabling users to store, organize, and retrieve digital assets efficiently. Here’s a detailed overview of both concepts: • 1. Content Storage • Definition: Refers to the storage of digital content, which can include text, images, audio, video, and other multimedia types. • Characteristics: • Metadata: Content storage often involves rich metadata to describe and categorize content for easier retrieval and management. • Access Methods: Typically accessed through web interfaces or APIs, allowing for integration with content management systems (CMS) and applications. • Examples: • Digital Asset Management (DAM): Systems designed specifically for managing digital media files (e.g., Adobe Experience Manager, Bynder). • 2. File Storage • Definition: A method of storing data as individual files within a hierarchical file system, often organized into directories or folders. • Characteristics: • File Structure: Uses a traditional file and folder organization, making it easy to navigate and manage files. • Access Methods: Accessed via file paths, with support for various protocols (e.g., SMB, NFS). • Examples: • File Systems: Local storage systems on hard drives, network-attached storage (NAS), and cloud file storage solutions (e.g., Google Drive, Dropbox).
  • 16.
    Information Storage andManagement • Information storage and management encompass the strategies, technologies, and practices used to store, organize, and retrieve data efficiently and securely. Here’s an overview of the key aspects of this field: • 1. Types of Information Storage • Structured Storage: • Data is organized in a predefined format (e.g., relational databases). • Examples: MySQL, Oracle Database. • Unstructured Storage: • Data lacks a fixed structure (e.g., documents, multimedia). • Examples: Object storage (Amazon S3, Google Cloud Storage). • 2. Storage Technologies • Databases: • Relational Databases: Use tables and SQL for data management. • NoSQL Databases: Designed for unstructured data, allowing for flexible data models. • File Systems: • Traditional file systems for storing documents and files on physical or network drives. • Cloud Storage: • Provides scalable storage solutions over the internet, allowing for flexible access and management. • Data Warehousing: • Central repositories for storing and analyzing large volumes of structured data.
  • 17.
    • 3. DataManagement Practices • Data Governance: • Policies and procedures to ensure data quality, security, and agreement with regulations. • Data Backup and Recovery: • Strategies for regularly backing up data to prevent loss and ensure business continuity. • Data Lifecycle Management: • Managing data from creation and storage to archiving and deletion. • Access Controls: • Implementing permissions and authentication measures to secure data. • 4. Challenges in Information Storage and Management • Data Volume: The exponential growth of data makes storage management increasingly complex. • Data Security: Protecting sensitive information from breaches and unauthorized access. • Integration: Ensuring seamless integration between different storage systems and applications.
  • 18.
    Object Storage inthe Cloud • Object storage is a data storage architecture that manages data as objects, which are stored in a flat namespace. It is widely used in cloud computing for its scalability, stability, and accessibility. Here are some key aspects of cloud object storage: • 1. What is Object Storage? • Data Representation: Data is stored as objects, each containing the data itself, metadata, and a unique identifier. • Flat Structure: Unlike traditional file systems that use hierarchical structures, object storage uses a flat namespace, making it easier to manage large amounts of data. • 2. Key Features • Scalability: Can handle vast amounts of unstructured data, making it ideal for big data applications, backups, and media storage. • Stability: Data is often replicated across multiple locations, ensuring redundancy and protection against data loss. • Accessibility: Objects can be accessed via RESTful APIs, allowing for easy integration with applications and services. • Advantages • Cost-Effective: Pay-as-you-go pricing models that help reduce costs, especially for large volumes of data. • Global Accessibility: Data can be accessed from anywhere, making it suitable for distributed applications. • Metadata Management: Rich metadata capabilities enhance data organization and retrieval.
  • 19.
    5. Popular CloudObject Storage Services • Amazon S3 (Simple Storage Service): A widely used service that provides high availability and scalability. Security Considerations • Access Control: Implementing policies to control who can access data. • Data Encryption: Encrypting data at rest and in transit to protect against unauthorized access.
  • 20.
    Threats to StorageTypes • The most common and well understood threat to storage is the unauthorized access or use of the data itself. • This can be an external threat or a compromise of a system, or it can be in the form of a malicious insider who possesses the credentials to access the data but uses them for unauthorized purposes. • Storage systems within a cloud also face threats from the network and physical perspectives. From a network perspective, storage systems are also susceptible to DoS attacks. • Technologies Available to Address Threats • A major concept and approach employed in a cloud environment to protect data is known as data loss prevention (DLP), or sometimes as data leakage prevention. DLP is a set of controls and practices put in place to ensure that data is only accessible and exposed to those users and systems authorized to have it. DLP strategy for an organization are to manage and minimize risk. Any DLP implementation is composed of three common components: discovery and classification, monitoring, and implementation.
  • 21.
    • The discoveryand classification stage is the first stage of the DLP implementation; it is focused on the actual finding of data that is pertinent to the DLP strategy and determining the security classification and requirements of the data once it has been found . • Once data has been discovered and classified, it can then be monitored with DLP implementations. It involves the actual process of watching data as it moves through the various states of usage to ensure it is being used in appropriate and controlled ways. also ensures that those who access and use the data are authorized to do so and are using it in an appropriate manner. • The final stage of a DLP implementation is the actual enforcement of policies and any potential violations caught as part of the monitoring stage.
  • 22.
    • DLP DataStates: • With data at rest (DAR), the DLP solution is installed on the systems holding the data, which can be servers, desktops, workstations, or mobile devices. • With data in transit (DIT), the DLP solution is deployed near the network perimeter to capture traffic as it leaves the network through various protocols, such as HTTP/ HTTPS and SMTP. One thing to note: if the traffic leaving the environment is encrypted, the DLP solution will need to be able to read and process the encrypted traffic in order to function, which might require key management and encryption aspects coming into play. • with data in use (DIU), the DLP solution is deployed on the users' workstations or devices in order to monitor the data access and use from the endpoint.
  • 23.
    Design and ApplyData Security Strategies Several toolsets and technologies are commonly used as data security strategies: • Encryption • Key management • Masking • Obfuscation • Anonymization • Tokenization
  • 24.
    Design and ApplyData Security Strategies • Encryption • use of encryption to protect data is essential and required, as the typical protections of physical separation and segregation found in a traditional data center model are not available or applicable to a cloud environment. The architecture of an encryption system has three basic components: the data itself, the encryption engine that handles all the encryption activities, and the encryption keys used in the actual encryption and use of the data. • Encryption is used in various manners and through different technology approaches, depending on the state of the data at the time-in use, at rest, or in motion. • There is a many of challenges with implementing encryption. A central challenge to encryption implementations is the dependence on key sets to handle the actual encryption and decryption processes. without the proper security of encryption keys, or exposure to external parties such as the cloud provider itself, the entire encryption scheme could be rendered vulnerable and insecure. encryption does not ensure data integrity, only confidentiality within an environment.
  • 25.
    Key Management • Keymanagement in the cloud is a crucial aspect of ensuring the security and integrity of sensitive data. It involves the processes and technologies used to manage cryptographic keys throughout their lifecycle, which includes generation, storage, distribution, use, and deletion. Here are some key concepts and practices related to cloud key management: • 1. Key Management Systems (KMS) • Definition: A KMS is a service that allows users to create, manage, and control cryptographic keys. • Examples: AWS Key Management Service, Azure Key Vault, Google Cloud Key Management. • 2. Key Lifecycle Management • Key Generation: Creating secure keys using strong algorithms. • Key Storage: Storing keys securely to prevent unauthorized access. • Key Rotation: Regularly updating keys to minimize the risk of compromise. • Key Usage: Applying keys in encryption and decryption processes. • Key Archiving and Deletion: Safely archiving old keys or securely deleting them when no longer needed. • 3. Encryption and Decryption • Data-at-Rest Encryption: Protecting stored data by encrypting it with keys managed in the KMS. • Data-in-Transit Encryption: Encrypting data as it travels over networks to prevent interception.
  • 26.
    • Key managementshould always be performed only on trusted systems and by trusted processes, whether in a traditional data center or a cloud environment. • The Cloud Security Professional will always need to consult with applicable regulatory concerns for any key management, access, and storage requirements, and determine whether a cloud provider can meet those requirement.