1. Risk Management
1.1. Concept
1.1.1. RM is the process of identifying and assessing risk, reducing it to an acceptable level, and ensuring it remains at that level
1.2. the major categories
1.2.1. • Physical damage: Fire, water, vandalism, power loss, and natural disasters
1.2.2. • Human interaction: Accidental or intentional action or inaction that can disrupt productivity
1.2.3. • Equipment malfunction: Failure of systems and peripheral devices
1.2.4. • Inside and outside attacks: Hacking, cracking, and attacking
1.2.5. • Misuse of data: Sharing trade secrets, fraud, espionage, and theft
1.2.6. • Loss of data: Intentional or unintentional loss of information to unauthorized receivers
1.2.7. • Application error: Computation errors, input errors, and buffer overflows
1.3. Holistic Risk Management
1.3.1. NIST SP 800-39 defines
1.3.1.1. • Organizational tier
1.3.1.2. • Business process tier
1.3.1.3. • Information systems tier
1.4. Information Systems Risk Management Policy
1.4.1. ISRM policy should address
1.4.1.1. • The objectives of the ISRM team
1.4.1.2. • The level of risk the organization will accept and what is considered an acceptable level of risk
1.4.1.3. • Formal processes of risk identification
1.4.1.4. • The connection between the ISRM policy and the organization’s strategic planning processes
1.4.1.5. • Responsibilities that fall under ISRM and the roles to fulfill them
1.4.1.6. • The mapping of risk to internal controls
1.4.1.7. • The approach toward changing staff behaviors and resource allocation in response to risk analysis
1.4.1.8. • The mapping of risks to performance targets and budgets
1.4.1.9. • Key indicators to monitor the effectiveness of controls
1.4.2. The Risk Management Team
1.4.2.1. • An established risk acceptance level provided by senior management
1.4.2.2. • Documented risk assessment processes and procedures
1.4.2.3. • Procedures for identifying and mitigating risks
1.4.2.4. • Appropriate resource and fund allocation from senior management
1.4.2.5. • Security awareness training for all staff members associated with information assets
1.4.2.6. • The ability to establish improvement (or risk mitigation) teams in specific areas when necessary
1.4.2.7. • The mapping of legal and regulation compliancy requirements to control and implement requirements
1.4.2.8. • The development of metrics and performance indicators so as to measure and manage various types of risks
1.4.2.9. • The ability to identify and assess new risks as the environment and company change
1.4.2.10. • The integration of ISRM and the organization’s change control process to ensure that changes do not introduce new vulnerabilities
1.4.3. The Risk Management Process
1.4.3.1. • Frame risk
1.4.3.2. • Assess risk
1.4.3.3. • Respond to risk
1.4.3.4. • Monitor risk
2. Threat Modeling
2.1. Threat Modeling Concepts
2.1.1. Vulnerabilities
2.1.1.1. Information
2.1.1.1.1. • Data at rest
2.1.1.1.2. • Data in motion
2.1.1.1.3. • Data in use
2.1.1.2. Processes
2.1.1.3. People
2.1.1.3.1. • Social engineering
2.1.1.3.2. • Social networks
2.1.1.3.3. • Passwords
2.1.2. Threats
2.1.2.1. potential cause of an unwanted incident, which may result in harm to a system or organization
2.2. Threat Modeling Methodologies
2.2.1. Attack Trees
2.2.1.1. “attack chain”
2.2.1.2. “kill chain”
2.2.2. Reduction Analysis
2.2.2.1. controls or countermeasures
3. Risk Assessment and Analysis
3.1. four main goals
3.1.1. • Identify assets and their value to the organization.
3.1.2. • Determine the likelihood that a threat exploits a vulnerability.
3.1.3. • Determine the business impact of these potential threats.
3.1.4. • Provide an economic balance between the impact of the threat and the cost of the countermeasure.
3.2. Risk Assessment Team
3.3. The Value of Information and Assets
3.4. Costs That Make Up the Value
3.4.1. assigning values to assets
3.4.1.1. • Cost to acquire or develop the asset
3.4.1.2. • Cost to maintain and protect the asset
3.4.1.3. • Value of the asset to owners and users
3.4.1.4. • Value of the asset to adversaries
3.4.1.5. • Price others are willing to pay for the asset
3.4.1.6. • Cost to replace the asset if lost
3.4.1.7. • Operational and production activities affected if the asset is unavailable
3.4.1.8. • Liability issues if the asset is compromised
3.4.1.9. • Usefulness and role of the asset in the organization
3.4.2. reasons
3.4.2.1. • To perform effective cost/benefit analyses
3.4.2.2. • To select specific countermeasures and safeguards
3.4.2.3. • To determine the level of insurance coverage to purchase
3.4.2.4. • To understand what exactly is at risk
3.4.2.5. • To comply with legal and regulatory requirements
3.5. Identifying Vulnerabilities and Threats
3.6. Methodologies for Risk Assessment
3.6.1. NIST SP 800-30, Revision 1.
3.6.1.1. 1. Prepare for the assessment.
3.6.1.2. 2. Conduct the assessment:
3.6.1.2.1. a. Identify threat sources and events.
3.6.1.2.2. b. Identify vulnerabilities and predisposing conditions.
3.6.1.2.3. c. Determine likelihood of occurrence.
3.6.1.2.4. d. Determine magnitude of impact.
3.6.1.2.5. e. Determine risk.
3.6.1.3. 3. Communicate results.
3.6.1.4. 4. Maintain assessment.
3.6.2. Facilitated Risk Analysis Process(FRAP)
3.6.2.1. qualitative methodology
3.6.2.2. FRAP is intended to be used to analyze one system, application, or business process at a time
3.6.3. OCTAVE (Operationally Critical Threat, Asset, and Vulnerability Evaluation)
3.6.3.1. OCTAVE would be used to assess all systems, applications, and business processes within the organization.
3.6.4. AS/NZS ISO 31000
3.6.4.1. takes a much broader approach to risk management
3.6.4.2. Australian and New Zealand methodology
3.6.5. ISO/IEC 27000 Series
3.6.5.1. ISO/IEC 27005
3.6.6. Failure Modes and Effect Analysis (FMEA)
3.6.6.1. FMEA is commonly used in product development and operational environments.
3.6.6.2. The goal is to identify where something is most likely going to break and either fix the flaws
3.6.6.3. steps
3.6.6.3.1. 1. Start with a block diagram of a system or control.
3.6.6.3.2. 2. Consider what happens if each block of the diagram fails.
3.6.6.3.3. 3. Draw up a table in which failures are paired with their effects and an evaluation of the effects.
3.6.6.3.4. 4. Correct the design of the system, and adjust the table until the system is not known to have unacceptable problems.
3.6.6.3.5. 5. Have several engineers review the Failure Modes and Effect Analysis.
3.6.7. fault tree analysis
3.6.7.1. • False alarms
3.6.7.2. • Insufficient error handling
3.6.7.3. • Sequencing or order
3.6.7.4. • Incorrect timing outputs
3.6.7.5. • Valid but not expected outputs
3.6.8. CRAMM(Central Computing and Telecommunications Agency Risk Analysis and Management Method)
3.6.8.1. was created by the United Kingdom
3.6.8.2. three distinct stages
3.6.8.2.1. define objectives,
3.6.8.2.2. assess risks
3.6.8.2.3. identify countermeasures
3.6.9. Choose methodology
3.6.9.1. deploy an organization-wide risk management
3.6.9.1.1. ISO/IEC 27005 or OCTAVE
3.6.9.2. focus just on IT security risks during your assessment
3.6.9.2.1. NIST SP 800-30
3.6.9.3. have a limited budget and need to carry out a focused assessment on an individual system or process
3.6.9.3.1. Facilitated Risk Analysis Process
3.6.9.4. dig into the details of how a security flaw within a specific system could cause negative ramifications
3.6.9.4.1. Failure Modes and Effect Analysis or fault tree analysis
3.6.9.5. to understand your company’s business risks
3.6.9.5.1. AS/NZS ISO 31000
3.7. Risk Analysis Approaches
3.7.1. Automated Risk Analysis Methods
3.7.2. Steps of a Quantitative Risk Analysis
3.7.2.1. Asset Value × Exposure Factor (EF) = SLE
3.7.2.2. SLE × Annualized Rate of Occurrence (ARO) = ALE
3.7.3. Results of a Quantitative Risk Analysis
3.7.3.1. • Monetary values assigned to assets
3.7.3.2. • Comprehensive list of all significant threats
3.7.3.3. • Probability of the occurrence rate of each threat
3.7.3.4. • Loss potential the company can endure per threat in a 12-month time span
3.7.3.5. • Recommended controls
3.8. Qualitative Risk Analysis
3.8.1. qualitative methods walk through different scenarios of risk possibilities and rank the seriousness of the threats and the validity of the different possible countermeasures based on opinions
3.8.2. The Delphi Technique
3.8.2.1. a group decision method
3.9. Protection Mechanisms
3.9.1. Control Selection
3.9.1.1. a cost/benefit analysis.
3.9.1.1.1. (ALE before implementing safeguard) – (ALE after implementing safeguard) – (annual cost of safeguard) = value of safeguard to the company
3.9.1.2. Security Control Assessment
3.10. Total Risk vs. Residual Risk
3.10.1. threats × vulnerability × asset value = total risk
3.10.2. total risk (threats × vulnerability × asset value) × controls gap = residual risk
3.10.3. total risk – countermeasures = residual risk
3.11. Handling Risk
3.11.1. Untitled
4. Supply Chain Risk Management
4.1. NIST SP 800-161
4.1.1. “Supply Chain Risk Management Practices for Federal Information Systems and Organizations.”
4.2. Upstream and Downstream Suppliers
4.2.1. Hardware
4.2.2. Software
4.2.3. Services
4.2.3.1. reduce its risk when it comes to outsourcing
4.2.3.1.1. • Review the service provider’s security program
4.2.3.1.2. • Conduct onsite inspection and interviews
4.2.3.1.3. • Review contracts to ensure security and protection levels are agreed upon
4.2.3.1.4. • Ensure service level agreements are in place
4.2.3.1.5. • Review internal and external audit reports and third-party reviews
4.2.3.1.6. • Review references and communicate with former and existing customers
4.2.3.1.7. • Review Better Business Bureau reports
4.2.3.1.8. • Ensure the service provider has a business continuity plan (BCP) in place
4.2.3.1.9. • Implement a nondisclosure agreement (NDA)
4.2.3.1.10. • Understand the provider’s legal and regulatory requirements
4.2.3.2. Service Level Agreements
4.2.3.2.1. (SLA) is a contractual agreement that states that a service provider guarantees a certain level of service.
5. Risk Management Frameworks
5.1. Commonly Accepted Risk Management Frameworks
5.1.1. • NIST RMF (SP 800-37r1)
5.1.1.1. It takes a systems life-cycle approach to risk management and focuses on certification and accreditation of information systems
5.1.1.2. six-step process of applying the RMF
5.1.1.2.1. 1. Categorize information system.
5.1.1.2.2. 2. Select security controls.
5.1.1.2.3. 3. Implement security controls.
5.1.1.2.4. 4. Assess security controls.
5.1.1.2.5. 5. Authorize information system.
5.1.1.2.6. 6. Monitor security controls.
5.1.2. • ISO 31000:2018
5.1.2.1. this framework is not focused on information systems, but can be applied more broadly to an organization.
5.1.3. • ISACA Risk IT
5.1.3.1. it is very well integrated with COBIT
6. Business Continuity and Disaster Recovery
6.1. Concepts
6.1.1. disaster recovery plan (DRP)
6.1.2. business continuity plan (BCP)
6.1.2.1. • Provide an immediate and appropriate response to emergency situations
6.1.2.2. • Protect lives and ensure safety
6.1.2.3. • Reduce business impact
6.1.2.4. • Resume critical business functions
6.1.2.5. • Work with outside vendors and partners during the recovery period
6.1.2.6. • Reduce confusion during a crisis
6.1.2.7. • Ensure survivability of the business
6.1.3. business continuity management (BCM)
6.1.3.1. is the holistic management process that should cover both of them.
6.1.3.2. Untitled
6.2. Standards and Best Practices
6.2.1. NIST SP 800-34, Revision 1, “Contingency Planning Guide for Federal Information Systems”
6.2.1.1. 1. Develop the continuity planning policy statement.
6.2.1.2. 2. Conduct the business impact analysis (BIA)
6.2.1.3. 3. Identify preventive controls.
6.2.1.4. 4. Create contingency strategies.
6.2.1.5. 5. Develop an information system contingency plan.
6.2.1.6. 6. Ensure plan testing, training, and exercises.
6.2.1.7. 7. Ensure plan maintenance.
6.2.2. Untitled
6.2.3. standards-based
6.2.3.1. ISO/IEC 27031:2011
6.2.3.2. ISO 22301:2012
6.2.3.2.1. This standard replaced BS 25999-2.
6.2.3.3. Business Continuity Institute’s Good Practice Guidelines (GPG)
6.2.3.3.1. Management Practices:
6.2.3.3.2. Technical Practices:
6.2.3.4. DRI International Institute’s Professional Practices for Business Continuity Planners
6.2.3.4.1. • Program Initiation and Management
6.2.3.4.2. • Risk Evaluation and Control
6.2.3.4.3. • Business Impact Analysis
6.2.3.4.4. • Business Continuity Strategies
6.2.3.4.5. • Emergency Response and Operations
6.2.3.4.6. • Plan Implementation and Documentation
6.2.3.4.7. • Awareness and Training Programs
6.2.3.4.8. • Business Continuity Plan Exercise, Audit, and Maintenance
6.2.3.4.9. • Crisis Communications
6.2.3.4.10. • Coordination with External Agencies
6.3. Making BCM Part of the Enterprise Security Program
6.4. BCP Project Components
6.4.1. BCP committee
6.4.1.1. • Business units
6.4.1.2. • Senior management
6.4.1.3. • IT department
6.4.1.4. • Security department
6.4.1.5. • Communications department
6.4.1.6. • Legal department
6.4.2. The initiation process for the BCP program
6.4.2.1. • Setting up a budget and staff for the program before the BCP process begins.
6.4.2.2. • Assigning duties and responsibilities to the BCP coordinator and to representatives from all of the functional units of the organization.
6.4.2.3. • Senior management kick-off of the BCP program with a formal announcement or, better still, an organization-wide meeting to demonstrate high-level support.
6.4.2.4. • Awareness-raising activities to let employees know about the BCP program and to build internal support for it.
6.4.2.5. • Establishment of skills training for the support of the BCP effort.
6.4.2.6. • The start of data collection from throughout the organization to aid in crafting various continuity options.
6.4.2.7. • Putting into effect “quick wins” and gathering of “low-hanging fruit” to show tangible evidence of improvement in the organization’s readiness, as well as improving readiness.
6.4.3. Scope of the Project
6.4.3.1. Enterprise-wide BCP
6.4.4. BCP Policy
6.4.4.1. The process of drawing up a policy
6.4.4.1.1. 1. Identify and document the components of the policy.
6.4.4.1.2. 2. Identify and define policies of the organization that the BCP might affect.
6.4.4.1.3. 3. Identify pertinent legislation, laws, regulations, and standards.
6.4.4.1.4. 4. Identify “good industry practice” guidelines by consulting with industry experts.
6.4.4.1.5. 5. Perform a gap analysis. Find out where the organization currently is in terms of continuity planning, and spell out where it wants to be at the end of the BCP process.
6.4.4.1.6. 6. Compose a draft of the new policy.
6.4.4.1.7. 7. Have different departments within the organization review the draft.
6.4.4.1.8. 8. Incorporate the feedback from the departments into a revised draft.
6.4.4.1.9. 9. Get the approval of top management on the new policy.
6.4.4.1.10. 10. Publish a final draft, and distribute and publicize it throughout the organization.
6.4.5. Project Management
6.4.5.1. SWOT analysis
6.4.5.1.1. • Strengths Characteristics of the project team that give it an advantage over others
6.4.5.1.2. • Weaknesses Characteristics that place the team at a disadvantage relative to others
6.4.5.1.3. • Opportunities Elements that could contribute to the project’s success
6.4.5.1.4. • Threats Elements that could contribute to the project’s failure
6.4.5.1.5. Untitled
6.4.5.2. components
6.4.5.2.1. • Objective-to-task mapping
6.4.5.2.2. • Resource-to-task mapping
6.4.5.2.3. • Workflows
6.4.5.2.4. • Milestones
6.4.5.2.5. • Deliverables
6.4.5.2.6. • Budget estimates
6.4.5.2.7. • Success factors
6.4.5.2.8. • Deadlines
6.4.6. Business Continuity Planning Requirements
6.4.6.1. Tips
6.4.6.1.1. Due diligence is normally associated with leaders, laws, and regulations
6.4.6.1.2. Due care is normally applicable to everyone and could be used to show negligence.
6.4.6.2. Business Impact Analysis (BIA)
6.4.6.2.1. • Maximum tolerable downtime and disruption for activities
6.4.6.2.2. • Operational disruption and productivity
6.4.6.2.3. • Financial considerations
6.4.6.2.4. • Regulatory responsibilities
6.4.6.2.5. • Reputation
6.4.6.3. Risk Assessment
6.4.6.3.1. Risk assessment process
6.4.6.4. Risk Assessment Evaluation and Process
6.4.6.4.1. The end goals of a risk assessment
6.4.6.4.2. Risk = Threat × Impact × Probability.
6.4.6.4.3. The main parts of a risk assessment
6.4.6.4.4. BIA Steps
6.4.6.4.5. Assigning Values to Assets
6.4.6.4.6. EXAM TIP
6.4.6.4.7. Interdependencies
7. Personnel Security
7.1. minimize the risks by implementing preventive measures
7.1.1. Separation of duties
7.1.1.1. makes sure that one individual cannot complete a critical task by herself.
7.1.1.2. split knowledge and dual control.
7.1.2. Rotation of duties (rotation of assignments)
7.1.2.1. an administrative detective control that can be put into place to uncover fraudulent activities
7.1.3. mandatory vacation
7.1.3.1. usually detect any fraudulent errors or activities
7.2. Hiring Practices
7.2.1. Possible background check criteria
7.2.1.1. • A Social Security number trace
7.2.1.2. • A county/state criminal check
7.2.1.3. • A federal criminal check
7.2.1.4. • A sexual offender registry check
7.2.1.5. • Employment verification
7.2.1.6. • Education verification
7.2.1.7. • Professional reference verification
7.2.1.8. • An immigration check
7.2.1.9. • Professional license/certification verification
7.2.1.10. • Credit report
7.2.1.11. • Drug screening
7.3. Onboarding
7.3.1. Steps
7.3.1.1. • The new employee attends all required security awareness training.
7.3.1.2. • The new employee must read all security policies, be given an opportunity to have any questions about the policies answered, and sign a statement indicating they understand and will comply with the policies.
7.3.1.3. • The new employee is issued all appropriate identification badges, keys, and access tokens pursuant to their assigned roles.
7.3.1.4. • The IT department creates all necessary accounts for the new employee, who signs into the systems and sets their passwords (or changes any temporary passwords).
7.3.2. Nondisclosure agreements (NDAs) must be developed and signed by new employees
7.4. Termination
7.4.1. • The employee must leave the facility immediately under the supervision of a manager or security guard.
7.4.2. • The employee must surrender any identification badges or keys, be asked to complete an exit interview, and return company supplies.
7.4.3. • That user’s accounts and passwords should be disabled or changed immediately.
7.5. Security Awareness Training
7.5.1. Presenting the Training
7.5.2. Periodic Content Reviews
7.5.3. Training Assessments
8. Security Governance
8.1. Metrics
8.1.1. industry best practices
8.1.1.1. ISO/IEC 27004:2016
8.1.1.2. ISO/IEC 27001
8.1.1.3. ISO/IEC 27004
8.1.1.4. NIST SP 800-55, Revision 1
8.1.1.5. Six Sigma
8.1.1.6. the measurements of service-level targets for ITIL
8.1.2. Untitled
9. Ethics
9.1. the Code of Ethics
9.1.1. • Protect society, the common good, necessary public trust and confidence, and the infrastructure
9.1.2. • Act honorably, honestly, justly, responsibly, and legally
9.1.3. • Provide diligent and competent service to principals
9.1.4. • Advance and protect the profession
9.2. The Computer Ethics Institute
9.2.1. Ten Commandments of Computer Ethics
9.2.1.1. 1. Thou shalt not use a computer to harm other people.
9.2.1.2. 2. Thou shalt not interfere with other people’s computer work.
9.2.1.3. 3. Thou shalt not snoop around in other people’s computer files.
9.2.1.4. 4. Thou shalt not use a computer to steal.
9.2.1.5. 5. Thou shalt not use a computer to bear false witness.
9.2.1.6. 6. Thou shalt not copy or use proprietary software for which you have not paid.
9.2.1.7. 7. Thou shalt not use other people’s computer resources without authorization or proper compensation.
9.2.1.8. 8. Thou shalt not appropriate other people’s intellectual output.
9.2.1.9. 9. Thou shalt think about the social consequences of the program you are writing or the system you are designing.
9.2.1.10. 10. Thou shalt always use a computer in ways that ensure consideration and respect for your fellow humans.
9.3. The Internet Architecture Board
9.3.1. It is responsible for the architectural oversight of the Internet Engineering Task Force (IETF) activities, Internet Standards Process oversight and appeal, and editor of Requests for Comments (RFCs).
9.3.2. unethical and unacceptable behavior
9.3.2.1. • Purposely seeking to gain unauthorized access to Internet resources
9.3.2.2. • Disrupting the intended use of the Internet
9.3.2.3. • Wasting resources (people, capacity, and computers) through purposeful actions
9.3.2.4. • Destroying the integrity of computer-based information
9.3.2.5. • Compromising the privacy of others
9.3.2.6. • Conducting Internet-wide experiments in a negligent manner
9.3.3. RFC 1087 is called “Ethics and the Internet.”
9.3.4. Untitled
10. Fundamental Principles of Security
10.1. AIC triad
10.1.1. Availability
10.1.1.1. • Redundant array of independent disks (RAID)
10.1.1.2. • Clustering
10.1.1.3. • Load balancing
10.1.1.4. • Redundant data and power lines
10.1.1.5. • Software and data backups
10.1.1.6. • Disk shadowing
10.1.1.7. • Co-location and offsite facilities
10.1.1.8. • Rollback functions
10.1.1.9. • Failover configurations
10.1.2. Integrity
10.1.2.1. • Hashing (data integrity)
10.1.2.2. • Configuration management (system integrity)
10.1.2.3. • Change control (process integrity)
10.1.2.4. • Access control (physical and technical)
10.1.2.5. • Software digital signing
10.1.2.6. • Transmission cyclic redundancy check (CRC) functions
10.1.3. Confidentiality
10.1.3.1. • Encryption for data at rest (whole disk, database encryption)
10.1.3.2. • Encryption for data in transit (IPSec, TLS, PPTP, SSH, described in Chapter 4)
10.1.3.3. • Access control (physical and technical)
11. Security Definitions
11.1. vulnerability
11.1.1. is a weakness in a system that allows a threat source to compromise its security
11.2. threat
11.2.1. is any potential danger that is associated with the exploitation of a vulnerability
11.3. risk
11.3.1. is the likelihood of a threat source exploiting a vulnerability and the corresponding business impact
11.4. exposure
11.4.1. is an instance of being exposed to losses
11.5. “control,” “countermeasure,” and “safeguard”
11.6. threat agent
11.7. asset
12. Control Types
12.1. 1
12.1.1. • Preventive
12.1.1.1. Locks
12.1.1.2. Badge system
12.1.1.3. Security guard
12.1.1.4. Biometric system
12.1.1.5. Mantrap doors
12.1.1.6. Security Police
12.1.1.7. Separation of duties
12.1.1.8. Information classification
12.1.1.9. Personnel procedures
12.1.1.10. Testing
12.1.1.11. Security awareness
12.1.1.12. ACLs
12.1.1.13. Encyption
12.1.1.14. Antivirus software
12.1.1.15. Smart cards
12.1.1.16. Dial-up call-back systems
12.1.2. • Detective
12.1.2.1. Motion detectores
12.1.2.2. Closed-circuit TVs
12.1.2.3. Monitoring and supervising
12.1.2.4. Job rotation
12.1.2.5. Investigations
12.1.2.6. Audit logs
12.1.2.7. IDS
12.1.3. • Corrective
12.1.3.1. Server Images
12.1.4. • Deterrent
12.1.4.1. Fences
12.1.4.2. Lighting
12.1.5. • Recovery
12.1.5.1. Offsite facility
12.1.6. • Compensating
12.2. 2
12.2.1. • Administrative
12.2.1.1. • Policies and procedures
12.2.1.2. • Effective hiring practices
12.2.1.3. • Pre-employment background checks
12.2.1.4. • Controlled termination processes
12.2.1.5. • Data classification and labeling
12.2.1.6. • Security awareness
12.2.2. • Physical
12.2.2.1. • Badges, swipe cards
12.2.2.2. • Guards, dogs
12.2.2.3. • Fences, locks, mantraps
12.2.3. • Technical
12.2.3.1. • Passwords, biometrics, smart cards
12.2.3.2. • Encryption, secure protocols, call-back systems, database views, constrained user interfaces
12.2.3.3. • Antimalware software, access control lists, firewalls, intrusion prevention system
13. Security Frameworks
13.1. Security Program Development
13.1.1. • ISO/IEC 27000 series
13.1.1.1. International standards on how to develop and maintain an ISMS developed by ISO and IEC
13.1.1.2. list
13.1.1.2.1. • ISO/IEC 27000 Overview and vocabulary
13.1.1.2.2. • ISO/IEC 27001 ISMS requirements
13.1.1.2.3. • ISO/IEC 27002 Code of practice for information security controls
13.1.1.2.4. • ISO/IEC 27003 ISMS implementation
13.1.1.2.5. • ISO/IEC 27004 ISMS measurement
13.1.1.2.6. • ISO/IEC 27005 Risk management
13.1.1.2.7. • ISO/IEC 27006 Certification body requirements
13.1.1.2.8. • ISO/IEC 27007 ISMS auditing
13.1.1.2.9. • ISO/IEC 27008 Guidance for auditors
13.1.1.2.10. • ISO/IEC 27011 Telecommunications organizations
13.1.1.2.11. • ISO/IEC 27014 Information security governance
13.1.1.2.12. • ISO/IEC 27015 Financial sector
13.1.1.2.13. • ISO/IEC 27031 Business continuity
13.1.1.2.14. • ISO/IEC 27032 Cybersecurity
13.1.1.2.15. • ISO/IEC 27033 Network security
13.1.1.2.16. • ISO/IEC 27034 Application security
13.1.1.2.17. • ISO/IEC 27035 Incident management
13.1.1.2.18. • ISO/IEC 27037 Digital evidence collection and preservation
13.1.1.2.19. • ISO/IEC 27799 Health organizations
13.2. Enterprise Architecture Development
13.2.1. • Zachman Framework
13.2.1.1. Model for the development of enterprise architectures developed by John Zachman
13.2.1.2. two-dimensional model
13.2.1.2.1. six basic communication interrogatives (What, How, Where, Who, When, and Why)
13.2.1.2.2. perspectives (Executives, Business Managers, System Architects, Engineers, Technicians, and Enterprise-wide)
13.2.2. • TOGAF
13.2.2.1. Model and methodology for the development of enterprise architectures developed by The Open Group
13.2.2.2. Architecture Development Method (ADM)
13.2.3. Military-Oriented Architecture Frameworks
13.2.3.1. • MODAF
13.2.3.1.1. Architecture framework used mainly in military support missions developed by the British Ministry of Defence
13.2.3.2. • DoDAF
13.2.3.2.1. U.S. Department of Defense architecture framework that ensures interoperability of systems to meet military mission goals
13.2.4. Enterprise Security Architecture
13.2.4.1. • SABSA model
13.2.4.1.1. Model and methodology for the development of information security enterprise architectures
13.2.4.1.2. provides a life-cycle model so that the architecture can be constantly monitored and improved upon over time.
13.2.4.2. -
13.2.4.2.1. Strategic Alignment
13.2.4.2.2. Business Enablement
13.2.4.2.3. Process Enhancement
13.2.4.2.4. Security Effectiveness
13.3. Security Controls Development
13.3.1. • COBIT 5
13.3.1.1. by ISACA and ITGI(IT Governace Institute)
13.3.1.2. derived from the COSO
13.3.1.3. Control Objectives for Information and related Technology (COBIT)
13.3.1.4. five key principles
13.3.1.4.1. 1. Meeting stakeholder needs
13.3.1.4.2. 2. Covering the enterprise end to end
13.3.1.4.3. 3. Applying a single integrated framework
13.3.1.4.4. 4. Enabling a holistic approach
13.3.1.4.5. 5. Separating governance from management
13.3.2. • NIST SP 800-53
13.3.2.1. by National Institute of Standards and Technology
13.3.3. • COSO Internal Control—Integrated Framework
13.3.3.1. by the Committee of Sponsoring Organizations (COSO) of the Treadway Commission
13.3.3.2. deal with fraudulent financial activities and reporting.
13.3.3.3. COSO IC deals more at the strategic level,while COBIT focuses more at the operational level
13.3.3.4. 17 internal control principles
13.3.3.4.1. Control Environment
13.3.3.4.2. Risk Assessment
13.3.3.4.3. Control Activities
13.3.3.4.4. Information and Communication
13.3.3.4.5. Monitoring Activities
13.4. Process Management Development
13.4.1. • ITIL
13.4.1.1. by the Committee of Sponsoring Organizations (COSO) of the Treadway Commission
13.4.1.2. Information Technology Infrastructure Library
13.4.1.3. developed in the 1980s by the UK’s Central Computer and Telecommunications Agency
13.4.1.4. ITIL was created because of the increased dependence on information technology to meet business needs.
13.4.2. • Six Sigma
13.4.2.1. is a process improvement methodology
13.4.2.2. Its goal is to improve process quality by using statistical methods of measuring operation efficiency and reducing variation, defects, and waste
13.4.2.3. developed by Motorola with the goal of identifying and removing defects in its manufacturing processes.
13.4.3. • Capability Maturity Model Integration (CMMI)
13.4.3.1. by Carnegie Mellon University
13.4.3.2. Levels
13.4.3.2.1. Level 0
13.4.3.2.2. Level 1
13.4.3.2.3. Level 2
13.4.3.2.4. Level 3
13.4.3.2.5. Level 4
13.4.3.2.6. Level 5
13.4.3.3. Top-Down Approach
13.4.3.3.1. Management’s support is one of the most important pieces of a security program
13.5. Life cycle
13.5.1. 1. Plan and organize
13.5.1.1. • Establish management commitment.
13.5.1.2. • Establish oversight steering committee.
13.5.1.3. • Assess business drivers.
13.5.1.4. • Develop a threat profile on the organization.
13.5.1.5. • Carry out a risk assessment.
13.5.1.6. • Develop security architectures at business, data, application, and infrastructure levels.
13.5.1.7. • Identify solutions per architecture level.
13.5.1.8. • Obtain management approval to move forward.
13.5.2. 2. Implement
13.5.2.1. • Assign roles and responsibilities.
13.5.2.2. • Develop and implement security policies, procedures, standards, baselines, and guidelines.
13.5.2.3. • Identify sensitive data at rest and in transit.
13.5.2.4. • Implement the following blueprints:
13.5.2.4.1. • Asset identification and management
13.5.2.4.2. • Risk management
13.5.2.4.3. • Vulnerability management
13.5.2.4.4. • Compliance
13.5.2.4.5. • Identity management and access control
13.5.2.4.6. • Change control
13.5.2.4.7. • Software development life cycle
13.5.2.4.8. • Business continuity planning
13.5.2.4.9. • Awareness and training
13.5.2.4.10. • Physical security
13.5.2.4.11. • Incident response
13.5.2.5. • Implement solutions (administrative, technical, physical) per blueprint.
13.5.2.6. • Develop auditing and monitoring solutions per blueprint.
13.5.2.7. • Establish goals, SLAs, and metrics per blueprint.
13.5.3. 3. Operate and maintain
13.5.3.1. • Follow procedures to ensure all baselines are met in each implemented blueprint.
13.5.3.2. • Carry out internal and external audits.
13.5.3.3. • Carry out tasks outlined per blueprint.
13.5.3.4. • Manage SLAs per blueprint.
13.5.4. 4. Monitor and evaluate
13.5.4.1. • Review logs, audit results, collected metric values, and SLAs per blueprint.
13.5.4.2. • Assess goal accomplishments per blueprint.
13.5.4.3. • Carry out quarterly meetings with steering committees.
13.5.4.4. • Develop improvement steps and integrate into the Plan and Organize phase.
14. The Crux of Computer Crime Laws
14.1. • 18 USC 1029
14.1.1. Fraud and Related Activity in Connection with Access Devices
14.2. • 18 USC 1030
14.2.1. Fraud and Related Activity in Connection with Computers
14.3. • 18 USC 2510 et seq.
14.3.1. Wire and Electronic Communications Interception and Interception of Oral Communications
14.4. • 18 USC 2701 et seq.
14.4.1. Stored Wire and Electronic Communications and Transactional Records Access
14.5. • Digital Millennium Copyright Act
14.6. • Cyber Security Enhancement Act of 2002
15. Complexities in Cybercrime
15.1. Electronic Assets
15.2. The Evolution of Attacks
15.2.1. advanced persistent threat (APT)
15.2.2. Common Internet Crime Schemes
15.2.2.1. • Auction fraud
15.2.2.2. • Counterfeit cashier’s check
15.2.2.3. • Debt elimination
15.2.2.4. • Parcel courier e-mail scheme
15.2.2.5. • Employment/business opportunities
15.2.2.6. • Escrow services fraud
15.2.2.7. • Investment fraud
15.2.2.8. • Lotteries
15.2.2.9. • Nigerian letter, or “419”
15.2.2.10. • Ponzi/pyramid
15.2.2.11. • Reshipping
15.2.2.12. • Third-party receiver of funds
15.3. International Issues
15.3.1. Organisation for Economic Co-operation and Development (OECD) Guidelines
15.3.1.1. core principles
15.3.1.1.1. • Collection Limitation
15.3.1.1.2. • Data Quality Principle
15.3.1.1.3. • Purpose Specification Principle
15.3.1.1.4. • Use Limitation Principle
15.3.1.1.5. • Security Safeguards Principle
15.3.1.1.6. • Openness Principle
15.3.1.1.7. • Individual Participation Principle
15.3.1.1.8. • Accountability Principle
15.3.2. General Data Protection Regulation (GDPR) 2016
15.3.2.1. three relevant entities
15.3.2.1.1. • Data subject
15.3.2.1.2. • Data controller
15.3.2.1.3. • Data processor
15.3.2.2. privacy data
15.3.2.2.1. • Name
15.3.2.2.2. • Address
15.3.2.2.3. • ID numbers
15.3.2.2.4. • Web data (location, IP address, cookies)
15.3.2.2.5. • Health and genetic data
15.3.2.2.6. • Biometric data
15.3.2.2.7. • Racial or ethnic data
15.3.2.2.8. • Political opinions
15.3.2.2.9. • Sexual orientation
15.3.2.3. Role
15.3.2.3.1. Data Protection Officer (DPO)
15.3.2.4. Key provisions of the GDPR
15.3.2.4.1. • Consent
15.3.2.4.2. • Right to be informed
15.3.2.4.3. • Right to restrict processing
15.3.2.4.4. • Right to be forgotten
15.3.2.4.5. • Data breaches
15.3.3. Import/Export Legal Requirements
15.3.3.1. • Category 1 Special Materials and Related Equipment
15.3.3.2. • Category 2 Materials Processing
15.3.3.3. • Category 3 Electronics
15.3.3.4. • Category 4 Computers
15.3.3.5. • Category 5 Part 1: Telecommunications
15.3.3.6. • Category 5 Part 2: Information Security
15.3.3.7. • Category 6 Sensors and Lasers
15.3.3.8. • Category 7 Navigation and Avionics
15.3.3.9. • Category 8 Marine
15.3.3.10. • Category 9 Aerospace and Propulsion
15.3.4. Types of Legal Systems
15.3.4.1. Civil (Code) Law System
15.3.4.1.1. Civil law generally is derived from common law (case law)
15.3.4.2. Common Law System
15.3.4.3. Criminal
15.3.4.3.1. • Based on common law, statutory law, or a combination of both
15.3.4.3.2. • Addresses behavior that is considered harmful to society.
15.3.4.4. Civil/Tort
15.3.4.4.1. • Offshoot of criminal law.
15.3.4.4.2. • usually physical or financial.
15.3.4.5. Administrative (regulatory):
15.3.4.6. Customary Law System
15.3.4.7. Religious Law System
15.3.4.8. Mixed Law System
16. Intellectual Property Laws
16.1. Trade Secret
16.1.1. is something that is proprietary to a company and important for its survival and profitability.
16.2. Copyright
16.3. Trademark
16.3.1. is used to protect a word, name, symbol, sound, shape, color, or combination of these
16.3.2. World Intellectual Property Organization (WIPO)
16.4. Patent
16.4.1. is the strongest form of intellectual property protection.
16.5. Internal Protection of Intellectual Property
16.6. Software Piracy
16.6.1. End User License Agreement (EULA)
16.6.2. The Federation Against Software Theft (FAST)
16.6.3. Business Software Alliance
16.6.4. Digital Millennium Copyright Act (DMCA)
16.6.4.1. a U.S. copyright law that criminalizes the production and dissemination of technology, devices, or services
16.6.5. Copyright Directive
16.6.5.1. The European Union passed a similar law
17. Privacy
17.1. Personally identifiable information (PII)
17.1.1. Typical compenents
17.1.1.1. • Full name (if not common)
17.1.1.2. • National identification number
17.1.1.3. • IP address (in some cases)
17.1.1.4. • Vehicle registration plate number
17.1.1.5. • Driver’s license number
17.1.1.6. • Face, fingerprints, or handwriting
17.1.1.7. • Credit card numbers
17.1.1.8. • Digital identity
17.1.1.9. • Birthday
17.1.1.10. • Birthplace
17.1.1.11. • Genetic information
17.1.2. can fall into the PII
17.1.2.1. • First or last name, if common
17.1.2.2. • Country, state, or city of residence
17.1.2.3. • Age, especially if nonspecific
17.1.2.4. • Gender or race
17.1.2.5. • Name of the school they attend or workplace
17.1.2.6. • Grades, salary, or job position
17.1.2.7. • Criminal record
17.2. Law
17.2.1. Federal Privacy Act of 1974
17.2.2. Gramm-LeachBliley Act of 1999 (GLBA)
17.2.2.1. also known as the Financial Services Modernization Act of 1999
17.2.2.2. Financial Privacy Rule
17.2.2.3. Safeguards Rule
17.2.2.4. Pretexting Protection
17.2.2.5. include any organization that provides financial products or services to individuals
17.2.3. Health Insurance Portability and Accountability Act (HIPAA)
17.2.3.1. HIPAA mandates steep federal penalties for noncompliance
17.2.4. HITECH
17.2.4.1. Health Information Technology for Economic and Clinical Health (HITECH) Act
17.2.4.2. addresses the privacy and security concerns associated with the electronic transmission of health information
17.2.5. USA PATRIOT Act
17.2.5.1. expanded law enforcement powers
17.2.6. Canada’s Personal Information Protection
17.2.6.1. Personal Information Protection and Electronic Documents Act (PIPEDA)
17.2.7. Electronic Documents Act
17.2.8. New Zealand’s Privacy Act of 1993
17.2.9. Payment Card Industry Data Security Standard (PCI DSS)
17.2.9.1. Secure Sockets Layer (SSL) and early Transport Layer Security (TLS) are not considered secure.
17.2.9.2. control objectives
17.2.9.2.1. 1. Install and maintain a firewall configuration to protect cardholder data.
17.2.9.2.2. 2. Do not use vendor-supplied defaults for system passwords and other security parameters.
17.2.9.2.3. 3. Protect stored cardholder data.
17.2.9.2.4. 4. Encrypt transmission of cardholder data across open, public networks.
17.2.9.2.5. 5. Use and regularly update anti-virus software or programs.
17.2.9.2.6. 6. Develop and maintain secure systems and applications.
17.2.9.2.7. 7. Restrict access to cardholder data by business need to know.
17.2.9.2.8. 8. Assign a unique ID to each person with computer access.
17.2.9.2.9. 9. Restrict physical access to cardholder data.
17.2.9.2.10. 10. Track and monitor all access to network resources and cardholder data.
17.2.9.2.11. 11. Regularly test security systems and processes.
17.2.9.2.12. 12. Maintain a policy that addresses information security for employees and contractors.
17.2.10. Federal Information Security Management Act (FISMA) of 2002
17.2.10.1. • Inventory of information systems
17.2.10.2. • Categorize information and information systems according to risk level
17.2.10.3. • Security controls
17.2.10.4. • Risk assessment
17.2.10.5. • System security plan
17.2.10.6. • Certification and accreditation
17.2.10.7. • Continuous monitoring
17.3. Ways to Deal with Privacy
17.3.1. • Laws on government FPA, VA ISA, USA PATRIOT
17.3.2. • Laws on corporations HIPAA, HITECH, GLBA, PIDEDA
17.3.3. • Self-regulation PCI DSS
17.3.4. • Individual user Passwords, encryption, awareness
18. Data Breaches
18.1. U.S. Laws Pertaining to Data Breaches
18.1.1. Health Insurance Portability and Accountability Act
18.1.2. Health Information Technology for Economic and Clinical Health Act
18.1.3. Gramm-Leach-Bliley Act of 1999
18.1.4. Economic Espionage Act of 1996
18.1.5. State Laws
18.2. Other Nations’ Laws Pertaining to Data Breaches
19. Policies, Standards, Baselines, Guidelines, and Procedures
19.1. Level
19.1.1. • Strategic
19.1.1.1. • Security policy
19.1.2. • Tactical
19.1.2.1. • Mandatory standards
19.1.2.2. • Recommended guidelines
19.1.2.3. • Detailed procedures
19.2. Security Policy
19.2.1. is an overall general statement produced by senior management
19.2.2. The policy provides the foundation
19.2.3. Types of Policies
19.2.3.1. • Regulatory
19.2.3.2. • Advisory
19.2.3.3. • Informative
19.2.4. outlined
19.2.4.1. issue-specific policies
19.2.4.2. system-specific policy
19.2.5. a common hierarchy of security policies
19.2.5.1. • Organizational policy
19.2.5.1.1. • Acceptable use policy
19.2.5.1.2. • Risk management policy
19.2.5.1.3. • Vulnerability management policy
19.2.5.1.4. • Data protection policy
19.2.5.1.5. • Access control policy
19.2.5.1.6. • Business continuity policy
19.2.5.1.7. • Log aggregation and auditing policy
19.2.5.1.8. • Personnel security policy
19.2.5.1.9. • Physical security policy
19.2.5.1.10. • Secure application development policy
19.2.5.1.11. • Change control policy
19.2.5.1.12. • E-mail policy
19.2.5.1.13. • Incident response policy
19.3. Standards
19.3.1. Standards refer to mandatory activities, actions, or rules.
19.3.2. eg. ISO/IEC 27000 series
19.4. Baselines
19.4.1. refers to a point in time that is used as a comparison for future changes
19.4.2. eg. Evaluation Assurance Level (EAL) 4 baseline
19.5. Guidelines
19.5.1. are recommended actions and operational guides to users, IT staff, operations staff, and others
19.6. Procedures
19.6.1. are detailed step-by-step tasks that should be performed to achieve a certain goal
19.7. Implementation
19.7.1. support them shows DUE CARE