Wednesday, October 30, 2019

Site Reliability Engineering (SRE)

Resources:

  • SRE Weekly. SRE Weekly is a newsletter devoted to everything related to keeping a site or service available as consistently as possible.  SRE (Site/Service Reliability Engineering) isn’t just about automated failover or fault-tolerant architectures — although of course those are important.  It’s about a holistic view of reliability that takes into account everything from servers to human factors to processes to automation and more. Blog, RSS feed, and Twitter.
  • USENIX SREcon. SREcon is a gathering of engineers who care deeply about site reliability, systems engineering, and working with complex distributed systems at scale. Our purpose is to be inclusive as we bring together ideas representative of our diverse community, whether its members are focusing on a global scale, launching new products and ideas for a small business, or pivoting their approach to unite software and systems engineering. SREcon challenges both those new to the profession as well as those who have been involved in it for decades. The conference has a culture of critical thought, deep technical insights, continuous improvement, and innovation

Tuesday, August 13, 2019

PII and PHI in the DoD

Personally Identifiable Information (PII) and Personal Health Information (PHI) is private information that can be used to distinguish or identify an individual. The PII the government collects must be relevant, accurate, timely, and complete (per Privacy Act of 1974). PII is normally stored in records that only individuals with need-to-know may access the records. PHI concerns records of an individual's physical or mental health. PHI includes any individually identifiable health information such as the medical history or medical billing information that was either created or received by a covered entity. Covered entities include health plans and almost all health care providers engaged in electronic billing and eligibility verification transactions. PHI is a subset of PII that requires additional safeguards. PHI is individually identifiable health information created or received by a covered entity, relating to the:  1) past, present, or future physical or mental health of an individual, 2) provision of health care to an individual, or 3) past, present, or future payment for provision of health care to an individual. Covered entities, whom are authorized to handle PHI, include health plans and health care providers. In the case of DoD, the TRICARE program is a covered entity health plan. Military Treatment Facilities are also healthcare provider covered entities. DoD 6025.18-R, DoD Health Information Privacy Regulation, contains detailed information on DoD Components that are and are not considered covered entities and obligations of non-covered entity DoD Components when they act as business associates of DoD covered entity components. When PHI is shared with a non-covered entity, it is no longer PHI and it is converted to PII. DoDI 8580.02, Security of Individually Identifiable Health Information in DoD Health Care Programs, defines exceptions and situations that are not considered covered entities.
http://www.dtic.mil/whs/directives/corres/pdf/540011r.pdf
DoD 5400.11-R, May 14, 2007
DL1. DEFINITIONS
DL1.14. Personal Information. Information about an individual that identifies, links, relates, or is unique to, or describes him or her, e.g., a social security number; age; military rank; civilian grade; marital status; race; salary; home/office phone numbers; other demographic, biometric, personnel, medical, and financial information, etc. Such information is also known as personally identifiable information (i.e., information which can be used to distinguish or trace an individual’s identity, such as their name, social security number, date and place of birth, mother’s maiden name, biometric records, including any other personal information which is linked or linkable to a specified individual).
http://www.dtic.mil/whs/directives/corres/pdf/540011p.pdf
DoDD 5400.11, May 8, 2007
E2. ENCLOSURE 2
E2.2. Personal Information. Information about an individual that identifies, links, relates, or is unique to, or describes him or her (e.g., a social security number; age; military rank; civilian grade; marital status; race; salary; home or office phone numbers; other demographic, biometric, personnel, medical, and financial information, etc). Such information also is known as personally identifiable information (e.g., information which can be used to distinguish or trace an individual’s identity, such as his or her name; social security number; date and place of birth; mother’s maiden name; and biometric records, including any other personal information which is linked or linkable to a specified individual.
Regulations which apply to the administrative, physical, and technological controls pertaining to PII include:  Privacy Act of 1974, Freedom of Information Act (FOIA), Health Insurance Portability and Accountability Act (HIPAA) Privacy and Security Rules (45 CFR Parts 160 and 164), and Health Insurance Technology for Economic and Clinical Health (HITECH).  The Privacy Act of 1974 requires establishment of rules of conduct and safeguards for PII and requires the US Government to maintain accurate, relevant, timely and complete information. The FOIA keeps the public informed while protecting government interests and protects access to records exempt under the Privacy Act. The HIPAA Privacy and Security Rules 1) establish national standards for PHI use and disclosure and individual rights and 2) establishes national standards for administrative, physical, and technical PHI safeguards. HIGHTECH establishes breach notification standards for PHI, expands HIPAA security requirements to business associates of covered entities, and expands penalties for HIPAA violations.

Risks associated with the misuse or improper disclosure of PII include:
  • Legal liability of the organization
  • Theft of the identity of the subject of the PII
  • Expense to the organization
  • Damage to the subject of the PII's reputation
  • Inconvenience to the subject of the PII
  • Loss of trust in the organization
Administrative, physical, and technical safeguards at the organizational level include:
Administrative safeguards include:
  • Restricting access to PII to people with need-to-know.
  • Reduce the volume and use of SSNs. Find alternative identification methods such as a CAC EDIPI.
  • Train individuals (orientation, specialized, management, training on systems of records containing PII). Ensure individuals can determine authorized access and need-to-know and limit access accordingly. Make sure use of PII matches the SORN.
  • Establish and follow policies and procedures for handling PII, specifically - defining the impact to affected individuals, actions, assigned agency roles and responsibilities, and consequences associated with losing or misusing PII.
  • Conduct risk assessments (Privacy Impact Assessment, or PIA) before collecting any PII to assess the level of risk to the individual or organization in collecting or maintaining PII. A PIA is conducted before an organization processes PII to ensure it meets legal, regulatory, and policy requirements and determines the risks of collecting, using, maintaining, and disseminating PII on electronic information systems. A PIA is required when an organization collects PII from:  an existing information system or electronic collection for which no PIA was previously completed or new information systems or electronic collections 1) before development or purchase, and 2) when converting paper records to electronic systems. A PIA is not required when the information system or electronic collection:  does not collect, maintain, or disseminate PII, is a National Security System (NSS), including one that processes classified information, or is solely paper-based.
  • Review and report on PII holdings (System of Records Notice, or SORN) annually in the Federal Register and report status to Congress. SORNs include a list of PII collected, system safeguards, purpose of collection, and access/correction processes.
  • Individuals should:  monitor and minimize the use of SSNs, determine authorized access adn need-to-know, determine if PII is necessary, ensure PII matches what is published in the SORN, be aware of the surrounding environment when engaging in a conversation involving PII, determine and ensure PII is correct (obtain directly from the subject, verify information is accurate, relevant, timely, and necessary, ensure any other information comes from an authorized, accurate source such as Government sources).
Physical safeguards include:
  • Properly storing records in accordance with agency policy and procedures.
  • Employ access controls.
  • Secure hardware by locking in a secure room.
  • Establish policies and procedures for handling, transmitting, and disposing of paper and electronics (must comply with National Archives and Records Administration, or NARA, for records management requires for retention and disposal).
  • Test safeguards to ensure they are operating as intended.
  • Mark as CUI.
  • When transporting or transmitting PII, use cover sheet, the appropriate postal class, and wrapping.
  • Protect PII during working and non-working hours.
  • Follow NARA's Record Management guidelines for record retention and disposal (render destroyed PII unrecognizable and beyond reconstruction via shredding or incineration).
  • Test safeguards.
Technical controls include:
  • Encrypting records in accordance with agency policies and procedures.
  • Provide secure systems for storing and transmitting electronic records which include:  encryption, remote access controls, time-out functionality, and logging and verifying access.
  • Ensure implementation of role-based access controls for the workforce.
  • Ensure workforce members understand their responsibilities for safeguarding electronic records.
  • Use only government-approved devices and software.
  • Use appropriate encryption.
  • Use access controls to limit access on shared drives to individuals with need-to-know.
  • Following agency policies and procedures for transmission of PII (i.e. confirming email or fax receipt, encrypting email, verification of need-to-know on mailing lists).
  • Follow rules for telework:  use of appropriate/approved systems to process PII, obtain approval from manager before extracting PII to another computer, never transmit PII via personal email.
Administrative, physical, and technical safeguards at the individual level include:

Administrative safeguards include at the individual level:
  • Monitor and minimize the use of SSNs. Use DoD ID numbers wherever possible for internal DoD business processes.
  • Determine authorized access and need-to-know (e.g. verify user credentials/authority) and limit access accordingly.
  • Ask yourself:
    • Is using PII necessary? Consider whether a task can be completed without PII. If the task cannot be completed without PII; if it cannot be completed without using or disclosing PII do not use or disclose more PII than is the minimum amount necessary to support the use or disclosure.
    • Make sure the use of PII matches the purpose of collection in the SORN. Do not use information that was previously collected in a system of records for a new use before altering the existing SORN or creating a new one and publishing it in the Federal Register. Do not even use a subset of existing PII for a new purpose, and do not maintain data collections in secret.
    • Is it safe to talk about this PII? Be aware of the surrounding environment when engaging in conversation involving PII. Ensure that telephone conversations are private. 
    • Do you have the right information, and is it correct? If possible, collect information directly from the subject of the PII; this will guarantee you have the most up-to-date information. Verify the data is accurate, relevant, timely, and necessary. Ensure that other information comes from the authorized official source, such as government sources. 
Physical safeguards at the individual level include:
  • Mark PII as CUI.
  • Use cover sheets, postal class, or wrapping when transporting.
  • Reduce the risk of access to PII during business hours by covering or placing it out of sight when not directly working on it. Lock your computer when leaving it unintended. Store PII appropriately after working hours (e.g. locked or unlocked containers, desks, or cabinets. Dispose of all paper or electronic records according to the standards defined in the SORN or by NARA. You must render discarded PII unrecognizable and beyond reconstruction. 
Technical safeguards at the individual level include:
  • Use government-approved devices and software. This safeguards PII because the government has determined these devices and software provide adequate protection. 
  • Make sure you are encrypting PII appropriately.
  • Use access controls to limit access to PII on shared drives to individuals.
  • Follow agency policies and procedures for transmitting PII, such as confirming fax receipt, encrypting emails, verifying email distribution lists contain only authorized individuals.
  • Follow rules for telework including using only a government-furnished computer, getting approval from a manager before extracting PII onto that computer, and must never transmit PII via personal email. 
A Privacy Impact Assessment (PIA) is a risk assessment that must be completed before an organization begins gathering PII. A PIA analyzes how an organization handles information to ensure it satisfies requirements and determines the risks of collecting, using, maintaining, and disseminating PII on electronic information systems. DoD Form 2930, Privacy Impact Assessment, June 2017 is used. A PIA is required when an organization collects PIA from:
  • Existing information systems and electronic collections for which no PIA was previously completed.
  • New information systems or electronic collections:
    • Before development or purchase
    • When converting paper records to electronic systems.
A PAI is not required when the information system or electronic collection:
  • Does not collect, maintain, or disseminate PII.
  • Is a National Security System, including ones that process classified information.
  • Is solely paper-based.
A PIA:
  • Analyzes how organization handles information to ensure it satisfies legal, regulatory, and policy requirements.
  • Determines the risks of collecting, using, maintaining, and disseminating PII on electronic information systems.
  • Aims to mitigate unauthorized use or disclosure risks.
If an organization determines that it must collect and maintain PII that will be retrieved by a personal identifier within that systems, a System of Records Notice (SORN) must be published in the Federal Register. A System of Records is a group of records under an organization or agency's control from which personal information about an individual retrieved using the individual's name or some other unique identifier. A SORN notifies the public that an agency will collect and retrieve PII in a system of records. It must be published before PII is collected. SORNs include: the legal authority to collect the PII, the type of PII that will be collected, the safeguards in place to protect the PII, how individuals can determine if they are part of that system, and how they can obtain a copy of their record if they are a part of that system.

All of the safeguards and best practices that apply to PII also apply to PHI but PHI receives greater protection. The HIPAA Privacy and Security Rules require covered entities to have in place appropriate administrative (e.g. access restrictions), technical (e.g. encryption), and physical (e.g. proper physical records storage) safeguards to protect PHI. Based on its risk analysis and risk management plan, each covered entity can establish its own specific administrative, physical, and technical safeguards. They do this by evaluating their needs, the types of PHI involved, and specific business risks. For DoD covered entity Components, individual implementation is subject to DoD's implementation of the HIPAA Privacy and Security Rules. Covered entities must train workforce members on the policies and procedures that apply to PHI. Other than required disclosures of an individual's PHI to that individual or use and disclosure of PHI for treatment of an individual, covered entities must limit permitted uses and disclosures of PHI to the minimum necessary to accomplish the purpose of that use of disclosure. Non-covered entities that are business associates of covered entities also have responsibilities to safeguard PHI. They must make sure their workforce members are able to recognize PHI and understand that HIPAA Privacy and Security Rules provide additional protection and controls for PHI beyond what is required for PII.
DoD 6025.18-R, "DoD Health Information Privacy Regulation." JAN 2003, implements the HIPAA Privacy Rule within DoD and its Components and defines permitted uses and disclosures of PHI. DoD 8580.02-R, "DoD Health Information Security Regulation," 12 JUL 2007 implements the HIPAA Security Rule within DoD and its Components and defines administrative, physical, and technical safeguards for electronic PHI.

DoD 8580.02-R, DoD Health Information Security Regulation, contains requirements for electronic PHI risk analyses and requires organizations to:
  • Assess potential risks and vulnerabilities to confidentiality, integrity, and availability of all electronic PHI they create, receive, store, or transmit.
  • Conduct a risk analysis that includes:  threat assessment, exploitable vulnerabilities, and residual risk determination.
  • Consider both organizational and technical assessments that address all security areas.
  • Consider all losses to be expected if security measures were not in place, including losses caused by unauthorized use and disclosure, as well as losses of data integrity or accuracy.
Authorized use and disclosure of PII: 
The Privacy Act limits the rights of an agency to disclose PII to any person or other agency . The Disclosure Rule:  No disclosure of a record in a system of records unless:
  • the request is made in a written request for whom the record pertains or that individual has given prior consent.
  • disclosure is made under one of the Privacy Act's 12 permitted disclosures, including routine use as defined in the SORN. Routine use is the purpose for which the government collected it (as defined in the SORN).
The 12 exceptions to the Privacy Act that permit disclosure of PII to another person or agency:
  1. To officers and employees of the agency maintaining the record who have a need for the PII in their performance of duties
  2. When the FOIA requires release
  3. For a "routine use" identified in the SORN that has been published in the Federal Register. The Privacy Act defines "routine use" as disclosure of a record for the purpose compatible with the purpose for which the government collected it. The SORN identifies routine uses and disclosures of the system's records.
  4. To the Census Bureau for the purposes of conducting a census or survey
  5. For statistical research or reporting without individually identifying data
  6. to the National Archives and Records Administration (NARA)
  7. To a law enforcement agency for a civil or criminal investigation
  8. When there are compelling or emergency circumstances affecting someone's health or safety
  9. To Congress, including its committees or subcommittees
  10. To the Government Accountability Office and Comptroller General
  11. Pursuant to a court order
  12. To a consumer reporting agency
A breach occurs when an individual or organization improperly uses or discloses PII. PII is lost, stolen, compromised or used outside of parties with need-to-now or as part of routine use. OMB M-17-12 requires agencies to establish a breach notification policy and plan. If a breach occurs, the head of the organization must notify the proper individuals and provide this information:
  • What happened, the date of the breach, and how it was discovered.
  • Types of personal information involved.
  • Whether the information was encrypted or otherwise protected.
  • Steps taken to protect the affected individuals from harm.
  • Agency investigation and remediation actions.
  • Point of contact for affected individuals.
Within the DoD, per DoD 5400.11-R, the organization must report the discovery of a breach within:
  • one hour to US-CERT
  • 24 hours to Component Privacy Office
  • 48 hours to the Defense Privacy and Civil Liberties, and Transparency Division
After discovering a potential breach of PII, an individual should:
  • immediately notify the appropriate authority:  Supervisor, Privacy Officer, or System Manager. In some cases, your manager may have to notify DoD and national authorities, as well as those individuals whose personal data was compromised. 
  • Document when and where the potential breach was found:  record the URL for PII on the Internet.
  • If the breach includes PHI, a simultaneous, independent report is required.
Authorized use and disclosure of PHI. A covered entity may use or disclose PHI:
A covered entity may use of disclose PHI:
  • To the subject of the PHI
  • Pursuant to the PHI subject's written authorization
  • For treatment, payment, or health care operations
  • As otherwise permitted or required by the HIPAA Privacy Rule
A breach of PHI is defined as the acquisition, access, use, disclosure of PHI in a manner not permitted under the HIPAA Privacy Rule that compromises the security or privacy of PHI.
For a breach of PHI affecting fewer than 500 individuals, covered entities must:
  • Notify individuals whose PHI may have been breached within 60 days of the incident.
  • Report all such breaches to the Secretary of the Health and Human Services (HHS) annually.
For PHI breaches affecting 500 or more individuals, covered entities must, within 60 days of the incident:
  • Notify individuals whose PHI may have been breached.
  • Notify the Secretary of HHS.
  • Notify prominent media outlets serving the State or jurisdiction where affected individuals reside, usually by issuing a press release.
In the event of a breach of PHI, individuals working at a covered entity must:
  • Report lost, stolen, or compromised PHI to the organization's designated official which describes the information compromised and when and where the discovery occurred.
If organizations and individuals do not comply with the laws for safeguarding PII and PHI, they are are subject to civil penalties for not protecting PII and PHI. Individuals are subject to criminal penalties for PII and potentially civil penalties for PHI. 
Organizations can incur civil penalties for:
  • unlawfully refusing to remove a record.
  • unlawfully refusing access to a record.
  • Failing to maintain accurate, relevant, timely, and complete information.
  • Failing to comply with any Privacy Act provision or agency rule that adversely affects the subject of the record.
  • Failing to comply with the HIPAA Privacy and Security Rules.
Penalties for failing to protect PII include damages and reasonable attorney fees. Penalties for failing to protect PHI include fines dependent on the nature and extent of harm caused.
Criminal penalties can be incurred by any official or employee who:
  • Knowingly discloses PII from a system of records to an unauthorized person.
  • Failing to publish SORN in Federal Register for a system of records.
Criminal penalties for individuals failing to comply with regulations to protect PII include:  misdemeanor conviction and fine up to $5,000. Individuals who knowingly or wrongfully obtain or disclose PHI are subject to the following criminal penalties:  Up to one year in prison and fine up to $50,000. For offenses committed under false pretenses or for commercial purposes (such as selling PHI or using PHI for personal gain or malicious harm), criminal penalties:  up to 10 years in prison and fines up to $250,000.
Individuals can incur civil penalties for failing to comply with the requirements and standards of the HIPAA rules. The HITECH Act increased the fines in 2009. Four tiers of penalties are defined for failure to protect PHI, depending on the severity:  Tier A, Tier B, Tier C, and Tier D:
  • Tier A. Violations in which the offender did not realize he or she violated the act (and by exercising reasonable diligence would not have known):  $100 to $50,000 for each occurrence.
  • Tier B. Violations due to reasonable cause and not to willful neglect:  $1,000 to $5000 for each occurrence.
  • Tier C. Violations due to willful neglect that the organization did not correct within 30 days of when the violation was discovered (or should have been discovered):  $10,000 to $50,000 per violation.
  • Tier D. Violations due to willful neglect that the organization did not correct:  $50,000 to $1.5M for identical violations occurring within a calendar year.
Consequences of PII Loss to Individuals:
  • Identity theft
  • Fraud (e.g. credit card)
  • Other criminal acts
  • Damage reputation
  • Embarrassment
  • Inconvenience
Consequences of PII Loss to Organizations
  • Loss of trust
  • Legal liability
  • Remediation costs (e.g. providing credit monitoring services to individuals affected by the lost or misused PII). DoD cost due to OMB breach in 2015 was $132M.
Examples of PII:
  • Social Security Number (SSN)
  • Driver's License Number
  • Fingerprint
REFERENCES:
  • Identifying and Safeguarding PII Course
  • DoD 6025.18-R​, "DoD Health Information Privacy Regulation," JAN 2003. Implements the HIPAA Privacy Rule within the DoD and its Components. Sets out permitted uses and disclosures of PHI.
  • DoDI 8580.02, "Security of Individually Identifiable Health Information in DoD Health Care Programs," August 12, 2015. Implements the HIPAA Security Rule within DoD and its Components. Defines administrative, physical, and technical safeguards for electronic PHI.
  • DoD 8580.02-R, "DoD Health Information Security Regulation," 12 JUL 2007.
  • DoD 5400.11-R, "Department of Defense Privacy Program," 14 MAY 2007. Regulation governing DoD Privacy Program.
  • DoDD 5400.11, "DoD Privacy and Civil Liberties Program," 29 January 2019, Change 1 December 8, 2020. Defines DoD Privacy Program.
  • http://www.doncio.navy.mil/contentView.aspx?id=2428
  • Army Course:  WNSF - Personally Identifiable Information (PII) v2.0
  • E-Government Act of 2002. Improves interaction and communication between public and private sectors. Amended by FISMA.
  • Federal Information Security Modernization Act (FISMA) of 2014. Identifies Federal information security controls.
  • OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002. Guides implementation for protecting information.
  • OMB Memorandum M-17-12, Preparing for and Responding to a Breach of Personally Identifiable Information. Identifies Federal information security controls. 
  • OMB Memorandum 07-16, Safeguarding Against and Responding to the Breach of Personally Identifiable Information (check if this has been superseded by M-17-12)
  • ISO 29134 (Guidelines for Privacy Impact Assessment, June 2017)

Friday, July 12, 2019

Where To Start wth Selecting a Cloud Service Provider (CSP) for DoD

The fundamental resource for cloud deployments in DoD is the DISA IASE. Until 15 Dec 2014 (the DoD CIO memo which permits DoD components to acquire cloud services directly) , the DISA ECSB (Enterprise Cloud Service Broker) was the designated broker for Cloud services in the DoD (see DoD Enterprise Cloud Service Broker Cloud Security Model, Version 2.1, March 13, 2014).
DoD uses the NIST SP 800-145 definition of cloud services to determine if a Cloud Service Offering (CSO) should be considered "cloud." Typical cloud services include any "as-a-service" offering:
  • Software As A Service (SaaS)
  • Platform As A Service (PaaS)
  • Infrastructure As A Service (IaaS)
  • Storage As A Service
Note the information below was written prior to the publishing of the SRG (the SRG replaced the cloud security model documents) and needs to be updated! The most current DISA cloud computing requirements is here:  DoD Cloud Computing Security. The current Security Requirements Guide is v1r2 published 25 MAR 2016.
(http://iase.disa.mil/cloud_security/Documents/cloudsecuritymodel_v2-1_20140320.pdf) is probably the first reference to go through. One of the first things that you need to do before selecting a Cloud Service Provider (CSP) is figure out what Impact Level (1-6) your service falls in. That will govern which CSP can host your application or data, because they will need a Provisional ATO (PA) for their service at that Impact Level. For example, AWS East/West (E/W) is approved at Impact Levels 1 and 2, and GovCloud is approved at Level 4 and IBM CMSG is approved at Level 5 (there are currently no CSPs approved at Impact Level 6, which is for Classified systems or data). The DoD approved CSOs are listed in the DoD Cloud Services Catalog.
To categorize your data and thus come up with an Impact Level, you use NIST SP 800-60 (two volumes, Volume II is what you need) to categorize the information type(s) stored, processed, disseminated, or otherwise used by the system. I have completed this activity for a few AGC services:  SDSFIE, IENC, and DDASS. The link to NIST SP 800-60 is here:  http://csrc.nist.gov/publications/nistpubs/800-60-rev1/SP800-60_Vol2-Rev1.pdf. You'll end up with an assignment for Confidentiality, Integrity, and Availability (CIA) for the system which can be Low, Medium, or High. That triad corresponds to an Impact Level. For example, {L,M,x} is an Impact Level 2 system which has low impact for confidentiality (this is public data, no big problem if it is released), medium for integrity (it's fairly important that the data is accurate and not changed), and an availability that is up to the system owner to select (i.e. it can be low, medium, or high). Here's a table of all the Impact Levels that were originally published under the DISA Cloud Security Model and have been reduced to 4 levels (2, 4, 5 & 6) with the release of the Cloud SRG:
  • Impact Level 1:  Unclassified, Public (NA-L-x)
  • Impact Level 2:  Unclassified, Private (L-M-x)
  • Impact Level 3:  Controlled Unclassified Information (L-M-x)
  • Impact Level 4:  Controlled Unclassified Information (M-M-x)
  • Impact Level 5:  Controlled Unclassified Information (H-H-x). This level may now be for National Security Systems (NSS). NSS are defined in CNSSI 4009 and NIST SP 800-59.
  • Impact Level 6:  For the Enterprise Cloud Service Broker (ECSB) Initial Operating Capability (IOC), only DoD Cloud Service Providers (CSP) are eligible to provide services for Impact Level 6. Any DoD CSPs providing service at Impact Level 6 must have a DoD ATO.
Note the "Private" tag on Impact Level 2. Per discussion with the DISA ECSB, they agree that public systems that use some kind of registration or access control (logon, but not necessarily DoD PKI), are considered "private" (which is different than "private" as defined by JTF-GNO CTO 07-015, which is what led us to using CAC or ECA for access to sites hosting CUI data).
The SRG introduces the requirement for DoD Provisional Authorizations (PA) and use of a Cloud Access Point (CAP) for Impact Levels 4-5 to mitigate risk to DoD by allowing CSPs to interconnect with DoD networks. The application of additional security controls beyond FedRAMP is termed FedRAMP Plus (+). FedRAMP is the minimum security baseline for all DoD cloud services. There are three paths to a Provisional ATO (PATO):
  • FedRAMP Joint Authorization Board (JAB) to DoD PA
  • FedRAMP Agency to DoD PA
  • DoD Sponsored (CSP needs 3PAO or DoD assessor)
"Shared Controls" require that the Cloud Service Offering (CSO) and Mission Owner address security.
How many controls need to be considered?
​Required ​Total ​Mission Owners Can Add ​
​NIST Moderate ​260 ​260
​FedRAMP Moderate ​+65 ​325
​DoD Impact Level 2 ​+0 ​325
​DoD Impact Level 4 ​+35 ​360 ​9 (NIST) ​TBD (Privacy Overlay if required)
​DoD Impact Level 5 ​+9 ​369 ​12 (NIST) ​TBD (Privacy Overlay if required)
​DoD Impact Level 6 ​+0 ​369 ​11 (NIST) ​98 (from the Classified Privacy Overlay if required)

Computer Network Defense (CND) responsibilities must be clearly defined.
The Mission Owner defines the cloud availability and resiliency (disaster recovery, business continuity) under a Service Level Agreement (SLA) with the CSP.
When approaching cloud migration, organizations should identify their IT services and the information type(s) associated with them, categorize them, address "special considerations" identified in NIST SP 800-60 Vol. 2, and assign a CSM Impact Level. Note that generic information types, such as "maps," are not going to be defined in NOST 800-60 Vol. 2. You have to know how the information is being used in order to categorize it. For example, a web application might provide maps of inland waterways. The maps are for waterway navigation, therefore the information type is "Water Transportation."
As far as approvals go, there's two parts:
1) Registration through the DISA ECSB
2) Approval for your mission service from your approving authority. This concerns getting an ATO for the application.
Here's the main DISA ECSB page. From there, you can find the policy documents as well as the Cloud Service Request (CSR) form:  http://disa.mil/Services/DoD-Cloud-Broker.
As far as the approval process goes, currently this is done under DIACAP and will be transitioning to the Risk Management Framework (RMF). Be leery of references to DoDI 8510.01 regarding the C&A process, because that document was revised in March of 2014 and kept the same numbering but changed title to "Risk Management Framework (RMF) for DoD Information Technology (IT)" and now we use the term "Assessment and Authorization" or A&A instead of C&A. DIACAP uses the DoDI 8500.2 IA control catalog and RMF uses the NIST SP 800-53 control catalog. DIACAP is a 5-step process (Initiate and Plan IA C&A, Implement and Validate Assigned IA Controls, Make Certification Determination & Accreditation Decision, Maintain Authorization to Operate and Conduct Reviews, Decommission). RMF is a 6-step process (Categorize, Select Controls, Implement Controls, Assess Controls, Authorize, Monitor). Per the RMF, we should begin working on System Security Plans (SSP), Information System Contingency Plans (ISCP), and other governance documentation for our cloud services. NIST provides templates:  http://csrc.nist.gov/publications/nistpubs/800-18-Rev1/sp800-18-Rev1-final.pdf, http://csrc.nist.gov/publications/nistpubs/800-34-rev1/sp800-34-rev1_errata-Nov11-2010.pdf.
Physical versus Logical, Shared or Dedicated?
​IL 2 ​shared or dedicated infrastructure (on or off premise OK)
​IL 4 ​shared or dedicated infrastructure with strong evidence of virtual separation controls and monitoring. Ability to meet search and seizure requests of DoD Data. On or off premise OK.
​IL 5 ​Only dedicate infrastructure (on or off premise OK). Only DoD Private, DoD Community or Federal Government community clouds can be used. Each deployment can support multiple missions/tenants from each customer organization. Virtual or physical separation between DoD and Federal tenants/missions is permitted. Virtual or physical separation between DoD tenants or missions is permitted. Physical separation from non-DoD, non-Federal tenants is required.
​IL 6 ​Dedicated infrastructure approved for classified information. On or off premise OK provided NISPOM or DoD 8500.1 requirements met. Requires cleared personnel (CSP must have FCL at appropriate level). Each deployment may support multiple SECRET missions. Virtual or physical separation is required between DoD and Federal tenants. virtual or physical separation between DoD tenants/missions is permitted. Physical separation from non-DoD/non-Federal tenants is required.
Other considerations:
  • The cloud management/orchestration features need to be considered (e.g. two-factor authentication for administration of the virtual infrastructure, not just the VMs themselves)
  • e-discovery and law enforcement seizure issues need to be considered.
  • ITAR clouds do not necessarily meet the standards for dedicated clouds (e.g. AWS GovCloud). 
  • PKI requirements:
    • CAC or ECA must be used at IL 4/5.
    • NSS Token must be used at IL 6.
    • Cloud provisioning portal or multi-factor authentication must be PK enabled for IaaS/PaaS/SaaS at IL 4, 5, and NSS Token at IL 6.
All the above pertains to security. As far as deployment in the cloud goes, and in particular AWS, there's a lot of technical information to read concerning AMI instances (templates or base operating system sources), EC2 (Elastic Compute Cloud, which are the VMs), EBS (Elastic Block Store, space to run the VM), S3 (Simple Storage Service), Glacier (archival storage), VPC (Virtual Private Cloud), and other Amazon-specific, virtualization, networking, etc. concepts. Once deployed, and authorized, there's the requirement for continuous monitoring (step 6 of the RMF) and compliance with ongoing OMB, USCYBERCOM, ARCYBER, and other Federal or DoD policies.
The DISA Cloud Connection Process Guide (CCPG) is intended to assist DoD components with navigating the security requirements and onboarding processes for implementing commercial cloud services.

Other government cloud computing guidance:
Resources:
eMASS
  • AWS GovCloud (US) Impact Level 4 DoD PA was issued 26 October 2018 and expires 26 October 2019 and was signed by Mr. Roger Greenwell. POC for questions concerning the eMASS package is Mr. Charlie Griggs at DISA (charles.e.griggs2.civ@mail.mil). The eMASS package ID is 657.
  • ARL as a CSSP provides a spreadsheet which lists NIST SP 800-53 controls and CCIs they take responsibility for. Until/unless this information gets transferred into eMASS and the record can be inherited, recommend establishing manual inheritance. This will provide compliance with approximately 120 assessment procedures. 
CSP/CSO Resources:
News:
Vendors:
  • Autonomic Resources
  • AWS Services in Scope by Compliance Program. Lists the status (approved, not approved, undergoing assessment) of AWS services for the following compliance programs:  SOC, PCI, ISO, FedRAMP, DoD CC SRG, HIPAA BAA, IRAP, MTCS, C5, K-ISMS, ENS High, OSPAR, HITRUST CSF. For DoD CC SRG, environments are broken out by IL 2 (East/West), IL 2 (GovCloud), IL 4 (GovCloud), and IL 5 (GovCloud).
  • AWS Cloud Adoption Framework (CAF). Identity and Access Management, Detective Controls, Infrastructure Protection, Data Protection. and Incident Response. Assigns stakeholders to one of six perspectives:  Business, People, Governance, Platform, Security, Operations. Need to understand cloud adoption from the view of those stakeholders.
  • ​Amazon Web Services (AWS) East/West. Impact Level 2.
  • AWS GovCloud US. Impact Level 4 and 5
  • Carpathia
Press:

Wednesday, July 10, 2019

Interim Authority to Test (IATT) Process for the Army

The US Army Network Enterprise Technology Command (NETCOM) IATT process is described here:  https://army.deps.mil/NETCOM/sites/RMF/csdwiki/iattfaq.aspx

Highlights:

  • An IATT is used to permit testing of a system in an operational environment with live data.
  • All applicable security controls should be tested and satisfied before testing in an operational environment or with live data except for those that can only be tested in an operational environment.
  • The Army has implemented a security controls overlay to streamline the IATT assessment process. The IATT overlay is designed to reduce the amount of time and resources necessary to assess the security state of the system under test. The AO will determine if use of the IATT Overlay is acceptable or unacceptable.
  • RMF Steps 1 and 2 (categorization and selection) must be completed prior to initiating the IATT process.
  • Documentation must be uploaded to eMASS to reflect the initial/test design. The final design may be different (and thus the revised design will be assessed if an ATO is pursued).
    • System details section of eMASS must be accurately completed.
    • Hardware list
    • Software list
    • Authorization boundary diagram
    • Data flow diagram (depict how specific data types are transmitted in/out/within the accreditation boundary). This should probably include CAL boundaries as well as ports and protocols.
    • System Enterprise and Information Security Architecture that describes how the system is/ will be integrated into the enterprise architecture and information security architecture. This should include enterprise services such as Defense Enterprise Email (DEE) as well as services to be provided by the Cybersecurity Security Service Provider (CSSP).
    • Other documentation:  rack diagrams, connection diagrams, physical blueprints, process flow diagrams, build guides, or operational guides which fully describe the system and its configuration to an unfamiliar reviewer. These documents may be bundled together in a Functional Architecture.
  • A documented test plan must be provided, which includes a test timeline, test purpose, and test scope (locations, networks, connectivity). References include Test and Evaluation Master Plan at DAU and Operational Test and Evaluation (TEMP) Guidebook. Our best business practice is to include a Word document which describes the test purpose, scope, and tests to be performed along with a Microsoft Project file which more clearly defines the test timeline and phases if applicable.
  • Third party assessment performed by a SCA-V is not required (but can be done if desired).
  • All STIGs (and SRGs) must be applied, assessed, and reviewed. The ISO may follow the sample rate guidance documented in Appendix C of the Security Control Assessor - Validator (SCA-V) Tactics, Techniques, and Procedures (TTP) to keep the STIG review files to a manageable number. Full STIG checklists must be utilized, not just the benchmarks. STIG and SRG assessment results (.ckl) must be provided in the eMASS record.
  • The hardware and software vulnerabilities must be assessed and remediated or mitigated after performing a vulnerability scan (ACAS). A copy of the vulnerability scan must be provided in the eMASS record.
  • The STIG and vulnerability analysis must identify and describe:
    • False positives, false negatives
    • Vulnerabilities that cannot be remediated due to operational requirements/technical limitations.
    • Vulnerabilities that cannot be remediated due to limitations of the testing environment, test configuration of the system, or testing schedule.
    • Specific mitigations in place for vulnerabilities.
    • Security controls that have ongoing vulnerabilities mapped to them that have been tailored by the IATT overlay.
  • If STIG or ACAS scan results import vulnerabilities that do not map to controls (e.g. controls were remove by the IATT overlay), the ISO must manually add these controls (e.g. tailor).
  • The ISO performs a self-assessment and provide appropriate test results for each CCI within each control.
  • The ISO must complete the Risk Assessment within the eMASS record by completing the Risk Assessment tab for all non-compliant security controls. eMASS will automatically filter the Risk Assessment tab for controls marked non-compliant. Information on completing the Risk Assessment can be found on the RMF Knowledge Service (RMFKS): https://rmfks.osd.mil/rmf/RMFImplementation/AssessControls/Pages/ResidualRisk.aspx
  • The ISO must create a POA&M item in eMASS for all ongoing vulnerabilities (including any that cannot be remediated) as well as vulnerabilities present in the STIG/SRG and ACAS scans that cannot be remediated.
    • Vulnerabilities that will be remediated at a later time must include at least one milestone documenting the steps needed for remediation and will be marked as "Ongoing"
    • Vulnerabilities that cannot be remediated must have a valid justification and must be marked as "Risk Accepted". When the AO completes the authorization decision, all Risk Accepted POA&M items will be marked as "Approved"
    • Vulnerabilities present in the scans that have since been remediated must include a reference to evidence tht hat vulnerability is address (evidence must be uploaded as an artifact) and must be marked as "Completed."
  • When the self-assessment is completed, the ISO initiates an Assess and Authorize (A&A) Package Approval Chain (PAC). Scans must be no older than 30 calendar days. The statements below must be included as a comment in the submission:  
    • "Seeking an IATT for the dates [dates in the test plan]."  The ISO may include additional detail about the dates as necessary (for example if the IATT will cover multiple connection periods).
    • (If the IATT overlay is used) "The Authorizing Official [or Authorizing Official Designated Representative, whichever is accurate] has approved the use of the IATT overlay [describe approval method]."  The intent of this statement is to guide reviewers to the document, package history, or other evidence recording the AO / AODR's approval of the IATT overlay.
  • Once the IATT is approved, the ISO must remove the IATT overlay if it has been applied. This will re-add security controls removed by the IATT overlay so that they can be assessed. Security controls that remain in the baseline when the IATT is removed will return to the baseline as Not Applicable. The ISO just add new test results for these controls to mark the Compliant or Not Compliant.

Wednesday, June 19, 2019

Data Protection Solutions and Prevention

How to prevent breach?

  • Firewall (filtering, deep inspection, intrusion detection and prevention)
  • Antivirus protection
  • Data encryption
  • Email protection and filtering
  • A program of ongoing patching and updating of
  • systems (vulnerability management plan, maintenance plan, change and release management plan, configuration control board)
  • Continuing education of users and employees
Vendors:
  • Varonis. Varonis Data Security Platform. Data protection. Analyzes users, their role(s) and establishes behavioral patterns. Detects unusual activity and automatically respond by disabling account and revoking access to resources. Provides quick report of what data a user has access to, so in the event of a breach that access can be removed (e.g. folders, mailboxes, SharePoint sites). Supported products and services:  Active Directory, Windows, SharePoint, Exchange, UNIX/Linux, Office 365, Dell EMC, NetApp, Nasuni, HPE, Box.
  • StealthBITS. Data protection.
  • Quest Software. Data protection.
  • Druva. On-premise and in-cloud backup.
  • Cohesity. On-premise and in-cloud backup.
  • Rubrik. On-premise backup to cloud.
  • N2WS (a Veeam company). Cloud Protection Manager (CPM).  Cloud backup.
Consider creating a "No Access" group and adding that as a "deny" permission or right everywhere. In the event the account is compromised, assign membership to this group.

As with any product to be deployed in the federal government, be aware of Trade Agreements Act (TAA) compliance (FAR 52.225-5). For a list of TAA Designated Countries, see:  https://gsa.federalschedules.com/resources/taa-designated-countries/.

A data protection solution needs to cover not only the data but systems which provide access to the data. Documentation:  Business Impact Analysis (BIA), Business Continuity Plan (BCP), Disaster Recovery Plan (DRP), Data Catalog.

What if you suffer a breach? Disaster Recovery Plan (DRP). 
Reactive:
- Logging. AWS CloudWatch, Splunk, Loggly
- Monitoring. Datadog
- Alerting
- Incident Management. PagerDuty.
Proactive
Site Reliability Engineering (SRE) (3 books, O'Reilly - Site Reliability Engineering, The Site Reliability Workbook, Seeking SRE)

Pyramid:
Base->Top
Monitoring and Observability
Incident Response
Post-incident Analysis
Testing & Release Procedures
Capacity Planning
Development
Product
sre.google.com/books
  • Gremlin. Inject incidents for testing. Chaos engineering?

Considerations:
  • General.
    • Full versus incremental backups. Incremental is fast but could take a while in a restore (multiple incremental).
    • 3-2-1 backup strategy. 3 copies of data, 2 different media types, 1 offsite. How does "offsite" change in the cloud? Another AZ, Region, VPC? Need replication to COOP site (or another cloud AZ, Region, VPC or cloud provider). Consider dumping this readonly into another account.
    • How quickly can you recover (RTO)? How much data can you stand to lose (RPO)?
    • What to recovery first? What is critical, what is dependent on what? BIA and BCP will help with this.
    • Policy(ies). Do you need different policies for different reasons? For example:
      • Production versus development, testing, QA? Maybe only production gets backed up. Implement this via tag on VM/instance (e.g. "Environment" = "Production"). 
      • Roles. Who needs access to backup/recovery console? Keep roles separated (separation of duties).
    • Make DR separate from backups/snapshots. Only need 1 copy of data for DR? Delete/purge old? Perform DR every "n" backups. Although your backups may be onsite or in the same Region as your production data (to avoid cross Region data transfer charges), keep your DR offsite or in another Region and account.
  • Image backups. Backing up a snapshot of an instance (e.g. virtual machine) so that it can be "snapped back" to a prior state. Define:
    • What's my recovery point objective (RPO)? This determines how often snapshots need to be made.
    • How long does it take to make a snapshot? Does this operation impact VM/instance performance?
    • How often are snapshots taken? How often does the state of the machine change? Should there be a combination of manual and automatic backups (e.g. nightly automatic, manual when there's a configuration change? Or trigger backup).
    • How many snapshots should be retained?
    • How to purge old snapshots?
  • Data backups. Backup up user data (e.g. single file or volume). Do I need to restore an entire volume in order to recover a single file (e.g. spreadsheet). Does my backup allow me to do this? Can users do this themselves?
  • Database backups.
Other
Data Access Vendors:
  • Sonrai Security.
Scenarios (what could go wrong, how does it happen?). Need to assign a likelihood and impact to each of these so as to assign risk and prioritize preparation (protection, response, testing).
  • Misconfiguration, hack. Human error allows attacker access or to exploit a vulnerability. Result is most likely administrator access, and then denial of service, exfiltration of data, malware insertion/ransomware.
  • Hack. No human error, possibly hacker exploits a zero-day vulnerability, but same result as above.
  • User accidentally deletes data
  • Insider Threat.
  • Service provider outage.
Case Studies:

Tuesday, May 7, 2019

Digital Electronics Signature Regulations & Laws in the World

courtesy Prakash [prakash2757@yahoo.com] from cisspstudy@cccure.org mailing list

Australia
November 25, 2000: Approved electronic signature bill law with action to take place from July 2001. Gatekeeper model set up and designed to guide development of PKI infrastructure. Australian Customs has developed a system called the Cargo Management Re-Engineering Project (CMR). CMR was designed to leverage PKI for improving Customs paperwork for import and export of goods.

Austria
Digital signature legislation provides full recognition of secure digital signatures with less (but some) support for insecure digital signature methods.

Bermuda
July 1999: Enacted Electronic Transactions Act of 1999, which provides legal recognition of electronic signatures.

Brazil
Superior Court of Justice will now publish its decisions online with a digital signature affixed to the decision to vouch for its authenticity.

Canada
The city of Toronto provides for the land (in other words, deed) registration documents that can be submitted and maintained electronically with digital signatures. More than 50 percent of the city’s land documents have been implemented through this system.

Finland
January 2000: Act on Electronic Service in the Administration. This act defines the scope and structure of the elements of a PKI for digital signatures. Specific exclusions of the act are listed; unlike in other countries, this law excludes the use of digital certificates for the application to administrative judicial procedures

France
March 13, 2000: Electronic signature law implemented.
July 16, 2002: A decree was issued that required cryptographic service providers to provide authorized (government-recognized) agents the ability to decrypt data on demand. Effectively, this required either an escrow ability for private keys in PKI or a cryptographic system that allowed for a “back-door.”

Hong Kong
Hong Kong’s government is offering free digital certificates for the first year of its Smart ID card program. The program was designed to build a system that can provide smart cards for all electronic activities requiring authentication for digital transactions.

Italy
The government has passed a regulation that provides for digital signatures in support of the relevant EU directive (93/99/EC). The regulation identifies two types of signatures: a light signature for person identification and access to general public administration services and a more secure signature for digital signatures of electronic documents.

Korea
The Ministry of Information and Communication requires Internet-based banking organizations to use government-issued banking digital certificates. Organizations already using nongovernment-issued certificates will be required to go to government-issued certificates by May 2003.

Malaysia
One of the earlier legislations on digital signatures, Malaysia passed its Digital Signature Act in 1997.

New Zealand
October 2002: Electronic Transactions Bill passed that provides electronic documents and digital signatures on par with physical contracts and signatures.

Singapore
June 1998: Singapore passed its digital signature act called the Electronic Transactions Act 1998 that provides for the legal recognition of digital signatures.

United States
https://www.csoonline.com/article/3391587/finra-rule-4512-u-s-sec-approves-electronic-signatures.html

Monday, May 6, 2019

Cybersecurity Security Service Provider (CSSP)

A Cybersecurity Security Service Provider (CSSP) is required per [] on DoD networks. A CSSP provides Protect, Detect, Respond, and Sustain services.
​What is a Tier1 versus 2 CSSP?
Who are the approved CSSPs for DoD and subordinate military Departments?
DoD:
Army:
  • Army Research Lab (ARL). Tier [].​

Friday, April 26, 2019

Vulnerability Analyzers

Vulnerability assessment products and services"
  • Assured Compliance Assessment Solution (ACAS).
  • Evident.io.
  • Nessus.
  • Qmulos.
  • Qualys. https://vimeo.com/162448482​. ​

Thursday, April 25, 2019

Cloud Migration Strategies

The migration process consists of choosing a strategy ("6 R's") and executing a 5 phase plan. The 5 phases are:  Opportunity Evaluation, Portfolio Discovery and Planning, Application Design, Migration & Validation, and Operate.

Phase 1:  Opportunity Evaluation. What is the business case or compelling event that will drive your migration to the cloud? Examples:  data center lease expiration, policy (e.g. FDCCI, DCOI, ADDCP), access to services or features, reduction of expense (increased opex is more beneficial than initial and recurring capex), developer productivity (e.g. reduced wait time for infrastructure provisioning and access to services that don't need to be built), availability, content caching, backups, disaster recovery.

Phase 2:  Portfolio Discovery and Planning. What's in your environment, what are the interdependencies, what will you migrate first, and how will you migrate it? Inspect your configuration management database (CMDB), institutional knowledge, and/or deploy tools (e.g. AWS Discovery Service or RISC Networks). Understand licensing arrangements. For security and costing purposes, it's useful to know:

  • What information type(s) are going to be received, processed, stored, transmitted, and/or displayed in the cloud? Refer to NIST SP 800-60 Volume II for guidance.
  • What is the footprint for applications that will be re-hosted or re-platformed? This information can be used for cost estimation.
  • Are performance metrics available? This information can be used to right-size the environment. Because the cloud has a pay-as-you-go model, there's no need to over provision resources. The architecture can be designed to support the minimal performance requirements and expanded as additional resources are required.
  • Consider data gravity. What data belongs with what application? Where the data lives will drive where the application lives and/or how that application access the data. If a large data set is going to be used to perform sporadic analysis work in the cloud (by spinning up 100s of instances), then analyze whether it makes sense to store this data permanently in the cloud, or move it in/out over the wire or via Snowball. If you don't have the bandwidth and the volume is large, the copy/mail (Snowball) approach will be more efficient than a wire transfer.
  • What volume of data needs to be transferred into our out of the cloud? This information is needed to design data transfer solutions (e.g. over the wire or out-of-band using Snowball) and also estimate costs for data egress.

Phase 3 and 4:  Designing, Migrating, and Validating Applications. Each application is designed, migrated, and validated using one of the 6 migration strategies. Common infrastructure and security solutions should be design and used, for example identify management (e.g. Active Directory), log collection, configuration management, backups. Low-complexity, low-interdependency applications should migrate first. Have the application owner monitor the migration and validate the results while monitoring costs.

Phase 5:  Modern Operating Model. As applications are migrated, iterate on your process(es) and foundation, decommission old systems, and continue to look for opportunities to achieve efficiencies and/or improvements in cost, labor, and security.

This article has a nice graphic showing 6 migration strategies (although 2 don't count):
  • ​​Re-host (lift-and-shift​)
  • ​Re-platform (lift-and-reshape)
  • ​​​Re​-purchase (replace-drop & shop)​​
  • Re-architect (​re-writing/decoupling applications)
  • ​Retire/decommission
  • ​Revisit/retain
References:

Identity and Access Management (IdAM) (draft)

Identify and Access Management (IdAM or IAM) is the mechanism used to authenticate an entity and authorize them to access a resource.

Several well-known services exist for authentication, including Active Directory, lightweight directory access protocol (LDAP), Remote Authentication Dial-In User Service (RADIUS), Internet Authentication Service (IAS) (Microsoft's implementation of RADIUS), Terminal Access Controller Access-Control System Plus (TACACS+) (Cisco-developed authentication, access, and accounting solution for networking devices), and Network Information Service (NIS). With the proliferation of cloud-based services, cloud service providers are providing identity and access services that are highly-scalable, robust, and secure. Examples include Amazon Web Services (AWS) Identify and Access Management (IAM) and several choices for implementation of Active Directory:  AD Connector, Simple AD, Microsoft AD. Rackspace has a good article which compares the different directory service offerings. Some things to consider in implementing authentication services:

  • Managing multiple services is an administrative burden and is not an efficient use of resources. Building a robust service (e.g. Active Directory) and configuring applications to use that service is easier to manage than setting up separate authentication realms. This approach is known as single sign on (SSO) - using one credential to access many services.
  • There are several methods to authenticate an entity, such as username/password, smartcard, challenge questions, one-time token, biometric, etc. Use of multifactor authentication (MFA) is highly recommended, which requires something a user knows (e.g. password) and something the user has (e.g. smartcard) or is (e.g. fingerprint).
  • Implement a lockout policy and recovery mechanism for lost or forgotten passwords. Examples of solutions for this are:  SysOp Tools Password Reset Pro. For hosted services (e.g. Office 365), use the administrative tools to configure a policy to do this.
  • Implement a logging and audit policy to review authentication events. For example, use of privileged accounts, failed access, anomalies in access times and/or geographic location (e.g. a user attempting to login from the US and then 10 seconds later from the UK).

Access management concerns governing permissions and rights to resources once the identity of an entity is verified via the authentication service described above. This is implemented by file system permissions and role definitions and assignment.

Per Department of the Army Pamphlet (PAM) 25-2-7 Information Management:  Army Cybersecurity Army Information System Privileged Access:  DA Form 7789 (Privileged Access Agreement and Acknowledgment of Responsibilities) is required after 4 May 2019. Individuals requiring elevated access to system control, monitoring, administration, criminal investigation, and/or compliance functions must sign a Privileged Access Agreement (PAA). Individuals seeking privileged access must complete and sign a PAA. Categories and specialties within the cybersecurity workforce that require a PAA include:   1) those requiring modification access to the configuration control functions of the IS/network and administration (e.g. user account management), 2) those with access to change control parameters (e.g. routing tables, path priorities, addresses of routers, multiplexers, and other key IS/network equipment or software), 3) those with the ability and authority to control and change program files, and other users' access to data, 4) those with direct access to operating-system-level functions that could permit system controls to be bypassed or changed, 5) those with access and authority to install, configure, monitor, and/or troubleshoot the security monitoring functions of IS/networks, or in the performance of cyber/network defense operations.

PAM, Privileged Access Management (sometimes Privileged Account Management), refers to the oversight of privileged user access to critical data and systems. PAM solutions implement features such as:  access manager (point of entry which facilitates user requests and the administrators' ability to grant and revoke access), password manager (protect, store, and rotate passwords, eliminating the need for users to have direct access to sensitive resources), session manager (trace and monitor privileged activity, automatically or manually terminate and report suspicious activity).

Authentication

  • Password policy
  • Multi-factor Authentication (MFA)
  • Single Sign On (SSO). Minimize the number of credentials needed, authentication realms to manage. Diligently audit in one place. SSO can support MFA for applications that don't.
Authorization

Authorization is the concept controlling what a user is allowed (or not allowed) to do once they
are authenticated. In IAM terms, authentication + authorization = access. 

Governance is the concept of periodically attesting to the appropriateness (or correctness) of user authorizations.

Identity analytics is an emerging discipline of IAM that evaluates the permissions, rights and
entitlements of individuals – and detects anomalies. Identity analytics is an effective way to uncover
errors in provisioning (we’ll call it over-provisioning) and to discover purposeful rights escalation
activities. It can prevent the activities of external bad actors and detect internal risks before they can
become a threat.
  • Provisioning. Defining and assign the user to the correct role(s) that provide the necessary rights for them to do their job.
  • Deprovisioning. Removing those rights.

Identity Solution Providers.
  • Auth0. Role-based access control. "Organizations" - allows clients to manage roles. Developer friendly. Note Auth0 was acquired by Okta.
PAM:
  • CyberArk.
  • Thycotic..
  • BeyondTrust. Call 7/26/2021. Nick Shaw, Anthony White (DoD and DHA), Steven Schullo (Solutions Engineer). Endpoint Privilege Management (agent-based. Windows, Mac, Unix. licensed by asset. Number of users doesn't matter. Allows access to privileged functions - process gets elevated vice the user). BeyondInsight (command and control/management auditing), can also control through ePO as well as GPO, can be run on-premise or SaaS in Azure, AWS, GCP, working on FedRAMP ATO, look for BeyondTrust in CSP marketplace). Whitelist, blacklist, and Assistance Required (need code from Service Desk). Only responsible for compute cost., Credentials Required (provide credential as well as reason). Password Safe. Cloud Privilege Broker (coming this summer).
  • Wallix
Identity Governance
​Resources:
Additional References:
  • ​​NIST SP 800-53 AC Control Family​​
Relevant Articles: