UIS.202.2 Application Developer Security Testing and Evaluation Guidelines

In support of UIS.202 Software Applications Management Policy

Georgetown University has adopted the security audit and accountability principles established in NIST SP 1800-5 “IT Asset Management” control guidelines as the official policy for this security domain. Each application administrator and system owner must adhere to the guidelines and procedures associated with this policy in order to support and be compliant with the University information security framework. 

Application Developer Security Testing and Evaluation Requirements 

Any individuals tasked with creating, developing, or supporting custom applications for use in the operation of University business must coordinate with UISO to ensure that these requirements are met: 

  1. Create and implement a security assessment plan

    • Ensure that the software application operation and features are supported in the most current version of the development language(s) and align with OWASP Secure Coding Practices.
    • Testing requirements must be defined and documented for both information system development and system integration activities. The plan must include requirements for retesting after significant changes occur. 

    • Perform security testing/evaluation

      •   Restricted or private data shall not be used for testing purposes.

      • UIS may permit the use of production data during the testing of new systems or systems changes only when no other alternative allows for the validation of the functions and when permitted by other regulations and policies. Departments shall use data anonymization or data masking tools, if they are available.

      • If production data is used for testing, the same level of security controls required for a production system shall be used.

    • Produce evidence of the execution of the security assessment plan and the results of the security testing/evaluation

    • Implement a verifiable flaw remediation process 

    • Correct flaws identified during security testing/evaluation  

  2. Use a formal recording system for the following:  

    • Tracks faults from initial reporting through to resolution. 

    • Monitors the status of reported faults and confirms that satisfactory resolutions have been achieved. 

    • Provides reports and metrics for system development and software support management. 

    • Software faults shall be prioritized and addressed promptly to minimize the exposure resulting from the security vulnerability  Perform unit, integration, and system regression testing/evaluation:  

  3. Perform unit, integration, and system regression testing/evaluation:

    • Require that information system developers/integrators perform a vulnerability assessment to document vulnerabilities, exploitation potential, and risk mitigations.

    • Appropriate testing and assessment activities shall be performed after vulnerability mitigation plans have been executed to verify and validate that the vulnerabilities have been successfully addressed. 

    • To maintain the integrity of university information technology systems, software must be evaluated and certified for functionality in a test environment before it is used in an operational/production environment.

    • Test data and accounts must be removed from an application or system prior to being deployed into a production environment if the application or system does not have a dedicated testing environment.  

    • University Information Security Office (UISO) must certify that the upgrade or change has passed acceptance testing.  

    • A rollback plan must be established in the event the upgrade or change has unacceptable ramifications.

  4. The following issues and controls are included when developing acceptance criteria and acceptance test plans:    

    • Capacity requirements – both for performance and for the computer hardware needed 

    • Error response – recovery and restart procedures and contingency plans 

    • Routine operating procedures – prepared and tested according to defined university policies

    • Security controls – agreed to and put in place

    • Manual procedures – effective and available where feasible and appropriate 

    • Business continuity – meets the requirements defined in the university and/or department’s business continuity plan

    • Impact on production environment – able to demonstrate that installation of new system will not adversely affect university and/or department’s current production systems (particularly at peak processing times) 

    • Training – of operators, administrators and users of the new or updated system 

    • Logs – logs of results shall be kept in accordance with the Information Security Audit Logging Policy once testing is completed 

  5. Implement a verifiable flaw remediation process to correct security weaknesses and deficiencies identified during the security testing and evaluation process.

  6. Controls that have been determined to be either absent or not operating as intended during security testing/evaluation must be remediated.