![]() |
中标分类
行业分类
ICS分类
最新标准
|
登录注册 |
您的位置: 标准明细 |
1 Introduction 1.1 General 1.1.1 Computer based systems are of increasing importance to safety in nuclear power plants as their use in plants is rapidly increasing. They are used both in safety related systems, such as some functions of the process control and monitoring systems, as well as in systems important to safety, such as reactor protection or actuation of safety facilities. The dependability of computer based systems important to safety is therefore of prime interest and shall be ensured. 1.1.2 With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. The development of software for computer based systems important to safety in nuclear power plants shall comply with the issued regulations, guides and standards dealing with software engineering and quality assurance. 1.2 Objective The objective of the Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based systems important to safety in nuclear power plants, for all phases of the system life cycle. 1.3 Scope 1.3.1 The Safety Guide is applicable to systems important to safety as defined in relevant nuclear safety regulations. Since at present the reliability of a computer based system cannot be predicted on the sole basis of, or built in by, the design process, it is not allowed to agree systematically on any possible relaxation in the Safety Guide to apply to software for safety related systems. Whenever possible, software which applies only to safety systems and not to safety related systems are explicitly identified. 1.3.2 The Safety Guide relates primarily to the software used in computer based systems important to safety. Guidance on the other aspects of computer based systems, such as those concerned with the design and operation of the computer based system itself and its hardware, is limited to the issues raised by the development, verification and validation of software, and are beyond the scope of the Safety Guide. 1.3.3 The main focus of the Safety Guide is on the preparation of documentation that is used for an adequate demonstration of the safety and reliability of computer based systems important to safety. 1.3.4 The Safety Guide applies to all types of software: pre-existing software or firmware (such as an operating system), software to be specifically developed for the project, or software to be developed from a pre-existing equipment family of hardware or software modules. The issue of the use for safety functions of pre-existing or existing commercial software, on the development of which little information is available, is addressed in Annex A. 1.3.5 The Safety Guide is intended for use by those involved in the production, assessment and licensing of computer based systems, including plant system designers, software designers and programmers, verifiers, validators, certifiers and regulators, as well as plant operators. 2 Technical Considerations for Computer Based Systems 2.1 Characteristics of computer based systems 2.1.1 In relation to the assessment of safety and reliability, computer based systems have two basic properties. They are programmable and their hardware is based on discrete digital logic. As with other systems, hardware faults may be due to errors in design or manufacturing, but typically they result from wearing out, degradation or environmental processes, and are of a random nature. Software, the programmable part of the computer, does not wear out, but it may be affected by changes in the operating environment. Software faults may result from either bad or unclear specification of requirements (which gives rise to errors in the logical design or implementation) or errors introduced during the implementation phase or maintenance phase. 2.1.2 The programmable nature of computer based systems coupled with the discrete logic means that they have a number of advantages over non-digital and non-programmable systems. They facilitate the achievement of complex functions; in particular, they can provide improved monitoring of plant variables, improved operator interfaces, improved testing, calibration, self-checking, and fault diagnosis facilities. They can also provide greater accuracy and stability. The use of multiplexed ‘bus’ structures may lead to a reduced need for cabling. Software modifications require less physical disruption of equipment, which will facilitate maintenance. 2.1.3 These advantages are counterbalanced by a number of disadvantages. Software implementation tends to be more complex and therefore more prone to design errors than implementation of purely hardware systems. Moreover, software implementations are discrete logic models of the real world. This has two types of consequences. Software is more sensitive (i.e. less tolerant) to ‘small’ errors. It is also more difficult to test, because interpolation and extrapolation are much more difficult to apply to computer based systems than to traditional analog systems, and ultimately are not entirely valid. 2.2 Development process 2.2.1 The development of systems important to safety shall be a process controlled step by step. It is by nature an iterative process. In this approach, the development process is organized as an ordered collection of distinct phases. Each phase uses information developed in earlier phases and provides input information for later phases. As the design progresses, faults and omissions made in the earlier stages become apparent and necessitate attention on the earlier stages. An essential feature of this approach is that the products of each development phase are verified against the requirements of the previous phase. At certain phases of the development a validation process is carried out to confirm compliance of the product with all functional requirements and non-functional requirements and the absence of unintended behavior. 2.2.2 Typical phases of the development process and an outline of the process that may be applied are shown in Figure 1. The boxes in the figure show the development activities to be performed and the arrows show the intended order and the primary information flow. The numbers in parentheses in Figure 1 indicate the sections of the Safety Guide in which the development activities and products are described. Activities without numbers are not within the scope of the Safety Guide. Figure 2 illustrates the relationship of verification and validation to the requirements, design and implementation. The choice of the particular development activities and their order in these figures are not intended to require a particular method of development; variations of the method may also have the necessary attributes and be capable of meeting the requirements. Figure 1 Development of Software for Computer Based Systems Important to Safety (Numbers in parentheses refer to sections of the Safety Guide) 2.2.3 The development of the computer based system shall start with an analysis of the plant and system requirements carried out by engineering specialists, including safety engineers and software engineers. On the basis of the results of this analysis, the requirements for the computer based system are derived. Iterations are normally needed in this analysis before a final set of requirements is identified. There shall be a clear demonstration of a systematic derivation since omissions at this stage may result in an incorrect specification of requirements for the safety provisions and hence may result in an unsafe system. 2.2.4 The first design decision shall be to apportion the requirements for the computer based system between the computer system and conventional electrical and electronic equipment for measuring plant variables and actuating the control devices (other components). The designer may have reasons to opt for analog equipment to implement certain functional requirements . 2.2.5 The computer system requirements are apportioned between the software requirements and the computer hardware in computer system design. The software is then designed as a set of interacting modules (software design) which are coded and tested as programs that will run on the computer hardware (software implementation). Next, the software is integrated with the computer hardware to produce the computer system (computer system integration). Finally, the computer system is installed in the plant for commissioning and operation. One of the steps of the commissioning phase is the integration of the computer system with the other components (which is part of the system integration phase). Figure 2 Verification, Validation and Commissioning (numbers in parentheses refer to sections of the Safety Guide) 2.3 Safety and reliability issues 2.3.1 Because of the disadvantages mentioned earlier, the quantitative evaluation of the reliability of software based systems is more difficult than that for non-programmable systems and this may raise specific difficulties in demonstrating the expected safety of a computer based system. Quantitative indicators of high software reliability are not demonstrable at the present time. Hence, designs requiring a single computer based system to achieve probabilities of failure on demand of lower than 10–4 for the software shall be treated with caution. 2.3.2 Since software faults are systematic rather than random in nature, common cause failure of computer based safety systems employing redundant subsystems using identical copies of the software is a critical issue. Countermeasures are not easy to implement. Designers shall adopt independence and diversity and a comprehensive qualification strategy to protect against common cause failures. However, it may be difficult to estimate the degree of success and the benefits of these strategies when software is involved. 2.3.3 With current technology, it is possible in principle to develop computer based instrumentation and control systems with the necessary dependability for systems important to safety and to demonstrate their safety sufficiently. However, dependability can only be demonstrated if a careful and fully documented process is followed. This process may include the evaluation of operating experience with pre-existing software, following specific requirements (see also Annex A). Recommendations for achieving an adequate safety demonstration are given in the Safety Guide. 2.4 Organizational and legal issues 2.4.1 There are various organizational and legal aspects of the development project for a computer based system that shall be carefully taken into consideration at a very early stage of the project in order to ensure its success. They include such factors as the availability of a suitable legal and administrative framework for licensing computer based systems important to safety and sufficient competence and resources within the organizations involved in the system development process. If these factors are not carefully considered at the conceptual design stage of the project, its schedule and costs may be considerably affected. Some of these factors may influence the decision to use programmable digital technology, making this option impractical or even prohibitively expensive. 2.4.2 Quantification of software reliability is an unresolved issue. Software testing will be limited. The quantification of software reliability for computer based systems may be difficult or impossible to demonstrate. The regulatory position concerning the safety demonstration and reliability required of software shall be clarified early in the project planning stage. The approach to be followed to deal with the safety and reliability issues shall be determined, documented, made available to and if necessary agreed upon by the regulator. This may include specific regulatory hold points. 2.4.3 It shall be ensured that sufficient competence and resources are available in the regulatory organization, the licensee’s design team, the nuclear safety regulator’s technical support staff and the supplier to fulfill the recommendations for the software development process and the assessment of its safety demonstration. The licensee shall also ensure that the supplier of the computer and/or software will be prepared to release all proprietary information that would be needed for the licensing process. 2.4.4 The licensee (i.e. the user of the computer system) shall establish an appropriate organization to deal with operational and maintenance issues. Occasionally, the licensee may need its own team to perform post-delivery modifications on the software. This team shall have the same level of competence and equipment as would have been provided by the original producer. 3 Application of Requirements for Management of Computer Based System Safety 3.1 General The majority of requirements for management of safety developed for traditional instrumentation and control systems are applicable to computer based systems. However, the application of these requirements for management of safety to computer based systems, in view of the differences between software and hardware, is not always straightforward and has, in many cases, new implications. Chapter 3 provides a brief overview of relevant requirements for management of safety, on issues related to requirements for design, design and development, management and quality assurance, and documentation. These requirements for management of safety form the basis for deriving the basic requirements of the computer based systems, especially the non-functional requirements. Documentation shall be of high quality, owing to its importance in the design of software and in the safety demonstration. 3.2 Requirements for management of safety 3.2.1 Simplicity in design 3.2.1.1 It shall be demonstrated that all unnecessary complexity has been avoided both in the functionality of the system and in its implementation. This demonstration is important to safety and is not straightforward, as digital programmable technology is adopted for the achievement of more complex functionality. Evidence of obedience to a structured design, to a programming discipline and to coding rules shall be part of this demonstration. 3.2.1.2 For safety systems, the functional requirements that are to be fulfilled by a computer system shall all be essential to the achievement of safety functions; safety-related functions shall be separated from and shown not to impact the safety functions. 3.2.1.3 For computer based system development, top-down decomposition, levels of abstraction and modular structure are important concepts for coping with the problems of unavoidable complexity. They not only allow the system developer to tackle several smaller, more manageable problems, but also allow a more effective review by the verifier. The logic structure behind the system modularization and the definition of interfaces shall be made as simple as possible, for example by applying “information hiding”. 3.2.1.4 In the design of system modules, simpler algorithms shall be chosen over complex ones. Simplicity shall not be sacrificed to achieve performance that is not required. The computer hardware used in safety systems shall be specified with sufficient capacity and performance to prevent software from becoming too complex. 3.2.2 Safety culture The personnel working on a software development project for a system with a high safety importance shall include application specialists as well as computer software and hardware specialists. This combination of expertise helps to ensure that safety requirements, which are generally well known owing to the maturity of the industry, are effectively communicated to the computer specialists. It shall be ensured that all personnel understand how their jobs are related to the fulfillment of the safety requirements and they shall be encouraged to question activities or decisions that may compromise the safety of the system. This means that the software specialists shall also have a good understanding of the application. An appropriate specification language (e.g. a graphical language) can be used for the description of safety functions. 3.2.3 Safety classification scheme A safety classification scheme to define the safety importance of the instrumentation and control system functions may be used for the system development process. It directs the appropriate degree of attention by the plant designers, operators and nuclear safety regulator to the specifications, design, qualification, quality assurance, manufacturing, installation, maintenance and testing (see 3.2.4) of the systems and equipment that ensure safety. 3.2.4 Balance between risk reduction and development effort 3.2.4.1 Trade-offs in the achievement of various conflicting design objectives shall be carefully assessed at each design stage for the system and its associated software. A top-down design and development process shall be used to facilitate this assessment. Graded design and qualification requirements, when applied to computer system functions, may be derived from a safety classification scheme (see 3.2.3). This gradation may be used in order to balance the design and qualification effort. The computer system shall meet the criteria for the highest safety class of the functions it is implementing. 3.2.4.2 Appropriate measures to warrant the level of confidence that is necessary shall be associated with each safety class. It shall be noted that for the hardware part of the system the confidence level can be assessed by using quantitative techniques; however, for software only qualitative assessment is possible. 3.2.5 Defense in depth Defense in depth as applied in the design of nuclear power plants shall be used in the development of the computer based system and its associated software. If a software based system constitutes the main safety function, defense in depth shall be provided by means of a diverse secondary protection system. 3.2.6 Redundancy Traditional design based on multiple redundant instrumentation channels with a voting arrangement, as commonly used in analog applications, has benefits for reliability of the hardware of computer systems. However, this type of redundancy does not prevent a failure of the system due to common cause faults in hardware and software design that could lead to impairment of all redundant channels. 3.2.7 Single failure criterion The application of the single failure criterion to hardware random failures is straightforward: no single failure shall result in loss of the safety functions. However, in software this criterion is difficult to meet, since a fault that causes a software failure is necessarily present in all the replicas of this software (see 3.2.8). 3.2.8 Diversity The reliability of computer based systems can be enhanced by using diversity to reduce the potential for software common cause failures. The use of diverse functions and system components at different levels of the design shall be considered. Diversity of methods, languages, tools and personnel shall also be taken into consideration. However, it shall be noted that although diverse software may provide improved protection against common cause failures, it does not guarantee the absence of coincident errors. The decision by the licensee whether to use diversity and the choice of type of diversity shall be justified. 3.2.9 Fail-safe design, supervision and fault tolerance System fail-safe features, supervision and fault tolerant mechanisms shall be added into the software, but only to the extent that the additional complexity is justified by a demonstrable global increase in safety. When software is used to check the hardware it is running on, then its ability to respond correctly shall also be demonstrated. The use of external devices, such as watchdog timers, makes the system response to fault detection more dependable. Defensive design, the use of appropriate languages, including safe subsets and coding techniques, shall be used to ensure a safe response under all circumstances, as far as this is achievable. One goal of the computer system requirements phase shall be to completely specify the desired and safe response to all combinations of inputs. 3.2.10 Security It shall be demonstrated that measures have been taken to protect the computer based system throughout its entire lifetime against physical attack, intentional and non-intentional intrusion, fraud, viruses and so on. Safety systems shall not be connected to external networks when justification cannot be made that it is safe to do so. 3.2.11 Maintainability The computer based system shall be designed to facilitate the detection, location and diagnosis of failures so that the system can be repaired or replaced efficiently. Software that has a modular structure will be easier to repair and will also be easier to review and analyze, since the design can be easier to understand and easier to modify without introducing new errors. Software maintainability also includes the concept of making changes to the functionality. The design of a computer based system shall ensure that changes are confined to a small part of the software. 3.2.12 Full representation of operating modes The requirements for and design of the software for systems important to safety shall explicitly define all relations between input and output for each of the operating modes. The software design shall be simple enough to permit consideration of all input combinations that represent all operating modes. 3.2.13 Human-machine interfaces and anticipation of human limitations The design of human-machine interfaces may have a significant impact on safety. The human-machine interfaces shall be designed to provide the operator with a sufficient and structured but not an overwhelming amount of information, and also to provide sufficient time for reacting (see, for example, the thirty minute rule in relevant documentation). When functions are allocated to the operator, the computer system shall allow for the time needed to derive a manual action from the information provided. All operator inputs shall be checked for validity as a defense against impairment by operator error. The possibility of the operator overriding these validity checks shall be investigated carefully. 3.2.14 Demonstrable dependability Not only shall the system be dependable, it shall also be possible to demonstrate to the regulator that it is dependable. The Safety Guide recommends to licensees how to achieve demonstrable dependability through design and qualification methods that improve traceability and through the production of adequate documents. 3.2.15 Testability Each requirement and each design feature shall be expressed in such a manner that a test can be done to determine whether that feature has been implemented correctly. Both functional and non-functional requirements shall be testable. Test results shall be traceable back to the associated requirements. 3.3 Design and development activities 3.3.1 General In determining the necessary design and development activities, particular attention shall be given to contents of 3.3.2~3.3.7. 3.3.2 Process controlled step by step The design and development process shall be controlled step by step. This development process can give evidence of correctness by its very construction. This can also ease the verification process and ensure that errors are detected early in the design process. 3.3.3 Reviewability Accurate and easily reviewable documentation shall be produced for all stages of the development process. The documentation used to demonstrate adequacy to the regulator shall be the same as the documentation actually used in the design. 3.3.4 Comprehensive testing Comprehensive testing shall be applied. Testing is an important part of development, verification and validation. A demonstration of test coverage, including tracing of test cases to source documents (design specification and requirements specifications, etc.), shall be provided. The test results, demonstration of test coverage and other test records shall be available for third party audit. A comprehensive test plan shall be established and made available to the regulator at an early stage of the project. 3.3.5 Use of automated tools All tools used shall be mutually compatible. The tools shall be qualified to a level commensurate with their function in the development of the software and in the safety demonstration. The techniques used to gain confidence in the tools shall be specified and documented. 3.3.6 Traceability Requirements shall be traceable to design; design shall be traceable to code; requirements, design and code shall be traceable to tests. Traceability shall be maintained when changes are made. There shall also be traceability in the reverse direction, to ensure that no unintended functions have been created. 3.3.7 Compliance with standards The requirements for management of safety, safety requirements and technical standards to be used in the development shall be identified. A compliance analysis shall be prepared for the key standard used in the specification and implementation of the computer system design. 3.4 Management and quality assurance 3.4.1 Clearly defined functions and qualifications of personnel The plant management shall demonstrate to the nuclear safety regulator that the level of staffing is adequate. Management shall establish the functions and qualifications required of personnel involved in software design, production and maintenance, and shall ensure that only qualified and experienced personnel perform these functions. Personnel qualifications shall be established for each task within the software development and maintenance process, including the execution of the quality assurance program. 3.4.2 Acceptable practices Well established methods, languages and tools for software development shall be used. Methods, languages and tools that are still at the research stage shall not be used. 3.4.3 Quality assurance program The organization’s quality assurance program shall extend into the software development process and shall also cover configuration management and change control after the software is delivered. At least for safety systems, independent verification, validation and testing, with independent supporting audits, shall also be covered. The quality requirements for the software shall be described in a software quality assurance plan which shall be required by the quality assurance program. 3.4.4 Assignment of responsibilities Interfaces shall be established between organizational entities within the design organization and between the design organization, its client (user) and other organizations involved in the system development process. Controls pertaining to design interfacing shall be established and shall include the assignment of responsibilities and the issue and transmission of documents to and from the interfacing organizations. 3.4.5 Third party assessment 3.4.5.1 Third party assessment shall be used for safety systems. Its scope and extent shall be defined in the quality assurance program. The team performing the quality assurance and the verification and validation tasks shall be independent of the development team. These issues are covered later in 3.4.5.2 and 4.4.6 of the Safety Guide. 3.4.5.2 The objective of the third party assessment is to provide a view on the adequacy of the system and its software which is independent of both the supplier and the user (the licensee). Such an assessment may be undertaken by the regulator or by a body acceptable to the regulator. It shall provide confidence in the production process of the licensee and suppliers. The assessment strategy, the competence and the knowledge of the project necessary for the third party assessment in order to provide the necessary level of confidence shall be carefully considered. In addition, third party assessment shall be agreed to by all parties (nuclear safety regulator, licensee, suppliers) so that the appropriate resources can be made available at the desired time. Some of the third party assessments shall involve examination of the process (e.g. through quality assurance audits and technical inspections). Other third party assessments shall include examination of the product (e.g. through static analysis, dynamic analysis, code and/or data inspection and test coverage analysis). The assessment of the final product shall (as far as possible, in view of time constraints) be undertaken on the final version of the software. This can include third party assessment of intermediate products of the development process, such as software specifications. 3.5 Documentation 3.5.1 General 3.5.1.1 For confidence in the reliability of a software product, evidence shall be provided of the soundness of the development process. The documentation is crucial in providing the ‘transparency’ and ‘traceability’ necessitated by this approach. Documentation for the design and implementation of a trustworthy software product shall be clear and precise. 3.5.1.2 The set of documents shall ensure the traceability of design decisions. Appropriate documents shall be produced at each step of the development process. Documentation shall be updated throughout the development, including commissioning and ongoing maintenance processes. The documents available to the regulator shall be identical to those used by the designers. The designer shall be informed of this early in the project. 3.5.1.3 Documentation of the requirements, the design and the code shall be clear and precise so that the designers, the programmers and the independent reviewers can comprehend fully every stage of the development and verify its completeness and correctness. 3.5.1.4 Good documentation is also essential to maintenance. The proper documentation format shall be used to reduce the likelihood of inconsistencies and errors in future maintenance related changes. Documents shall have the attributes of comprehensibility, preciseness, traceability and completeness, consistency, verifiability and modifiability, as described in 3.5.2~3.5.7. 3.5.2 Comprehensibility Documentation shall be understandable by people with a variety of backgrounds and expertise. The language used shall be clear, and when it is formal (e.g. graphical forms), it shall have well defined syntax and semantics. 3.5.3 Preciseness Requirements and designs shall be described formally with explanations given in natural language. There shall be only one possible interpretation of each description. 3.5.4 Traceability and completeness 3.5.4.1 The purpose of tracing is to demonstrate that the implementation is complete with respect to the computer system requirements and design, and to facilitate the detection of unsafe features in the implementation. Tracing from higher level documents to software documents checks completeness, and tracing back from software documents to higher level documents checks for unspecified items which might be unsafe. Every requirement shall be uniquely identified. 3.5.4.2 A traceability matrix (or matrices) shall be implemented that clearly shows the linkage between computer based system requirements and following items which implement each computer based system requirement in the computer system requirements analysis, the computer system design, the software requirements analysis, the software design and the software implementation. This matrix shall demonstrate complete test coverage of the requirements for the computer based system during implementation, integration, installation and commissioning. 1 Introduction 2 Technical Considerations for Computer Based Systems 3 Application of Requirements for Management of Computer Based System Safety 4 Project Planning 5 Computer System Requirements 6 Computer System Design 7 Software Requirements 8 Software Design 9 Software Implementation 10 Verification and Analysis 11 Computer System Integration 12 Validation of Computer System 13 Installation and Commissioning 14 Operation 15 Post-delivery Modifications Annex A Use and Confirmation of Pre-existing and Existing Software Annex B Term Explanation 1 引 言 1.1 概述 1.1.1基于计算机的系统在核动力厂日益广泛应用,它们对核动力厂安全的重要性也正在增加。这类系统既用于安全有关系统(例如过程控制和监测系统的某些功能),也用于安全系统(例如核反应堆保护或安全设施的驱动)。核动力厂基于计算机的安全重要系统的可信性应首先被关注并给予保证。 1.1.2采用现代技术,为安全重要系统开发基于计算机的仪表和控制系统在原理上是可能的,该系统有潜力提高其安全性和可靠性并具有足够的可信性。但是,只有遵循系统化、文件化和可评审的工程步骤,才能够预计并证明它们的可信性。核动力厂基于计算机的安全重要系统软件的开发应遵循已经发布的有关软件工程和质量保证方面的法规、导则和标准。 1.2 目的 本导则的目的是在核动力厂基于计算机的安全重要系统的软件的生存周期各个阶段,为安全论证提供收集证据和编制文件的指导。 1.3范围 1.3.1 本导则适用于有关核安全法规定义的安全重要系统。由于目前基于计算机的系统的可靠性还不能完全依靠设计过程或内部配置进行预计,所以对安全有关系统软件的应用不允许系统性地放宽要求。只要有可能,应明确标识仅适用于安全系统而不适用于安全有关系统的软件。 1.3.2本导则主要涉及基于计算机的安全重要系统的软件。有关计算机系统自身及其硬件的设计和运行等问题,不属于本导则的范畴,除非它们与软件的开发、验证和确认有关。 1.3.3 本导则重点在于指导编制文件,用以充分证明基于计算机的安全重要系统的软件具有安全性和可靠性。 1.3.4 本导则适用于所有类型的软件:已有软件或固件(例如操作系统):为项目专门开发的软件;或从已有硬件的设备序列或软件模块开发出来的软件。对已有或现有商用软件,由于有关开发过程的资料很难得到,其安全功能的应用问题在附录A中另行叙述。 1.3.5本导则供从事基于计算机的系统的生产、评定和取证的人员使用,他们包括核动力厂系统设计人员、软件设计人员和程序员、验证人员、确认人员、认证人员和管理人员以及核动力厂操纵员。 2 基于计算机的系统的技术考虑 2.1 基于计算机的系统的特性 2.1.1 基于计算机的系统有两个有关安全性和可靠性评定的基本性能:它们是可编程的,且其硬件建立在离散数字逻辑的基础上。如其他系统一样,硬件故障可能源于设计或制造中的错误,但通常它们是由磨损、老化或环境变化引起的,并具有随机性。软件(即计算机系统可编程的部分)不磨损,但它可能受运行环境改变的影响。软件故障既可能由有问题的或不明确的需求规格说明(它导致逻辑设计和实现中的错误)引起,也可能由实现阶段或维护阶段导入的错误引起。 2.1.2基于计算机的系统的可编程特性与数字逻辑的结合意味着,它们拥有许多超过非编程或非数字系统的优点。它们很容易实现复杂的功能,特别是它们能修改核动力厂变量的监测,修改操纵员接口,修改试验、校准、自检和故障诊断设备。它们也能提供更高的准确度和稳定性。使用多路传输“总线”结构可减少电缆的数量。修改软件几乎不需要改动硬设备,这有利于维护。 2.1.3基于计算机的系统也有缺点。与单纯硬件系统相比,软件的实现更复杂、也就更容易发生设计错误。此外,软件实现是客观存在的数字逻辑模式。这引起两类后果:软件对“小”错误更敏感(不能容忍):测试也更难,因为对基于计算机的系统使用传统模拟系统通用的内插法和外推法有较多的困难.甚至不完全有效。 2.2 开发过程 2.2.1 安全重要系统的开发应是一个可逐步控制的过程,其本质也就是一个逐步逼近的过程。在该方法中,将开发过程编排为若干不同阶段的有序集合。每个阶段使用前一阶段开发的结果,并为后续阶段提供输入信息。当设计向前进行时,前期阶段形成的故障和疏忽变得明显,因此需要在较早的阶段认真对待。这种方法的基本特征是每个阶段的输出(开发结果)针对前一阶段的需求送行验证。在开发的某些阶段,将实施确认过程以证实安全重要系统与所有功能需求和非功能需求的符合性,同时不存在未预期的特性。 2.2.2开发过程的典型阶段和适用过程的框图见图1。图中的方框表示将要进行的开发活动,而箭头表示预期的顺序和主信息流。图1括弧内的序号指明本导则中描述开发活动的章号,未标明序号的活动不属于本导则的范围。图2说明验证、确认与需求、设计和实现之间的关系。在图1和图2中,特定开发活动及其顺序的选择不一定要求特定的开发方法;但方法变更应有必需的属性并能满足需求。 核动力厂和系统的需求 基于计算机的系统的需求 计算机系统需求(5) 计算机系统设计(6) 软件需求(7) 软件设计(8) 其他部件 计算机硬件 运行(14)和交付后修改(15) 安装和调试(13) 计算机系统集成(11) 软件实现(9) 图1 基于计算机的安全重要系统的软件开发(括号内的数字为本导则的章号) 2.2.3基于计算机的系统的开发应从核动力厂和系统的需求分析开始,它们由工程专家(包括安全工程师和软件工程师)实施。在分析结果的基础上导出基于计算机的系统的需求。在分析中通常需要使用逐步逼近的方法以最终确定一组需求。由于这个阶段的疏忽,不适当的安全措施可能导致需求规格说明的错误,并可能产生一个不安全的系统,所以对系统化的推导过程应予以清晰的证明。 2.2.4对基于计算机的系统,第一个设计决策应在计算机系统与测量核动力厂变量和驱动控制设备(其他部件)的常规电气和电子设备之间分配需求。设计人员也可以选择模拟设备实现某些功能需求①。 2.2.5在计算机系统设计中,其系统需求在软件需求与计算机硬件之间分配。然后,将软件设计为一套相互配合的模块(软件设计),这些模块作为在计算机硬件上运行的程序被编码和测试(软件实现)。接着,软件与计算机硬件集成在一起构成计算机系统(计算机系统集成)。最后,计算机系统安装到核动力厂以各调试和运行。调试阶段的步骤之一是计算机系统与其他部件的集成(整个系统集成阶段的一部分)。 ①本导则不讨论其他设备的评价问题,仅集中讨论计算机系统,特别是讨论在该计算机上运行的软件。 核动力厂和系统的需求 基于计算机的系统的需求 确认(12) 计算机系统需求(5) 验证(10) 计算机系统设计(6) 软件需求(7) 软件设计(8) 要求时确认 安装和调试(13) 验证和分析(确认)(12) 运行(14)和交付后修改(15) 安装和调试(13) 计算机系统集成(11) 验证和分析(10) 软件实现(9) 图2验证、确认和调试(括号内的数字为本导则的章号) 2.3安全性和可靠性问题 2.3.1 因为前面提到的缺点,对软件系统可靠性的定量评价与非可编程系统相比难度较大,并且使得证明基于计算机的系统的预期安全性特别困难。现在还不能证明软件高可靠性的量化指标,所以对基于单个计算机的系统,应慎重对待使软件失效概率低于10-4的设计要求。 2.3.2 由于软件故障本质上是系统性的,而不是随机性的,基于计算机的安全系统(该系统通过相同的软件拷贝而使用多重分系统)的共因故障是一个关键问题,安全防范措施不容易实现。设计人员应采用独立性和多样性以及全面的质量鉴定等策略以防止共因故障。然而,当涉及软件时,对这些策略的成功率和价值进行评估也是困难的。 2.3.3按现代技术,在原理上有可能为安全重要系统开发具有必需可信性的基于计算机的仪表和控制系统,并证明它们具有足够的安全性。但是,只有实施一个精心和充分的文件化过程,其可信性才能予以证明。这个过程可包括按特定需求对已有软件(见附录A)运行经验的评价。本导则给出证明达到足够安全性的建议。 2.4组织和法律问题 2.4.1 基于计算机的系统的项目开发存在各种组织上和法律上的问题,在项目早期开发阶段应仔细考虑这些问题以保证开发的成功。它们包括为基于计算机的安全重要系统取证的适当法律和行政管理模式,以及在组织内部拥有系统开发过程所需足够的能力和资源。若在项目的概念设计阶段未仔细考虑这些因素,可能显著影响其进度和价格。其中某些因素将对采用可编程数字技术的决定产生影响,并使该决定不切实际或价格昂贵。 2.4.2软件可靠性的量化是一个尚未解决的问题,软件的测试将受到限制。基于计算机的系统软件可靠性的量化是困难的或是不可能证明的。与软件的安全性证明和要求的可靠性有关的管理见解应在项目计划阶段尽早阐述。应确定处理安全性和可靠性问题所遵循的方法并编制文件,该文件应提交管理人员;若有必要,应经管理人员认可。文件可包括特定的管理控制点。 2.4.3在履行关于软件开发过程和评定其安全论证的建议中,管理机构、持有设计资格许可证的设计队伍、国家核安全监管部门的技术支持人员以及供应商应保证具有充分的能力和资源。许可证持有者也应保证使计算机和(或)软件的供应商能提供交付取证过程中所需要的全部专利信息。 2.4.4许可证持有者(也就是计算机系统的用户)应建立一个合适的机构以处理运行和维护问题。许可证持有者有时可能需要自己的人员实施软件交付后的修改,这些人员具有的资格和设备应与原生产者相同。 3基于计算机的系统安全管理要求的应用 3.1 概述 在传统仪表和控制系统中制定的大多数安全管理要求也适用于基于计算机的系统。但是,鉴于软件与硬件之间的差别,基于计算机的系统不一定可直接应用这些安全管理要求,而且在很多情况下有新的含义。第3章就有关设计需求、设计和开发、管理和质量保证以及文件的问题提供一个安全管理要求的概要。这些安全管理要求构成并导出基于计算机的系统的基本需求(特别是非功能需求)。由于文件在软件设计及其安全论证中的重要性,其质量应是高水平的。 3.2安全管理要求 3.2.1设计简明性 3.2.1.1 应证明在系统功能及其实现方面都已避免不必要的复杂性。当使用数字可编程技术实现更复杂的功能时,该证明对安全是重要的但却很困难。遵循结构化设计、编程规范和编码规则的证据应是该证明的一部分。 3.2.1.2 对安全系统,由计算机系统实现的功能需求对完成所有安全功能是必不可少的:安全有关功能应能从安全功能中分割出来,并指明它们不影响安全功能。 3.2.1.3对基于计算机的系统的开发,自顶至底分解、抽象化层级和模块化结构是解决不可避免的复杂性问题的重要原则。它们不仅允许系统开发人员解决某些较小的、易于处理的问题,而且还允许验证者进行更有效的审查。系统模块化和接口定义的逻辑结构应尽可能简单,例如采用“信息隐藏”的方法。 3.2.1.4在系统模块设计中,应优选简单算法而不选用复杂算法。不应为实现不需要的性能而增加复杂性。安全系统中使用的计算机硬件应具有足够的能力和性能,以防止使软件复杂化。 3.2.2安全文化 开发具有高安全重要性系统的软件项目的工作人员应包括应用专家以及计算机软件和硬件专家。这种专家组合有助于保证将工业上成熟的、已成惯例的安全要求有效地传递给计算机专家。应保证所有人员理解他们的工作如何关系到安全要求的实现,同时应鼓励他们开展能妥善处理系统安全的研究活动并作出相应决定。这意味着软件专家也需要深刻地了解应用。一种合适的规格说明语言(例如图示语言)能用于描述安全功能。 3.2.3安全分级方案 决定仪表和控制系统功能的安全重要性的安全分级方案可用于系统的开发过程。这促使核动力厂的设计人员、操纵员和国家核安全监管部门高度重视那些保证系统和设备安全的技术规格书、设计、质量鉴定、质量保证、制造、安装、维护和试验(见3.2.4)。 3.2.4降低风险与研制计划之间的平衡 3.2.4.1 在系统及其相关软件设计的每个步骤,对为实现各种交叉设计目标而达成的折衷方案应谨慎予以评定,并采用自顶至底的设计和开发过程以便进行这种评定。当分级设计和质量鉴定要求适用于计算机系统的功能时,可从安全分级方案导出(见3.2.3)。这个分级可在设计与质量鉴定效果之间取得平衡。计算机系统应满足所执行功能中最高安全级别的准则。 3.2.4.2应保证必需置信度的适当措施与每个安全级别相适应。应指出,对系统硬件的置信度能进行定量评定:而对软件仅可能进行定性评价。 3.2.5纵深防御 基于计算机的系统及其相关软件的开发应遵循核动力厂设计中的纵深防御原则。如果基于软件的系统构成主要安全功能,则应借助不同的第二保护系统提供纵深防御。 3.2.6多重性 基于表决方案的多重仪表通道的传统设计,如通常在模拟应用中那样.有利于提高计算机系统硬件的可靠性。但是,这种多重性并不防止由于软件和硬件设计中的共因故障引起的系统失效,而该共因故障可能导致损害所有多重通道。 3.2.7单一故障准则 单一故障准则可直接应用于硬件随机故障,即单一故障不会导致安全功能的丧失。然而,软件满足这个准则是困难的,因为导致软件失效的故障存在于该软件的所有复制品中(见3.2.8)。 3.2.8多样性 对基于计算机的系统,通过使用多样性而降低潜在的软件共因故障,能提高其可靠性。应考虑在设计的不同层次使用多样性的功能和系统部件。还应研究方法、语言、工具和人员的多样性。但是应指出,尽管不同的软件可防止共因故障,却不能保证不出现同时发生的错误。许可证持有者应证明是否使用多样性以及选择多样性类型的合理性。 3.2.9故障安全设计、监督和故障容限 应将系统故障安全特性、监督和故障容限机制加入到软件中,但应限制由此引入的复杂程度,同时能证明增加了整体安全性。当软件用来检查运行它的硬件时,还需证明其正确响应的能力。使用外部设备(例如监控设备定时器),将使系统对故障探测的响应更可靠。防御性设计和使用合适语言(例如安全子程序和编码技术)应保证在所有情况下尽可能快地实现安全响应。计算机系统需求的阶段目标之一应是对所有输入组合全面规定预期的安全响应。 3.2.10安全防范 应证明已采取了各项措施,以防止基于计算机的系统在整个生存周期中遭到实体破坏、有意和无意的侵入、虚假信息和病毒等。当不能证明安全系统与外部网络连接达到其安全要求时,不应进行这种连接。 3.2.11 可维护性 基于计算机的系统的设计应便于故障的探测、定位和诊断,以致系统能有效检修和更换。因为具有模块结构的软件的设计便于理解并便于修改而又不引入新的错误,它们较容易修复,也较容易检查和分析。软件维护还包括更改某些功能。基于计算机的系统的设计应保证更改限于软件的一小部分。 3.2.12运行模式的完整表示 对安全重要系统的软件的要求和设计应对每种运行模式明确规定输入与输出之间的全部关系。软件设计应足够简化,以便有可能表示所有运行模式的组合。 3.2.13人机接口和预期人为局限性 人机接口的设计可能对安全有显著影响。人机接口的设计应为操纵员提供大量充分的和有序的而不是杂乱无章的信息.还应为操纵员提供足够的反应时间(例如有关文件中的30mill准则)。当功能分配给操纵员时,计算机系统应允许从提供信息到手动操作所需的宽容时间。为对人为差错的损害的防御有效,应检查操纵员的所有输入。应认真研究操纵员超越这些有效检查的可能性。 3.2.14可信性证明 系统不仅本身是可信的,还应能向管理人员证明它是可信的。本导则向许可证持有者推荐如何实现可信性的证明,其措施是采用改善追溯性的设计方法和质量鉴定方法以及产生足够的文件。 3.2.1 5可测试性 每项需求和设计特性应能通过测试确定其特性是否正确实现。所有功能和非功能的需求均应是可测试的。测试结果对相应的要求应反向可追溯。 3.3设计和开发活动 3.3.1 概述 在确定必需的设计和开发活动过程中,应对3.3.2~3.3.7的内容给予特别重视。 3.3.2逐步控制的过程 设计和开发过程应逐步控制。这个过程应能通过完善的结构给出正确性的证据,由此也能容易实施验证过程,并保证在设计过程中尽早发现错误。 3.3.3可审查性 在开发过程的所有阶段应产生正确而容易审查的文件。用于向管理人员证明开发过程恰当的文件应与设计中实际使用的文件相同。 3.3.4综合测试 应进行综合测试。测试是开发、验证和确认的重要部分。应提供测试有效范围的证明,包括测试用例对源文件(设计规格说明和需求规格说明等)的追溯。第三方监查应得到测试结果、测试有效范围的证明和其他测试记录。应制定一个综合测试计划,并在项目早期阶段让管理人员得到这个计划。 3.3.5 自动工具的使用 所有使用的工具应相互兼容。对工具应进行质量鉴定以使其等级与软件开发和安全论证中要求的功能相匹配。对获得工具置信度所采用的技术应予以规定并编制文件。 3.3.6可追溯性 需求对设计应是可追溯的;设计对编码应是可追溯的:需求、设计和编码对测试应是可追溯的。当实施更改时,应保持其可追溯性。反向也应是可追溯的,以保证不产生非预期的功能。 3.3.7与标准的符合性 应确定安全管理要求、安全要求和用于开发过程的技术标准。对计算机系统设计规格说明和实现中使用的关键标准应进行符合性分析。 3.4管理和质量保证 3.4.1 明确人员的职责和资格 核动力厂管理者应向国家核安全监管部门证明其工作人员的水平是足够的。管理者应制定软件的设计、生产和维护等人员所要求的职责和资格要求,并应保证仅由有资格和有经验的人员执行这些职责。应对实施软件开发、维护、包括执行质量保证大纲在内的每项任务的人员进行资格鉴定。 3.4.2可接受的方法 软件开发应使用成熟的方法、语言和工具,不应使用仍处于研究阶段的方法、语言和工具。 3.4.3质量保证大纲 有关组织的质量保证大纲应包括软件开发过程,并应覆盖软件交付后的配置管理和更改控制。对安全系统,至少应以支持独立监查的方式覆盖独立的验证、确认和测试。应在质量保证大纲所要求的软件质量保证计划中描述软件质量要求。 3.4.4职责分配 应在设计机构内的各组织实体之间以及设计机构、它的委托人(用户)和涉及系统开发过程的其他组织之间建立接口。应建立与设计接口有关的控制,包括职责分配及在有接口的组织之间文件的发布和传递。 3.4.5第三方评定 3.4.5.1对安全系统应采用第三方评定。应在质量保证大纲中规定该评定的范围和程度。实施质量保证、验证和确认任务的队伍应独立于开发人员。这些问题详见本导则的3.4.5.2和4.4.6。 3.4.5.2第三方评定的目的是提供关于系统及其软件充分性的评定,该评定是独立于供应商和用户(许可证持有者)的。这样的评定可由管理机构或管理机构能接受的团体承担。评定应在许可证持有者和供应商的生产过程中提供可信性。为得到必需的置信度,应仔细考虑第三方评定必需的关于项目评价的策略、资格和知识。另外,第三方评定应得到所有各方(国家核安全监管部门、许可证持有者和供应商)的赞同,以便在期望的时间能得到合适的资源。某些第三方评定应包含过程检验(例如通过质量保证监查和技术检验),其他的第三方评定应包括产品的检验(例如通过静态分析、动态分析、代码和(或)数据检验以及测试范围分析)。在有限时间的检查中,最终产品评定应尽可能在软件最后的版本上进行,这能包括开发过程中的中间产品(例如软件规格说明)的第三方评定。 3.5文件 3.5.1概述 3.5.1.1 开发过程应对软件产品可靠性的置信度提供完善的证据。文件是对这种开发方法提供必需的透明度和n-J"追溯性的关键。设计和实现可信软件产品所编制的文件应是清晰的和严谨的。 3.5.1.2整套文件应保证设计决策的可追溯性。在开发过程的每一步应产生适当的文件。文件应随开发而不断更新,包括调试过程和维护过程。管理人员应能得到与设计人员所用文件相同的文件。在项目早期应使设计人员明确上述这些要求。 3.5.1.3需求、设计和编码的文件应清晰和严谨,以便设计人员、编程人员和独立审查人员能充分了解每个开发阶段并能验证文件的完整性和正确性。 3.5.1.4 良好的文件也是维护的基础。在未来有关更改的维护中应使用合理的文件格式,以降低不一致和错误的可能性。文件应具有3.5.2~3.5.7中描述的可读性、严谨性、可追溯性、完整性、一致性、可验证性和可修改性等属性。 3.5.2可读性 文件应能被具有不同背景和经验的人理解。所使用的语言应明确,当采用形式语言(例如图形格式)时,应确切定义语法和语义。 3.5.3严谨性 应采用形式语言描述需求和设计,并用自然语言注释。每项描述应只有一种可能的解释。 3.5.4可追溯性和完整性 3.5.4.1 追溯的目的是证明对计算机系统需求和设计的实现是完整的,且有助于检测实现过程中的不安全性。从高层文件到软件文件的追溯可检查文件的完整性,而从软件文件到高层文件的反向追溯则检查出可能不安全的未加规定的项目。每项要求应给予唯一标识。 3.5.4.2 应建立追溯矩阵图,以清晰显示基于计算机的系统的需求与下述项目之间的链接,它们是在计算机系统的需求分析和设计中以及在软件的需求分析、设计和实现中完成基于计算机的系统的需求的那些项目。这个矩阵应证明在软件实现、集成、安装和调试期间,包括了基于计算机的系统需求的整个测试范围。 3.5.5一致性 文件不应包含矛盾的或不一致的语句。文件中每条信息应分为单一的、可确定的段,并且不应在两段或多段之间重复或分割。每项需求、设计单元或程序模块应有唯一标识(这也有助于可追溯性)。在整个文件中,符号(标记)、术语、注释和技术方法应是唯一的。 3.5.6可验证性 文件应可理解、明确和可追溯,以改善其可验证性。在整个软件需求和软件设计中,对计算机系统需求使用相同的模型或语言也将有助于验证,但不必对所有问题都这样要求。当用形式语言来规定需求或设计时,原理试验器和模型检查器也可能有助于验证。 3.5.7可修改性 文件应具有可修改性,也就是文件的结构和风格应能够方便、完整和保持一致地进行任何必要的更改,并且容易确定。 4项目计划 4.1 总则 4.1.1开发过程应仔细制定计划,同时为便于安全重要系统通过取证评定,应提供清晰证据以证明遵循了开发过程的要求。项目计划应为基于计算机的系统编制综合的和特定的安全文件(安全分析报告),或写成覆盖项目所有方面的一套计划。第4章描述单独计划的每个方面,但是它等效于将这些计划归纳为一个独立文件。不应拒绝对系统的修改,并应为可能反复进行安全分析制定措施。 4.1.2开发计划应规定一系列开发活动以及每项活动的基本特性。另外应编制质量保证、验证和确认、配置管理以及调试和安装等方面的计划。图1显示假定在本导则中适用的、基于计算机的系统开发的流程图(与选择特定开发模式相关的信息见第2章)。图2用流程说明验证和确认与需求、设计和实现的关系。 4.2开发计划 4.2.1 概述 开发计划应标识和规定将用于特定项目的开发过程。4.2.2~4.2.7说明计算机系统开发计划应包括的内容。 4.2.2阶段 开发过程的所有阶段(图1)应予以标识。每个阶段,例如计算机系统设计,由规格说明、设计和实现组成。一个阶段的设计活动形成下一阶段的需求,而为本阶段规定的需求是前一阶段设计活动的一部分。例如,实现是选择现行代码(包括程序库例行程序)和产生任何需要的补充代码的过程。在完成每个开发阶段后和开始下一阶段前应实施验证(见图2及关于验证和确认计划的4.4)。 4.2.3方法 在开发计划中应确定所使用的方法。方法的选择应与质量保证大纲(其中确立了标准和程序)相关联。 4.2.4工具 4.2.4.1 在开发计划中应确定所使用的工具。工具的选择应便于正确使用所选择的方法、标准和程序。为整个项目预先编制计划将有助于选择一整套工具。应对系统开发、管理或验证的工具的功能进行合适的质量鉴定。应采用交叉验证法或反演法由工具鉴定过程保证工具输出的正确性。 4.2.4.2应使用工具,以避免工作人员从事易于出现人为差错的任务(例如编程和验证)。经验证明,工具还有助于系统性能达到可复现的质量水平。 4.2.5文件 每个阶段产生的文件应标识,并规定其内容。应指明全部需求、质量属性和性能特性在文件中的位置,并指明用于整个项目的验收准则等。文件应提供项目已遵守开发计划的证据。 4.2.6进度和里程碑 当实施项目审查时,应确定文件的进度并标识项目审查的时间。下面是一项管理任务包括的项目: ——平定资源的可用性; ——估算进行反复分析时每个阶段的时间(期限); ——评定需要的培训; ——评定可利用设施和工具的充分性; ——估算管理评审和批准所必需的时间; ——估算审查项目关键点所必需的时间。 4.2.7人员 应编制计划,以保证那些涉及开发活动的人员有能力使用相关的标准、程序和方法,使用设计、编程、分析工具和方法以及实施配置管理和更改控制。应保存关于人员能力的记录。如果队伍中增加新成员,他们应接受培训和监督,直到他们的能力达到专业人员的水平为止。 4.3质量保证 在项目开始前,许可证持有者应编制和实施质量保证大纲,该质量保证大纲应适用于管理评审(以及可能的批准)。在项目一开始就应编制软件质量保证计划。整个计划应覆盖外部供应商,并至少包括下述内容: (1)质量保证大纲应用的硬件和软件的标识,为项目使用的管理标准、程序和工具的标识; (2)质量保证大纲指明形成的每份文件应由谁审查和批准其正式交付; (3)项目组织机构的描述应包括保证其质量保证监督员的独立性; (4)对参与项目人员所要求资格和培训的说明; (5)对与标准和程序不符合项的标识、报告和纠正的机制; (6)所有必需的计划(例如开发计划、验证计划、配置管理计划以及调试和安装计划)的标识; (7)说明以系统安全分级为基础的质量保证监查的数量和范围; (8)对工具进行质量鉴定的程序(见4.2.4.1); (9)对来自外部供应商零部件质量的检查机制;若这个机制依赖外部认证程序,例如型式试验,则也应包括这些程序的描述。 |
联系我们
|
微信联系客服
![]() |
关于我们 | 联系我们 | 收费付款 |
服务热线:400-001-5431 | 电话:010-8572 5110 | 传真:010-8581 9515 | Email: bz@bzfyw.com | |
版权所有: 北京悦尔信息技术有限公司 2008-2020 京ICP备17065875号-1 51La |
本页关键词: |
HAD 102/16-2004, HADT 102/16-2004, HADT 10216-2004, HAD102/16-2004, HAD 102/16, HAD102/16, HADT102/16-2004, HADT 102/16, HADT102/16, HADT10216-2004, HADT 10216, HADT10216 |