IEEE-845-1999-R2005.pdf

上传人:椰子壳 文档编号:3769914 上传时间:2019-09-23 格式:PDF 页数:15 大小:68.83KB
返回 下载 相关 举报
IEEE-845-1999-R2005.pdf_第1页
第1页 / 共15页
IEEE-845-1999-R2005.pdf_第2页
第2页 / 共15页
IEEE-845-1999-R2005.pdf_第3页
第3页 / 共15页
IEEE-845-1999-R2005.pdf_第4页
第4页 / 共15页
IEEE-845-1999-R2005.pdf_第5页
第5页 / 共15页
亲,该文档总共15页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述

《IEEE-845-1999-R2005.pdf》由会员分享,可在线阅读,更多相关《IEEE-845-1999-R2005.pdf(15页珍藏版)》请在三一文库上搜索。

1、The Institute of Electrical and Electronics Engineers, Inc. 3 Park Avenue, New York, NY 10016-5997, USA Copyright 2005 by the Institute of Electrical and Electronics Engineers, Inc. All rights reserved. Published 28 September 1999. Printed in the United States of America. IEEE is a registered tradem

2、ark in the U.S. Patent (978) 750-8400. Permission to photocopy portions of any individual standard for educational classroom use can also be obtained through the Copy- right Clearance Center. Note: Attention is called to the possibility that implementation of this standard may require use of subject

3、 matter covered by patent rights. By publication of this standard, no position is taken with respect to the existence or validity of any patent rights in connection therewith. The IEEE shall not be responsible for identifying patents for which a license may be required by an IEEE standard or for con

4、ducting inquiries into the legal validity or scope of those patents that are brought to its attention. Copyright The Institute of Electrical and Electronics Engineers, Inc. Provided by IHS under license with IEEELicensee=IHS Employees/1111111001, User=OConnor, Maurice Not for Resale, 04/29/2007 00:1

5、3:25 MDTNo reproduction or networking permitted without license from IHS -,-,- Copyright 1999 IEEE. All rights reserved. iii Introduction (This introduction is not part of IEEE Std 845-1999, IEEE Guide for the Evaluation of Human-System Performance in Nuclear Power Generating Stations.) This introdu

6、ction provides background on the rationale used to develop this guide. This information is meant to assist in the understanding and usage of this guide. Human factors engineering has been a part of nuclear power plant design, construction, and operation from the industrys beginning, although not und

7、er that name. (For example, see H. L. Parris, “A Review of Human Factors R b)Testing design or operating approaches for adequacy; c) Comparing alternative designs or confi gurations; or d)Evaluating the maintainability of the system. This guide is for use by personnel who are familiar with the conce

8、pts of formal human factors analysis, but not necessarily familiar with the details of specifi c techniques. 1 The numbers in brackets correspond to those of the bibliography in Annex A. Copyright The Institute of Electrical and Electronics Engineers, Inc. Provided by IHS under license with IEEELice

9、nsee=IHS Employees/1111111001, User=OConnor, Maurice Not for Resale, 04/29/2007 00:13:25 MDTNo reproduction or networking permitted without license from IHS -,-,- IEEE Std 845-1999IEEE GUIDE FOR THE EVALUATION OF HUMAN-SYSTEM PERFORMANCE 2 Copyright 1999 IEEE. All rights reserved. 2. Defi nitions Fo

10、r the purposes of this guide, the following terms and defi nitions apply. IEEE Std 100-1996 B2 should be referenced for terms not defi ned in this clause. 2.1 human-system interface (HSI): The interaction between workers and their equipment. This interaction requires information to fl ow in two dire

11、ctions. The system provides status information to the user, and the user provides control information to the system. Used in other texts as man-machine interface (MMI), human-machine interface (HMI), human-machine system (HMS), and human-computer interface (HCI). (For further information see IEEE St

12、d 1289-1998 B5.) 2.2 system development cycle: The life cycle through which a system is developed, which consists of the following: a)Concept development; b)Design; c)Test and construction; d)Operation; and e)Maintenance (see IEEE Std 1023-1988 B3). 3. Evaluating human-system performance 3.1 General

13、 To evaluate human-system performance, the evaluator needs to recognize that human performance is integral to system performance throughout design, development, testing, operation, and maintenance activities. Therefore, human performance is an integral part of system performance evaluation. Human pe

14、rformance is infl uenced by many factors. For example, environmental conditions; organizational design; training; and physiological, perceptual, and cognitive processes all infl uence human performance. The evaluator can apply various measures and evaluation techniques to formally evaluate the perfo

15、rmance of people on tasks of interest. There are several considerations that are associated with evaluating human-system performance. These include the following: a)Selection and implementation of the measure and technique; b)Analysis and interpretation of human performance data; c)Measuring cogniti

16、ve processes; d)Generalizing from experimental studies; and e)Establishing meaningful performance criteria for some tasks. A comprehensive approach to human-system performance evaluation will require attention to these considerations. This guide includes brief discussions of selected considerations

17、as they relate to the performance evaluation techniques recommended here. For a more detailed discussion of potential considerations, see ANSI/AIAA G-035-1992 B1. This guide describes human-system performance evaluation techniques that may be used to support the systems design approach described in

18、IEEE Std 1023-1988 B3. These evaluation techniques include paper and pencil, observational, expert judgment, and experimental techniques. Human factors design analyses techniques (e.g., mission, function, task, and link analyses) are not included in this guide, but are described Copyright The Instit

19、ute of Electrical and Electronics Engineers, Inc. Provided by IHS under license with IEEELicensee=IHS Employees/1111111001, User=OConnor, Maurice Not for Resale, 04/29/2007 00:13:25 MDTNo reproduction or networking permitted without license from IHS -,-,- IEEE IN NUCLEAR POWER GENERATING STATIONSStd

20、 845-1999 Copyright 1999 IEEE. All rights reserved. 3 in IEEE Std 1023-1988 B3. Human factors expertise is desirable when selecting and applying appropriate human-system performance evaluation techniques to avoid the use of ineffi cient or inappropriate techniques. Descriptions of the type of data o

21、btained from each technique, cost considerations, and other useful decision criteria are included to guide the user in incorporating human performance evaluation in system design. 3.2 Evaluation concepts Human-system performance evaluation requires the evaluator to select appropriate measurement tec

22、hniques, collect the data, and analyze and interpret the results. The selection of appropriate measurement techniques depends on the purpose of the overall evaluation and other practical constraints. Within these limits, different techniques exist that will be more or less suited to particular situa

23、tions. This guide contains information for the selection and application of human-system performance evaluation techniques. To interpret results, the evaluator should specify criteria for judging the acceptability of human-system performance. Without some form of acceptance criteria, the evaluator h

24、as performed only measurement, not evaluation. These criteria may be informal (evaluators opinion regarding the acceptability of the performance) or formal (establishing specifi c criteria related to the measurement; for example, operator diagnosis within a specifi c time limit). Other performance c

25、riteria may also be appropriate in the overall evaluation (e.g., maintaining an adequate margin to a safety function in the operation of the plant). Ultimately, the selection of appropriate criteria should be established in the context of the overall system development process, including project goa

26、ls and constraints (see IEEE Std 1023-1988 B3). 3.3 Characteristics of human-system performance measures The characteristics described in 3.3.1 should be considered when selecting evaluation measures that best refl ect meaningful and measurable aspects of human performance. This is not a complete li

27、st, but represents the major characteristics that should be considered when selecting techniques. For a detailed discussion of characteristics, see ANSI/AIAA G-035-1992 B1. The application of specifi c evaluation techniques is described in 3.4. 3.3.1 Characteristics associated with selecting human-s

28、ystem performance measures Acceptability: The degree to which evaluators at all levels agree on the use of the measure. Accuracy: Minimization of measurement error. Applicability: Not all techniques are applicable to all phases of system design and operation. Therefore, it is important in any evalua

29、tion of human-system performance to defi ne the applicability of the specifi c measures. Consider the applicability of each measure with respect to its use during design and evaluation of the system. Bias: The degree to which measured results are free from systematic sources of prejudice or error. I

30、ntrusiveness: The extent to which the measure alters or interferes with the performance being measured. Avoid measures that infl uence the workers performance or disrupt the activity. Precision: The level of detail of the instrument, sensor, or instrumentation. Reliability: The degree to which the m

31、easure yields consistent and reproducible fi ndings when used in comparable circumstances. Resources: The items needed to implement the measure, such as time, budget, personnel, equipment, logistics, and the need for specialized expertise. Sensitivity: The degree to which the measure is able to disc

32、riminate meaningful variations on the dimension of interest. Validity: The degree to which an instrument or technique can be demonstrated to measure what it is intended to measure. Copyright The Institute of Electrical and Electronics Engineers, Inc. Provided by IHS under license with IEEELicensee=I

33、HS Employees/1111111001, User=OConnor, Maurice Not for Resale, 04/29/2007 00:13:25 MDTNo reproduction or networking permitted without license from IHS -,-,- IEEE Std 845-1999IEEE GUIDE FOR THE EVALUATION OF HUMAN-SYSTEM PERFORMANCE 4 Copyright 1999 IEEE. All rights reserved. 3.3.2 Characteristics of

34、 subjective vs. objective measures Often, objective measures are perceived as being more meaningful than subjective measures. This is not necessarily the case. Subjective measures yield data that are obtained from the judgments and opinions of users or experts (e.g., judgment of task diffi culty). S

35、uch measures, while vulnerable to individual bias and perspective, are typically the most practical measures available for complex or inferred behaviors (e.g., problem solving). The sources of bias in subjective measures can be controlled in most situations by attention to the characteristics descri

36、bed in 3.3.1. Subjective measures, such as asking opinions on design features, may produce qualitative data. Subjective measures, such as the use of rating scales where numbers indicate the degree of response, may also produce quantitative data. Statistical techniques are available for analyzing dat

37、a derived from both subjective and objective measures. Use of statistical techniques increases the formality and objectivity of performance evaluation. Objective measures yield data that are obtained from observable human behavior (e.g., time to complete a task). Objective measures are not as vulner

38、able to bias introduced by individual opinions as subjective measures. Objective measures yield data such as reaction times, error rates, display types accessed, and the number of specifi c control actions. Though data from objective measures are diffi cult to collect from unobservable human behavio

39、r such as cognitive processes, they can help evaluators interpret complex human behaviors. When applied to observable behavior, data from objective measures can be used to draw inferences about differences between experimental groups, and between samples and the populations from which they are drawn

40、. 3.3.3 Characteristics of using diverse measures Given the nature of an evaluation, a single measure may not yield data to assure suffi ciently valid results. In such cases, multiple (i.e., diverse) measures of the same performance should be considered. Agreement among the data obtained from multip

41、le measures may strengthen the fi ndings. Note that the expected benefi t of adding measure(s) to an evaluation must justify the costs of their data collection and processing. However, the incremental costs of added measures may be relatively small if simple methods are used. 3.4 Evaluation techniqu

42、es Many human-system performance evaluation techniques apply to the nuclear power plant setting. This subclause describes some of the most commonly used techniques. An assessment of each technique is made relative to the attributes described in 3.3. For a detailed technical discussion of each techni

43、que, refer to the documents in Annex A. The human-system performance evaluation techniques are divided into the following four descriptive categories, each based on the method of application: a)Paper and pencil; b)Observational; c)Expert judgment; and d)Experimental. Copyright The Institute of Elect

44、rical and Electronics Engineers, Inc. Provided by IHS under license with IEEELicensee=IHS Employees/1111111001, User=OConnor, Maurice Not for Resale, 04/29/2007 00:13:25 MDTNo reproduction or networking permitted without license from IHS -,-,- IEEE IN NUCLEAR POWER GENERATING STATIONSStd 845-1999 Co

45、pyright 1999 IEEE. All rights reserved. 5 3.4.1 Paper and pencil techniques The evaluation techniques in this category are different in that the observation of actual system/personnel performance is not required. The output from these techniques can be a simple accept-or-reject decision, a hierarchi

46、cal ranking, or a numerical representation of merit. 3.4.1.1 Checklists Description: A checklist is a numbered set of statements presenting attributes that a piece of equipment or a system should possess to meet acceptable human factors criteria. A checklist is used to partition a complex entity int

47、o more readily understood elements. Such human engineering elements typically concern whether expected information or control functions are available, whether their design characteristics are suitable, or whether a specifi c performance result is obtained. If checklist items are too general, then th

48、e evaluator may make discretionary decisions, and thus the result may be less reliable. If checklist items are too specifi c, their application requires more effort. Evaluator discretion will determine which checklist items apply to different components of the system or equipment. Checklist items sh

49、ould be simple and clearly specifi ed. Bases for the criteria should be established and retained. Requirements: A checklist is useful in at least four different ways, as follows: 1)As a memory aid, reminding the evaluator to assess a particular characteristic; 2) As a reference standard, providing criteria against which the design

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 其他


经营许可证编号:宁ICP备18001539号-1