Customer Service:
Mon - Fri: 8:30 am - 6 pm EST

Augmented Reality

Augmented reality standards are published by ISO, IEC, IEEE, SAE, BS, and DS. They cover topics including computer graphics, multimedia application format (MPEG-A),information technology for education, human-system interaction, terminology, mixed and augmented reality, benchmarking of vision-based spatial registration and tracking methods, human factor guidelines, entity representation, catalog model for content, tactile and haptic interactions, immersive visual content coding, eyeglass displays, and requirements for manager of information for users.


ISO/IEC 18038:2020

Information technology - Computer graphics, image processing and environmental representation - Sensor representation in mixed and augmented reality

This document defines the framework and information reference model for representing sensor-based 3D mixed-reality worlds. It defines concepts, an information model, architecture, system functions, and how to integrate 3D virtual worlds and physical sensors in order to provide mixed-reality applications with physical sensor interfaces. It defines an exchange format necessary for transferring and storing data between physical sensor-based mixed-reality applications.

This document specifies the following functionalities:

a) representation of physical sensors in a 3D scene;

b) definition of physical sensors in a 3D scene;

c) representation of functionalities of each physical sensor in a 3D scene;

d) representation of physical properties of each physical sensor in a 3D scene;

e) management of physical sensors in a 3D scene;

f) interface with physical sensor information in a 3D scene.

This document defines a reference model for physical sensor-based mixed-reality applications to represent and to exchange functions of physical sensors in 3D scenes. It does not define specific physical interfaces necessary for manipulating physical devices, but rather defines common functional interfaces that can be used interchangeably between applications.

This document does not define how specific applications are implemented with specific physical sensor devices. It does not include computer generated sensor information using computer input/output devices such as a mouse or a keyboard. The sensors in this document represent physical sensor devices in the real world.


ISO/IEC 23000-13:2017

Information technology - Multimedia application format (MPEG-A) - Part 13: Augmented reality application format

ISO/IEC 23000-13:2017 specifies the following:

- scene description elements for representing AR content;

- mechanisms to connect to local and remote sensors and actuators;

- mechanisms to integrated compressed media (image, audio, video, graphics);

- mechanisms to connect to remote resources such as maps and compressed media.


INCITS/ISO/IEC 23000-13:2017 (2021)

Information technology - Multimedia application format (MPEG-A) - Part 13: Augmented reality application format

Specifies the following: scene description elements for representing AR content; mechanisms to connect to local and remote sensors and actuators; mechanisms to integrated compressed media (image, audio, video, graphics); mechanisms to connect to remote resources such as maps and compressed media.


ISO/IEC TR 23843:2020

Information technology for learning, education and training - Catalogue model for virtual, augmented and mixed reality content

This document describes how to search for virtual reality (VR), augmented reality (AR) and mixed reality (MR) content through a curriculum catalogue based on curriculum and achievement standards information. The curriculum catalogue metadata is defined in order to search for educational VR and MR content information.


ISO/TS 9241-430:2021

Ergonomics of human-system interaction - Part 430: Recommendations for the design of non-touch gestural input for the reduction of biomechanical stress

This document provides guidance on the design, selection and optimization of non-contacting hand and arm gestures for human-computer interaction. It addresses the assessment of usability and fatigue associated with different gesture set designs and provides recommendations for approaches to evaluating the design and selection of gestures. This document also provides guidance on the documentation of the process for selecting gesture sets.

 

This document applies to gestures expressed by humans. It does not consider the technology for detecting gestures or the system response when interpreting a gesture. Non-contacting hand gestures can be used for input in a variety of settings, including the workplace or in public settings and when using fixed screens, mobile, virtual reality, augmented reality or mixed-mode reality devices.

 

Some limitations of this document are:

 

—    The scope is limited to non-contacting gestures and does not include other forms of inputs. For example, combining gesture with speech, gaze or head position can reduce input error, but these combinations are not considered here.

 

—    The scope is limited to non-contacting arm, hand and finger gestures, either unilateral (one-handed) or bilateral (two-handed).

 

—    The scope assumes that all technological constraints are surmountable. Therefore, there is no consideration of technological limitations with interpreting ultra-rapid gestures, gestures performed by people of different skin tones or wearing different colours or patterns of clothing.

 

—    The scope is limited to UI-based command-and-control human computer interaction (HCI) tasks and does not include gaming scenarios, although the traversal of in-game menus and navigation of UI elements is within scope.

 

—    The scope does not include HCI tasks for which an obviously more optimal input method exists. For example, speech input is superior for inputting text than gesture input.

 

—    The scope includes virtual reality (VR), augmented reality (AR) and mixed reality (MR) and the use of head-mounted displays (HMDs).

 

—    The scope does not include the discoverability of gestures but does include the learnability and memorability of gestures. It is assumed that product documentation and tutorials will adequately educate end users about which gestures are possible. Therefore, assessing gesture discoverability is not a primary goal of the recommendations in this document.


IEEE 1589-2020

IEEE Standard for Augmented Reality Learning Experience Model

Augmented Reality (AR) promises to provide significant boosts in operational efficiency by making information available to employees needing task support in context in real time. To support according implementations of AR training systems,this document proposes an overarching integrated conceptual model that describes interactions between the physical world, the user, and digital information, the context for AR-assisted learning and other parameters of the environment. It defines two data models and their binding to XML and JSON for representing learning activities (also known as employee tasks and procedures) and the learning environment in which these tasks are performed (also known as the workplace). The interoperability specification and standard is presented in support of an open market where interchangeable component products provide alternatives to monolithic Augmented Reality-assisted learning systems. Moreover, it facilitates the creation of experience repositories and online marketplaces for Augmented Reality-enabled learning content. Specific attention was given to reuse and repurposing of existing learning content and catering to ‘mixed’ experiences combining real world learner guidance with the consumption (or production) of traditional contents such as instructional video material or learning apps and widgets.



ISO/IEC 18039:2019

Information technology - Computer graphics, image processing and environmental data representation - Mixed and augmented reality (MAR) reference model

This document defines the scope and key concepts of mixed and augmented reality, the relevant terms and their definitions and a generalized system architecture that together serve as a reference model for mixed and augmented reality (MAR) applications, components, systems, services and specifications. This architectural reference model establishes the set of required sub-modules and their minimum functions, the associated information content and the information models to be provided and/or supported by a compliant MAR system.

The reference model is intended for use by current and future developers of MAR applications, components, systems, services or specifications to describe, compare, contrast and communicate their architectural design and implementation. The MAR reference model is designed to apply to MAR systems independent of specific algorithms, implementation methods, computational platforms, display systems and sensors or devices used.

This document does not specify how a particular MAR application, component, system, service or specification is designed, developed or implemented. It does not specify the bindings of those designs and concepts to programming languages or the encoding of MAR information through any coding technique or interchange format. This document contains a list of representative system classes and use cases with respect to the reference model.


ISO/IEC 18040:2019

Information technology - Computer graphics, image processing and environmental data representation - Live actor and entity representation in mixed and augmented reality (MAR)

This document defines a reference model and base components for representing and controlling a single LAE or multiple LAEs in an MAR scene. It defines concepts, a reference model, system framework, functions and how to integrate a 2D/3D virtual world and LAEs, and their interfaces, in order to provide MAR applications with interfaces of LAEs. It also defines an exchange format necessary for transferring and storing LAE-related data between LAE-based MAR applications.

This document specifies the following functionalities:

a) definitions for an LAE in MAR;

b) representation of an LAE;

c) representation of properties of an LAE;

d) sensing of an LAE in a physical world;

e) integration of an LAE into a 2D/3D virtual scene;

f) interaction between an LAE and objects in a 2D/3D virtual scene;

g) transmission of information related to an LAE in an MAR scene.

This document defines a reference model for LAE representation-based MAR applications to represent and to exchange data related to LAEs in a 2D/3D virtual scene in an MAR scene. It does not define specific physical interfaces necessary for manipulating LAEs, that is, it does not define how specific applications need to implement a specific LAE in an MAR scene, but rather defines common functional interfaces for representing LAEs that can be used interchangeably between MAR applications.


ISO/IEC 18520:2019

Information technology - Computer graphics, image processing and environmental data representation - Benchmarking of vision-based spatial registration and tracking methods for mixed and augmented reality (MAR)

This document identifies the reference framework for the benchmarking of vision-based spatial registration and tracking (vSRT) methods for mixed and augmented reality (MAR).

The framework provides typical benchmarking processes, benchmark indicators and trial set elements that are necessary to successfully identify, define, design, select and apply benchmarking of vSRT methods for MAR. It also provides definitions for terms on benchmarking of vSRT methods for MAR.

In addition, this document provides a conformance checklist as a tool to clarify how each benchmarking activity conforms to this document in a compact form by declaring which benchmarking processes and benchmark indicators are included and what types of trial sets are used in each benchmarking activity.


ISO/IEC 23488:2022

Information technology - Computer graphics, image processing and environment data representation - Object/environmental representation for image-based rendering in virtual/mixed and augmented reality (VR/MAR)

This document specifies an image-based representation model that represents target objects/environments using a set of images and optionally the underlying 3D model for accurate and efficient objects/environments representation at an arbitrary viewpoint. It is applicable to a wide range of graphic, virtual reality and mixed reality applications which require the method of representing a scene with various objects and environments.

 

This document:

 

—    defines terms for image-based representation and 3D reconstruction techniques;

 

—    specifies the required elements for image-based representation;

 

—    specifies a method of representing the real world in the virtual space based on image-based representation;

 

—    specifies how visible image patches can be integrated with the underlying 3D model for more accurate and rich objects/environments representation from arbitrary viewpoints;

 

—    specifies how the proposed model allows multi-object representation;

 

—    provides an XML based specification of the proposed representation model and an actual implementation example (see Annex A).


ISO/IEC TS 23884:2021

Information technology - Computer graphics, image processing and environmental data representation - Material property and parameter representation for model-based haptic simulation of objects in virtual, mixed and augmented reality (VR/MAR)

This document specifies:

 

—    physical and material parameters of virtual or real objects expressed to support comprehensive haptic rendering methods, such as stiffness, friction and micro-textures;

 

—    a flexible specification of the haptic rendering algorithm itself.

 

It supplements other standards that describe scene or content description and information models for virtual and mixed reality, such as ISO/IEC 19775 and ISO/IEC 3721-1.


CAN/CSA ISO/IEC 23000-13-2018

Information technology - Multimedia application format (MPEG-A) - Part 13: Augmented reality application format (Adopted ISO/IEC 23000-13:2017, second edition, 2017-11)

CSA Preface Standards development within the Information Technology sector is harmonized with international standards development. Through the CSA Technical Committee on Information Technology (TCIT), Canadians serve as the SCC Mirror Committee (SMC) on ISO/IEC Joint Technical Committee 1 on Information Technology (ISO/IEC JTC1) for the Standards Council of Canada (SCC), the ISO member body for Canada and sponsor of the Canadian National Committee of the IEC. Also, as a member of the International Telecommunication Union (ITU), Canada participates in the International Telegraph and Telephone Consultative Committee (ITU-T). Scope This document specifies the following: — scene description elements for representing AR content; — mechanisms to connect to local and remote sensors and actuators; — mechanisms to integrated compressed media (image, audio, video, graphics); — mechanisms to connect to remote resources such as maps and compressed media.


BS ISO/IEC 18039:2019

Information technology. Computer graphics, image processing and environmental data representation. Mixed and augmented reality (MAR) reference model (British Standard)

This document defines the scope and key concepts of mixed and augmented reality, the relevant terms and their definitions and a generalized system architecture that together serve as a reference model for mixed and augmented reality (MAR) applications, components, systems, services and specifications. This architectural reference model establishes the set of required sub-modules and their minimum functions, the associated information content and the information models to be provided and/or supported by a compliant MAR system.

The reference model is intended for use by current and future developers of MAR applications, components, systems, services or specifications to describe, compare, contrast and communicate their architectural design and implementation. The MAR reference model is designed to apply to MAR systems independent of specific algorithms, implementation methods, computational platforms, display systems and sensors or devices used.

This document does not specify how a particular MAR application, component, system, service or specification is designed, developed or implemented. It does not specify the bindings of those designs and concepts to programming languages or the encoding of MAR information through any coding technique or interchange format. This document contains a list of representative system classes and use cases with respect to the reference model.


ISO/IEC TR 23842-1:2020

Information technology for learning, education and training - Human factor guidelines for virtual reality content - Part 1: Considerations when using VR content

This document presents considerations for using VR content in the learning, education and training (LET) domain for reducing reality and virtual reality crossover confusion among users and assisting users to effectively use these emerging technologies.

This document addresses VR content that uses a head-mounted display (HMD) in the LET domain. It does not address VR content using immersive technology and does not address augmented reality, mixed or merged reality content.


DS/ISO/IEC TR 23842-1:2020

Information technology for learning, education and training - Human factor guidelines for virtual reality content - Part 1: Considerations when using VR content

This document presents considerations for using VR content in the learning, education and training (LET) domain for reducing reality and virtual reality crossover confusion among users and assisting users to effectively use these emerging technologies. *This document addresses VR content that uses a head-mounted display (HMD) in the LET domain. It does not address VR content using immersive technology and does not address augmented reality, mixed or merged reality content.


DS/ISO/IEC TR 23842-2:2020

Information technology for learning, education, and training - Human factor guidelines for virtual reality content - Part 2: Considerations when making VR content

This document presents considerations for making VR content for the learning education and training (LET) domain.*This document addresses VR content that uses a head-mounted display (HMD) in the LET domain. It does not address VR content using immersive technology and does not address augmented reality, mixed or merged reality content.


BS ISO/IEC 18040:2019

Information technology. Computer graphics, image processing and environmental data representation. Live actor and entity representation in mixed and augmented reality (MAR) (British Standard)

This document defines a reference model and base components for representing and controlling a single LAE or multiple LAEs in an MAR scene. It defines concepts, a reference model, system framework, functions and how to integrate a 2D/3D virtual world and LAEs, and their interfaces, in order to provide MAR applications with interfaces of LAEs. It also defines an exchange format necessary for transferring and storing LAE-related data between LAE-based MAR applications.

This document specifies the following functionalities:

a) definitions for an LAE in MAR;

b) representation of an LAE;

c) representation of properties of an LAE;

d) sensing of an LAE in a physical world;

e) integration of an LAE into a 2D/3D virtual scene;

f) interaction between an LAE and objects in a 2D/3D virtual scene;

g) transmission of information related to an LAE in an MAR scene.

This document defines a reference model for LAE representation-based MAR applications to represent and to exchange data related to LAEs in a 2D/3D virtual scene in an MAR scene. It does not define specific physical interfaces necessary for manipulating LAEs, that is, it does not define how specific applications need to implement a specific LAE in an MAR scene, but rather defines common functional interfaces for representing LAEs that can be used interchangeably between MAR applications.


BS ISO/IEC 18520:2019

Information technology. Computer graphics, image processing and environmental data representation. Benchmarking of vision-based spatial registration and tracking methods for mixed and augmented reality (MAR) (British Standard)

This document identifies the reference framework for the benchmarking of vision-based spatial registration and tracking (vSRT) methods for mixed and augmented reality (MAR).

The framework provides typical benchmarking processes, benchmark indicators and trial set elements that are necessary to successfully identify, define, design, select and apply benchmarking of vSRT methods for MAR. It also provides definitions for terms on benchmarking of vSRT methods for MAR.

In addition, this document provides a conformance checklist as a tool to clarify how each benchmarking activity conforms to this document in a compact form by declaring which benchmarking processes and benchmark indicators are included and what types of trial sets are used in each benchmarking activity.


BS ISO/IEC 23488:2022

Information technology. Computer graphics, image processing and environment data representation. Object/environmental representation for image-based rendering in virtual/mixed and augmented reality (VR/MAR) (British Standard)

This document specifies an image-based representation model that represents target objects/environments using a set of images and optionally the underlying 3D model for accurate and efficient objects/environments representation at an arbitrary viewpoint. It is applicable to a wide range of graphic, virtual reality and mixed reality applications which require the method of representing a scene with various objects and environments.

 

This document:

 

—    defines terms for image-based representation and 3D reconstruction techniques;

 

—    specifies the required elements for image-based representation;

 

—    specifies a method of representing the real world in the virtual space based on image-based representation;

 

—    specifies how visible image patches can be integrated with the underlying 3D model for more accurate and rich objects/environments representation from arbitrary viewpoints;

 

—    specifies how the proposed model allows multi-object representation;

 

—    provides an XML based specification of the proposed representation model and an actual implementation example (see Annex A).


DS/IEC 63145-20-20:2019

Eyewear display - Part 20-20: Fundamental measurement methods - Image quality

IEC 63145-20-20:2019 (E) specifies the standard measurement conditions and measurement methods for determining the image quality of eyewear displays. This document is applicable to non-see-through type (virtual reality “VR” goggle) and see-through type (augmented reality “AR” glasses) eyewear displays using virtual image optics. Contact-lens type displays and retina direct projection displays are out of the scope of this document.


IEC 63145-20-10 Ed. 1.0 en:2019

Eyewear display - Part 20-10: Fundamental measurement methods - Optical properties

IEC 63145-20-10:2019(E) specifies the standard measurement conditions and measurement methods for determining the optical properties of eyewear displays. This document applies to non-see-through type (virtual reality “VR” goggles) and see-through type (augmented reality “AR” glasses) eyewear displays using virtual image optics.
 Contact lens-type displays and retina direct projection displays are out of the scope of this document.


IEC 63145-20-20 Ed. 1.0 en:2019

Eyewear display - Part 20-20: Fundamental measurement methods - Image quality

IEC 63145-20-20:2019 (E) specifies the standard measurement conditions and measurement methods for determining the image quality of eyewear displays. This document is applicable to non-see-through type (virtual reality “VR” goggle) and see-through type (augmented reality “AR” glasses) eyewear displays using virtual image optics.
 Contact-lens type displays and retina direct projection displays are out of the scope of this document.


IEC 63145-22-10 Ed. 1.0 en:2020

Eyewear display - Part 22-10: Specific measurement methods for AR type - Optical properties

IEC 63145-22-20:2020(E) specifies the standard measurement conditions and measuring methods for determining the see-through optical properties and imaging quality of augmented reality (AR) eyewear displays. This includes the transmission characteristics and ambient optical performance of the eyewear displays.
 Contact lens type displays are out of the scope of this document.
 NOTE The relationship between the scope and other documents (IEC 63145-20-10, IEC 63145-22-10) is shown in Annex A.


CSA ISO/IEC TR 23843-2021

Information technology for learning, education and training - Catalogue model for virtual, augmented and mixed reality content (Adopted ISO/IEC TR 23843:2020, first edition, 2020-10)

CSA Preface Standards development within the Information Technology sector is harmonized with international standards development. Through the CSA Technical Committee on Information Technology (TCIT), Canadians serve as the SCC Mirror Committee (SMC) on ISO/IEC Joint Technical Committee 1 on Information Technology (ISO/IEC JTC1) for the Standards Council of Canada (SCC), the ISO member body for Canada and sponsor of the Canadian National Committee of the IEC. Also, as a member of the International Telecommunication Union (ITU), Canada participates in the International Telegraph and Telephone Consultative Committee (ITU-T). For brevity, this Standard will be referred to as "CSA ISO/IEC TR 23843" throughout. At the time of publication, ISO/IEC TR 23843:2020 is available from ISO and IEC in English only. CSA Group will publish the French version when it becomes available from ISO and IEC. This Standard has been formally approved, without modification, by the Technical Committee and has been developed in compliance with Standards Council of Canada requirements for National Standards of Canada. It has been published as a National Standard of Canada by CSA Group. Scope This document describes how to search for virtual reality (VR), augmented reality (AR) and mixed reality (MR) content through a curriculum catalogue based on curriculum and achievement standards information. The curriculum catalogue metadata is defined in order to search for educational VR and MR content information.


PD ISO/TS 9241-430:2021

Ergonomics of human-system interaction Recommendations for the design of non-touch gestural input for the reduction of biomechanical stress (British Standard)

This document provides guidance on the design, selection and optimization of non-contacting hand and arm gestures for human-computer interaction. It addresses the assessment of usability and fatigue associated with different gesture set designs and provides recommendations for approaches to evaluating the design and selection of gestures. This document also provides guidance on the documentation of the process for selecting gesture sets.

 

This document applies to gestures expressed by humans. It does not consider the technology for detecting gestures or the system response when interpreting a gesture. Non-contacting hand gestures can be used for input in a variety of settings, including the workplace or in public settings and when using fixed screens, mobile, virtual reality, augmented reality or mixed-mode reality devices.

 

Some limitations of this document are:

 

—    The scope is limited to non-contacting gestures and does not include other forms of inputs. For example, combining gesture with speech, gaze or head position can reduce input error, but these combinations are not considered here.

 

—    The scope is limited to non-contacting arm, hand and finger gestures, either unilateral (one-handed) or bilateral (two-handed).

 

—    The scope assumes that all technological constraints are surmountable. Therefore, there is no consideration of technological limitations with interpreting ultra-rapid gestures, gestures performed by people of different skin tones or wearing different colours or patterns of clothing.

 

—    The scope is limited to UI-based command-and-control human computer interaction (HCI) tasks and does not include gaming scenarios, although the traversal of in-game menus and navigation of UI elements is within scope.

 

—    The scope does not include HCI tasks for which an obviously more optimal input method exists. For example, speech input is superior for inputting text than gesture input.

 

—    The scope includes virtual reality (VR), augmented reality (AR) and mixed reality (MR) and the use of head-mounted displays (HMDs).

 

—    The scope does not include the discoverability of gestures but does include the learnability and memorability of gestures. It is assumed that product documentation and tutorials will adequately educate end users about which gestures are possible. Therefore, assessing gesture discoverability is not a primary goal of the recommendations in this document.


PD ISO/IEC TR 23843:2020

Information technology for learning, education and training. Catalogue model for virtual, augmented and mixed reality content (British Standard)

This document describes how to search for virtual reality (VR), augmented reality (AR) and mixed reality (MR) content through a curriculum catalogue based on curriculum and achievement standards information. The curriculum catalogue metadata is defined in order to search for educational VR and MR content information.


ISO 9241-940:2017

Ergonomics of human-system interaction - Part 940: Evaluation of tactile and haptic interactions

ISO 9241-940:2017

- describes the types of methods that can be used for the evaluation of haptic devices and of systems that include haptic devices,

- specifies a procedure for the evaluation of haptic interactions by a usability walkthrough or usability test (see Annex J), and

- provides guidance on the types of methods that are appropriate for the evaluation of specific attributes of haptic systems, cross-referenced to the guidance in the relevant clauses of other International Standards (see Annexes A, B, C, D, E, F and G).

It applies to the following types of interaction:

- augmented reality - information overlaid on a real scene, e.g. vibrating belt indicating distance;

- gesture control of a device or a virtual scenario;

- unidirectional interaction such as a vibrating phone or a vibrating belt;

- virtual environment - virtual space with which a user can interact with the aid of a haptic device.

ISO 9241-940:2017 applies to the following types of devices:

- gesture sensor, e.g. video that discerns 3D hand movements, touch screens that sense 2D touches;

- kinaesthetic haptic device, e.g. desktop haptic interface;

- tactile display, e.g. vibrating phone.

ISO 9241-940:2017 is not applicable to standard input devices such as keyboards, mice or track balls.

NOTE ISO 9241‑400 covers standard input devices, and ISO 9241‑411 applies to the evaluation of input devices such as keyboards and mice.

ISO 9241-940:2017 can be used to identify the types of methods and measures for

- establishing benchmarks,

- establishing requirements for haptic interaction,

- identifying problems with haptic interaction (formative evaluation), and

- use of the criteria to establish whether a haptic system meets requirements (summative evaluation).


DS/ISO 9241-940:2017

Ergonomics of human-system interaction - Part 940: Evaluation of tactile and haptic interactions

ISO 9241-940:2017*- describes the types of methods that can be used for the evaluation of haptic devices and of systems that include haptic devices,*- specifies a procedure for the evaluation of haptic interactions by a usability walkthrough or usability test (see Annex J), and*- provides guidance on the types of methods that are appropriate for the evaluation of specific attributes of haptic systems, cross-referenced to the guidance in the relevant clauses of other International Standards (see Annexes A, B, C, D, E, F and G).*It applies to the following types of interaction:*- augmented reality û information overlaid on a real scene, e.g. vibrating belt indicating distance;*- gesture control of a device or a virtual scenario;*- unidirectional interaction such as a vibrating phone or a vibrating belt;*- virtual environment û virtual space with which a user can interact with the aid of a haptic device.*ISO 9241-940:2017 applies to the following types of devices:*- gesture sensor, e.g. video that discerns 3D hand movements, touch screens that sense 2D touches;*- kinaesthetic haptic device, e.g. desktop haptic interface;*- tactile display, e.g. vibrating phone.*ISO 9241-940:2017 is not applicable to standard input devices such as keyboards, mice or track balls.*NOTE û ISO 9241‑400 covers standard input devices, and ISO 9241‑411 applies to the evaluation of input devices such as keyboards and mice.*ISO 9241-940:2017 can be used to identify the types of methods and measures for*- establishing benchmarks,*- establishing requirements for haptic interaction,*- identifying problems with haptic interaction (formative evaluation), and*- use of the criteria to establish whether a haptic system meets requirements (summative evaluation).


BS EN ISO 9241-940:2022

Ergonomics of human-system interaction Evaluation of tactile and haptic interactions (British Standard)

ISO 9241-940:2017

- describes the types of methods that can be used for the evaluation of haptic devices and of systems that include haptic devices,

- specifies a procedure for the evaluation of haptic interactions by a usability walkthrough or usability test (see Annex J), and

- provides guidance on the types of methods that are appropriate for the evaluation of specific attributes of haptic systems, cross-referenced to the guidance in the relevant clauses of other International Standards (see Annexes A, B, C, D, E, F and G).

It applies to the following types of interaction:

- augmented reality - information overlaid on a real scene, e.g. vibrating belt indicating distance;

- gesture control of a device or a virtual scenario;

- unidirectional interaction such as a vibrating phone or a vibrating belt;

- virtual environment - virtual space with which a user can interact with the aid of a haptic device.

ISO 9241-940:2017 applies to the following types of devices:

- gesture sensor, e.g. video that discerns 3D hand movements, touch screens that sense 2D touches;

- kinaesthetic haptic device, e.g. desktop haptic interface;

- tactile display, e.g. vibrating phone.

ISO 9241-940:2017 is not applicable to standard input devices such as keyboards, mice or track balls.

NOTE ISO 9241‑400 covers standard input devices, and ISO 9241‑411 applies to the evaluation of input devices such as keyboards and mice.

ISO 9241-940:2017 can be used to identify the types of methods and measures for

- establishing benchmarks,

- establishing requirements for haptic interaction,

- identifying problems with haptic interaction (formative evaluation), and

- use of the criteria to establish whether a haptic system meets requirements (summative evaluation).


DS/EN ISO 9241-940:2022

Ergonomics of human-system interaction - Part 940: Evaluation of tactile and haptic interactions (ISO 9241-940:2017)

ISO 9241-940:2017*- describes the types of methods that can be used for the evaluation of haptic devices and of systems that include haptic devices,*- specifies a procedure for the evaluation of haptic interactions by a usability walkthrough or usability test (see Annex J), and*- provides guidance on the types of methods that are appropriate for the evaluation of specific attributes of haptic systems, cross-referenced to the guidance in the relevant clauses of other International Standards (see Annexes A, B, C, D, E, F and G).*It applies to the following types of interaction:*- augmented reality – information overlaid on a real scene, e.g. vibrating belt indicating distance;*- gesture control of a device or a virtual scenario;*- unidirectional interaction such as a vibrating phone or a vibrating belt;*- virtual environment – virtual space with which a user can interact with the aid of a haptic device.*ISO 9241-940:2017 applies to the following types of devices:*- gesture sensor, e.g. video that discerns 3D hand movements, touch screens that sense 2D touches;*- kinaesthetic haptic device, e.g. desktop haptic interface;*- tactile display, e.g. vibrating phone.*ISO 9241-940:2017 is not applicable to standard input devices such as keyboards, mice or track balls.*NOTE – ISO 9241 400 covers standard input devices, and ISO 9241 411 applies to the evaluation of input devices such as keyboards and mice.*ISO 9241-940:2017 can be used to identify the types of methods and measures for*- establishing benchmarks,*- establishing requirements for haptic interaction,*- identifying problems with haptic interaction (formative evaluation), and*- use of the criteria to establish whether a haptic system meets requirements (summative evaluation).


SS-EN ISO 9241-940:2022

Ergonomics of human-system interaction - Part 940: Evaluation of tactile and haptic interactions (ISO 9241-940:2017) (Swedish Standard)

ISO 9241-940:2017 - describes the types of methods that can be used for the evaluation of haptic devices and of systems that include haptic devices, - specifies a procedure for the evaluation of haptic interactions by a usability walkthrough or usability test (see Annex J), and - provides guidance on the types of methods that are appropriate for the evaluation of specific attributes of haptic systems, cross-referenced to the guidance in the relevant clauses of other International Standards (see Annexes A, B, C, D, E, F and G). It applies to the following types of interaction: - augmented reality - information overlaid on a real scene, e.g. vibrating belt indicating distance; - gesture control of a device or a virtual scenario; - unidirectional interaction such as a vibrating phone or a vibrating belt; - virtual environment - virtual space with which a user can interact with the aid of a haptic device. ISO 9241-940:2017 applies to the following types of devices: - gesture sensor, e.g. video that discerns 3D hand movements, touch screens that sense 2D touches; - kinaesthetic haptic device, e.g. desktop haptic interface; - tactile display, e.g. vibrating phone. ISO 9241-940:2017 is not applicable to standard input devices such as keyboards, mice or track balls. NOTE ISO 9241-400 covers standard input devices, and ISO 9241-411 applies to the evaluation of input devices such as keyboards and mice. ISO 9241-940:2017 can be used to identify the types of methods and measures for - establishing benchmarks, - establishing requirements for haptic interaction, - identifying problems with haptic interaction (formative evaluation), and - use of the criteria to establish whether a haptic system meets requirements (summative evaluation).


DS/ISO/IEC/IEEE 26511:2018

Systems and software engineering - Requirements for managers of information for users of systems, software, and services

This document supports the needs of users for consistent, complete, accurate, and usable information. It provides requirements for strategy, planning, managing, staffing, translation, production, and quality and process-maturity assessment for managers of information for users. It specifies processes and procedures for managing information for users throughout the product- or systems-development life cycle. It also includes requirements for key documents produced for managing information for users, including strategic and project plans.*This document provides an overview of the information-management processes that are specific for the management of information for users. It addresses the following activities:*— developing a comprehensive strategy for information development;*— assessing user information needs;*— planning and managing an information-development project;*— staffing and forming information-development teams;*— reviewing and testing information for users;*— managing the translation process;*— publishing and delivering information for users;*— evaluating customer satisfaction and information quality;*— measuring productivity, efficiency, and costs; and*— evaluating organizational maturity.*The guidance in this document applies to multiple project management approaches, including both agile and traditional practices. Traditional practices can encompass predictive, waterfall, or other top-down management methods. Where certain practices are common in agile project management, they are noted.*This document is applicable for use by managers of information for users or organizations with information developers. This document can also be consulted by those with other roles and interests in the process of developing information for users:*— managers of the product and system development process;*— acquirers of information for users prepared by suppliers;*— experienced information developers who prepare information for users;*— human-factors experts who identify principles for making information for users more accessible and easily used; and*— user interface designers and ergonomics experts working together to design the presentation of information.*This document can be applied to manage the following types of information for users, although it does not cover all aspects of them:*— information for user assistance, training, marketing, and systems documentation for product design and development, based on reuse of user information topics;*— multimedia marketing presentations using animation, video, and sound;*— information developed for virtual and augmented reality presentations;*— computer-based training (CBT) packages and course materials intended primarily for use in formal training programs; and*— information describing the internal operation of products.


PD ISO/IEC TS 23884:2021

Information technology. Computer graphics, image processing and environmental data representation. Material property and parameter representation for model-based haptic simulation of objects in virtual, mixed and augmented reality (VR/MAR) (British Standard)

This document specifies:

 

—    physical and material parameters of virtual or real objects expressed to support comprehensive haptic rendering methods, such as stiffness, friction and micro-textures;

 

—    a flexible specification of the haptic rendering algorithm itself.

 

It supplements other standards that describe scene or content description and information models for virtual and mixed reality, such as ISO/IEC 19775 and ISO/IEC 3721-1.


IEEE 1857.9-2021

IEEE Standard for Immersive Visual Content Coding

New IEEE Standard - Active. Efficient coding tool sets for compression, decompression, and reconstructing of the immersive visual content data is provided. The target applications and services include, but are not limited to, virtual reality (VR), such as unmanned aerial vehicle-based VR, augmented reality, panorama video, free-view TV, panoramic stereo video, and other video-/audio-enabled services and applications, such as immersive video streaming, broadcasting, storage, and communication. 


SAE J 1757-2-2018

Standard - Optical System HUD for Automotive

This SAE Standard provides measurement methods to determine HUD optical performance in typical automotive ambient lighting conditions. It covers indoor measurements with simulated outdoor lighting for the measurement of HUD virtual images. HUD types addressed by this standard includes w-HUD (windshield HUD) and c-HUD (combiner HUD) with references to Augmented Reality (AR) HUD as needed. It is not the scope of this document to set threshold values for automotive compliance; however, some recommended values are presented for reference.


ISO/IEC/IEEE 26511:2018

Systems and software engineering - Requirements for managers of information for users of systems, software, and services

This document supports the needs of users for consistent, complete, accurate, and usable information. It provides requirements for strategy, planning, managing, staffing, translation, production, and quality and process-maturity assessment for managers of information for users. It specifies processes and procedures for managing information for users throughout the product- or systems-development life cycle. It also includes requirements for key documents produced for managing information for users, including strategic and project plans.

This document provides an overview of the information-management processes that are specific for the management of information for users. It addresses the following activities:

— developing a comprehensive strategy for information development;

— assessing user information needs;

— planning and managing an information-development project;

— staffing and forming information-development teams;

— reviewing and testing information for users;

— managing the translation process;

— publishing and delivering information for users;

— evaluating customer satisfaction and information quality;

— measuring productivity, efficiency, and costs; and

— evaluating organizational maturity.

The guidance in this document applies to multiple project management approaches, including both agile and traditional practices. Traditional practices can encompass predictive, waterfall, or other top-down management methods. Where certain practices are common in agile project management, they are noted.

This document is applicable for use by managers of information for users or organizations with information developers. This document can also be consulted by those with other roles and interests in the process of developing information for users:

— managers of the product and system development process;

— acquirers of information for users prepared by suppliers;

— experienced information developers who prepare information for users;

— human-factors experts who identify principles for making information for users more accessible and easily used; and

— user interface designers and ergonomics experts working together to design the presentation of information.

This document can be applied to manage the following types of information for users, although it does not cover all aspects of them:

— information for user assistance, training, marketing, and systems documentation for product design and development, based on reuse of user information topics;

— multimedia marketing presentations using animation, video, and sound;

— information developed for virtual and augmented reality presentations;

— computer-based training (CBT) packages and course materials intended primarily for use in formal training programs; and

— information describing the internal operation of products.


CSA ISO/IEC/IEEE 26511-2020

Systems and software engineering - Requirements for managers of information for users of systems, software, and services (Adopted ISO/IEC/IEEE 26511:2018, second edition, 2018-12)

CSA Preface Standards development within the Information Technology sector is harmonized with international standards development. Through the CSA Technical Committee on Information Technology (TCIT), Canadians serve as the SCC Mirror Committee (SMC) on ISO/IEC Joint Technical Committee 1 on Information Technology (ISO/IEC JTC1) for the Standards Council of Canada (SCC), the ISO member body for Canada and sponsor of the Canadian National Committee of the IEC. Also, as a member of the International Telecommunication Union (ITU), Canada participates in the International Telegraph and Telephone Consultative Committee (ITU-T). This Standard has been formally approved, without modification, by the Technical Committee and has been developed in compliance with Standards Council of Canada requirements for National Standards of Canada. It has been published as a National Standard of Canada by CSA Group. Scope This document supports the needs of users for consistent, complete, accurate, and usable information. It provides requirements for strategy, planning, managing, staffing, translation, production, and quality and process-maturity assessment for managers of information for users. It specifies processes and procedures for managing information for users throughout the product- or systems-development life cycle. It also includes requirements for key documents produced for managing information for users, including strategic and project plans. This document provides an overview of the information-management processes that are specific for the management of information for users. It addresses the following activities: — developing a comprehensive strategy for information development; — assessing user information needs; — planning and managing an information-development project; — staffing and forming information-development teams; — reviewing and testing information for users; — managing the translation process; — publishing and delivering information for users; — evaluating customer satisfaction and information quality; — measuring productivity, efficiency, and costs; and — evaluating organizational maturity. The guidance in this document applies to multiple project management approaches, including both agile and traditional practices. Traditional practices can encompass predictive, waterfall, or other topdown management methods. Where certain practices are common in agile project management, they are noted. This document is applicable for use by managers of information for users or organizations with information developers. This document can also be consulted by those with other roles and interests in the process of developing information for users: — managers of the product and system development process; — acquirers of information for users prepared by suppliers; — experienced information developers who prepare information for users; — human-factors experts who identify principles for making information for users more accessible and easily used; and — user interface designers and ergonomics experts working together to design the presentation of information. This document can be applied to manage the following types of information for users, although it does not cover all aspects of them: — information for user assistance, training, marketing, and systems documentation for product design and development, based on reuse of user information topics; — multimedia marketing presentations using animation, video, and sound; — information developed for virtual and augmented reality presentations; — computer-based training (CBT) packages and course materials intended primarily for use in formal training programs; and — information describing the internal operation of products.


SAE J 2892-2021

Graphics-Based Service Information

This document establishes standard graphical symbols and color conventions for use in either still (static) or animated graphics used for communicating service information. This document’s purpose is to communicate conventions for using those symbols and colors to accurately and consistently communicate intended information via graphics-based documentation. These practices are intended for use in service procedures, assembly instructions, training materials, and similar applications when trying to minimize the amount of human natural language text used within the document. The still and animated graphical conventions referenced should support effective communication via paper and “traditional” electronic media. The conventions can also extend to documenting via additional electronic delivery paradigms such as augmented reality (AR).   This document is intended for organizations interested in using graphics-based documentation to record and communicate assembly, adjustment, maintenance, and other service procedures.   Adoption of this document’s recommendations involves a series of business decisions. An organization choosing to follow this recommended practice is able to decide to implement the entire set of recommendations or to selectively adopt only those recommendations it determines are appropriate for their unique needs and situation. Short- and long-term retention of an organization’s legacy symbols and conventions are options to consider. Implementation may be partial or progress through multiple stages towards the full set of recommendations. In all situations, realizing this document’s maximum, long-term benefits for all companies, organizations, and people requires that the symbols, colors, and conventions recommended be widely taught and applied.


ANSI Logo

As the voice of the U.S. standards and conformity assessment system, the American National Standards Institute (ANSI) empowers its members and constituents to strengthen the U.S. marketplace position in the global economy while helping to assure the safety and health of consumers and the protection of the environment.

CUSTOMER SERVICE
NEW YORK OFFICE
ANSI HEADQUARTERS