Skip to main content
U.S. flag

An official website of the United States government

All Baselines Tests for Documents

1. Keyboard Accessible

Accessibility Requirements

  • WCAG SC 2.1.1 Keyboard – All functionality of the content is operable through a keyboard interface without requiring specific timings for individual keystrokes, except where the underlying function requires input that depends on the path of the user’s movement and not just the endpoints.
  • WCAG SC 2.1.2 No Keyboard Trap – If keyboard focus can be moved to a component of the page using a keyboard interface, then focus can be moved away from that component using only a keyboard interface, and, if it requires more than unmodified arrow or tab keys or other standard exit methods, the user is advised of the method for moving focus away.
  • Conformance Requirement 5: Non-Interference The following success criteria apply to all content on the page, including content that is not otherwise relied upon to meet conformance, because failure to meet them could interfere with any use of the page:
    • 1.4.2 - Audio Control
    • 2.1.2 - No Keyboard Trap
    • 2.3.1 - Three Flashes or Below Threshold, and
    • 2.2.2 - Pause, Stop, Hide.

Test Method Rationale

This requirement relies on use of a keyboard to validate access and control of all functionalities of the content first by checking use of standard keyboard commands (Tab, Space Bar, Enter, Escape, etc.). If a document uses non-standard keyboard commands, the document must clearly document the commands and make users aware that the commands exist.

Keyboard access and control includes the ability to navigate to AND away from interactive content using only a keyboard.

Limitations, Assumptions, or Exceptions

  • This test was written to be performed on a standard physical keyboard for a Windows PC. While keyboard emulators (such as on-screen keyboards, alternate keyboards, speech input, etc.) may be utilized, testing instructions may differ. Mouse Keys (a Windows and Mac OS feature that enables control of the mouse pointer by keyboard) is not a keyboard emulator.

  • Notes from SC 2.1.1:

    • Note 1: This exception relates to the underlying function, not the input technique. For example, if using handwriting to enter text, the input technique (handwriting) requires path-dependent input, but the underlying function (text input) does not.
    • Note 2: This does not forbid and should not discourage providing mouse input or other input methods in addition to keyboard operation.
  • Note from SC 2.1.2:

    • Note 1: Since any content that does not meet this success criterion can interfere with a user’s ability to use the whole document, all content in a document (whether it is used to meet other success criteria or not) must meet this success criterion. See Conformance Requirement 5: Non-Interference.

1.A Test Procedure for Keyboard Access

Baseline Test ID: 1.A-KeyboardAccess

Identify Content

All functionality of the content that is available by mouse control must be keyboard accessible. Determine the functionality of visible and hidden interactive document components (links, form fields, drop down menus, show/hide content, tree views, pop ups, etc.) available using a mouse (hover and/or click).

Test Instructions

  1. Check that all functionality can be accessed and executed using only the keyboard. [SC 2.1.1]
    1. Use the keyboard to perform functions available by mouse (including drop-down menus, form fields, revealing/hiding content, tooltips, **AND** all interactive interface components).
    2. If an interactive component is not available by keyboard, check if another keyboard control with the same functionality is provided. (All functionalities must meet this requirement.)
  2. Check that individual keystrokes do not require specific timings for activation. [SC 2.1.1]
    1. If operation requires specific timings of individual keystrokes, check if another control is provided on the page with the same functionality which does not require specific timings for operation. (All functionality must be available without requiring specific timings for individual keystrokes to operate.)

Test Results

If any of the above checks fail, then Baseline Test 1.A-KeyboardAccess fails.

1.B Test Procedure for No Keyboard Trap

Baseline Test ID: 1.B-NoKeyboardTrap

Identify Content

Components that receive keyboard focus.

Test Instructions

  1. Check that focus can be moved away from the component. There must be NO "TRAP" that disrupts keyboard navigation. [SC 2.1.2, Conformance Requirement 5]
    1. If a keyboard trap is found, inspect any help (contextual help, or application help) and documentation for notification of available alternate keyboard commands (e.g., non-standard keyboard controls, access keys, hotkeys).
    2. If nonstandard keyboard commands are required to navigate away from a component or set of components, check that the commands work.

Test Results

If the above check fails, then Baseline Test 1.B-NoKeyboardTrap fails.

Advisory: Tips for Streamlined Test Processes

  • Keyboard focusable components include links, form fields, drop-down menus, show/hide content, tree views, embedded objects, and pop ups. Focusable components may also be “hidden”, positioned off-screen, and/or have no visible indication of focus.
  • Keyboard commands include standard and any nonstandard keyboard commands.
  • This test may be combined with tests for keyboard focus.
  • Tips and techniques for finding hidden content may be useful for testers.
  • Based on the document format being tested, it may be useful for testers to reference the applications’ guide for keyboard commands.
  • Content that is found non-conformant with SC 2.1.1 may be marked for further review for a Section 508 exception if “the underlying function requires input that depends on the path of the user’s movement and not just the endpoints”.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

2. Focus

Accessibility Requirements

  • WCAG SC 2.4.3 Focus Order – If a document can be navigated sequentially and the navigation sequences affect meaning or operation, focusable components receive focus in an order that preserves meaning and operability.
  • WCAG SC 2.4.7 Focus Visible – Any keyboard operable user interface has a mode of operation where the keyboard focus indicator is visible.
  • WCAG SC 3.2.1 On Focus – When any user interface component receives focus, it does not initiate a change of context.

Test Method Rationale

Manually navigating or controlling document components by keyboard-only will enable a tester to identify when there is no visual differentiation between a focused item and the rest of the document or content. Using the keyboard to navigate facilitates inspection of focus order.

Limitations, Assumptions, or Exceptions

  • Some interface components (e.g., screen text for form filling instructions), which are not normally considered interactive, may be in the tab order. Such interface components should receive a visible indication of focus when the user navigates to them using a keyboard.
  • Loss of visible focus should not occur while manually shifting focus through the page (using the TAB or arrow keys). However, when a function that moves the focus is executed (such as an internal page link or hidden content is revealed), it may be necessary to manually shift focus once with the keyboard before focus becomes visible again. This is not considered a failure.
  • Focus may be moved to a control either via the keyboard (e.g., tabbing to a control) or the mouse (e.g., clicking on a text field). Moving the mouse over a control does not move the focus unless scripting implements this behavior.
  • While it may be a common best practice, Focus Order is not required to move left to right, top to bottom.
  • Focus order includes forward and backward navigation.
  • Without exception, focus must shift to modal dialog boxes and remain within the dialog box until the box is closed by the user.
  • For some types of controls, clicking a control may also activate the control (e.g., button), which may, in turn, initiate a change in context. Controls that are clearly labeled and intended to initiate a change in context do not fail under this test.
  • This test evaluates 3.2.1 On Focus using only the keyboard to avoid unintentional activation of controls with a mouse.
  • Changes of context are major changes in content that, if made without user awareness, can disorient users who are not able to view the entire page simultaneously. Changes in context include changes of:
    1. User agent
    2. Viewport
    3. Focus
    4. Content that changes the meaning of the page
      • Note: A change of content is not always a change of context. Changes in content, such as an expanding outline, dynamic menu, or a tab control do not necessarily change the context, unless they also change one of the above (e.g., focus).
      • Examples: Opening a new window, moving focus to a different component, going to a new document or window (including anything that would look to a user as if they had moved to a new document) or significantly re-arranging the content of a page/screen are examples of changes of context.

2.A Test Procedure for Focus Visible

Baseline Test ID: 2.A-FocusVisible

Identify Content

Keyboard accessible interface components (e.g., links, form fields, drop-down menus, show/hide content, tree views, and pop ups).

Test Instructions

  1. Use the keyboard to navigate through each interface component.
  2. Check that a visible indication of focus is provided when focus is on the interface component. The focus indicator must not be limited; when the keyboard focus is shown it must remain. [SC 2.4.7]

Test Results

If any of the above checks fail, then Baseline Test 2.A-FocusVisible fails.

2.B Test Procedure for Focus Order

Baseline Test ID: 2.B-FocusOrder

Identify Content

Keyboard accessible document components (links, form fields, drop-down menus, show/hide content, tree views, and pop ups, etc.) that have a meaningful sequence of navigation.

Test Instructions

  1. Use the keyboard to navigate through document components.
    1. Use the keyboard to activate trigger controls that reveal hidden content (menus, dialogs, expandable tree list, etc.).
      1. Check that the revealed focusable content is included in the focus order. [SC 2.4.3]
      2. Advance the focus through the revealed content.
    2. Use the keyboard to close/hide the revealed content.
      1. Check that focus is returned to the trigger control. It is acceptable to Shift+ TAB once or use an arrow key to move the focus backward to the trigger control. [SC 2.4.3]
  2. Check that the focus order preserves the meaning and usability of the page. [SC 2.4.3]

Test Results

If any of the above checks fail, then Baseline Test 2.B-FocusOrder fails.

2.C Test Procedure for On Focus

Baseline Test ID: 2.C-OnFocus

Identify Content

Keyboard accessible document components (links, form fields, drop-down menus, show/hide content, tree views, and pop ups, etc.).

Test Instructions

  1. Use the keyboard to move focus to and navigate through each interactive document component (including form drop-down lists and form fields).
  2. Check that when a document component receives focus, it does not initiate an unexpected change of context. [SC 3.2.1]
    • Forms submitted automatically when a component receives focus
    • New document window or browser launched when a component receives focus
    • Focus is moved to another component

Test Results

If any of the above checks fail, then Baseline Test 2.C-OnFocus fails.

Advisory: Tips for Streamlined Test Processes

  • The clarity of visible focus is subjective, and the minimum level is the application’s (or OS platform’s) default display setting for indicating focus. Applications may also represent visual focus differently in specific situations.
  • This test may be performed simultaneously with Baseline 1: Keyboard Access.
  • No focus modifications should be enabled in the test environment during testing. Some testing tools will add a visible outline around elements that receive focus. While testing tools may help testers to track focus, any markup provided by a testing tool should not be used as an indicator of visible focus for meeting this requirement.
  • Given the variability in how applications may present visual focus in specific situations, test reports should include details about testing environment, including application and version.
  • Tab order that initially appears illogical may still meet this requirement due to an application-specific business logic.
  • It may be useful to combine these tests with tests for keyboard navigation and visible focus.
  • It may be useful to provide instructions about what “modal dialog boxes” are and how they should behave.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

3. Non-Interference

Accessibility Requirements

  1. WCAG Conformance Requirement 5: Non-Interference – The following success criteria apply to all content in the document, including content that is not otherwise relied upon to meet conformance, because failure to meet them could interfere with any use of the content:
    • 1.4.2 - Audio Control,
    • 2.1.2 - No Keyboard Trap,
    • 2.3.1 - Three Flashes or Below Threshold, and
    • 2.2.2 - Pause, Stop, Hide.

Test Method Rationale

The test results for SC’s 1.4.2 (Baseline Test 21.D-AudioControl), 2.1.2 (1.B-NoKeyboardTrap), 2.3.1 (9.A-Flashes), and 2.2.2 (21.B-MovingInfo and 21.C-AutoUpdate) determine the result of this baseline test.

Limitations, Assumptions, or Exceptions

None

3.A Test Procedure for Non-Interference

Baseline Test ID: 3.A-NonInterference

Identify Content

Results of Baseline Tests 21.D-AudioControl, 1.B-NoKeyboardTrap, 9.A-Flashes, 21.B-MovingInfo, and 21.C-AutoUpdate.

Test Instructions

  1. Check that all of the test results are pass. [CR5]

Test Results

If any of the above checks fail, then Baseline Requirement 3.A-NonInterference fails.

Advisory: Tips for Streamlined Test Processes

  • This test result is a logical AND of the identified SCs. All must pass for this test result to pass.
  • A reporting tool may be utilized to generate the result for Conformance Requirement 5.

WCAG 2.2 Techniques

Not Applicable.

4. Repetitive Content – Not Applicable to Documents

No Accessibility Requirements

This Baseline test for repetitive content checks for WCAG SCs that Section 508 does not apply to non-web documents. It is from the ICT Testing Baseline for Web and was not removed to maintain harmonization.

Per Section 508 E205.4 Accessibility Standard for electronic content exception, non-web documents shall not be required to conform to the following success criteria:

  • WCAG SC 2.4.1 Bypass Blocks – A mechanism is available to bypass blocks of content that are repeated on multiple Web pages.
  • WCAG SC 3.2.3 Consistent Navigation – Navigational mechanisms that are repeated on multiple Web pages within a set of Web pages occur in the same relative order each time they are repeated, unless a change is initiated by the user.
  • WCAG SC 3.2.4 Consistent Identification – Components that have the same functionality within a set of Web pages are identified consistently.

5. User Controls

Accessibility Requirements

  • WCAG2 SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies.

Test Method Rationale

The purpose of this Baseline test is to check the following accessibility properties for user controls:

  • Name
  • Role
  • State
  • Value

Limitations, Assumptions, or Exceptions

  • User interface component is a part of the content that is perceived by users as a single control for a distinct function. User interface components include form elements and links as well as components generated by scripts. This test uses the term “user controls” for brevity.
  • The accessibility properties of the user control must be correct if the user control changes.
  • Per WCAG 2.2 Understanding SC 1.4.1 Use of Color authors cannot set the visited state of links. The anchor element does not include a “visited” attribute; therefore, the author has no ability to alter the state through an attribute setting. Exclude visited/unvisited state of links from this Baseline test.

5.A Test Procedure for Control Name

Baseline Test ID: 5.A-ControlName

Identify Content

Identify user controls for a distinct function. Exclude forms and links as these are covered by Baseline 10: Forms and Baseline 14: Links, respectively

Test Instructions

  1. Check that the combination of the accessible name and accessible description is not empty. [SC 4.1.2]
  2. Check that the non-empty combination of the accessible name and accessible description describes the control's purpose. [SC 4.1.2]
  3. If the name of the user control changes on user interaction, repeat the previous test steps and check that the accessible name is correct after the change.
    • Depending on the control, a change of name may be triggered by various actions, such as changing values or states of other components, toggling a function, entering data in the component, mouseover, etc.
    • Examples include entering a response in a form field for country changes the name of a button to "Save Changes", and selecting a control toggles its functionality from sorting ascending to descending.

Test Results

If any of the above checks fail, then Baseline Test 5.A-Control Name fails.

5.B Test Procedure for Control Role

Baseline Test ID: 5.B-ControlRole

Identify Content

Identify user controls for a distinct function that have an explicit role attribute. Examples include forms, links, and toggle controls.

Test Instructions

  1. Check that the role of the user control is valid and appropriate for its function. [SC 4.1.2]

Test Results

If any of the above checks fail, then Baseline Test 5.B-ControlRole fails.

5.C Test Procedure for Control State

Baseline Test ID: 5.C-ControlState

Identify Content

Identify user controls for a distinct function that can be set by the user. Examples of such user controls include those that can be checked, expanded, hidden, and pressed. Exclude the visited/unvisited state of links.

Test Instructions

  1. Check that the state of the user control is correct. [SC 4.1.2]
  2. If the state of the user control changes with use of the application, check that the state of the user control is correct after a change of state. [SC 4.1.2]
    • Depending on the control, a change of state may be triggered by various actions, such as changing values or states of other components, toggling a function, entering data in the component, mouseover, etc.
    • Examples include a disabled "Submit" button is enabled when all required form fields are filled in, a link becomes visible after a user-initiated calculation completes, a check box changes from checked to unchecked, a table column sort control is toggled from ascending to descending.

Test Results

If any of the above checks fail, then Baseline Test 5.C-ControlState fails.

5.D Test Procedure for Control Value

Baseline Test ID: 5.D-ControlValue

Identify Content

Identify controls that have a value that can be changed by a user. Examples include form fields and sliders.

Test Instructions

  1. Check that the value of the user control is correct. [SC 4.1.2]
  2. Modify the value of the user control. Depending on the control, a change of value may be performed by entering a number, selecting from a list of options, etc.
  3. Check that the value of the user control is correct after the user-initiated change of value. [SC 4.1.2]

Test Results

If any of the above checks fail, then Baseline Test 5.D-ControlValue fails.

Advisory: Tips for Streamlined Test Processes

  • Changes to controls may also include changes in color to convey information. If so, this test should check that the information is programmatically determinable. If color is used as the only visual means of conveying information (or changes in information), then the content would fail to meet SC 1.4.1 Use of Color (addressed in Baseline 7. Sensory Characteristics).
  • The accessible name and accessible description of some user controls are tested in other Baseline tests, such as Baseline 10. Forms, Baseline 14. Links.
  • For user controls that have dedicated Baseline Tests, please map to those tests for accessible name instead of 5.A-ControlName.
  • This test may require interaction with controls to assess changes in name, role, state, value. Interaction instructions such as a test plan may be helpful.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

6. Images

Accessibility Requirements

  • WCAG2 SC: 1.1.1. Non-Text – All non-text content that is presented to the user has a text alternative that serves the equivalent purpose, except for [specific] situations.
  • WCAG2 SC: 1.4.5 Images of Text – If the technologies being used can achieve the visual presentation, text is used to convey information rather than images of text except for [specific situations].
  • WCAG2 SC: 4.1.2 Name, Role, Value – For all user interface components (including but not limited to: form elements, links, and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies.

Test Method Rationale

  • The image tests evaluate the images as coded to discern whether the author of the content has determined they are meaningful or decorative. However, there are certain scenarios, as described in the tests, where the author’s programmatic determination could be incorrect.
  • The tests include guidance from the W3C Web Accessibility Initiative Images Tutorial.
  • All images must be evaluated. Multiple tests may apply to an image.

Limitations, Assumptions, Exceptions

  • An image that has a non-empty text alternative has been determined to be meaningful by the content author. The author has decided that this image should not be ignored by assistive technology.
  • An image that has an empty text alternative has been determined to be decorative by the content author. The author has determined that this image should be ignored by assistive technology.
  • Commonly used image formats include .jpg, .png, .svg, .gif, .tiff, and .bmp. Other graphic formats are also in use and should be considered for this test.
  • Decoration, Formatting, Invisible: If the image is pure decoration, is only used for visual formatting, or is not presented to users, then it is implemented in a way that it can be ignored by assistive technology.
  • Images of text which are essential to the information being conveyed are exempt from SC 1.4.5. Logotypes (text that is part of a logo or brand name) are considered essential.
  • The definition of image of text contains the note: “Note: This does not include text that is part of a picture that contains significant other visual content.” Examples of such pictures include graphs, screenshots, and diagrams which visually convey important information through more than just text.

6.A Test Procedure for Images with a non-empty text alternative

Baseline Test ID: 6.A-MeaningfulImage

Identify Content

Identify any image that has a non-empty text alternative (combination of the accessible name and accessible description) per the HTML Accessibility API Mappings 1.0 for img.

Test Instructions

  1. Check that none of the following is true [SC 1.1.1]:
    1. The image is page design/formatting and could be ignored by assistive technology without any loss of meaning.
    2. The image is not visible on the page.
  2. Check that the non-empty text alternative (combination of accessible name and accessible description) provides an equivalent description of the image's purpose. [SC 1.1.1]

Test Results

If any of the above checks fail, then Baseline Test 6.A-MeaningfulImage fails.

6.B Test Procedure for Images with an empty text alternative

Baseline Test ID: 6.B-DecorativeImage

Identify Content

Identify any image (including a background image) that has an empty text alternative.

Test Instructions

  1. Check that the empty text alternative has been programmatically assigned using one of the following techniques [SC 1.1.1]:
    1. The image is marked as decorative.
    2. The image is marked as an artifact.
    3. The image is only part of the background, header, footer, or on a hidden layer.
  2. Check that none of the following is true [SC 1.1.1]:
    1. The image is the only way to convey meaningful information.
    2. The image is in the tab order.
    3. The image is a functional image that initiates action.

Test Results

If any of the above checks fail, then Baseline Test 6.B-DecorativeImage fails.

6.C Test Procedure for Captchas – Not applicable to documents

Baseline Test ID: 6.C-Captcha

Captchas are not implemented in non-web documents, so this test is not applicable. It was not removed to maintain harmonization with the ICT Testing Baseline for Web.

Test Results

Baseline Test 6.C-Captcha is not applicable to documents.

6.D Test Procedure for Images of Text

Baseline Test ID: 6.D-ImageText

Identify Content

Identify any images of text, except where a particular presentation of text is essential to the information being conveyed (e.g., logotypes or text that is part of a logo or brand name).

Test Instructions

  1. Check that using text cannot achieve the same visual presentation and effect as images of text. [SC 1.4.5]
  2. Check that the image of text can be visually customized to a user's requirements. [SC 1.4.5]
    For example, document content allows users to specify font, size, color, and background settings, and all images of text are then provided based on those settings (e.g., SmartArt, text art).

Test Results

If any of the above checks fail, then Baseline Test 6.D-ImageText fails.

Advisory: Tips for Streamlined Test Processes

None

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

7. Sensory Characteristics

Accessibility Requirements

  • WCAG2 SC 1.1.1 Non-Text – All non-text content that is presented to the user has a text alternative that serves the equivalent purpose, except for [specific] situations.
  • WCAG SC 1.3.3 Sensory Characteristics – Instructions provided for understanding and operating content do not rely solely on sensory characteristics of components such as shape, size, visual location, orientation, or sound.
  • WCAG SC 1.4.1 Use of Color – Color is not used as the only visual means of conveying information, indicating an action, prompting a response, or distinguishing a visual element.

Test Method Rationale

Users affected by this requirement may be sighted and not limited to users of assistive technology (AT), and include those with Color Vision Deficiency. Visual inspection is required to determine the adequacy of instructions or content to account for any limitations of sensory or color perceptions.

Limitations, Assumptions, or Exceptions

  • SC 1.4.1 does not prohibit the use of color and can be met with another visual cue (e.g., color and shape). Text Alternative descriptions that are not available visually would not pass these tests.
  • Per WCAG 2.2 Understanding SC 1.4.1 Use of Color, where color alone distinguishes between visited and unvisited links, it does not result in a failure of this Success Criterion.
  • Per WCAG 2.2 Understanding SC 1.4.1 Use of Color, use of colors that differ in color (hue) and lightness with a contrast ratio of 3:1 or greater meet this requirement. However, if content relies on the user’s ability to accurately perceive or differentiate a particular color, an additional visual indicator will be required regardless of the contrast ratio between those colors.
  • SC 1.3.3 applies to instructions and cannot be met by providing multiple sensory characteristics (e.g., color and shape).
  • The test for audible cues covers short sounds used to notify the user, such as confirmation beeps and error notifications. Audio in time-based media is covered in Baseline 16. Audio-only and Video-only.

7.A Test Procedure for Use of Color

Baseline Test ID: 7.A-Color

Identify Content

Content that uses color to convey meaning, indicate an action, prompt a response, distinguish a visual element, or identify errors. Exclude colors of links that indicate visited or unvisited.

Test Instructions

  1. Check if one or more of the following is true:
    1. The element using color to convey meaning also provides on-screen alternate text describing the color and/or the meaning conveyed by the color when the user must be able to accurately perceive or differentiate a particular color. [SC 1.4.1]
    2. The content using color to convey meaning also provides other visual differentiation (e.g., shape, position, size, underline) with a clear indication of its meaning when the user must be able to accurately perceive or differentiate a particular color. [SC 1.4.1]
    3. The content using only a difference in colors to convey meaning uses colors (hues) that have a contrast ratio of 3:1 or greater. This content does not require the user to be able to accurately perceive or differentiate a particular color. [SC 1.4.1]

Test Results

If the above check fails, then Baseline Test 7.A-Color fails.

7.B Test Procedure for Sensory Characteristics

Baseline Test ID: 7.B-SensoryCharacteristics

Identify Content

Identify instructions for understanding and operating content that rely on sensory information to convey information. This may include references to shape, size, visual location, orientation, or sound.

Test Instructions

  1. Check that the instructions contain additional information that allows it to be located, identified, and understood without any knowledge of its shape, size, or relative position. [SC 1.3.3]
    For example:
    • To see your changes, select the round button labeled "Go".
    • The links on the right, with the heading "Resources", provide further information.
    • Select the lower-right "Cancel" button to close this session.
  2. Check that any auditory cues also provide programmatically determinable or textual cues. [SC 1.3.3].
    For example:
    • At the sound of the beep and the appearance of the timer, begin the quiz.

Test Results

If any of the above checks fail, then Baseline Test 7.B-SensoryCharacteristics fails.

7.C Test Procedure for Audible Cues

Baseline Test ID: 7.C-AudibleCues

Identify Content

Identify any short sound/audible cue that serves as a notification to the user, such as a beep that signifies an error has occurred or a chime to indicate an incoming message.

Test Instructions

  1. Check that a text alternative that describes the purpose of the sound is provided with the audible cue. [SC 1.1.1]
    For example:
    • A short beep and an asterisk appear on a required field to notify the user that the field must be completed.
    • As a timer counts down, a bell rings and a "Two minutes left!" message appears on the screen.

Test Results

If any of the above checks fail, then Baseline Test 7.C-AudibleCues fails.

Advisory: Tips for Streamlined Test Processes

  • Content that uses color must have an additional visual cue. Instructions that rely on a sensory characteristic must have a non-sensory cue.
  • Displaying content in greyscale may help identify content that uses only color to convey information.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

8. Contrast

Accessibility Requirements

  • WCAG SC 1.4.3 Contrast (minimum) – The visual presentation of text and images of text has a contrast ratio of at least 4.5:1, except for the following:

    • Large Text: Large-scale text and images of large-scale text have a contrast ratio of at least 3:1.
    • Incidental: Text or images of text that are part of an inactive user interface component, that are pure decoration, that are not visible to anyone, or that are part of a picture that contains significant other visual content, have no contrast requirement.
    • Logotypes: Text that is part of a logo or brand name has no contrast requirement.

Test Method Rationale

This test is conducted to evaluate equal access to information for all users, including those who may experience difficulty in discerning between items with low contrast.

Limitations, Assumptions, or Exceptions

  • Exception: The following types of text and images of text are not included in this test:
    • Logotypes: logo or brand name
    • Inactive (disabled) user interface components
    • Pure decoration purposes and not meaningful, having no functionality
    • Contained within a picture that contains significant other visual content
  • Testing of text contrast changes includes changes due to mouse hover and selection status.

  • Disabled input elements do not receive keyboard focus, cannot be selected, and cannot be modified. These are not required to meet contrast ratio requirements.
    Note: Read-only and disabled interface components are not the same. Disabled interface components can be considered inactive interface components; read-only interface components are active interface components and must meet contrast ratio requirements.
  • Large-scale text is at least 18-point text or 14-point bold text.

8.A Test Procedure for Contrast (minimum)

Baseline Test ID: 8.A-ContrastMinimum

Identify Content

All visible text AND images of text (except those noted in Limitations, Assumptions, or Exceptions above)

Test Instructions

  1. Determine the contrast ratio of foreground text and background.
  2. Check that the contrast ratio is at least 4.5:1. [SC 1.4.3]
  3. If the contrast ratio is less than 4.5:1, check that the ratio is at least 3:1 AND the font meets one of the following criteria: [SC 1.4.3]
    • At least 18 point (24 pixels)
    • At least 14 point (18.5 pixels) AND bold (at least 700 font weight)

Test Results

If both of the above checks fail, then Baseline Test 8.A-ContrastMinimum fails.

Advisory: Tips for Streamlined Test Processes

  • There are a variety of color contrast tools that can perform the algorithms necessary to determine the contrast. See Sufficient Technique G18 for possible testing tools that use an appropriate algorithm.

  • Use contrast tools that do not round values. A ratio of 4.499:1 would not meet the 4.5:1 threshold.

  • WCAG 2.2 Understanding 1.4.3: Contrast (Minimum) suggests using foreground and background colors obtained from the user agent, or the underlying markup and stylesheets for the contrast ratio computation, rather than the text as presented on screen.

  • While text contained in logos rendered as images is exempt from this requirement, the image must still provide alternative text (e.g., via an alt attribute).

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

9. Flashing

Accessibility Requirements

Test Method Rationale

Flashing can be caused by factors beyond the control of authors (e.g., the user’s display, the computer rendering of the image, or connectivity issues). There is no reliable, free, or widely available solution for determining the resulting flash frequency for these types of factors.

This test addresses flashing caused by the content itself, including:

  • Determining the flash rate from programmatically available information
  • Determining the pixel dimensions of any flashing element
  • Determining if the relative luminance changes by more than 10% for a pair of opposing changes in a flash
  • Determining if a pair of opposing transitions in a flash involves a saturated red

Limitations, Assumptions, or Exceptions

  • It is possible that users could view content at a resolution or from a distance much different from the intended resolution and viewing distance.
  • For the purposes of this baseline, the terms flicker and blink may be used synonymously with the term flash.
  • Blinking elements that conform to this requirement are still required to conform to SC 2.2.2 Pause, Stop, Hide, if the blinking lasts longer than 5 seconds (Baseline 21. Timed Events).
  • Note from SC 2.3.1:
    • Note 1: Since any content that does not meet this success criterion can interfere with a user’s ability to use the whole page, all content (whether it is used to meet other success criteria or not) must meet this success criterion. See Conformance Requirement 5: Non-Interference.

9.A Test Procedure for Three Flashes or Below Threshold

Baseline Test ID: 9.A-Flashes

Identify Content

Visually identify content that flashes.

Test Instructions

  1. Set the user agent at standard zoom level, e.g. 100% in a browser.
  2. If there is an option to view a larger version of the flashing content, such as a full screen mode, test the larger version. If there is an option to loop or repeat the flashing content, test the looping version.
  3. Check that one of the following is true of the flashing content: [SC 2.3.1]
    1. The flashing frequency is less than or equal to 3 flashes in any one second (3 Hertz).
    2. The flashing frequency is above 3 Hertz or cannot be determined, and at least one of the following is true:
      1. The combined area of flashes occurring concurrently occupies no more than a 341 x 256 pixel rectangle anywhere on the displayed screen area when the content is viewed at 1024 x 768 pixels.
      2. The flash does not include "general flashes" (a pair of opposing changes in relative luminance of 10% or more of the maximum relative luminance (1.0) where the relative luminance of the darker image is below 0.80; and where "a pair of opposing changes" is an increase followed by a decrease, or a decrease followed by an increase).
      3. The flash does not include any "pair of opposing transitions involving a saturated red" (a pair of opposing transitions where one transition is either to or from a state with a value R/(R + G + B) that is greater than or equal to 0.8, and the difference between states is more than 0.2 (unitless) in the CIE 1976 UCS chromaticity diagram. [[ISO_9241-391]])

Test Results

If all of the above checks fail, then Baseline Test 9.A-Flashes fails.

Advisory: Tips for Streamlined Test Processes

  • If content will be displayed or viewed at significantly different sizes or distances (e.g., responsive content intended for display across desktop, mobile, and/or other displays), the content should be evaluated for each scenario.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

10. Forms

Accessibility Requirements

  • WCAG2 SC: 1.1.1 Non-Text: All non-text content that is presented to the user has a text alternative that serves the equivalent purpose, except for [specific] situations.
    • Controls, Input: If non-text content is a control or accepts user input, then it has a name that describes its purpose. (Refer to Success Criterion 4.1.2 for additional requirements for controls and content that accepts user input.)
  • WCAG2 1.3.1 Info and Relationships: Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text.
  • WCAG 2.4.6 Headings and Labels: Headings and labels describe topic or purpose.
  • WCAG 3.2.2 On Input: Changing the setting of any user interface component does not automatically cause a change of context unless the user has been advised of the behavior before using the component.
  • WCAG2 3.3.1 Error Identification: If an input error is automatically detected, the item that is in error is identified and the error is described to the user in text.
  • WCAG2 3.3.2 Labels or Instructions: Labels or instructions are provided when content requires user input.
  • WCAG2 3.3.3 Error Suggestion: If an input error is automatically detected and suggestions for correction are known, then the suggestions are provided to the user, unless it would jeopardize the security or purpose of the content.
  • WCAG2 3.3.4 Error Prevention (Legal, Financial, Data): For documents that cause legal commitments or financial transactions for the user to occur, that modify or delete user-controllable data in data storage systems, or that submit user test responses, at least one of the following is true:
    1. Reversible: Submissions are reversible.
    2. Checked: Data entered by the user is checked for input errors and the user is provided an opportunity to correct them.
    3. Confirmed: A mechanism is available for reviewing, confirming, and correcting information before finalizing the submission.
  • WCAG2 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links, and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies.

Test Method Rationale

Review form instructions for completeness and programmatic association to their inputs. Enter erroneous inputs and review error notifications provided to the user.

Limitations, Assumptions, or Exceptions

  • Read-only (e.g., pre-filled) form fields receive keyboard focus and are selectable but cannot be modified. These must be labeled and programmatically determinable, and are tested under SC 1.3.1.
  • Disabled input elements do not receive keyboard focus, cannot be selected, and cannot be modified. These are not included in this test.
  • The combination of an element’s accessible name and accessible description is its text alternative.
  • Clicking an option or selecting an option in a form should select the option but should not initiate a change in context.
  • Change of context is a major change that, if made without user awareness, can disorient users who are not able to view the entire page simultaneously. Changes in context include changes of:
    1. User agent
    2. Viewport
    3. Focus
    4. Content that changes the meaning of the document
      • Note: A change of content is not always a change of context. Changes in content, such as an expanding outline, dynamic menu, or a tab control do not necessarily change the context, unless they also change one of the above (e.g., focus).
      • Example: Opening a new window, moving focus to a different component, going to a new page (including anything that would look to a user as if they had moved to a new page), or significantly rearranging the content of a page are examples of changes of context.
  • Per WCAG 2.2 Understanding SC 3.3.2: Labels or Instructions, this Success Criterion does not apply to links or other controls (such as an expand/collapse widget, or similar interactive components) that are not associated with data entry.

10.A Test Procedure for Form Names

Baseline Test ID: 10.A-FormName

Identify Content

  1. Find all form components. Examples include buttons, text fields, radio buttons, checkboxes, read-only fields, and drop-down lists.
  2. Find all instructions and cues (textual and graphical) that are related to form components, including groupings, order of completion, special conditions or qualifiers, format instructions, etc.

Test Instructions

  1. Check that the combination of the accessible name and accessible description is not empty. [SC 4.1.2]
  2. Check that the non-empty combination of the accessible name and accessible description describes the form's purpose. [SC 4.1.2] [Form components that include non-text content should also map to SC 1.1.1.]
  3. Check that all relevant instructions and cues (textual and graphical) have programmatic association (e.g., table column and/or row associations) to the form component. [SC 1.3.1]

Test Results

If any of the above checks fail, then Baseline Test 10.A-FormName fails.

10.B Test Procedure for Form Labels Descriptive

Baseline Test ID: 10.B-FormDescriptiveLabel

Identify Content

  1. Find all form components. Examples include buttons, text fields, radio buttons, checkboxes, drop-down lists.
  2. Find all instructions and cues (textual and graphical) that are related to form components, including groupings, order of completion, special conditions or qualifiers, format instructions, etc.

Test Instructions

  1. Check that provided labels (instructions and cues) for each form component describe purpose, inform users what input data is expected, and if applicable, what format is required. [SC 2.4.6]

Test Results

If any of the above checks fail, then Baseline Test 10.B-FormDescriptiveLabel fails.

10.C Test Procedure for On Input

Baseline Test ID: 10.C-OnInput

Identify Content

All active form components.

Test Instructions

  1. Enter data in all form fields, and exit (tab out of) the field.
  2. Change selections and/or values for form components, such as radio buttons, check boxes, drop-down lists, etc.
  3. Check that navigating away from a component and/or changing component values/selections (e.g., entering data in a text field, changing a radio button selection) does NOT initiate a change of context unless the user has been advised of the behavior before using the component. [SC 3.2.2] Examples of a change of context could include:
    • Forms submitted automatically when exiting the field.
    • Forms submitted automatically when exiting the last field in a form.
    • New windows launched when changing a radio button selection.
    • Focus is changed to another component when a drop-down item is selected.

Test Results

If any of the above checks fail, then Baseline Test 10.C-OnInput fails.

10.D Test Procedure for Error Identification

Baseline Test ID: 10.D-ErrorIdentification

Identify Content

Input form components with automatic error detection and notification.

Test Instructions

  1. Enter incorrect values in input form components in order to trigger automatic error detection that results in error notifications.
    Examples include but are not limited to:
    • required fields
    • date (format)
    • state (abbreviations in an address)
    • password
  2. If an input error is automatically detected, check that the error notification meets all of the following [SC 3.3.1]:
    • the user is made aware of the error (whether immediately upon shifting focus away from the item in error or when trying to submit the form), and
    • the error is described to the user in text, and
    • the item that is in error is identified in text.

Test Results

If any of the above checks fail, then Baseline Test 10.D-ErrorIdentification fails.

10.E Test Procedure for Form has a Visible Label

Baseline Test ID: 10.E-FormHasLabel

Identify Content

  1. Find all form components associated with data entry. Examples include buttons, text fields, radio buttons, checkboxes, multi-select lists, and drop-down lists.
  2. Find all instructions and cues (textual and graphical) that are related to the data entry form components, including groupings, order of completion, special conditions or qualifiers, format instructions, etc.

Test Instructions

  1. Check that each form component has visible label(s) or instructions while the form component has focus. \[SC 3.3.2\]

Test Results

If any of the above checks fail, then Baseline Test 10.E-FormHasLabel fails.

10.F Test Procedure for Error Suggestion

Baseline Test ID: 10.F-ErrorSuggestion

Identify Content

Input form components with automatic error detection and notification.

Test Instructions

  1. Enter incorrect values in input form components in order to trigger automatic error detection that results in error notifications. Examples include but are not limited to:
    • required fields
    • date (format)
    • state (abbreviations in an address)
    • password
  2. Review error notifications provided.
  3. Check that additional guidance (e.g., suggestion for corrected input, guidance on how to correct the user's input) is provided on how to correct errors for form fields that would not jeopardize the security or purpose of the content. [SC 3.3.3]

Test Results

If any of the above checks fail, then Baseline Test 10.F-ErrorSuggestion fails.

Baseline Test ID: 10.G-ErrorPrevention

Identify Content

Document that causes legal commitments or financial transactions for the user to occur, that modify or delete user-controllable data in data storage systems, or that submit user test responses.

Test Instructions

  1. Complete the form components necessary to submit. Include errors.
  2. Check that at least one of the following is true [SC 3.3.4]:
    1. Reversible: Submissions are reversible.
    2. Checked: Data entered by the user is checked for input errors and the user is provided an opportunity to correct them.
    3. Confirmed: A mechanism is available for reviewing, confirming, and correcting information before finalizing the submission.

Test Results

If any of the above checks fail, then Baseline Test 10.G-ErrorPrevention fails.

Advisory: Tips for Streamlined Test Processes

  • For SC 3.3.1, acceptable techniques include (a) shifting focus to an error message informing the user that the previous field needs to be corrected and describing the error, (b) refreshing the page upon form submission, then listing the error descriptions and locations at the top of the page. Re-displaying the form and indicating the fields in error within the form is insufficient to meet this requirement. The user should not need to search through the form to find where errors were made.
  • For SC 3.3.4, because the user can review a simple, 1-page form before pressing the submit button on the page, another review mechanism is not required.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

11. Document Titles

Accessibility Requirements

  1. WCAG2 SC 2.4.2 Document Titled – Documents have titles that describe topic or purpose.

Test Method Rationale

The Title property defines the title of the document and is required in all documents. This test evaluates the presence of a descriptive title for the document.

Limitations, Assumptions, Exceptions

  1. Every document must have a descriptive title. This test always applies.

  2. This test would apply to all documents in a collection of documents (e.g., PDF portfolios).

11.A Test Procedure for Document Titled

Baseline Test ID: 11.A-DocumentTitled

Identify Content

Title property for the document.

Test Instructions

  1. Check that the document's Title property is defined for the document. [SC 2.4.2]
  2. Check that the document title describes the contents or purpose of the document. [SC 2.4.2]

Test Results

If any of the above checks fail, then Baseline Test 11.A-DocumentTitled fails.

Advisory: Tips for Streamlined Test Processes

None

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

12. Tables

Accessibility Requirements

  • WCAG SC 1.3.1 Info and Relationships: Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text.
  • WCAG2 SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links, and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies.

Test Method Rationale

For assistive technology (AT) users, data tables must explicitly associate table data with table row and column headers via programmatic markup. Table markup also facilitates navigation for AT users by providing programmatic landmarks via column and row headers.

When <table> elements are used for layout purposes, data table structure elements such as row/column headers, captions, or table summaries are not permitted.

Limitations, Assumptions, Exceptions

  • Data tables are tables where information in a cell requires a row and/or column header to adequately describe the cell’s contents. If a table is used for placement of content for visual aesthetics, it is considered a layout table.
  • Some content may visually appear to require a data table structure, but linearizing the content reveals that the content is understandable without the table. These elements use styling methods to present content in columns or rows. The information conveyed does not rely on programmatic relationships between column or row headers to be understood. This content is not a data table and should not have associated programmatic table attributes. It should be tested using other baseline tests, such as 13. Structure and/or possibly 10. Forms) (assocated instructions).
  • Rows of data that are related must have a row header so assistive technology users can understand the relationship of the row’s data cells. Not every table requires a row header. For example, a calendar month is a data table, typically with the days of the week as column headers. The dates in a row are not related so typically, there is no row header present. However, if there was a cell in each row to indicate the week of the month, this cell would serve as a row header for the dates within that row.
  • Complex data tables are tables that have any one or more of these elements: multiple columns of row headers, multiple rows of column headers, and split or merged cells. These tables must incorporate formatting that establishes programmatically determinable relationships.

12.A Test Procedure for Data Table Roles

Baseline Test ID: 12.A-DataTableRole

Identify Content

All content/data visually presented in a table with column and/or row headers where the content is not in a meaningful sequence when linearized.

Note: Linearization of table content is the presentation of a table’s two-dimensional content in one-dimensional order of the content in the source, beginning with the first cell in the first row and ending with the last cell in the last row, from left to right, top to bottom.

Test Instructions

  1. Check that each data table has programmatically assigned a table role. [SC 4.1.2]
  2. Check that each data cell is programmatically assigned a data cell role. [SC 4.1.2]
  3. Identify all column and row headers for each data cell. Check that each header cell is programmatically assigned a role of header. [SC 4.1.2]

Test Results

If any of the above tests fail, Baseline Test 12.A-DataTableRole fails.

12.B Test Procedure for Data Table Header Association

Baseline Test ID: 12.B-DataTableHeaderAssociation

Identify Content

For any data table identified in 12.A, identify all column and row headers for each data cell.

Test Instructions

  1. Check that all data cells are programmatically associated with relevant header(s). [SC 1.3.1]

Test Results

If any of the above tests fail, Baseline Test 12.B-DataTableHeaderAssociation fails.

12.C Test Procedure for Layout Tables

Baseline Test ID: 12.C-LayoutTable

Identify Content

All content/data visually presented in a table that retains a meaningful sequence when linearized.

Note: Linearization of table content is the presentation of a table’s two-dimensional content in one-dimensional order of the content in the source, beginning with the first cell in the first row and ending with the last cell in the last row, from left to right, top to bottom.

Test Instructions

  1. Check that the table used purely for layout purposes does NOT include data table heading elements and/or associated attributes (e.g., row or column headers, summary, caption, scope). [SC 4.1.2]

Test Results

If any of the above tests fail, Baseline Test 12.C-LayoutTable fails.

Advisory: Tips for Streamlined Test Processes

  • Content that is presented with a table appearance, but does not rely on header association, can most easily be identified by linearization. Another helpful indicator is the table only has row headers or column headers but not both.
  • Baseline Tests 12.A and 12.C should be performed for each data table.

WCAG 2.2 Techniques

The following sufficient techniques were considered when developing this test procedure for this baseline requirement:

13. Content Structure

Accessibility Requirements

Test Method Rationale

  • Visual headings must be programmatically determinable, represent the content structure, and describe the content that follows the headings.
  • Visual lists must be programmatically determinable according to their types (bullet, numbered, multilevel).

Limitations, Assumptions, or Exceptions

  • A document with only one heading does not have a heading level structure and would not be tested for heading structure.
  • Document can have more than one heading level 1 or no heading level 1.
  • The heading level 1 on a page is not required to match the document title.
  • The order of heading levels may not always be in sequence but may be valid as it relates to the visual structure/importance communicated by visible headings on the page. For example, a heading level 2 might be used for a navigation structure that precedes a heading level 1 title on a page. Similarly, a heading level 1 may be followed by a heading level 3 without a heading level 2 between them.
  • Not all lists need markup. For instance, sentences that contain comma-separated lists may not need list markup.
  • A test for Visually Apparent Lists should not include navigation menus. While programmatic lists are often used to create navigation menus, menus may also be created using other techniques.

13.A Test Procedure for Descriptive Headings

Baseline Test ID: 13.A-HeadingDescriptive

Identify Content

Visually apparent headings, which denote sections of content. Headings are often in a larger, bolded font separated from paragraphs by extra spacing (though not always). Note the hierarchy and structure of each heading with respect to other headings on the page or screen.

Test Instructions

  1. Check that each heading describes the topic or purpose of its content. [SC 2.4.6]

Test Results

If any of the above checks fail, then Baseline Test 13.A-HeadingDescriptive fails.

13.B Test Procedure for Visual Headings Programmatic

Baseline Test ID: 13.B-VisHeadingProg

Identify Content

Visually apparent headings, which denote sections of content. Headings are often in a larger, bolded font separated from paragraphs by extra spacing (though not always). Note the hierarchy and structure of each heading with respect to other headings on the page.

Test Instructions

  1. Check that all visual headings are programmatically determinable and that programmatic heading levels logically match the visual heading presentation within the heading structure [SC 1.3.1]:
    1. The most important heading(s) should have the highest priority level. For example, heading level 1 <H1> is a higher level than heading level 2 <H2>, which is higher than heading level 3 <H3>.
    2. Headings with an equal or higher level start a new section; headings with a lower level start new subsections that are part of the higher leveled section.

Test Results

If the above check fails, then Baseline Test 13.B-VisHeadingProg fails.

13.C Test Procedure for Programmatic Headings Visual

Baseline Test ID: 13.C-ProgHeadingVisual

Identify Content

Programmatically determined headings: <h1> to <h6>

Test Instructions

  1. Check that each programmatically determinable heading is also serving as a visual heading on the page. Content that is not a visual heading cannot have a role of heading. For example, heading markup should not be used for emphasis on an element that is not a heading for content after it. [SC 1.3.1]

Test Results

If the above check fails, then Baseline Test 13.C-ProgHeadingVisual fails.

13.D Test Procedure for Visually Apparent Lists

Baseline Test ID: 13.D-List

Identify Content

Visually apparent lists.

  • A bulleted list is not numbered and is used where sequence or the ability to reference specific items by number/letter is not important. List items have the same visual marking or may have no marking.
  • A numbered list is numbered sequentially and, a multilevel list includes hierarchy (e.g., 1, 2, 2a, 2ai, etc.). These types of lists are used where sequence or the ability to reference specific items by number/letter is important.
  • A description list is used to group term(s) with their description(s). These are common in a glossary.

Test Instructions

  1. For each visually apparent list:
    1. Check that content that has the visual appearance of a list (with or without bullets) that has no special order or sequence is marked as a bulleted list. [SC 1.3.1]
    2. Check that content that has the visual appearance of a numbered list is marked as a numbered, or multilevel list. [SC 1.3.1]

Test Results

If any of the above checks fail, Baseline Test 13.D-List fails.

Advisory: Tips for Streamlined Test Processes

There is not a test to check that programmatic lists are visually apparent lists.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

Accessibility Requirements

  • WCAG SC 2.4.4 Link Purpose (In Context) – The purpose of each link can be determined from the link text alone or from the link text together with its programmatically determined link context, except where the purpose of the link would be ambiguous to users in general.
  • WCAG2 SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies.

Test Method Rationale

Links and buttons, including scripted elements, must have meaningful text (either directly associated or available in context) that describes their purpose or function. In order for associated text to be available to assistive technologies, the information must be determinable programmatically.

Limitations, Assumptions, Exceptions

  • From Understanding SC 2.4.4: There may be situations where the purpose of the link is supposed to be unknown or obscured. For instance, a game may have links identified only as door #1, door #2, and door #3. This link text would be sufficient because the purpose of the links is to create suspense for all users.
  • Programmatically determined link context is additional information that can be programmatically determined from relationships with a link, combined with the link text, and presented to users in different modalities.
    • Example: In HTML, information that is programmatically determinable from a link in English includes text that is in the same paragraph, list, or table cell as the link or in a table header cell that is associated with the table cell that contains the link.
  • The combination of an element’s accessible name and accessible description is its text alternative.

Baseline Test ID: 14.A-LinkPurpose

Identify Content

All links including those that are scripted elements that function as a link.

Test Instructions

  1. Check that the combination of accessible name and accessible description is not empty. [SC 4.1.2]
  2. Check that the purpose of each link can be determined from the link and programmatically determined linked context [SC 2.4.4]:
    • the link text,
    • the link's accessible name and accessible description,
    • text that is in the same paragraph, list, or table cell as the link,
    • the table header cell that is associated with the table cell that contains the link.

Test Results

If any of the above checks fail, then Baseline Test 14.A-LinkPurpose fails.

Advisory: Tips for Streamlined Test Processes

  • In cases where the link is to a document or a web application, the name of the document or web application would be sufficient to describe the purpose of the link (which is to take the user to the document or web application).

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

15. Language

Accessibility Requirements

  • WCAG SC 3.1.1 Language of Page – The default human language of the document can be programmatically determined.
  • WCAG SC 3.1.2 Language of Parts – The human language of each passage or phrase in the content can be programmatically determined except for proper names, technical terms, words of indeterminate language, and words or phrases that have become part of the vernacular of the immediately surrounding text.

Test Method Rationale

The default human language for the document must be programmatically identified. Passages of content that use a language other than the default must be programmatically identified.

Limitations, Assumptions, or Exceptions

  • The primary language for documents is based on the application’s language preference settings. Other language properties may be applied to the entire document, specific content, or sections of a document. Dialects specified after the primary language are not part of this test.
  • Exception: Proper names, technical terms, words of indeterminate language, and words or phrases that have become part of the vernacular of the immediately surrounding text are not covered by the Language of Parts.

15.A Test Procedure for Language of Document

Baseline Test ID: 15.A-LanguageDocument

Identify Content

Pages with text (including text alternatives).

Test Instructions

  1. Identify the default human language of the document by reviewing the document content. The default human language of the document is the language in which most of the content is presented.
  2. Check that the value of the language property matches the determined default human language for the document. [SC 3.1.1]

Test Results

If any of the above checks fail, then Baseline Test 15.A-LanguagePage fails.

15.B Test Procedure for Language of Parts

Baseline Test ID: 15.B-LanguageParts

Identify Content

Text content that differs from the default human language of the document including alternative text for non-text content.

Test Instructions

  1. Identify the human language of the text content that differs from the default human language of the document.
  2. Check that the appropriate language is programmatically specified for any content segment that differs from the default human language of the document. [SC 3.1.2]
  3. Note: An element without a set language inherits its language property from the document’s default language settings.

Test Results

If any of the above checks fail, then Baseline Test 15.B-LanguagePart fails.

Advisory: Tips for Streamlined Test Processes

None

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

16. Audio-Only and Video-Only

Accessibility Requirements

  • WCAG SC 1.2.1 Audio-only and Video-only (Prerecorded) – For prerecorded audio-only and prerecorded video-only media, the following are true, except when the audio or video is a media alternative for text and is clearly labeled as such:
    • Prerecorded Audio-only: An alternative for time-based media is provided that presents equivalent information for prerecorded audio-only content.
    • Prerecorded Video-only: Either an alternative for time-based media or an audio track is provided that presents equivalent information for prerecorded video-only content.

Test Method Rationale

Evaluation of alternative content to assess its equivalence to audio-only or video-only content generally involves a manual, cognitive comparison of the original content and its alternative(s).

Limitations, Assumptions, or Exceptions

  • Media alternative for text is media that presents no more information than is already presented in text (directly or via text alternatives). Note: A media alternative for text is provided for those who benefit from alternate representations of text. Media alternatives for text may be audio-only, video-only (including sign-language video), or audio-video.

Audio-Only

  • Audio-only is a time-based presentation that contains only audio (no video and no interaction).
  • If audio is synchronized with video, slides, animations, or other time-based visual media, then use the synchronization test instead: Baseline 17. Synchronized Media.
  • Audio labeled as a media alternative for text does not require additional description if it is indeed equivalent to the text.
  • A text equivalent is not required for audio that is provided as an equivalent for video with no audio information. For example, it is not required to caption video description that is provided as an alternative to a silent movie.
  • Short sounds used to notify the user, such as confirmation beeps and error notifications, are not included in this requirement.
  • Information and/or instructions provided in the form of audio-only content must provide equivalent programmatic and/or textual cues; the check for this requirement is performed under Baseline 7. Sensory Characteristics.

Video-Only

  • Video-only is a time-based presentation that contains only video (no audio and no interaction).
  • In a video-only presentation, information is presented in a variety of ways including animation, text or graphics, the setting and background, the actions and expressions of people, animals, etc.
  • Video labeled as a media alternative for text does not require additional description if it is indeed equivalent to the text.
  • If the video is accompanied by timed sounds or meaningful dialog, it is not video-only. Test for Baseline 17. Synchronized Media requirements.
  • Video-only content may present moving, blinking, scrolling, or auto-updating information; however, other methods may be used to present similar content. In either case, whether presented via video-only or some other method, the content must provide the ability to pause, stop, or hide the content. The check for this requirement is performed under Baseline 21. Timed Events.

16.A Test Procedure for Audio-only (Prerecorded)

Baseline Test ID: 16.A-AudioOnlyTranscript

Identify Content

Pre-recorded audio-only content. Do not include media that is clearly labeled as a media alternative for text.

Test Instructions

  1. Check that the content provides transcript(s) for audio-only content. [SC 1.2.1]
  2. Check that the transcript is text (e.g., an image-only PDF would not be sufficient to pass this test). [SC 1.2.1]
  3. Play the audio-only content entirely while referring to the alternative.
  4. Check that the information in the transcript is an accurate and complete representation of the audio-only content and includes relevant sounds in addition to dialogue, such as doors banging, sirens wailing, identification of speakers in dialogue, etc. [SC 1.2.1]

Test Results

If any of the above checks fail, then Baseline Test 16.A-AudioOnlyTranscript fails.

16.B Test Procedure for Video-only (Prerecorded)

Baseline Test ID: 16.B-VideoOnlyAlt

Identify Content

Pre-recorded video-only content. Do not include media that is clearly labeled as a media alternative for text.

Test Instructions

  1. Check that all video-only content information is also available through a text alternative (e.g., text that provides description of video content and actions) or an audio track that describes the video content. [SC 1.2.1]
  2. View the video-only content while referring to the alternative.
  3. Check that the information in the alternative includes the same information that the video-only presentation displays (e.g., if the video includes multiple characters, the alternative must identify which character is associated with each depicted action). [SC 1.2.1]

Test Results

If any of the above checks fail, then Baseline Test 16.B-VideoOnlyAlt fails.

16.C Test Procedure for Audio Media Alternative (Prerecorded)

Baseline Test ID: 16.C-AudioMediaAlternative

Identify Content

Pre-recorded audio-only that is clearly labeled as a media alternative for text.

Test Instructions

  1. Identify the text for which the media is an alternative.
  2. Play the media that is labeled as an equivalent alternative for the text.
  3. Check that the meaningful audible information of the media is available in the text.

Test Results

If any of the above checks fail, then the audio-only is not a media alternative for text. Perform Baseline Test 16.A Test Procedure for Audio-Only (Prerecorded).

16.D Test Procedure for Video Media Alternative (Prerecorded)

Baseline Test ID: 16.D-VideoMediaAlternative

Identify Content

Pre-recorded video-only that is clearly labeled as a media alternative for text.

Test Instructions

  1. Identify the text for which the media is an alternative.
  2. Play the media that is labeled as an equivalent alternative for the text.
  3. Check that the meaningful visual information of the media is available in the text.

Test Results

If any of the above checks fail, then the video-only is not a media alternative for text. Perform Baseline Test 16.B Test Procedure for Video-only (Prerecorded).

Advisory: Tips for Streamlined Test Processes

  • Baseline Tests 16.A and 16.C are tests for Audio-only files. It may make sense to perform Test 16.C before Test 16.A.
  • Baseline Tests 16.B and 16.D are tests for Video-only files. It may make sense to perform Test 16.D before Test 16.B.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

17. Synchronized Media

Accessibility Requirements

Test Method Rationale

Evaluation of captions and audio descriptions to assess its equivalence to synchronized media content generally involves a manual, cognitive comparison of the original content with its alternative(s). Media that are clearly labeled as a media alternative for text are tested to assess equivalence to the text and if not equivalent, the tests for captions and audio descriptions are to be performed.

Limitations, Assumptions, or Exceptions

  • Synchronized media is audio or video synchronized with another format for presenting information and/or with time-based interactive components, unless the media is a media alternative for text that is clearly labeled as such. Synchronized media includes, but is not limited to Web casts, press conferences, and online training presentations.
  • Media alternative for text is media that presents no more information than is already presented in text (directly or via text alternatives). Note: A media alternative for text is provided for those who benefit from alternate representations of text. Media alternatives for text may be audio-only, video-only (including sign-language video), or audio-video.
  • Captions are synchronized visual and/or text alternative for both speech and non-speech audio information needed to understand the media content. Additional notes from definition:
    • Note 1: Captions convey not only the content of spoken dialogue, but also equivalents for non-dialogue audio information needed to understand the program content, including sound effects, music, laughter, speaker identification and location.
    • Note 4: Captions should not obscure or obstruct relevant information in the video.
  • Audio descriptions are narration added to or combined with the soundtrack to describe important visual details that cannot be understood from the main soundtrack alone.
  • Captions and audio descriptions need to be available but do not need to be enabled by default.
  • Captions and audio descriptions can be provided in separate media files, i.e., audio described version and captioned version are different files.
  • Transcripts and non-synchronized alternatives alone will not meet this requirement.
  • Captions are not needed when the synchronized media is, itself, an alternate presentation of information that is also presented via text in the document.
  • Live captions Exception: Two-way multimedia calls between two or more individuals through web apps are not included in this test; it is only intended for broadcast of synchronized media.
  • From Understanding SC 1.2.5: if all of the information in the video track is already provided in the audio track, no audio description is necessary.
  • For this Section 508 baseline, synchronized media is tested for SC 1.2.5; (Level A) SC 1.2.3 is not tested. At the higher conformance level AA, SC 1.2.5 requires audio descriptions and is more strict than SC 1.2.3.

17.A Test Procedure for Media Player Controls

Baseline Test ID: 17.A-MediaPlayerCCADControls

Identify Content

Media player that displays video with synchronized audio.

Test Instructions

  1. Check that user control for the selection of captions is provided. [Section 508 503.4]
  2. Check that user control for the selection of audio descriptions is provided. [Section 508 503.4]

Test Results

If any of the above checks fail, then Baseline Requirement 17.A-MediaPlayerCCADControls fails.

17.B Test Procedure for Media Player Caption Control Level

Baseline Test ID: 17.B-MediaPlayerCCLevel

Identify Content

Media player that displays video with synchronized audio and has volume adjustment controls.

Test Instructions

  1. Check that user controls for the selection of captions are at the same menu level as the user controls for volume adjustment or program selection. [Section 508 503.4.1]

Test Results

If any of the above checks fail, then Baseline Test 17.B-MediaPlayerCCLevel fails.

17.C Test Procedure for Media Player Audio Description Control Level

Baseline Test ID: 17.C-MediaPlayerADLevel

Identify Content

Media player that displays video with synchronized audio and has program selection controls.

Test Instructions

  1. Check that user controls for the selection of audio descriptions are at the same menu level as the user controls for volume or program selection. [Section 508 503.4.2]

Test Results

If any of the above checks fail, then Baseline Test 17.C-MediaPlayerADLevel fails.

17.D Test Procedure for Captions (Prerecorded)

Baseline Test ID: 17.D-CaptionsPrerecorded

Identify Content

Pre-recorded synchronized multimedia. Do not include media that is clearly labeled as a media alternative for text.

Test Instructions

  1. Enable captions through multimedia player functions and play the media. If a separate media file with captions is provided, test that file.
  2. Check that captions are provided.
  3. Check that captions are accurate and include all dialogue and equivalents for non-dialogue audio information needed to understand the program content, including sound effects, music, laughter, speaker identification and location. [SC 1.2.2]
    1. Listen to the audio of the entire synchronized media.
    2. Compare the audio to the captions for accuracy, time-synchronization, and equivalence.
  4. Check that the captions do not obscure or obstruct relevant information in the video. [SC 1.2.2]

Test Results

If any of the above checks fail, then Baseline 17.D-CaptionsPrerecorded fails.

17.E Test Procedure for Audio Description (Prerecorded)

Baseline Test ID: 17.E-ADPrerecorded

Identify Content

Pre-recorded synchronized multimedia. Do not include media that is clearly labeled as a media alternative for text.

Test Instructions

  1. Enable audio descriptions through multimedia player functions and play the media. If a separate media file with audio descriptions is provided, test that file.
  2. Check that the audio (with audio descriptions enabled) adequately describes important visual content in the media, including information about actions, characters, scene changes, on-screen text, and other visual content. [SC 1.2.5]

Test Results

If any of the above checks fail, then Baseline 17.E-ADPrerecorded fails.

17.F Test Procedure for Captions (Live)

Baseline Test ID: 17.F-CaptionsLive

Identify Content

Live synchronized multimedia.

Test Instructions

  1. Enable captions through multimedia player functions and start the live session.
  2. Check that captions are provided. [SC 1.2.4]
  3. Check that provided captions include dialogue and important sounds. [SC 1.2.4]:
    1. Listen to the audio of the entire synchronized media.
    2. Compare the audio to the captions for accuracy, time-synchronization, and equivalence. Lower accuracy of captions for live broadcasts may be acceptable due to limitations of real-time caption capabilities.

Test Results

If any of the above checks fail, then Baseline Requirement 17.F-CaptionsLive fails.

17.G Test Procedure for Sync Media Alternative (Prerecorded)

Baseline Test ID: 17.G-SyncMediaAlternative

Identify Content

Pre-recorded synchronized multimedia that is clearly labeled as a media alternative for text.

Test Instructions

  1. Identify the text for which the media is an alternative.
  2. Play the media that is labeled as an equivalent alternative for the text.
  3. Check that the meaningful audible information of the media is available in the text.
  4. Check that the meaningful visual information of the media is available in the text.

Test Results

If any of the above checks fail, then the multimedia is not a media alternative for text. Perform Baseline Tests 17.D Test Procedure for Captions (Prerecorded) and 17.E Test Procedure for Audio Description (Prerecorded) on the pre-recorded synchronized multimedia.

Advisory: Tips for Streamlined Test Processes

  • Testing synchronized media is different from testing Baseline 16. Audio-Only and Video-Only content.
  • Synchronized media players may be software or HTML.
  • At Level AA, SC 1.2.5 applies to synchronized media. The related Level A requirement, SC 1.2.3, should be marked as ‘Not Applicable’ in the test report. It is permissible for test processes to add a test for SC 1.2.3 (evaluate a full text alternative for equivalence). Adding such a test would exceed baseline test requirements and would not affect Baseline 17’s outcome.
  • All synchronized multimedia should be tested. If the pre-recorded multimedia is labeled as a media alternative for text, confirm that it provides equivalent information as text. If it does not, then it is not a media alternative for text. Test the multimedia for captions and audio descriptions. It may make sense to perform Test 17.G before testing for captions and audio descriptions.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

18. Meaningful Content and Sequence

Accessibility Requirements

  • WCAG SC 1.3.1 Info and Relationships – Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text.
  • WCAG SC 1.3.2 Meaningful Sequence – When the sequence in which content is presented affects its meaning, a correct reading sequence can be programmatically determined.

Test Method Rationale

Meaningful content must be available to all users. The sequence of the content (in context) must be logical and preserve content meaning.

Limitations, Assumptions, or Exceptions

  • Meaningful content provides information or context and includes content in headers, footers, watermarks, master page items, artifacts, and in floating elements.
  • Inline styling is included in this test.
  • Invisible content (text and background are the same color) is used for accessibility purposes and is not covered in Test 18.A. It is covered in 18.B.
  • Programmatically identified content is exposed to assistive technology. Document content that is not exposed to assistive technology can vary depending on document type.

18.A Test Procedure for Meaningful Visible Content

Baseline Test ID: 18.A-MeaningfulContent

Identify Content

Meaningful visible content

Do not include meaningful background images, which are covered under Baseline 6. Images.

Test Instructions

  1. Check that all meaningful content is available in the body of the document or programmatically identified. [SC 1.3.1]

Test Results

If any of the above checks fail, then Baseline Test 18.A-MeaningfulContent fails.

18.B Test Procedure for Meaningful Sequence

Baseline Test ID: 18.B-MeaningfulSequence

Identify Content

Identify all meaningful content including invisible meaningful content

Test Instructions

  1. Check that the reading order of all meaningful content (in context) is logical. [SC 1.3.2]

Test Results

If the above checks fail, then Baseline Test 18.B-MeaningfulSequence fails.

Advisory: Tips for Streamlined Test Processes

None

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

19. Frames and iFrames– Not Applicable to Documents

Frames and iframes are not implemented in non-web documents, so this test is not applicable. It was not removed to maintain harmonization with the ICT Testing Baseline for Web.

Test Results

Baseline Tests 19.A-FrameTitle and 19.B-iFrameName are not applicable to documents

20. Conforming Alternate Version

Accessibility Requirements

  • WCAG Conforming Alternate Version: Conformance requirement #1 allows non-conforming pages to be included within the scope of conformance as long as there is a “conforming alternate version”, which is defined as a version that:

    1. conforms at the designated level, and
    2. provides all of the same information and functionality in the same human language, and
    3. is as up to date as the non-conforming content, and
    4. for which at least one of the following is true:

      1. the conforming version can be reached from the non-conforming version via an accessibility-supported mechanism, or
      2. the non-conforming version can only be reached from the conforming version, or
      3. the non-conforming version can only be reached from a conforming page that also provides a mechanism to reach the conforming version.

Test Method Rationale

An alternate version must meet all parts of the definition in order to be considered a “conforming alternate version.”

Limitations, Assumptions, or Exceptions

  • Notes from the Conforming Alternate Version definition:
    • Note 1: In this definition, “can only be reached” means that there is some mechanism, such as a conditional redirect, that prevents a user from “reaching” (loading) the non-conforming page unless the user had just come from the conforming version.
    • Note 2: The alternate version does not need to be matched page for page with the original (e.g., the conforming alternate version may consist of multiple pages).
    • Note 3: If multiple language versions are available, then conforming alternate versions are required for each language offered.
    • Note 4: Alternate versions may be provided to accommodate different technology environments or user groups. Each version should be as conformant as possible. One version would need to be fully conformant in order to meet conformance requirement 1.
    • Note 5: The conforming alternative version does not need to reside within the scope of conformance, or even on the same Web site, as long as it is as freely available as the non-conforming version.
    • Note 6: Alternate versions should not be confused with supplementary content, which support the original page and enhance comprehension.
    • Note 7: Setting user preferences within the content to produce a conforming version is an acceptable mechanism for reaching another version as long as the method used to set the preferences is accessibility supported.
  • Per WCAG 2.2 Understanding Conforming Alternate Versions, authors relying on conforming alternate versions must make end users aware that a conforming alternate version is available. This may be accomplished by providing a link to a more accessible version, identified clearly by link text. Alternatively, a link to instructions may be provided which documents how to access a more accessible version as well as the specific ways the alternate version is more accessible (e.g. a “high contrast version”).
  • It is not a WCAG requirement to provide a conforming alternate version. This test only checks that a conforming alternate version is present. If there is not a conforming alternate version, the result for this baseline test is “Does Not Apply” (it would not be a failure).
  • To meet Conformance Requirement 1 for Level AA conformance, the document satisfies all the Level A and Level AA Success Criteria, or a Level AA conforming alternate version is provided.

20.A Test Procedure for Conforming Alternate Version

Baseline Test ID: 20.A-ConformingAltVersion

Identify Content

Multiple versions of the same content.

Test Instructions

  1. Check that the alternate version provides all of the same information and functionality in the same human language as the original. [CAV]
  2. Check that the alternate version is as up to date as the non-conforming content. [CAV]
  3. Check that the alternate version passes all other applicable baseline tests. [CAV]
  4. Check that at least one of the following is true: [CAV]
    1. The conforming alternate version can be reached from the non-conforming version via an accessibility-supported mechanism, or
    2. The non-conforming version can only be reached from the alternate version, or
    3. The non-conforming version can only be reached from a conforming version that also provides a mechanism to reach the alternate version.
  5. Check that the content indicates that a conforming alternate version is available. [CAV]

Test Results

If any of the above tests fail, a Conforming Alternate Version does not exist and Baseline Requirement 20.A-ConformingAltVersion DOES NOT APPLY.

Advisory: Tips for Streamlined Test Processes

  • When a conforming alternate version is provided, non-conforming versions of that content are tested only for Conformance Requirement 5. It is not necessary to test the non-conforming versions of that content for other baseline tests.
  • The presence of a conforming alternate version can determine whether other versions of the content need to be tested. To save on time and effort, it is advised that this be one of the first tests performed.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

21. Timed Events

Accessibility Requirements

  • WCAG SC 1.4.2 Audio Control – If any audio in a document plays automatically for more than 3 seconds, either a mechanism is available to pause or stop the audio, or a mechanism is available to control audio volume independently from the overall system volume level.

  • WCAG SC 2.2.1 Timing Adjustable – For each time limit that is set by the content, at least one of the following is true:

    • Turn off: The user is allowed to turn off the time limit before encountering it.
    • Adjust: The user is allowed to adjust the time limit before encountering it over a wide range that is at least ten times the length of the default setting.
    • Extend: The user is warned before time expires and given at least 20 seconds to extend the time limit with a simple action (for example, “press the space bar”), and the user is allowed to extend the time limit at least ten times.
  • WCAG SC 2.2.2 Pause, Stop, Hide – For moving, blinking, scrolling, or auto-updating information, all of the following are true:

    • Moving, blinking, scrolling: For any moving, blinking, or scrolling information that (1) starts automatically, (2) lasts more than five seconds, and (3) is presented in parallel with other content, there is a mechanism for the user to pause, stop, or hide it unless the movement, blinking, or scrolling is part of an activity where it is essential.
    • Auto-updating: For any auto-updating information that (1) starts automatically and (2) is presented in parallel with other content, there is a mechanism for the user to pause, stop, or hide it or to control the frequency of the update unless the auto-updating is part of an activity where it is essential.
  • Conformance Requirement 5: Non-Interference - The following success criteria apply to all content in a document, including content that is not otherwise relied upon to meet conformance, because failure to meet them could interfere with any use of the page: 1.4.2 - Audio Control, 2.1.2 - No Keyboard Trap, 2.3.1 - Three Flashes or Below Threshold, and 2.2.2 - Pause, Stop, Hide.

Test Method Rationale

Determine how time limits, auto-play, and auto-update can be modified by a user and execute the modifications.

Limitations, Assumptions, or Exceptions

  • From SC 2.2.1: Timing Adjustable, time limits set by the content that meet any of the following are not included in this test:
    • Real-time Exception: The time limit is a required part of a real-time event (for example, an auction), and no alternative to the time limit is possible; or
    • Essential Exception: The time limit is essential and extending it would invalidate the activity; or
    • 20 Hour Exception: The time limit is longer than 20 hours.
    • Content that repeats or is synchronized with other content, so long as the information and data is adjustable or otherwise under the control of the end user. Examples of time limits for which this success criterion is not applicable include scrolling text that repeats, captioning, and carousels. These are situations that do include time limits, but the content is still available to the user because there are controls for accessing it.
  • Changing content is considered to be “in parallel” when it appears alongside other content. For example, a news flash updating across the bottom of a page would be considered changing content in parallel with other content when the page also presents a news video and text news articles (both examples of static content). A button allowing users to pause the changing content would not be considered other static content.
  • Moving, blinking, scrolling, and/or auto-updating is considered “essential” to an activity when, if removed, it would fundamentally change the information or functionality of the content, and information and functionality cannot be achieved in another way that would conform.
  • Notes from SC 2.2.2 Pause, Stop, Hide:
    • Note 1: For requirements related to flickering or flashing content, refer to Guideline 2.3.
    • Note 2: Since any content that does not meet this success criterion can interfere with a user’s ability to use the whole document, all content in the document (whether it is used to meet other success criteria or not) must meet this success criterion. See Conformance Requirement 5: Non-Interference.
    • Note 3: Content that is updated periodically by software or that is streamed to the user agent is not required to preserve or present information that is generated or received between the initiation of the pause and resuming presentation, as this may not be technically possible, and in many situations could be misleading to do so.
    • Note 4: An animation that occurs as part of a preload phase or similar situation can be considered essential if interaction cannot occur during that phase for all users and if not indicating progress could confuse users or cause them to think that content was frozen or broken.
  • Note from SC 1.4.2 Audio Control:
    • Note 1: Since any content that does not meet this success criterion can interfere with a user’s ability to use the whole document, all content in the document (whether or not it is used to meet other success criteria) must meet this success criterion. See Conformance Requirement 5: Non-Interference.
  • Per WCAG 2.2 Understanding SC 1.4.2: Audio Control, having control of the volume includes being able to reduce its volume to zero. Muting the system volume is not “pausing or stopping” the autoplay audio. Both the “pause or stop” and control of audio volume need to be independent of the overall system volume.

21.A Test Procedure for Timing Adjustable

Baseline Test ID: 21.A-TimingAdjustable

Identify Content

Identify any instances of content time limits (excluding exceptions described above).

Test Instructions

  1. For each instance of an identified time limit for content, check that at least one of the following is true before time expires [SC 2.2.1]:
    1. The user has the ability to turn off the time limit.
    2. The user has the ability to adjust the time limit before encountering it over a wide range that is at least ten times the length of the default setting.
    3. The user is warned before time expires AND:
      1. Given at least 20 seconds to extend the time limit with a simple action (e.g., "press the space bar"), AND
      2. Allowed to extend the time limit at least ten times.

Test Results

If the above check fails, then Baseline Test 21.A-TimingAdjustable fails.

21.B Test Procedure for Moving Information

Baseline Test ID: 21.B-MovingInfo

Identify Content

Any moving, blinking, or scrolling information that meets ALL of the following:

  • Starts automatically, AND
  • Lasts more than 5 seconds, AND
  • Is presented in parallel with other content, AND
  • Moving, blinking, scrolling is not essential

Test Instructions

  1. Check that there is a mechanism for the user to pause, stop, or hide it [SC 2.2.2].

Test Results

If the above check fails, then Baseline Test 21.B-MovingInfo fails.

21.C Test Procedure for Auto-updating Information

Baseline Test ID: 21.C-AutoUpdate

Identify Content

Any auto-updating information that meets ALL of the following:

  • Starts automatically, AND
  • Is presented in parallel with other content, AND
  • Is not part of an activity where it is essential

Test Instructions

  1. Check that there is a mechanism for the user to pause, stop, or hide it or to control the frequency of the update [SC 2.2.2].

Test Results

If the above check fails, then Baseline Test 21.C-AutoUpdate fails.

21.D Test Procedure for Audio Control

Baseline Test ID: 21.D-AudioControl

Identify Content

Audio that automatically plays for more than 3 seconds.

Test Instructions

  1. Check that either [SC 1.4.2]:
    1. A mechanism is available at the beginning of the document content or in platform accessibility features to pause or stop the audio that is independent of the overall system volume, OR
    2. A mechanism is available at the beginning of the document content or in platform accessibility features to control audio volume independently from the overall system volume level.

Test Results

If the above check fails, then Baseline Test 21.D-AudioControl fails.

Advisory: Tips for Streamlined Test Processes

  • Remind testers that when the time-out occurs, visible focus should shift to the time-out alert to comply with success criteria for keyboard accessibility and focus order.
  • In some cases, it may be necessary to contact the application authors to clarify the conditions under which time-outs occur.
  • A failure of SC 1.4.2 or 2.2.2 would also fail Conformance Requirement 5: Non-Interference and should be highlighted in test reports to indicate the severe impact on accessibility.
  • Media players must be configured to disable autoplay of audio prior to testing of content. Provide instructions for conformant player mechanisms only. Test results may vary depending on the media player used.
  • Content that is found non-conformant with SC 2.2.2 may be marked for further review for a Section 508 exception if the auto-update is essential. However, an exception for SC 2.2.2 should be considered carefully as Conformance Requirement 5: Non-Interference requires its conformance.

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

22. Resize Text

Accessibility Requirements

  • WCAG SC 1.4.4 Resize text – Except for captions and images of text, text can be resized without assistive technology up to 200 percent without loss of content or functionality.

Test Method Rationale

This baseline test requires an evaluation of visual content and functionality after text has been resized.

Limitations, Assumptions, or Exceptions

  • Exception: captions and images of text are not included in the test.

22.A Test Procedure for Resize Text

Baseline Test ID: 22.A-ResizeText

Identify Content

All text on a page.

Test Instructions

  1. Check that there is a mechanism to resize, scale, or zoom in on the content at least to 200% of original size. [SC 1.4.4]
    Known approaches include:
    • Browser zoom function or text-sizing feature
    • Accessibility features provided by the platform or Operating System
    • On-page controls to change text size.
  2. Modify the font size appearance to twice the width and height, or 200% larger.
  3. Check for all of the following [SC 1.4.4]:
    • Text is not clipped, truncated, or obscured
    • Text entered in text-based form controls resize fully
    • All functionality is available
    • All content is available

Test Results

If any of the above checks fail, then Baseline Test 22.A-ResizeText fails.

Advisory: Tips for Streamlined Test Processes

None

WCAG 2.2 Techniques

The following sufficient techniques and/or common failures were considered when developing this test procedure for this baseline requirement:

23. Multiple Ways — Not Applicable to Documents

No Accessibility Requirements

This Baseline test applies to a WCAG Success Criterion that Section 508 does not apply to non-web documents. It is from the ICT Testing Baseline for Web and was retained to maintain harmonization.

Per Section 508 E205.4, the accessibility standard for electronic content, non-web documents are not required to conform to the following success criteria:

  • WCAG SC 2.4.5 Multiple Ways — More than one way is available to locate a Web page within a set of Web pages, except where the Web page is the result of, or a step in, a process.

24. Parsing

Accessibility Requirements

  • WCAG SC 4.1.1 Parsing – In content implemented using markup languages, elements have complete start and end tags, elements are nested according to their specifications, elements do not contain duplicate attributes, and any IDs are unique, except where the specifications allow these features.

Test Method Rationale

  • WCAG 2.2 has deprecated SC 4.1.1 Parsing as it no longer has utility because accessibility errors due to assistive technology directly parsing HTML no longer exist or are addressed in other criteria.
  • Section 508 is not directly affected by WCAG 2.2 as it incorporates by reference WCAG 2.0 Level A and AA, W3C Recommendation, December 11, 2008. SC 4.1.1 Parsing is not deprecated in WCAG 2.0, and the criterion is a Section 508 requirement. However, this Baseline test will incorporate the WCAG 2.0 Errata which states “This criterion should be considered as always satisfied for any content using HTML or XML.”

Limitations, Assumptions, or Exceptions

  • From WCAG 2.0 Errata: Success Criterion 4.1.1 was originally adopted to address problems that assistive technology had directly parsing HTML. Since this criterion was written, the HTML Standard has adopted specific requirements governing how user agents must handle incomplete tags, incorrect element nesting, duplicate attributes, and non-unique IDs. Although the HTML Standard treats some of these cases as non-conforming for authors, it is considered to “allow these features” for the purposes of this Success Criterion because the specification requires that user agents support handling these cases consistently. In practice, this criterion no longer provides any benefit to people with disabilities in itself. Issues such as missing roles due to inappropriately nested elements or incorrect states or names due to a duplicate ID are covered by different Success Criteria and should be reported under those criteria rather than as issues with 4.1.1.

24.A Test Procedure for Parsing

Baseline Test ID: 24.A-Parsing

Identify Content

All document content

Test Instructions

No testing necessary

Test Results

Baseline Test 24.A-Parsing passes.

Advisory: Tips for Streamlined Test Processes

None

WCAG 2.2 Techniques

While SC 4.1.1 has been deprecated in WCAG 2.2, the following sufficient techniques are listed for reference: