81 notes in guidance statements:
Note: There are many types of visual cues that give a 3D perception, some of which may not be as problematic as others. Some examples characteristics that contribute to depth perception:
- Binocular disparity (differences between views of left and right eyes)
- Binocular convergence (angle of two eyes inwards with closer objects)
- Motion parallax
- Overlapping objects, occlusion
- Patterns of lighting and shading
- Perceived relative sizes of objects (things that are smaller may appear farther away)
- Texture gradient (fine detail can be viewed on close rather than distant objects)
- Linear perspective (otherwise parallel lines that converge)
Blind, wheelchair share this need though different
" accessible to me " meaningsFor stationary or installed systems, people may need
Note: This could apply to all people using audio autoput where the listening environment is noisy.
Note: This could apply to the main audio channel or could apply to individual apps, media, and other audio channels.
This includes both the alternative versions from the author perspective and the need to allow Assistive Technology to present alternatives.
Alternative to auditory alerts
Split apart from original. May still need to address
" Form and language " in first item
Unsure about functional ability
Split apart from original. May still need to address
" Form and language " in first item
Split apart from original. May still need to address
" Form and language " in first item
Note:
Brightness may need to be increased (if insufficient) or decreased (if too much and causes sensitivity, glare, etc.).
This might include changes to characteristics such as frequency, speed, speech rate, voice, stop/start and type of content (such as sounds, speech, signals, alarms)
partial for 1.4.2. The other part (independent control of Assistive Technology) is under audio-control.
Screen reader users may need to change volume or turn off.
partial for 1.4.2. The other part (independent control of Assistive Technology) is under audio-characteristics
Should this just be time based content?
Should this be combined with avoid-interruptions?
Too broad to fit
Is this avoid or adjust?
Too broad to fit?
See avoid tactile distractions, avoid visual distractions, etc.
Some examples:
- Screen readers users may be derailed when something unexpected happens, especially if it is a new window.
- The appearance of a pop-up or dialog may be missed by somebody with low vision.
- Some people with limited attention may lose track of their primary task when interrupted.
Why is haptics the only overwhelm that includes
" prevent them from completing a tactile task. " ?
Note: Some of these issues may be due to motion or due to bold static patterns.
Should this and other consistency include fine motor control to support voice interactions?
Should this be interactive content?
Should this tie in with other disabilities?
Content that can interrupt focus must be identified and allow users to control interaction with it (distractions, interruptions).
For some people, these distractions can include advertisements, side tasks, and pop-ups (including GDPR dialogs).
Is this to broad? See avoid-interruptions, avoid-olfactory-distractions, avoid-tactile-distractions, avoid-visual-distractions
Description of operable parts layout
For example:
- Location of keypads
- &
- keyboards
- Arrangement of number pad (ascending vs. descending)
- Location of scanner or area
- Location ticket, receipt, or other dispenser
Consider if this or another provision covers the physical location where the user must go next.
Distinguish auditory components
Description from ISO parent user need statement 5-4-3
" To have accessibility features not interfere with perception of standard information " :
This need focuses on the perception of displayed content and required interactions not being hindered by accessibility features. Where accessibility features are not built in to the way that a system displays its content, it is much harder for users to manipulate the interface between the display content and the accessibility features. This can result in interference between the two, seriously degrading the user experience.
Distinguish auditory components
Examples:
- Audio descriptions happening in gaps in dialog
- Audio descriptions made with a voice distinct from other content
- Background music can be turned off or reduced in volume so it does not interfere
Partial 4.2.5 Making Content Usable
Examples:
- Disabled buttons gray out with a contrast that sufficiently different compared to the enabled state.
- A screen reader announces actionable elements, such as buttons and links (including if they are in a disabled state) because of good markup.
- Actionable buttons are backlit with an LED.
- An error tone is given immediately when a person try to type text in a field that does not currently accept text input.
- Fields that are not available in the current context are hidden, so there is no need to distinguish between actionable and non-actionable components.
Distinguish tactile components
Examples:
- Vibration patterns used for different alerts are sufficiently different from each other
Examples:
- Important content on a video that is not covered up when captions are turned on
- Dialogs that do not overlap
- Tooltip-like interactions that can be dismissed so that content underneath can be perceived and read
Should we add interactive content?
Consider
" Chromaticity " instead of " color " if chromaticity is what we really mean (since " color " covers many aspects of what is perceived)
Is there an adjust?
We may need a characteristic for content that can trigger mental health concerns, e.g., the news
Examples:
- A person may be distracted by ASL that is available in a meeting and may need to hide it.
- A person may be distracted by captions on a video and need a way to hide them.
Guidelines:
- ISO/IEC 29138-1:2018 11-3-c: to be able to temporarily hide specific accessibility functions
3.2.6 Consistent Help (partially)
Too broad to fit?
For example:
- Blind people and people with low vision may have difficulty with pointing input and instead want to use a keyboard or simple gestures
- Deaf people and people who are hard of hearing and people with speech limitations may have difficulty providing spoken commands and instead want to use a keyboard or other input mechanism.
- A person with limited dexterity may wish to use speech commands or a keyboard or other buttons instead of using a pointing device or touchscreen.
Specific boundaries need to be set based on research
Manage individualization features
For example:
- Turn on/off color shift mode for color vision deficiencies
- Turn on/off magnifier
- Turn on/off highlighting words as text is read
- Turn on/off onscreen keyboard
Manage individualization functions
For example:
- Get a text description of an image from AI
- Change speech rate of a screen reader or text-to-speech output
- Play, pause, and replay text-to-speech for content (?)
- Translate or simplify content into something easier to understand
Overlap with meaningful sequence
Overlap with focus order.
There may be many reasons people need to be able to mute the audio output:
- Because it is distracting or overwhelming
- Because it conflicts with audio from another task the user is working on
- Because a user wants custom music or audio
- Because it conflicts with audio from assistive technology
- Because the user needs to remain silent in the current environment or circumstances
No accessibility feature override
For example:
- Web content does not block browser zoom functionality (or an alternate is available?)
- Web content does not block screen direction changes
- Applications
- &
- web content allow for high contrast themes to be applied
- Applications/content respect preferences for reduced motion(?)
Note that electromagnetic hypersensitivity (EHS) is not a recognized medical diagnosis, although there are some disability grants in some countries for shielding/abatement.
Hazardous patterns are specific to a person and may also be influenced by viewing environment and psychological state of the viewer (e.g., stress, fatigue, etc.).
The patterns may be...
- Patterns in time, such as flashing or flickering.
- Patterns in space, such as high contrast parallel lines, waves, or concentric circles.
Note:
Consider " Chromaticity " instead of " color " if chromaticity is what we really mean (since " color " covers many aspects of what is perceived)
- 1.4.1 Use of color
- Underlines for hyperlinks
Delete? See no limited vision instruction references
Notifications in auditory alternatives
See auditory equivilents
Obrist, M., Tuch, A. N.,
& Hornbaek, K. (2014, April). Opportunities for odor: experiences with smell and implications for technology. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2843-2852).
Too broad to fit?
For example:
- Blind people and people with low vision may have difficulty with visual output and instead want to use audio output.
- Deaf people and people who are hard of hearing may have difficulty hearing sound effects and want the screen or a portion of the screen to flash when one occurs.
- Those who have difficulty hearing and understanding speech may wish to have captions or other visual indications of speech.
There are also provisions for clear path and approach for mobility devices in the ADAAG and EN 301 549.
Examples where people may need to have privacy protection:
- A blind person may wish to blank a screen on a kiosk to prevent
- "
- shoulder surfing
- "
- A person with limited mobility may not be able to effectively shield PIN input on a pinpad mounted at the end of their reach range that others might do in similar situations.
- A person using a speaker in public might not know that information on a screen has private information and that it will potentially be read aloud.
- A person might not know that some pieces of information can cause a serious privacy risk if others were to get it.
- A person might not know that the default option for a post is public instead of other, more private levels of access.
The is the aria-password tussle wrapped up into a guidance statement.
Programmatic structure and relationships
Some overlap with titles. Maybe break into titles, labels, links and headings. Also overlap with interactive equivalent
About identity
Same as adjustable-parts?
For example:
- This is particularly important to be done automatically on public devices (reset volume, zoom, etc. at end of session)
There are also provisions for approach and clearance for wheelchairs, knees, and toes in the ADAAG and EN 301 549.
Examples:
- A person might want to turn on the audio channel of audio descriptions for a movie
- Turning off some audio track may reduce background and other noise that interferes with use
- This functionality is also useful for various language tracks that might be available
Separate identification & activation
Note: This seems to be a need that everybody has, but systems are designed so controls are mostly identifiable to people without disabilities. Where labels are incorrect or with physical hardware, it might be harder for blind people to identify controls.
Separate selection & activation
Some potential examples:
- Somebody with tremor or low accuracy may find it better to select and confirm with two separate steps rather than say just using a mouse or touchscreen where selection and activation is essentially the same motion.
- This can be important for everybody with steps where mistakes are likely to occur and there are larger consequences of a mistake.
Note: This is in the " Perceive " section in the ISO guidelines, but seems to (also) be related to operation.
Note: This may need to be broken up into more granular needs.
Note: This may need to be broken up into more granular needs.
Speech with visual highlighting
ISO/IEC 29138-1:2018 2-4-f to have speech support with synchronized highlighting so a user can follow with rapid feedback
Other provisions
- EN 301 549: 5.6.1 Tactile or auditory status
- Where ICT has a locking or toggle control and that control is visually presented to the user, the ICT shall provide at least one mode of operation where the status of the control can be determined either through touch or sound without operating the control.
- EN 301 549: 5.6.2: Visual status
- Where ICT has a locking or toggle control and the control is non-visually presented to the user, the ICT shall provide at least one mode of operation where the status of the control can be visually determined when the control is presented.
- Section 508: 409 Status Indicators
- 409.1 General: Where provided, status indicators shall be discernible visually and by touch or sound.
Clarity here is often a function of the edge radius or
" sharpness " of a tactile feature (and height to a lesser extent).
Rewrite from the point of view of what it should be set to?
Adjustable timeout may need its own category
As examples...
- a person might want to turn off the vibration features on a game controller
- a person might not want to be distracted by vibration notifications on a smartphone or wearable
For example, a person might want to turn off the screen on an ATM so that others who might be nearby cannot see information on the screen that the user wants to keep private.
Unchanging button functionality
For example:
- Label changes to soft keys can be missed by users
- Common shortcut keystrokes vary by system mode or context
Maybe not limited or no hearing?
Maybe not limited or no hearing?
We have been including feedback in help and instructions but should it be here?
For systems with visual content, people may need that content presented programmatically or need equivalent auditory and tactile versions?
See audio equivalents
This could be broken up but since different standards combine them in different ways and they are interrelated it was easier to keep all of it together.
Specific boundaries need to be set based on research
Interrelated with magnification
There are also provisions for reach ranges in the ADAAG and EN 301 549.
Consider adding
" short stature " as a functional need?
Related to adjustable parts