By continuing to use the site you agree to our Privacy & Cookies policy

In the realm of the senses The meeting-room door may be marked in Braille, but how does a blind person find their way there from the foyer?

Lighthouse, one of the world's leading organisations providing help and rehabilitation for blind and partially sighted people, wanted its new headquarters in New York to be a model of accessibility. It asked Roger Whitehouse and Company, my practice, to develop a wayfinding system for the building.

The Americans with Disabilities Act (ada) requires identification in raised letters and Braille outside every room - but how does a blind or low-vision user find these signs? One solution, tactile maps, has most commonly been used for exterior environments. Fortunately we were able to involve visually impaired users of the Lighthouse's day-to-day services in the design process by making working mock-ups and setting up realistic design-evaluation sessions.

How a blind person creates a cognitive map of the environment is interesting, and appears to differ greatly from one individual to another. It is also determined by whether the person is blind from birth, or lost their sight as a result of disease or accident. Many blind people experience the environment sequentially rather than in a more complex and interrelated spatial manner.

We started with what was basically an architectural plan, raised in three dimensions, showing all rooms and corridors. In tests users took a long time to read all the information on such a map, so we simplified it to show only the circulation spaces, with destinations indicated as doors marked in raised numbers and Braille.

To simplify the information even further, the aim was to provide sequential rather than spatial information in a 'route and event' diagram. This was a map indicating the route on which people could travel to a particular destination, shown as a dotted line, with 'events' along it, such as doors, lifts, and restrooms, indicated by symbols.

'You are here' arrows on the maps took the typical user about 10 minutes to find, so a 'super-tactile pip' was developed. This is higher than the rest of the tactile text and meant that people could scan the map with their hands and discover the pip quickly.

It was important for these maps to be both visual and tactile so that a sighted user and a visually impaired one could study and discuss it together.

Many visually impaired users rely on canes to navigate, counting doors as they go, so we differentiated doors from closets and mechanical spaces by showing them as closed rather than open. In an attempt to create a kind of universal wayfinding shorthand on the maps, and on the signs, we tested the use of colour coding and shape symbols for the three key destinations found in all buildings: lifts (a green open square), emergency exits (red bars - red is the colour code required in the us), and rest rooms (blue circles for women and blue triangles for men). We found that the number of symbols that could be discriminated successfully was low. For instance, most users found it difficult to distinguish between a solid triangle and a solid square.

The maps are mounted at a 30degrees angle on the entrance desk and on smaller desks on each floor, always in the same location and orientation in relationship to the lifts.

We also tested a speaking map. If any part of the map is touched it tells the user what that element is. Users can choose to let the map give a guided tour of the building, or simply teach the user how it works. Results so far have been very successful.

Another area of exploration was how room signs could best be designed to suit the widest spectrum of users. Early experiments suggested advantages in separating tactile information from visual. For instance, tests showed that the ideal characteristic for letterforms for tactile use was all caps, while for visual use upper and lower case was best.

To find the most legible typeface for tactile use, we started by examining Helvetica Medium and Times Roman, the most commonly used typefaces on tactile signs in the us. We found Helvetica too thick to distinguish easily by touch and the serifs on Times Roman too confusing. The experiment found that it took almost twice as long to read these typefaces as to read vag Rounded. A typical sign message took three or four minutes to understand as opposed to one or two minutes.

It became apparent that an important factor for reading text with the fingers is that the outline must be distinct, and people must be able to trace the outline of the character. We designed a special typeface called Haptic to be easier and more economical to read with the finger. One consideration was that several characters are often mistaken for each other, because many blind users transpose things vertically - for example an A and the numeral 4, M and W (vertical transposition), and zero and the letter O.

Spacing between letters is of critical importance, and needs to be much wider than we are comfortable with for visual letterforms. It is important to understand that tactile letter spacing is absolute, not proportional. There should be a consistent 3.5-5mm between the two closest parts of adjacent letters, although this looks hideous. There should also be about 10mm between tactile lettering and any other raised elements such as borders.

We tested a range of cross-sections of the letterforms and discovered that a thin raised line with a softened but still very distinctive edge was ideal. For example photopolymer vinyl sheet, which is about 1mm thick, is a rather good tactile material which gives a very crisp image.

The separation of the visual part of the sign from the tactile part enabled us to use upper and lower case for visual components and capital letters for the tactile plaques. An additional advantage of separating the visual elements from the tactile is it reduces the area, and therefore the cost, of the typically expensive tactile components.

Testing again showed that tactile reading of signage placed flat against a wall can be uncomfortable. This led us to the notion of placing tactile elements on a 45degrees ledge. People who tested this said it was useful as they could use their thumbs as an 'index' to position the hands while reading the sign. The ledge also enables them to reach out and read the sign in a much more natural and comfortable position than flat against the wall.

A system called Talking Signs, consisting of Infrared transmitters mounted on the underside of the tactile ledge, transmits the function of the room to a small hand-held receiver which users carry with them. This gives the visually impaired the ability to scan the surroundings and find out what is around them.

Although they were not eventually installed in the Lighthouse, we extended the notion of tactile ledges into a continuous element, dubbed 'Braille rail', that could run through the building and that would have both directional and room identification information on it. We also devised what we call a 'low acuity' arrow which gives directional information even if you are only seeing it as a blur, because the colour intensity increases towards the head.

After the project had been completed, we became aware of other methods for universal wayfinding that we would like to explore, such as audio- visual landmarks. For instance, a tall case clock or a miniature waterfall would be visually distinctive for those who could see, and audible for those who could not - although it would be important that the noise be restful and unobtrusive.

Roger Whitehouse is an architect specialising in designing signage and wayfinding systems. For further information contact the uk Sign Design Society on 01582 713556.

Have your say

You must sign in to make a comment.

The searchable digital buildings archive with drawings from more than 1,500 projects

AJ newsletters