Rebecca Riordan’s article Improving Data Entry Feedback with Sound in this month’s issue on using sound made me think of other places wheresound is used effectively. Sound is effective in those situations where youcan’t see what’s going on. Parents and spouses are the most familiar with theseconditions. As a parent, I can’t have eyes everywhere (though my parentsseemingly could) and, as a driver, I can only look in one direction at a time.
Parents of small children know, instinctively, torespond to the sound upstairs of a crash followed by a yell (especially if theyell is “I’m alright!”). When my wife says “Honey” in that voice that has adistinct edge to it, I know that I’m not seeing the point of the currentdiscussion. As drivers, we now that the sound of a honking horn (especiallyfrom behind or beside us) is critical. Changes in sound are also important. Ifthat sound of the honking horn is getting louder and closer, for instance, I’mmore concerned than if the sound is at a constant distance. A repetition of“Honey” with a more distinct edge is also a good warning.
Rebecca’s article also emphasizes how important it isfor developers to understand the real world of their users. While it might benatural to assume that any person working at a computer is looking at thescreen, that isn’t necessarily the case. Rebecca describes the situation for atypical data entry person, often referred to as “heads down” data entry. Thoughthese people are working at a computer, they aren’t looking at the screen.Instead, they’re concentrating on whatever document they’re pulling informationfrom. Unfortunately, as developers, we’re so seldom in the data entry scenariothat it’s easy for us to forget that not all computer users are looking at thescreen. As with the driver in my previous example, the user in Rebecca’sscenario isn’t able to look at the place where a problem may be occurring.
An important tool for any technical writer on softwaretopics is a screen capture utility—something for taking screenshots. You don’tactually need a separate tool for this: Pressing the Print Screen button on anycomputer copies your current screen onto the Windows clipboard in bitmapformat. From the clipboard you can paste the bitmap into any picture editingutility (Paint, for instance). However, one of the key problems with thismethod is feedback. After all, after you press the Print Screen button how doyou know that it worked? Since a screen capture utility is designed to help youget a picture of the screen as you see it, the one thing that the screencapture utility can’t do is appear on the screen. The tool that I use isCaptureEze, though there are a number of other equally good tools. When I takea screenshot with CaptureEze, it emits a “click-clickswoosh” kind of sound thatindicates that it has taken the picture, letting me know that all is well.
The sound, for me at any rate, evokes the sound madeby an old Brownie camera when I clicked its mechanical shutter. As my kids growup with more sophisticated picture-taking devices I wonder if they’ll make thesame association. However, that association isn’t necessary. The sound thatCaptureEze makes can be arbitrary. On occasion, user interface designers canget too wrapped up in trying to make their user interfaces look like somethingelse (what is called a “visual metaphor”) than in meeting the needs of theirusers. When my wife says “Honey” using that special edge, it’s the tone of hervoice that matters, not the words that she uses (sometimes she says“Sweetie”—either way, I’m being dense). Rebecca also discusses this essentialcharacteristic of sound feedback: distinctiveness.
In many ways, user interface design is all aboutfeedback. The key factor is letting your user know what’s going on in a waythat’s appropriate and distinctive to what the user is doing. Those of youusing Palms, Visors, or other Personal Digital Assistants that let you write onthe screen can see the impact of this right away. Many people who start workingwith PDAs get 70- 0 percent accuracy in handwriting recognition almostimmediately. Unfortunately, they find that they never seem to get much better.
The problem, I think, is that PDA users look in thewrong place for feedback. Most of these devices require you to write in oneplace (the input area) while your text appears in another place (the screen).This is different from working with pen and paper, where the end of your pen iswhere your words appear.
With pen and paper, when you write, you stare at theend of your pen. I’ve always assumed that I was looking to see what I’vewritten. So, when working with a PDA, I always looked at the screen to see howmy letters appeared. About a year ago I discovered that with my PDA, if Iwatched the end of my pen/stylus when writing, my accuracy improvedtremendously. It turns out that we stare at the end of a pen in order to getthe necessary feedback to write well, not to see what we’ve written.