Full Issue: AccessWorld January 2017

2016: The Year of Braille

In the January, 2016 issue of AccessWorld, we took a look at the upcoming transition from English braille to Unified English braille in the United States. This momentous change in the braille code used by millions of blind people across the country coincided with the 207th birthday of Louis Braille, creator of the code. There was quite a bit of discussion as to how the changes in the braille code would affect the blind community, and it was certain to take a few months for everything to fall into place. If these changes to braille were the only things to take place in 2016, there would certainly have been a lot to reflect on when the year drew to a close.

As it turned out, however, the topic of braille would remain in the news for most of the year. At the 31st annual California State University (CSUN) conference, held each year in San Diego, California, an event where accessible technology products are often announced, braille was a major area of discussion—so much so, in fact, that the 2016 CSUN conference was dubbed by many as "the year of braille."

Here at AccessWorld, we thought it would be a good idea to take a look at some of the products that were announced during the 2016 CSUN conference, and see which ones actually made it into the hands of the blind community. Some products have been quite successful, and some appear to be just over the horizon. Others have not yet made an appearance as of this writing.

Let's take a look at a few examples.

Braille Tablets Make an Appearance

French company Inside Vision introduced a touch-screen PC tablet running Windows 10 and sporting a 32-cell braille display. Rather than having 32 cursor-routing buttons above each cell of the braille display, the tablet would have a slider that would allow various cells to be acted upon. Rather than sporting a physical Perkins-style 9-key keyboard, divots on the tablet's surface would allow the typist to place their fingers in the proper configuration to type in braille. A regular qwerty keyboard would also be available. In addition to using regular Windows apps, specialized software for tasks such as taking notes would also be included. The tablet would run NVDA, boast 10 hours of battery life, and would soon be available in the United States. The downside to this device was its $7,000 price tag. As of the time of this writing, the tablet does not yet appear to be available in the US.

A product that definitely has made its presence known since its debut at CSUN 2016 is Humanware's BrailleNote Touch. This Google-certified Android tablet also sports a 32-cell braille display and a physical 9-key Perkins-style keyboard. Fold the keyboard back, and you can type in braille on the tablet's touch screen. Humanware's TouchBraille technology allows the typist to almost instantly calibrate the fingers when typing on the touch screen, and this process can be repeated multiple times as needed. A sighted person can view content on the tablet's screen, and an HDMI port allows the device to be plugged into a monitor for easier viewing. Since its release in early summer, Humanware has provided several updates to the BrailleNote Touch's suite of included applications. Today, it is possible for a blind person to type math equations in Nemeth or UEB code while a sighted person views the regular math expressions. In addition to being able to complete standard word processing tasks using the BrailleNote Touch's KeyWord application, it is possible to both read and write in braille using the included KeyBRF application. BrailleNote Touch users are also able to take advantage of the tablet's built-in camera when using the included copy of the ever-popular KNFB Reader program. Finally, in addition to using the BrailleNote Touch as a stand-alone device, it is now possible to use the unit as a braille display with your smartphone or in conjunction with your favorite screen reader.

AccessWorld published an article reviewing the BrailleNote Touch, and I invite you to read the BrailleNote Touch review.

Hybrids Are Hot

Over the past few years, the line between specialized notetakers for the blind and mainstream devices has begun to blur. This was in evidence at CSUN 2016 when the company then known as Freedom Scientific, now VFO, announced a forthcoming product known as ElBraille. In essence, ElBraille is a docking station designed for the company's 14-cell Focus Braille display. It runs a 32-bit version of Windows 10, the latest version of JAWS for Windows, and does not have a visual display. The docking station does, however, contain a Perkins-style keyboard. You can read all the specs here, but a few points are notable. First, the unit promises 20 hours of battery life, and it is possible to take advantage of cellular connectivity. Finally, quoting from this Blind Bargains blog post, "The ElBraille also features a rescue mode button which will launch a menu separate from JAWS that self-voices and has braille output." As of this writing, the ElBraille has not yet been released, and pricing information is not available.

Irie-AT announced a new braille notetaker as well. NeoBraille runs Android 5.1, contains 32 braille cells and uses a Perkins-style keyboard. In a departure from other products of its type, NeoBraille has its own app store, and does not support apps from the Google Play store. The unit is priced at $4,995 and can be ordered here.

Affordable Braille May Be Just Around the Corner

If there is one common denominator for most all of the products discussed thus far, it is a high price point. Unless you have several thousand dollars to spend, or can make a case for having equipment purchased for you, you will most likely only be able to imagine what it would be like to use all of the new products mentioned here. It is hard to imagine that anyone would disagree with the statement that braille needs to become more affordable. At CSUN 2016, the Transforming Braille Group and Orbit Research announced the forthcoming release of the Orbit Reader 20, a braille display that should sell for somewhere around $500 when it is released. Although the Orbit Reader 20 is isn't as high powered as most note-takers and the braille cell refresh time of about 1 second is much slower than is typical of most displays, the lower price point should make it possible for many people who have never owned a braille display before to finally have one in their hands. With increasing concerns about a decline in braille literacy, the Orbit Reader 20 is expected to be a welcome addition to the family of braille displays already in existence. AccessWorld contributing author Deborah Kendrick wrote an excellent review of the Orbit Reader 20 for the October issue of the magazine.

The Bottom Line

We have taken a look at just a few of the products mentioned at the 2016 CSUN conference. What we have seen, though, should give us hope. Companies are working hard to come up with new ways of getting braille into the hands of those who need it most. Rather than trotting out old ideas in a new package, developers are looking for ways to merge modern, mainstream technology with accessible options that increase productivity and efficiency for blind people who use technology every day of their lives.

When the 32nd CSUN Assistive Technology Conference convenes this February 27th, what new products will await us? Will some of the hardware promised in 2016 finally be available in 2017? Will there be new products for us to look at that will change the landscape entirely? Rest assured that AccessWorld will be at CSUN this year, and we will let you know what we find.

Blind Bargains podcast coverage of CSUN 2016, sponsored by the American Foundation for the Blind, produced several hours of audio content and blog posts from CSUN 2016. Anyone wishing to learn more about the products discussed here, as well as products not discussed in this article, will find Blind Bargains to be a great resource.

Comment on this article.

Related articles:

More from this author:

AccessWorld Recognizes the Birthday of Louis Braille

Lee Huffman

Dear AccessWorld readers,

The holidays are behind us now, and it is time to start looking forward to a new year. For many, that means New Year's resolutions and commitments to exercising, losing weight, eating healthier foods, and taking better care of ourselves. Statistically, people with vision loss tend to be among the most sedentary and unhealthy among all age groups, but it does not have to be that way. You may be surprised by how technology can help you become healthier in the new year. If getting into better shape is on your mind, check out Bill Holton's article on Bluetooth scales you can access with VoiceOver to track your weight loss progress through accessible apps.

On another topic, at this time each year, AccessWorld recognizes and celebrates the birthday, contributions, and legacy of Louis Braille. The fact is, 200 years ago, a child or adult who was blind had no effective way to read or write independently. Today, thanks to Louis Braille's invention and continuing advancements in technology, children and adults who are blind can read and write as well as their sighted peers. The invention of braille, a system of raised dots representing letters, numbers, and punctuation, truly revolutionized independent communication for people who are blind or visually impaired.

This month, AccessWorld celebrates the anniversary of Louis Braille's birthday, January 4, 1809. We also celebrate the braille code, named after its young inventor, and the expanded possibilities for literacy and independence this code created for people with vision loss.

The AccessWorld team invites you to visit The Louis Braille Museum on the AFB website, which illustrates the life and legacy of the creator of the braille code. Using photographs, engravings, and illustrations from books preserved in the AFB Archives and Rare Book Collection, the museum traces Braille's life from his childhood in Coupvray, France, through his student years in Paris, to his invention of the braille code and the recognition of its importance throughout the world.

We also invite you to read The Reading Fingers, the full text of Jean Roblin's classic 1952 biography of Louis Braille, and "Braille, the Magic Wand of the Blind," Helen Keller's essay on Louis Braille, written around 1924. In this essay, Keller describes how the braille system works and relates how she benefited from learning and using braille. She describes the reading systems that existed prior to braille and the debates of the late 19th and early 20th centuries over competing embossed systems.

I also encourage you to check out Cay Holbrook's blog post, "Falling in Love with Braille," on the AFB FamilyConnect site. If you happen to be a teacher of visually impaired students or a professional in the field of vision loss, you may want to take Reinforcing Braille Using the iPad, a webinar available for purchase from the AFB e-Learning Center. For kids, parents, and teachers interested in learning about braille in a fun and interactive way, please check out AFB's BrailleBug website.

Today, braille has made the leap into the increasingly fast-paced world of technology via braille notetakers and braille displays. The following braille-related articles from the AccessWorld archives will be interesting and useful to those who are interested in, or users of, braille and braille technology.

The entire AccessWorld team hopes you enjoy this issue and exploring the additional braille resources linked to above. The team hopes you will make 2017 the year you become more tech savvy than ever. We encourage you to download and try the AccessWorld app for iOS, and we wish you the best in the New Year!

Sincerely,
Lee Huffman
AccessWorld Editor-in-Chief
American Foundation for the Blind

Letters to the Editor

Dear AccessWorld Editor,

An Evaluation of the Samsung UN50H6400 Television by Bill Holton was a fantastic, well-written, and thoughtful review, as is so much of your content, and it is much appreciated. It's two years later now, and this article is way overdue for an update. Information on smart TVs that run on Android and contain voice guidance or let you load Talk Back is still nearly impossible to find on the Internet. On the Samsung site the features aren't even listed on the specification sheets. You have to go into the individual manuals for each model to find it. So there is no way I could find to easily search for models that have it.

The voice guidance feature has slipped down into models that can be obtained this holiday season for under $300 including the UN40J5200 ASXZA. Samsung Direct had this model for $249 with free shipping, and their technical support person verified the existence of the accessibility menu. Voice guidance can be launched now by a long press on the mute button. From what I understand all of their 4K smart TVs are supposed to have this feature, but testing in the store would be recommended. After all, I could be wrong. Other specific more expensive models I was given with this feature are: UN40JU6700, UN40KU700D, and UN40KU7000.

These are all 40-inch models since that is what I was interested in, but I understand all the 2015 and 2016 4K Smart TVs have this feature. I would love to [know] if some of the bugs have been worked out [and] if other manufacturers have stepped up to provide talking televisions for the US market, along with their models, problems, and prices. Thank you very much.

Respectfully,

Reginald George

Dear AccessWorld Editor,

My comments are in response to Janet Ingber's November article, Holiday Shopping at Best Buy and QVC.

I have used QVC for 15 or so years. I have done so with all methods mentioned in this great article. The only point I take issue with is saying their website is completely accessible. [In the example in the article] of the silver bracelet, there would not have been a color option, but say… I want to order a top that comes in many colors. The buttons for color are not labelled. Therefore, I would still have to call customer service or automated ordering to place my order. The buttons just say "button" … with no other identifying marker. Have you found a work around for this problem? Because of this, I use the app more and more… at least up until I get to the checkout portion.

Great job! Keep up these great articles!

Shannon Cook

AccessWorld News

AFBLC registration Is Now Open!

Registration for the 2017 Joint AFB Leadership and Virginia AER Conference is now open! Please register today for the best available rates, and don't forget to make your hotel reservation at the beautiful and centrally located Crystal Gateway Marriott—the special event rate of $190/night is available through our dedicated reservations page. If you prefer to speak with a reservations specialist, please call 888-236-2427 and reference our conference to get the group rate. Rooms available at this rate are limited, so reserve early for the best availability.

Many of you have asked and we are pleased to share the 2017 program! Sessions are still being added, but you can take a look now by day or by topic.

If you are interested in hosting an exhibit booth, advertising in the program, or sponsoring the conference, please take a look at our options and contact Amanda Kolling for more information.

For more information on the conference, please visit our meeting website. We look forward to seeing you in Crystal City!

The American Foundation for the Blind Now Accepting Applications for its 2017 Scholarship Program

The American Foundation for the Blind (AFB) administers three post-secondary education scholarships for up to 7 deserving students who are legally blind. The available scholarships for 2017 are detailed below.

The Rudolph Dillman Memorial Scholarship: Four scholarships of $2,500

Requirements:

  • Full-time undergraduate or graduate student
  • Studying rehabilitation or education of persons who are blind and/or visually impaired

The Paul and Ellen Ruckes Scholarship: Two scholarships of $2,000

Requirements:

  • Full time undergraduate or graduate student
  • Studying engineering or computer, physical, or life sciences

The R.L. Gillette, Gladys C. Anderson, and Karen D. Carsel Memorial Scholarship: One scholarship of $3,500

Requirements:

  • Female
  • Undergraduate student
  • Studying music

Visit the AFB scholarships website for further information and to fill out the application

Please direct questions and comments to: American Foundation for the Blind Information Center, 800-232-5463, afbinfo@afb.net

Announcing the release of version 2.5 of Sendero's iPhone GPS apps

  • Seeing Eye GPS (subscription in North America)
  • RNIB Navigator (subscription in the UK, Ireland, France and, Germany)
  • Guide Dogs NSW/ACT (subscription in Australia)
  • Seeing Eye GPS XT (North America, no subscription; upcoming release

Thanks to continued underwriting from Guide Dogs NSW ACT Australia, there are over 10 improvements in the Sendero version 2.5 iPhone GPS apps including collaboration with BlindSquare to generate routes.

Warning from the manufacturer: Before you install the update, share your user points of interest with your email; otherwise, they will be deleted from the app. You can then click on the attachment of the email on your phone to import them into the new version.

What's New in Version 2.5
  1. Ability to select a destination within BlindSquare and be able to select a Sendero GPS app to create a route to that destination.
  2. Ability to create route to an address in your Contacts database.
  3. Simplified the process of adding a User POI with the following changes:

    1. Added a Record User POI gesture, one-finger double tap and hold on the location screen to go directly to the Add User POI screen.
    2. Automatically default to the 'User' category so that users do not have to do the extra step of selecting a category if they are not going to share user POIs.
    3. Create a free-text field for tags so that the user can type their own tags and add further information about the user POI.
  4. Added a shortcut gesture to go directly to a list of nearest points of interest. Perform a one-finger double tap, hold and release on the POI tab, (only available when VoiceOver is on).

  5. Revamp nearest POI search to be more consistent and relative to user's immediate location.
  6. Increased the number of POI's in announced in the LookAround wand.
  7. Added a Submenu screen on the location screen to announce less essential information such as altitude, GPS accuracy and speed.
  8. Added an accuracy filter for Side of Street indication in routes.
  9. Now if the POI has a questionable location, such as in the middle of the street, the app will not announce the side of the street for the route destination.
  10. Fixed bugs introduced in iOS 10 including correcting the background functionality.
  11. User Interface improvements include:

    1. Streamlined User Interface to reduce duplicate announcements.
    2. Fixed the voice over cursor issue.
    3. Fixed low vision contrast issues, border color to all buttons.
    4. Removed irrelevant POI categories, "points_of_interest" and "street_address."
  12. Under the POI tab, the Beacon feature has been temporarily removed.

Countries and regions available as of 2016: USA, Puerto Rico, Virgin Islands, Canada, UK, France, Ireland, New Zealand, Australia, Austria, Belgium, Germany, Lithuania, Spain, Switzerland, Turkey and Israel.

The iPhone apps are available in the App Store and on the Sendero website.

A Review of the Mystic Access Apple Watch Tutorial

Like the iPhone before it, the Apple Watch has gained a following among users who are blind and visually impaired, due to the availability of the built-in VoiceOver screen reader. If you're an iPhone user who's into fitness or who just wants to have instant access to what's happening on your phone, the watch may be a tempting accessory. Even for an experienced VoiceOver user, though, a new piece of hardware with its own software means there's a lot to learn. The Audio Tutorial for the Apple Watch from Mystic Access aims to make getting to know your watch easier.

Like the other educational offerings from Mystic Access, the Apple Watch tutorial is a long-form audio program, organized by topic into sections and subsections, like a book's table of contents. This structure gives users the ability to choose whether to hear the entire tutorial, or navigate directly to a particular topic. Since it's available in DAISY or MP3 format, the tutorial can be played on an iOS device, a computer, or a DAISY-capable audio player, such as the Victor Reader Stream. At standard playback speed, the Apple Watch tutorial runs more than 4.5 hours.

Written and voiced by experienced accessibility teacher, Lisa Salinger, the tutorial aims to be a comprehensive guide to getting to know, and learning to use, the Apple Watch. Salinger begins by addressing potential watch purchasers, suggesting who might want such a device. She then introduces the hardware (a supported iPhone and an Apple Watch), and software (the current version of the watchOS 3 operating system) you'll need to work through the tutorial. You'll then learn about watch accessories and bands, and get a hands-on orientation to the watch hardware. The organization of these introductory sections is a bit muddled: we learn about watch bands before inspecting the contents of the box, for example, but this is a minor concern.

Early on, Salinger suggests you should feel free to skip around in the tutorial, rather than approaching it in a linear manner. This is good advice, not only because different users have different needs and levels of knowledge, but because there are so many ways to customize the watch to your own use. In fact, a substantial portion of the tutorial describes settings and other configuration options. If you're impatient to begin a watch-based workout, or to feel a message notification on your wrist, you might find yourself jumping two-thirds of the way through the material, to a section that covers using your watch. Others might choose to work through the tutorial from the beginning; early sections introduce wearing options, VoiceOver gestures you'll need to master, and how to use the hardware buttons.

Next up are a number of long sections that take you through almost all settings on the watch and in the iPhone's Watch app that's used to configure a number of watch/phone integration options. Salinger's table of contents is detailed, and she does a good job of introducing each section, allowing you to quickly decide whether the section contains the material you want by listening to a few seconds of audio. She also incorporates short "time out" sections, in which she reviews what has come before, and encourages listeners to reflect on what they've learned and take a break if they're feeling overwhelmed.

Toward the end of the tutorial, Salinger begins demonstrating the uses of the watch to track workouts, hear message notifications, check the weather, play music, and use Siri to control the watch and phone. Because so much of the tutorial is devoted to setup, the sections focused on using the watch feel like a treat. But if you navigate through the material, rather than listening from the beginning, you can have your desert first!

The tutorial is thorough and well-written. No important watch feature or setting is left out, and Salinger sprinkles tips and hints throughout the material that show she is an experienced Apple Watch user. Salinger, who has produced audio programs and podcasts for other organizations, speaks in a clear, slow voice, pausing frequently. Her pace may feel too slow for some users, but that's easily remedied by playing the Apple Watch tutorial on a device, or in an app that allows you to adjust playback speed. I used Voice Dream Reader on my iPhone to play the tutorial at 165%, a comfortable speed for my listening style.

The tutorial includes speech from Salinger's Apple Watch and her iPhone, throughout. She has chosen different voices for the two devices, so it's easy to know which one is speaking. She doesn't fall into the habit of some narrators, who tend to repeat what a device has just said, which I appreciate. All speech is distinct, but you might need to turn your listening device's volume up a bit to catch the Watch's audio. It's probably most effective to use headphones to listen, especially if you experience any hearing loss.

This tutorial is produced entirely from the perspective of a VoiceOver user. Salinger says early on, and I agree with her, that many users with low vision will find it far easier to use an Apple Watch with speech than with zoom and the few contrast settings Apple offers. For that reason, the tutorial does not offer detailed guidance for those who want to use the watch without VoiceOver.

The Bottom Line

If you're serious about the Apple Watch and want to get the most from it as a VoiceOver user, this Mystic Access tutorial is a great way to dig into every nook and cranny to find features you might not know about. Though Apple does provide VoiceOver-aware documentation for the watch, it is neither as detailed, nor as focused on the lived experience of watch ownership as is the Mystic Access tutorial. At nearly $40, this audio program isn't cheap, but neither was your fancy Apple Watch. Watch owners who are not fully comfortable with new technology will feel comfortable in Salinger's patient hands. If you're a geek who enjoys teaching yourself about your gadgets, you will get less from the tutorial, though Salinger's inclusion of tips and real-world experience take the material beyond beginner level in some areas.

Product: Audio Tutorial for the Apple Watch (available in DAISY or MP3 format) from Mystic Access, 716-543-3323
Price: $39.00

Comment on this article.

Related articles:

More from this author:

VizLens and HALOS: Making Touchscreen Appliances and Other Devices More Blind Friendly

Here at AccessWorld, we have published an ongoing series of articles focused on emerging research and breakthrough technologies to help prevent blindness and restore lost vision. This article describes an emerging solution to a growing accessibility issue: the increasing prevalence of terminals, kiosks, vending machines, and other interactive objects that use touchscreen interfaces inaccessible to people with visual impairments. The article also introduces a brand new "low-tech" solution available right now that can help you better label and navigate the touch controls on your microwave, oven, dishwasher, and other home appliances.

VizLens

If you have used an iPhone for any length of time, you likely have experience with an app called VizWiz. VizWiz was developed by the ROC HCI Group at the University of Rochester, with the support of Google and the National Science Foundation. With VizWiz, you can take a photo and send it along with a question to a contact, Facebook friend, Twitter user, or Amazon Mechanical Turk worker, who can return an answer. Unfortunately, if you've tried using VizWiz lately, you have doubtless noticed a decided lack of response.

"VizWiz was produced as a research project," says Jeffrey Bigham, one of VizWiz's lead developers. "Once other people took the ideas and produced other free and commercial options, it became time to let it lapse."

Bigham's dedication to accessibility has not flagged, however. He's now an Associate Professor at the Carnegie Mellon University Human-Computer Interaction Institute, where one of his PhD students, Anhong Guo, expressed an interest in using computer vision to assist people with visual impairments. Bigham became a project advisor to Guo, as the student began looking for ways to make non-voicing touch interfaces more accessible. Today they have a working prototype, called VizLens.

The VizLens user begins by taking an iPhone photo of a touch interface, and giving it a name (such as "Home Microwave," or "Office Vending Machine"). Then, the user uploads the photo to the service, where the image is crowdsourced for text labeling of each of the interface controls, along with its accompanying position. The image is then sent to a server where it is stored for access whenever the user wants to operate the device.

The user receives a text map of the touch panel, which can be explored using VoiceOver touch and swipe navigation. Say your dishwasher has a single row of touch controls. You would now know that the extreme right button is "Start," and the one to its immediate left is "Rinse and Hold," and so on and so forth. You could also snap a pic of your new TV remote, and then use the resulting map to familiarize yourself with the various controls.

But wait—as they say on TV infomercials—there's more!

After a touch controlled appliance or other device has been mapped, the user can open the VizLens app, activate the named item, then hold the phone with one hand with the camera aimed at the controls, then hover a finger from the other hand over the touchpad. The app announces, button-by-button, which control your finger is pointing at. A second mode allows the user to tap the control they wish to use, at which point the app guides your finger toward the correct position with either beeps or spoken "left, right, up, down" instructions.

"This works well for touch panels with a single layer of controls," says Guo. "But there are many touch interfaces, such as the ones on our office coffee pot and copying machine, that offer dynamic displays, using Mode buttons that change the entire layout with each new press."

The solution the VizLens team has developed involves a one-time series of photographs, one for each mode. "At this point, the app could identify which mode the device is in, and offer navigation for that particular screen," says Guo.

The VizWiz team does not plan to use a library of appliance and other touch panel devices, since the angle and lighting for each user will vary. However, once you have taken the first picture, it will be placed in an on-device library so it won't be necessary to reshoot every time you wish to preheat the oven or start a load of wash.

Future enhancements include using OCR to verify that the information you enter is correct, and combining OCR with crowdsourcing to enable near-real-time use of dynamic displays that have not already been added to your device's library—such as the ticket kiosk at an airport where you have never traveled from before.

"It would also be a simple matter to use the scans to produce 3D templates with tactile controls," notes Bigham.

As to what these tactile controls might conceivably feel like, read on.

HALOS

Currently, the best solution to operating touch interface home appliances is via adhesive dots to mark often used controls or a braille overlay, either handmade or produced by third-party providers or one of a very few appliance makers who offer them. Recently, a new alternative has appeared which offers tremendous potential. It's called the Home Appliance Label and Overlay System (HALOS), produced by Anne DeWitte, proprietor of Tangible Surface Research.

As an engineering grad student, DeWitte studied tactile displays. "Today we have LEDs that can change colors. Hopefully, soon we will also have materials that can shrink or expand using similar display commands to produce touchable buttons, sliders, and even graphics."

It occurred to DeWitte that these same developing screen technologies could also be used to help consumers with visual impairments to operate the touch-sensitive control panels that are becoming standard on more and more home appliances. Thinking ahead toward this possibility, DeWitte began to wonder, "Once we do have the ability, what will the universal tactile representation for, say, a timer clock feel like?"

DeWitte was familiar with braille appliance overlays, but she rejected these as programmable tactile elements. "Not everyone, especially newly blind seniors, reads braille, so they cannot be considered universal design elements."

DeWitte wanted to develop a tactile icon library that identifies tactile shapes that have meaning for the home appliance domain. She took her proposal to Experiment.com, a crowdfunding site for scientific research. Her project received funding, and with the help of design students at the Rochester Institute of Technology—who designed some of the tactile icon shapes—and guidance from members of the Association for the Blind and Visually Impaired (ABVI) in Rochester, who also supplied test volunteer subjects, she began compiling a library of proposed universal tactile icons that could be used over a variety of appliance models and manufacturers.

"The tactile Start button on a Microwave should be the same as the tactile Start button on a dishwasher," observes DeWitte.

Too much research winds up being published in a journal, then put on a back shelf waiting for someone else to turn the theoretical into the actual. But DeWitte is an engineer, and wanted to see tangible results of her work. So she began creating transparent silicon appliance overlays, which used shapes instead of braille to note various functions. If you've used an NLS player you are already familiar with several of these tactile icons: a left-pointing arrow for Rewind, a raised letter X for Stop.

Today, DeWitte offers custom appliance templets on her Etsy store for just $30. Instead of raised dots, these overlays use different shapes to signify what each control does. A raised right-facing triangle shape signifies the Start button on an oven, microwave or a dishwasher, for example, while a popped kernel shape signifies a microwave's Popcorn setting.

"Once I get a product model name and number I can usually find the display layout in a parts list so I can get the layout and measurements," she says. "If it isn't available, I ask the customer to place a quarter against the display for context and then snap a photo for me to work from."

Since the overlays are transparent, a sight-impaired individual can use them without blocking the view for their sighted spouse, children, or other housemates. DeWitte also offers a second option if you don't wish to use a full template. Currently, she produces three different sets of foam-based, stick-on Tactile Icons, which include HALOS for most popular controls. HALOS are priced at $5 per set, but DeWitte is also willing to consider creating custom HALOS packs, or even creating individual custom icons with the shapes of your choice.

"My most popular custom requests are for clothes washers and driers," she says. Which is why DeWitte is actively seeking feedback on what shapes would work best for various appliance functions such as "Delicate Wash," and "Air Dry."

Further Information

You can read more about VizLens and stay up to date on the VizLens website, which includes research materials and a YouTube video demonstration.

HALOS are available through DeWitte's Etsy shop.

Comment on this article.

Related articles:

More from this author:

The Quest for “The Holy Braille” Full-Page Refreshable Braille Display: An Interview with Alex Russomanno

If all goes as planned, The Holy Braille, a project from the University of Michigan School of Information, is set to revolutionize the pace of education for people with visual impairments. It would do this through developing a full-page, refreshable braille display. Project designers believe that the device could drastically improve the rate at which people with visual impairments receive and interpret information. Currently, braille readers or devices that attach to the bottom of a tablet can interpret only one line of readable text at a time.

The team's two-pronged approach seeks to develop the technology needed to create a full-page refreshable braille display and tactile graphics on portable devices at a reasonable cost. It would do so through a new technology, a pneumatic system of air pressure and fluid below the screen, programmed to raise bubbles on the screen's surface to produce braille and tactile graphics.

The epicenter of The Holy Braille's development sits at The University of Michigan. Here, Brent Gillespie of the College of Mechanical Engineering, Alex Russomanno, PhD student in the Department of Mechanical Engineering, Mark Burns of the College of Chemical Engineering, and Sile O'Modhrain, Associate Professor of the School of Information are working to build their Holy Braille model. Their outlook is positive and their motivation strong.

The device's catchy name was coined by collaborator Noel Runyan, and from what the team is telling us, it's the holy grail for adaptive technology for the blind and visually impaired—the "answer" or wondrous revelation of truth after a long, laborious journey. The quest for The Holy Braille is the quest for visually impaired individuals to have the same digital experience as their sighted peers.

In a recent interview, Alex Russomanno talked about his team's momentum, the project's trajectory, and what we can expect to see in the next few years.

Francesca Crozier-Fitzgerald: What inspired you to join the team working to create The Holy Braille?

Alex Russomanno: I initially got involved in the project because of my prior experience with the technology that we're working on, dealing with microdevices. I did research on similar technology during my undergraduate program. While the braille and tactile graphics applications were added elements to this project, they're ones I've ended up being really excited about.

FCF: I understand that while refreshable braille displays do exist, the current technology only provides one line of text at a time. How did this factor into your team's motivation for creating a full-page refreshable braille display?

AR: Sile O'Modhrain, from the School of Information, is a key member of our team. She is blind herself and an avid user of refreshable braille displays for many years, as well as other types of adaptive equipment and technology. She has firsthand experience on what's not available in the industry, especially in the realm of refreshable braille displays. One of the big things I've learned from her and in talking to other people that use these devices is that they are very limiting. Reading a single line of rendered braille on the screen will not be like reading a full page of hard copy braille on a sheet of paper. It's not equivalent. In the same way, there is no adaptation for images, spreadsheets, or other graphics, so there's no way for visually impaired individuals to interact with these [types of] more interactive digital content. She is personally motivated to make these features available.

FCF: How has Sile O'Modhrain's first-hand experience been important for the development of the project?

AR: In collaboration with my advisor, Brent Gillespie, she's always been considering different ways they could make an impact for visually impaired individuals. Brent Gillespie works in the haptics domain—research involving the sense of touch, so there is a real clear overlap of their interests. Between Brent's mechanical engineering expertise and Sile's background in working with these devices and knowing where they are lacking, they are attacking this challenge to improve full-page tactile text and graphics.

FCF: If you had to narrow it down, what are your main objectives with developing The Holy Braille?

AR: Our first objective is to create multiple lines of text, electronically, that you could access just as you would access a hard copy braille, scanning and using both hands to really have a more immersive interaction with the content.

Our second objective is to use the technology to render images, and go so far as to render a tactile interface you can feel and interact with, similar to how sighted people interact with a desktop computer or any type of visual display with unique icons. If we could build a display that has a full page of dots that move up and down, you could use those dots to render images, to render the equivalent of icons, and you could have interesting new ways of interacting with digital content you don't currently have with text to-speech software or other adaptive means of interacting with visual content.

FCF: Where are you in the trajectory of getting to the answers of these questions?

AR: It's a two-pronged approach right now. On one side we're working with building the technology—how can we enable the creation of a low cost full-page display. We're not quite ready to be putting out a product, but in the next year we'll be looking at attaining seed funding to pursue the commercialization of our technology, enabling the creation of a potential prototype model of our device.

The second part of our research is looking at how someone would interact with such a device. Currently, while full-page braille displays have been created, they are so expensive not many people can get their hands on them. We want to research how they are used and what they will be best used for. It's hard to have these answers at this point without a prototype device, but we're moving closer to these answers.

FCF: You talk about visual displays—graphs, charts, spreadsheets—being rendered in braille. Can you explain how your team would actually create those images with braille?

AR: The image or graphic would look like a grid of dots. By programming those dots to appear with appropriate spacing and heights, you will be able to render shapes or images. This is very similar to how you program pixels to appear as an image on a computer screen. To make graphs and different images on the screen, it's just a matter of programming those physical features in the form of dots, to rise and fall in a certain, programmed manner.

Creating rendered images and graphics is obviously much more complicated than writing braille letters in a straight line. In times like this it has helped having contacts at SKERI (Smith-Kettlewell Eye Research Institute) as they're also very interested in studying the ways that blind and visually impaired people in particular are interacting with tactile images.

FCF: Are there alternative methods that you are currently using or can use to study how individuals will use the device?

AR: Right now our research is focused on mimicking the display, or a manifestation of our display, using methods like 3D printing. We can 3D print what our display might look like and then run small-group sample studies to see how people interact with the device. We can observe how well they can identify tactile images and then we'll then use that information to guide the creation of an eventual display. We're very interested in both sides, the technology and also how the technology might be used because it's all brand new. No one has created such a device.

FCF: It seems like this technology could really change the way we are educating our blind and visually impaired students. What stands in your way of getting these devices into residential schools for the blind, or into the hands of visually impaired students around the world?

AR: We are well aware that one of the limits of creating a refreshable braille displays or multi-lined displays has been their cost. A full-page refreshable braille display could cost up to $55,000 for one device, and that's obviously prohibitively expensive if you want to get it into the hands of a student or of someone that actually needs it. We think we have come up with a specific technology and a method to manufacture that device in a way that's going to be much cheaper. We think by creating our technology, through a pneumatic system, will reduce costs of production.

With our technology, all of the dots that move up and down and all the interconnections and air channels and pumps, and everything needed to control them, would all come in one single piece. It'd be made in the same way that a computer chip is made. When you take away the need for assembling all the parts, and condense the technology into one piece, you remove some of the main factors that tend to drive up the costs. That's where I think the bread and butter, where we make our impact, will be. We want to lower the cost of these devices so we can get these things into the hands of people that will use them.

FCF: Are there still technical hurdles that your team is addressing regarding the production of The Holy Braille?

AR: There are many open questions regarding turning this into a portable device, that is something that I imagine will be further down the road. I don't think creating portable devices is impossible, but it's an added technical hurdle to make that happen. Creating the full-page rendered braille display in a desktop version, the size of a CCTV for example, that you could put on a table or desk in a classroom, we're not far from that. That is well within our current reach.

FCF: What keeps you motivated to work on this project?

AR: It's fascinating. We went to the California School for the Blind with our collaborators from SKERI earlier this year to get more information about how tactile graphics can be used and what can be improved. It's just very obvious that a lot of older children that don't have access to these materials and devices early on have a lot of trouble reading the information in a graph or comprehending what it is trying to communicate, and that makes sense. If a child has not grown up reading or learning how to read graphs, how could they be expected to comprehend what a graph is and how it's read later on in their lives? Even those who are getting access and experience with tactile images, are not getting enough to feel comfortable and fluent.

We want to get these things into the hands of these kids as early as possible, because teaching these things later makes it tougher to catch up. So, that motivates me.

If you have further questions, you can contact members of The Holy Braille team, Alex Russomanno or Sile O'Modhrain.

Comment on this article.

Related articles: