Full Issue: AccessWorld December 2021

Editor's Page: Looking Back on 2021

Dear AccessWorld Readers,

It seems that I have been saying this a great deal lately, but time really does fly. Once again, we are closing out another volume of AccessWorld; at this point, we have published a full 22 volumes of the magazine. I thought I would use this space to review AccessWorld in 2021 to date. Remember, if you wish to review old issues of AccessWorld going back to our original publication in 2000, you can do so on the Back Issues page.

Each January, we celebrate the birthday of Louis Braille with an issue of AccessWorld focused on braille. This year, our braille-focused content included a braille retrospective from Jamie Pauls, A review of several 40-cell braille displays by Scott Davert, and an article detailing Braille by Sight from Hadley by Steve Kelley. In addition, J.J. Meddaugh brought us a review of the new Chromecast from Google and Janet Ingber reviewed the 2020 release of the Mac operating system, Big Sur.

When AccessWorld began, our February issue focused exclusively on seniors. Over the years, the focus first expanded to anyone with low vision, and then to anyone who is also new to vision loss, no matter their age. In February this year we published our first piece of sponsored content, a look at the Lookout app from Google. With a focus on the new user of assistive technology, Bill Holton brought us a guide to his most important accessible apps for the iOS platform. J.J. Meddaugh covered the first Sight Tech Global accessibility conference, with the coverage of this year's conference in this (December 2021) issue of AccessWorld. Steve Kelley reviewed the Blind Shell Classic Light while Janet Ingber reviewed Netflix in the first of our new Streaming Video Service series. Finally, Deborah Kendrick informed us of the compact and accessible HP LaserJet M15W laser printer.

The AccessWorld issues from March through June do not have specific themes, but cover whatever is most pertinent at the time. In March this year, Deborah Kendrick brought us the Employment Matters article profiling Matthew Campbell, Chief Technology Officer of Pneuma Solutions, while Steve Kelley discussed YouTube TV in the second installment of our Streaming Video Services series. Jamie Pauls reviewed the Google Meet videoconferencing service for the second installment of his Conferencing Programs series and Janet Ingber finished the issue with a review of Apple Fitness Plus.

Bill Holton started the April issue with a look at the Supersense app from Mediate, followed by Janet Ingber's review of Disney+ for our Streaming Video series. In addition, Steve Kelley reviewed YouTube Music for the first entry in our Streaming Audio Service series while Deborah Kendrick finished out the issue with a discussion of Ski for Light International 2021.

The May issue of AccessWorld was a full one, with Deborah Kendrick starting things off with an interview with Jim Kracht, retired Miami-Dade County attorney. Judy Dixon published her first AccessWorld article, a guide to LiDAR on the iPhone. In Android coverage, J.J. Meddaugh discussed the updates to the TalkBack screen reader and Steve Kelley reviewed Google's Lookout app. To finish the issue, Janet Ingber reviewed Apple Music for the Streaming Audio series and Jamie Pauls evaluated the game titled Untold RPG for iOS.

In June, we covered more Streaming Video with a review of Apple TV+ by Judy Dixon, followed by an article on two popular audio item trackers, Tile and AirTag, by Janet Ingber. J.J. Meddaugh reviewed the Clubhouse app—a platform I visit frequently and which is now open to everyone. Bill Holton looked at two different podcasts that provide audio description. We finished out the issue with two book reviews of books published by AccessWorld authors: Audio Description: What It Is, Where to Find It, and How to Use It, by Judy Dixon (reviewed by Janet Ingber) and Basics for the Beginning User: Mac OS 11 Big Sur Update, by Janet Ingber (reviewed by Steve Kelley).

We return to theme issues with "Back to School" in July. For our focused content in this issue, we have "Reading Print with Low Vision," by Steve Kelley, "Supercharge Your Braille Reading for Pleasure and Productivity," by Judy Dixon, and "Take a Note: A History of Braille Note Taking," by Deborah Kendrick. We finished the issue with "Product Review: The Brailliant Bi 40X Braille Display from HumanWare," by Deborah Kendrick, "A Recap of the 2021 Apple Worldwide Developers Conference (WWDC) Keynote," by Janet Ingber, and "Vision Tech: New Research into Glaucoma," by Bill Holton.

Starting in 2019, the August AccessWorld focused on entrepreneurship and then broadened to include personal finance in order to avoid overlap with October's employment theme. This year, we focused on personal finance for our topical article: "A Guide to Using Mobile Check Deposit and Some Helpful Accessories," by Judy Dixon and "Online Payment Platforms: An Evaluation of Accessibility, Part I: PayPal and Venmo," by Jamie Pauls. The issue also contains "When Old is New Again, Part I: What's Happening With some "Vintage" Access Technology Tools," by Deborah Kendrick, "Video Streaming Services, Part 5: The Accessibility of Hulu for Users with Visual Impairments," by Janet Ingber, and "RAZ Mobility MiniVision2 Mobile Phone for Users with Visual Impairments," by Steve Kelley.

For September, we published an interview with Dr. Hoby Wedler who was a featured guest in our centennial conversation Dinner and Music for a Historic Celebration. The issue also included the articles "Touching the News and Other Tactile Graphics Offerings from the San Francisco LightHouse," by Judy Dixon, "Video Streaming Services, Part 6: Amazon Prime Video: An Enormous Range of Content," also by Judy Dixon, and "Talking to Louie," by Steve Kelley.

Our October issue is dedicated to employment and for 2021 we featured sponsored content from the Lighthouse for the Blind in Seattle. In addition, the issue also included the seventh part in our video streaming series, a review of Peacock from Janet Ingber. Judy Dixon detailed the new mainstream features in iOS 15, while Janet Ingber discussed its accessibility features. Finally, J.J. Meddaugh reviewed Hearthstone, a mainstream video game that has recently become accessible. Hearthstone has quickly become one of my favorite games and is definitely worth a look.

November is the holiday issue. For 2021, Janet Ingber's now-traditional shopping guide looked at Chewy, Zappos, and Best Buy. We at AFB also brought you a gift guide with recommendations from me and other AFB staff. Deborah Kendrick detailed the achievements of recent MacArthur grant winner, Joshua A. Miele, while Jamie Pauls reviewed the BrailleSense 6. Finally, Steve Kelley described the Vision Buddy wearable.

Following are the top three most-viewed articles for 2021:

Thank you for being readers of AccessWorld and I hope that you have a wonderful rest of 2021!

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Next Article

Back to Table of Contents

Video Streaming Services, Part 8: Is it Worth Exploring Discovery+?

Janet Ingber

The Discovery+ app and the Discovery+ website present a wealth of documentary and reality content from an extensive list of channels. Unfortunately, at this time they do not offer audio description. The chat representative I contacted a few months ago said, “This particular feature is not yet available in this version of Discovery+, but it's a great idea so thanks for the feedback. I am going to share this with the developers to review.” The process of adding audio description can be a long and involved one. See Described Video via Blindy TV: "Taking the Vision out of Television" for some details on some of the challenges in bringing audio description to programming (Note the Blindy TV service described in that article is no longer available).

However, if you are willing to watch programs without audio description, below is the list of channels available with Discovery+.

  • A&E
  • American Heroes Channel
  • Animal Planet
  • Cooking Channel
  • Destination America
  • Discovery
  • Discovery Life
  • DIY Channel
  • Food Network
  • HGTV
  • History Channel
  • ID (Investigation Discovery)
  • Lifetime
  • Magnolia Network
  • OWN (Oprah Winfrey Network)
  • Planet Earth (from BBC)
  • Science Channel
  • TLC
  • Travel Channel

For this article, I used an iPhone 13 Mini and an M1 MacBook Air to test the service.

Getting Discovery+, Help, and Supported Devices

The Discovery+ iOS app is here, Android is here, and you also watch on the Discovery+ website.

There is a help link on the Discovery+ website. The Help Center has many categories including Watching Shows and Technical Assistance. They do not offer telephone support, but chat is available and accessible. In addition, There is a Help link in the Discovery+ iOS app in the Account tab.

Discovery+ is compatible with iOS and Android as well as with smart TVs and a lot of other devices.

Getting Started

Once you have installed the app or have chosen to use the website, you will need to create an account. At this time, Discovery+ offers a 7-day free trial. After that, there are two plans: ad free for $6.99 per month or with ads for $4.99 per month. You can have up to five people, including yourself, on the account. Four devices can be used simultaneously.

At the time of this writing, a student discount is available. Verizon customers who subscribe to certain unlimited plans can get one year for free.

Using Discovery+

The Discovery+ home screen has five tabs at the bottom of the screen. They are Home, Browse, My List, Search, and Account. Near the top of the screen are category tabs such as Food, True Crime, Documentaries, Relationships, Science & Technology, and Lifestyle. There is also a For You tab.

In the middle of the screen is information about different shows and also links to different Discovery+ channels.

The Browse tab has the same categories as on the home screen. In the top left corner is a Trending button and an A-Z button. By default, the Trending button is selected. If you select the A-Z button, your results will be selected alphabetically. If you flick right, you will hear a list of categories such as Food, Relationships, True Crime, and Nature & Animals. Selecting a category brings up a list of programs in that category. Below the categories list are options to choose specific channels and a description of some popular programs.

Selecting a category brings up a list of shows which meet the criteria. For example, I chose Nature and Animals and then chose a show called “Dogs 101.” On the new screen was information about the show, a Play Now button, and a Favorite Add button. Selecting the Favorite Add button adds the item to your list. Under the Favorite Now button are three tabs: Episodes, About the Show, and You May Also Like. If the show has more than one episode, they will be listed below the three tabs.

The My List tab is where your list of saved shows is located. As you explore the Browse section, find an item that interests you and select it. If you want to save the item to your list, activate the item’s Favorite Add button.

The Search tab has an Edit box at the top of the screen. Results are located below the Edit box.

If no text is in the Search box, there are several search categories including Recommended for You, Top Searches, and Just Added. Also in this area is an option to access your list.

The first item in the Account tab is Add Profile.

It is very easy to create a profile for Discovery+. Go to the Account tab and then flick right to the Create Profile button. An edit box will open. Enter the person’s name and select the Done button in the upper left corner. There is also a button to manage profiles.

Below the profile information is the Settings button. This is where you control notifications and access to your device. Below the Settings heading is the Help link.

Watching Content

Once an item is selected, it will begin playing within a few seconds. If you flick around the screen, you will find that VoiceOver is reading whatever information is presented about what is playing. For example, if you play a show with many episodes, the episodes list will remain on the screen. With sighted assistance, I learned that the information was not visible and only the video is on the screen. Unfortunately, the playback controls are not very accessible. While content is playing, if you double tap near the top third of the screen, an unlabeled box brings up playback controls. VoiceOver does not say anything. Controls are on the screen for about three seconds. There are two buttons at the top of the screen: Watch from Start and Watch Live. They can bring up the controls, but again, the controls disappear in about 3 seconds. If you swipe right from the "Watch live" button after pressing it you can find the controls briefly. The first two items are labeled "10" and seem to rewind and fastforward by 10 seconds respectively. Next you will see the curent time and finally, the total time of the program.

The two finger-double tap does work for start and stop. If you have an Apple Watch, playback controls will be there and easy to use.

Discovery+ on the Mac

Discovery+ works well on the Mac. The same categories as on the iOS device are on the website. However, your list  does not sync with the iOS app. If you are watching the same show on the app and website, that doesn’t sync either. The website has a lot of information but little clutter. If profiles have been set up, they too will be there. Playback controls are accessible. Once content is playing, VO-Right Arrow to playback controls. If that does not work, VO-Right Arrow to Full Screen. Do not select it. Instead, VO-Left Arrow until you hear the playback controls. You can also try using the forms option in the VoiceOver rotor.

Conclusion

Although Discovery+ does not offer audio description at this time, the streaming service offers a lot of content. It would be a considerable improvement if they offered audio description. The relatively inaccessible playback controls on the iOS app is a negative. The website does offer accessible playback controls however.

To try Discovery+, check out their free trial. If you have an unlimited plan with Verizon, call to find out if you are eligible for a free one-year subscription.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

Sight Tech Global Returns for Year Two

J.J. Meddaugh

Sight Tech Global is a mainstream accessibility conference which focuses on big picture technology trends and aims to give a glimpse at what the future of access technology may look like in the next 5 or 10 years. Now in its second year, it is organized by online news site TechCrunch and was entirely virtual and free to attend. The conference featured speakers from Apple, Google, Microsoft, HumanWare, Vispero, and other leading mainstream and access technology companies and covered such topics as tactile images, autonomous vehicles, and indoor navigation. The main stage sessions are available to watch for free and also include full text transcripts. Some of the highlights of the 2021 event are covered below.

The Holy Braille in Action

There has been much talk in recent years on potential solutions to render multiline braille and tactile graphics electronically and on the same screen. Greg Stilson, Head of Global Innovation for the American Printing House for the Blind did more than just talk about the future, he demonstrated it. Stilson fired up an early prototype of a graphics device developed in partnership between APH and HumanWare which he casually referred to as a tactile Kindle.

This is not APH's first foray into developing a tactile image device, in fact, their former partnership with Orbit Research yielded the Graphiti, a 40 by 60 pin tactile tablet capable of creating touchable imagery. But one thing missing from that product was the ability to display braille in the form familiar to readers as opposed to jumbo-sized dots.

As Stilson points out, "We can’t change the way that the pin actually feels to somebody. We can’t change the way Braille feels. It’s just too much to change. It’s too much to accept. So, we’re like, OK, the way I look at it is like if you took the way a print font looks, if you took away the way of a specific print font looks to a sighted person, the adoption rate would be far lower. So, I wanted to make sure that we really refined the pin and the way the pin feels so that people are comfortable as soon as they lay their fingers on it."

In fact, much of the user research performed over the past year examined these concepts, including how many lines of braille to include in the device and how thick to make the graphic lines, major design decisions which will ultimately guide the development of the product.

The importance of having both braille and graphics on the same screen would be hugely beneficial to many types of users, with a math textbook given as one possible use. Imagine a geometry book which includes a story problem, which then references an image. Ideally, a student would be able to read the math problem, and then locate the image to come up with a solution. But to take this idea to the next level, it may someday be possible for that student to touch the area of the image to zoom in and realize more details. Meanwhile, the entire screen could be relayed to a parent or teacher so they can also follow along with what the student is learning. IN the live demo, it took about two seconds for an image or text to be displayed on the machine.

Stilson envisions creating a means for software developers to connect to the unit, potentially enabling many uses beyond APH's educational needs. Application designers could feel what a screen looks like in real-time while designing an app. Travelers could read tactile maps or learn about their surroundings. With the right software, just about any image that is available online could be rendered on the machine. This technology will not be cheap, but Stilson also points out that it currently could cost $30,000 to produce one math textbook. Some of this cost comes from the embossers, and the amount of paper used (think thousands of pages for a large textbook). APH is exploring possible sources for grants to help defray the costs, as the current Federal Quota System used by school districts would not easily support such a large purchase. With a bit of luck and lots of testing, we may see this yet unnamed device by the end of 2023.

Apple Taking Images to the Next Level

This year's event opened with a Q&A session with Apple's Sarah Herrlinger, Senior Director of Global Accessibility Policy & Initiatives and Jeffrey Bigham, Research Lead, AI/ML Accessibility. The letters of AI and ML were common themes of the conference.

Artificial Intelligence, or AI, is the process of solving a task that would normally require human intelligence. Machine Learning, or ML, is a major part of current AI research, and uses huge sets of data to try to detect patterns and make predictions about objects, scenes, and text. For instance, a machine learning model might be fed thousands of images of different trees in order to teach it what a tree looks like and the characteristics they may contain. But not all trees look the same. They come in many shapes and sizes. Some have leaves, while others don't. To compound this further, trees look different when viewed from a different angle. Where older methods for identifying an image may seek to find an exact match for a picture, or something very close, machine learning takes the previous images of trees into account and tries to determine if the next image has these same characteristics. It can also use clues about its surroundings, I.E., most trees would be found in an outdoor setting.

Apple has made huge strides in the depth of image recognition features, which really started to become widely available with iOS version 14 in the fall of 2020. This fall, iOS 15 included a new feature called Live Text, which will allow you to interact with text that is found in a picture. This can be accessed by taking a photo using the Camera app. As an example, if you were to take a picture of a business card which included a phone number, you can select this number and dial it directly from the app. Like most new Apple features, it was accessible with VoiceOver from day 1.

Machine Learning has also been used to create features which are mostly targeted at certain users, such as people who are blind or have low vision. From the photo viewer, VoiceOver users can select an option to explore the image, allowing a user to move their finger around the screen to hear the relationship of items in a photo. When taking a picture of my desk, it identified my laptop, a mixer to its left, the wooden desk, and even some of the text near controls on the mixer. Instead of having all of these items in one large description as is done with some image apps, each element is selectable using regular VoiceOver commands, and I could move my finger around the screen to understand the relationship between these objects.

Machine Learning has also been tapped to provide accessibility to apps which may have not been available in the past. It does this through the Screen Recognition feature which attempts to recognize common controls like buttons or sliders and then convey this information to a VoiceOver user. And while this is exciting, Herrlinger notes the feature is not meant to give app developers a free pass when it comes to making their products accessible.

"We are really encouraging of developers and kind of work with them to make sure that they’re putting the time and energy into the accessibility of their own apps so that members of the blind community have that full experience in a way that is just a little bit better than what we’re able to do with our machine learning models."

Performing these tasks on device is another huge technology advancement. Formerly, phones were not powerful enough to manage all of the data required for recognizing images or text on the device itself, rather, they would upload images to the cloud for processing by high-powered computers. While many apps still use this approach, it often results in slower response times and could lead to security breaches. Allowing for image recognition and other features on the device itself largely negates these challenges, and also allows for these features to work in areas where Internet access is not available.

BIGHAM points out that even just 10 years ago, the prevailing thought was that there was not enough computing power in the world to make these huge leaps in Machine Learning advancements. Seasoned iPhone users may recall VizWiz, an app which could give information about what's in a picture. The difference between then and now is that at the time, a human was the one supplying the answer, not a computer. Now, there is a growing amount of data that can be used to provide these descriptions automatically. With numerous ways to learn about your surroundings on an iPhone alone, researchers and engineers will be looking at not only how to improve the computer-generated responses to questions, but also how to make the experience as fluid and seamless as possible for users.

Apple, as they often do, did not give many specifics on features planned for the near future, but it is quite evident this is a major area of focus for the company. They have made numerous advancements in Machine Learning over the past few years, and it will be interesting to follow their developments going forward.

Looking Forward with Google Lookout

Across the virtual aisle, one of the main areas of focus for Google and the Android platform has been Google Lookout, an app available on many modern Android devices which can recognize text, food labels, currency, and objects among other things. One area of emphasis for the Lookout team is finding ways to bridge the gap between the images that are used for training the data models and the pictures they are receiving from users. Often, user photos are less clear or taken at a less than ideal angle, so the results that are presented are not what might be expected. Lookout Product Manager Scott Adams wants to tackle some of these challenges and find ways for users to get the exact pieces of information they are seeking.

"What if only part of that object is visible? Is there a way we can let the person know that and perhaps coach them so they can get better results?"

Another example Adams gives is the text of a newspaper, where there may be multiple headlines and stories on a single page, laid out with many columns and pieces of information. He would love to find a way for the app and users to be able to tell the difference between headlines and articles, and then focus on the article or piece of information they desire. Currently, text is often read as one large block, making it more difficult to discern specific parts of a complex page or document.

Among the new features added in 2021 is support for handwriting, which can be found inside the Documents mode of the app. The decision to include handwriting in the existing documents mode was intentional to create a smoother experience for the user. Support for more devices and languages is also in development, though Senior Software Engineer Andreína Reyna admits this can be challenging given the vast Android ecosystem.

"We think that there’s a group of features or baseline of features that are so important that we really wanted to make sure they worked on all devices. And so, we have been testing and doing this gradual rollout to make sure that the features that we have are supported in all of our devices."

Despite the iPhone being the more dominant choice for blind and low vision users especially in the United States, Android is used by nearly three quarters of the world's mobile phone users, and Lookout remains a key component of Google's accessibility strategy. It's available for free from Google Play.

A Stellar Preview for HumanWare

Dr. Louis-Philippe Massé, Vice-president of Product Innovation and Technologies for HumanWare, gave a first glimpse at a new GPS-based product they are currently calling the Stellar Trek. Building on past innovations from the Victor and Trekker lines, the new product will include both modern GPS features and a built-in camera. It's the latter inclusion that Massé feels will vastly improve the navigation experience for blind travelers. The device is a bit thicker than a smartphone, with two high-resolution cameras on the back, and a simple set of buttons on the front which is intended to be activated using your thumb.

A familiar challenge to GPS users is what is often referred to as the FFF, or final forty feet problem, referring to the gap in information between a GPS which guides you to the correct block or area where a location sits and locating the actual door or landmark the user is seeking. The Stellar Trek will use a combination of the cameras, more accurate GPS technologies, and an on-board voice assistant to help solve these challenges.

"We want to give additional help such that it will be a little like having a friend helping you going to those final 40 feet. So, we will use the AI and the [AUDIO OUT] to eventually locate potential threats. And sometimes, when you’re on the sidewalk, the path to that door is not a straight line. So, we will say, OK, you will have to go at 10 hours for 40 feet, and then turn to 2 hours, and so on".

The device is envisioned as a stand-alone unit, with no Internet connection required, but cloud connectivity is something that is being explored for the future, perhaps as a subscription add-on or a one-time fee, with an offline option remaining available. Massé cites battery life as one major advantage of a stand-alone unit, with potential battery life of five or six times that of a smartphone. The Stellar Trek is expected to sell for north of $1,000 and be launched this spring.

The Future Is Autonomous, But It May Be A While

When participants on a panel of experts on autonomous vehicles were asked by moderator Brian Bashin on how long it would take until a blind person would have a reasonable chance to hail a driverless vehicle, the responses were all over the board, ranging from a few years to decades. This represents some of the complexities that various groups are working through while developing and honing next-generation vehicle technology.

There are a variety of considerations to think about when teaching a vehicle how to coexist with a person who is blind, starting with simply recognizing a cane as an object to avoid, since in some cases, a white cane may just blend in with the surroundings. Beyond this, it's important to recognize the varying travel methods of people. For instance, a blind person may explore the curb cut with their cane or react differently to a light change or other nearby objects. The interactions between vehicles and a sighted pedestrian may be different than that same interaction with a blind traveler.

Aside from this, consider the situation where you are at an airport and are trying to find your Uber or Lyft. Currently, you might call the driver and describe what you look like so they can locate you. But an autonomous vehicle has no human at the controls, so alternatives will need to be considered. In Arizona, vehicles can honk the horn by request of a passenger, a feature used by many users, not just those who are blind. But perhaps a horn honk sound is too abrasive, and a more polite sound should be employed. There is also the matter of directing the rider to the vehicle, or to the door at their destination, and doing so in a way that is both efficient and safe.

Based on all of these and many other issues to work out, it becomes clearer why the opinions on when autonomous vehicles will be a normal part of life vary tremendously. So, look for a driverless car in 5 years, or perhaps 50.

More to Explore

There were more great panels and sessions than what we mentioned above, including a look at the future of Amazon Alexa, a panel on indoor navigation and mapping, and a look at the latest development's for Microsoft's Seeing AI app. Be sure to check the agenda page to find all of the main sessions from the conference. Will many of these predictions come to fruition in the near future? Time will tell, but we'll be able to track the progress made when the conference returns for 2022.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

Entering Text on an iPhone, Part 1: Using Software Methods from the Phone

Judy Dixon

Many people find it difficult to enter text using the iPhone's onscreen keyboard. The good news is that there are many ways to enter text, and the default method is possibly one of the most difficult. Some of these additional ways are variations of the onscreen keyboard itself, while others employ a completely different strategy.

This two-part article will present an overview of a wide variety of input methods. In Part one, we will look at the software options, those that can be found on the iPhone itself. In part two, we will look at several third-party hardware devices that can help with entering text.

Text Entry Methods Using the Onscreen Keyboard

Whenever you have the opportunity to enter text on your iPhone, a keyboard magically appears at the bottom of the screen. There are no actual keys on this keyboard, but there are places where you can tap and your selected character will be entered. Your phone is smart enough to know exactly where you are touching.

The layout of the onscreen keyboard varies from app to app, and even with your phone's orientation. So, if you are being asked to enter your email address, look around on the bottom row of the keyboard and you will very likely find an at sign. The Delete key is below the L and the Return key is below the Delete key. In general, the keyboard has three rows of letters but no numbers. To get numbers, activate the Numbers key near the lower left corner of the keyboard. The top row of the Numbers layout has the digits 1 through 0, with the most common punctuation on the other two rows. You can get more punctuation and other characters by activating the Symbols button on the left side of the keyboard.

If your region is set to the United States, two keyboards will be available by default, English (US) and Emoji. Emoji are those little pictures like "smiling face," "red heart," and "face blowing kiss" that are frequently used in text messages and elsewhere. To put emoji characters in your documents, select the Emoji key at the bottom left corner of the keyboard. The Emoji keyboard opens with the Frequently Used category selected. Across the bottom are nine radio buttons representing the nine available categories of emojis. The Next Keyboard key in the bottom left corner will take you back to the English keyboard. If you have more than two keyboards installed, this key will cycle among them.

Typing Mode

There are three Typing Modes that control how you select keys from the onscreen keyboard. With VoiceOver enabled, when you first set up a brand-new iPhone, the keyboard Typing Mode is set to Standard Typing. Standard Typing means that you slide a finger around the keyboard until you hear the character you are seeking. VoiceOver will speak the name of each character as your finger glides over it. Then, you can either lift your finger and double-tap anywhere on the screen, or you can split-tap, which is to keep a finger on the letter and tap anywhere else on the screen with a second finger. In either case, VoiceOver will repeat the letter as it is entered so you can be sure it is the one you want.

An alternative Typing Mode that is preferred by many blind people is Touch Typing. With Touch Typing enabled, you slide one finger around the keyboard until you find the character you want, then just lift your finger, and that character will be entered. The character that is entered is spoken again, this time in a higher pitch, so you can be assured that you actually got the right character. In Touch Typing, you can also split-tap to enter a character.

There is a third option on the Typing Mode menu called Direct Touch Typing. This option lets you enter a character with a single tap. There is no confirmation beyond the name of the character being spoken when you tap it. This form of input works best for low-vision users who can see well enough to aim for the key they want and actually have a good chance of hitting it.

As you are sliding your finger around the keyboard, if you are unsure about which character is active, just hold your finger still and VoiceOver will speak the phonetic letter associated with that character - alpha for a, bravo for b, Charlie for c, and so forth. If it is not the character you want, just start moving your finger again. As long as you don't lift your finger, or tap somewhere else on the screen, the unwanted character will not be entered.

There are a couple of important things to remember here. Be sure that only one finger is touching the screen, and you keep all other fingers tucked out of the way. Also, be sure that you don't take action until you are on the character that you want. This may take a bit of practice at first, but in no time at all, it can become second nature.

Typing Mode is a VoiceOver setting that can be quickly changed. Changing Typing Mode is the very first thing I do when I pick up a new iPhone or iPad. You can change Typing Mode with the VoiceOver rotor, or in Settings > Accessibility > VoiceOver > Typing, where it is called Typing Style. To change it with the rotor, you must be in a text field of some kind. Only when you have a keyboard on the screen will Typing Mode appear on the rotor. Simply rotor to Typing Mode and flick up or down with one finger until you reach the one you want. The mode you choose with the rotor will be retained, and all your future inputs will default to the Typing Mode you choose.

Slide to Type

Out of the box, there is a mainstream keyboard feature called Slide to Type that is on by default. This feature is fully accessible and works when you have Typing Mode set to Touch Typing or Direct Touch Typing. Once activated, you slide your finger from letter to letter. The phone looks at the pattern you drew to figure out the word you want.

To activate Slide to Type, slide your finger to the first letter of the word you want to type. Hold your finger on that key until you hear an ascending three-tone sequence, then start sliding to the next character you want. Stop briefly, then, without lifting your finger, continue sliding your finger to the remaining characters in the word. When you have finished the word, lift your finger and VoiceOver will speak the word it thinks you wanted. Slide to Type will attempt to predict the word as you move your finger. You may hear the word you want before you have moved to all the letters. In this case, just lift your finger and the word will be entered. Other possible words will be placed above the keyboard. You can easily get to these by sliding your finger up from the q key or flicking left from the q key. If you prefer one of them, just double tap it and it will replace the last word entered. If your word isn't among the suggestions, press Delete, and the entire word is deleted.

The default amount of time to hold your finger on a letter before Slide to Type is activated is one second. This can interfere with hearing the spoken phonetic for the letter. If you would like that time to be longer, you can change it in Settings > Accessibility > VoiceOver > Typing > Keyboard Interaction Time. The options are 0 to 4 seconds. Alternatively, you can put Slide to Type on the rotor so you can turn it on or off easily, or you can turn Slide to Type off completely in Settings > General > Keyboard.

Accented and Other Alternative Characters

You can enter an accented letter or many other alternative characters. when in Standard Typing mode, touch a letter, then double-tap and hold until you hear a tone indicating alternative characters have appeared. VoiceOver will say "alternative characters available."

Slide left or right to hear the choices. Release your finger to enter the character you want. When in Touch typing mode: hold your finger on a letter until you hear the sound and VoiceOver indication. Then do as above.

So, to type an e with an accent like in resume, type the r, then place your finger on the e until you hear VoiceOver say "alternative characters available," still keeping your finger on the keyboard, slide to the right until you hear "e acute," then lift your finger. Do the same for the final e. Many of the punctuation keys have alternative character menus, too. The $ (dollar sign) offers other currency symbols, the . (period) offers an ellipsis, the - (hyphen) gives you longer dashes and a bullet. The " (left quote), ? (question mark), ! (exclamation point), and ' (apostrophe) keys all offer alternates as well.

If you get Slide to Type when you want to enter alternative characters, just keep your finger on the letter and the alternative characters menu will appear.

Keyboard Settings

There are many settings that can make the process of inputting text even easier. The ones for software keyboards are located in three different areas within Settings. Here are a few of the major ones.

In Settings > General > Keyboard, you will find:

  • Keyboards: Here you will find the installed keyboards. At the bottom of this list is Add New Keyboard. Double tapping this button brings up a list of all native keyboards that can be installed and any third-party keyboards that are available because you installed their apps. The various language keyboards are listed under the heading "Other iPhone Keyboards," below the third-party keyboards.

  • Text Replacement: This is a way to create short strings that expand to much longer ones such as "myad" which expands to your full address. You can use whatever sequence of letters you wish.

  • Auto-Capitalization: This controls whether iOS automatically capitalizes the first word of a sentence, days of the week, months of the year, and other proper nouns.

  • Auto-Correction (on by default): This controls whether your phone will offer to autocorrect what you type.

  • Predictive (on by default): This feature will try and predict what you are about to type. Predicted words are place above the keyboard.

  • Slide to Type (on by default): This is where you can turn Slide to Type off.

  • Period Shortcut: With this setting on, when you finish a sentence all you have to do is hit the Space key twice and the device inserts a period, keeps one space, and capitalizes the next letter.

In Settings > Accessibility > Keyboard, you will find:

This section has settings for hardware keyboards and software keyboards. The only item in software keyboards is Show Lowercase Keys which affects how keys are displayed on keyboards that use the Shift key.

In Settings > Accessibility > VoiceOver, you will find:

  • Verbosity

    • Capital Letters: This setting controls how VoiceOver indicates capital letters. The options are Speak Cap (default), Play Sound, Change Pitch, and Do Nothing.

    • Deleting Text: This setting controls how VoiceOver indicates that text has been deleted. The options are: Speak, Play Sound, Change Pitch (default), and Do Nothing. Speak inserts the word “deleted” before the item that has been deleted.

  • Typing

    • Typing Style: This setting is the same as Typing Mode that I discussed earlier on in this article. You can set Typing Mode from the rotor or here in Typing Style. I have no clue why it is called by two different names. As on the rotor, there are three possible options: Standard Typing, Touch Typing, and Direct Touch Typing.

    • Phonetic Feedback: This setting controls how VoiceOver speaks the names of keys on the keyboard. The options are: Off, Character and Phonetics (default), and Phonetics Only. Off still speaks the name of the key; it just doesn’t speak the phonetic after it. Phonetics Only speaks the phonetic as you move your finger around the keyboard. Some VoiceOver users who are also hearing impaired find Phonetics Only to be very useful because they can quickly tell the difference between November and Mike without having to wait for the phonetic to be spoken.

    • Typing Feedback: This setting controls what is spoken as you type. It has three sections: Software Keyboards, Hardware Keyboards, and Braille Screen Input. All three sections have the same options: Nothing, Characters, Words, and Characters and Words (default).

    • Keyboard Interaction Time: The amount of time to wait before VoiceOver activates Slide to Type or alternative keys. The default is one second.

Other Input Methods

VoiceOver has two additional input methods that do not use the onscreen QWERTY keyboard.

Braille Screen Input

Braille Screen Input is a way of writing uncontracted or contracted braille on an iPhone. It is not an app, but a feature that is part of the iOS operating system. In addition to entering text anywhere there is a keyboard on the screen, you can use Braille Screen Input to find apps from a Home screen, enter your passcode, or enter single letters to navigate a web page. Below is a quick overview to get you started with this very cool feature. For more information, go to Type braille directly on the iPhone screen using VoiceOver.

Braille Screen Input is not on by default. To enable it:

  1. Go to Settings > Accessibility > VoiceOver > Rotor.

  2. Double tap on Braille Screen Input and make sure it says, "Selected."

  3. It can be helpful to move Braille Screen Input to the top of the list of rotor items. If you do this, Braille Screen Input is usually the first rotor selection when you turn the rotor clockwise. Conversely, if you prefer rotoring to the left, you can move Braille Screen Input to the bottom of the list and it will be first when you rotor counterclockwise.

To enter braille, set the rotor to Braille Screen Input, then turn your iPhone to landscape orientation. With the usual six fingers, begin typing braille as you normally would. By default, VoiceOver will speak both characters and words as you type.

The iPhone can only accept five simultaneous touches. To type a for-sign, place any five fingers on the screen, then lift one finger leaving the other four on the screen, then add the sixth finger.

A three-finger swipe to the right toggles between contracted and six-dot braille. You can enter a space using a one-finger swipe to the right, a backspace with a one-finger swipe to the left, and a new line with a two-finger swipe to the right.

Each time you start using Braille Screen Input, you should calibrate the dots. Do this by typing a 4-5-6 character with your right hand and then an l (dots 1-2-3) with your left hand. VoiceOver says, “Dot positions calibrated.” This needs to be typed fairly quickly—at about the speed that you would normally double tap. You can exit Braille Screen Input by turning the rotor in either direction, returning your phone to portrait orientation.

Handwriting

Handwriting lets you draw print letters on the screen instead of typing them from the keyboard. Handwriting is not enabled by default, but can be added to the rotor in the same way that Braille Screen Input is added. In addition to entering text, you can also enter your passcode, search for apps from the home screen, and perform single-letter navigation on web pages. Handwriting also has additional gestures. You can get more information about the handwriting feature at:

Write with your finger using VoiceOver on iPhone.

Dictation

Dictation on an iPhone has steadily improved over the past few years. I am frequently astounded at what a great job it does. Dictation allows you to use your voice to enter text anywhere a keyboard is visible. This setting is off by default, but the first time you try to dictate somewhere when Dictation is not enabled, you are informed of this and given an opportunity to Enable Dictation right then and there.

Once dictation is enabled, there will be a Dictate button in the lower right corner of most of the keyboard layouts. You can also activate Dictation with a two-finger double tap but you must be in an edit field for this to work. When your device is ready to receive your dictation, you will hear a tone; it's a single version of the Siri tone.

To dictate text just speak the words with all necessary punctuation. You can format your text with commands like "New line," and "New paragraph." In addition to punctuation, you can enter symbols such as degree sign, trademark sign, and forward slash.

You can find a complete list of punctuation, symbols, and formatting commands that can be used while dictating at Use Dictation on your iPhone, iPad, or iPod touch.

When you have finished your dictation, do another two-finger double tap. You will hear another single tone, higher than the first one. In some apps, VoiceOver says "inserted..." followed by your dictation. Sometimes, you have to touch the screen to hear your dictation.

Conclusion

Becoming comfortable with entering text on an iPhone can take some time and a bit of patience. It can be time well spent to open the Notes app, create a new Note and just practice typing. You may be amazed at how quickly your proficiency improves and your confidence soars.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

The Vale: Shadow of the Crown – An Audio Game with Excellence

Jamie Pauls

When I first began writing articles for AccessWorld, I was a game reviewer. Over the years, I have written about many other things—audio description, Braille, Diabetes management—but I have always returned to gaming. When I first began using a computer in the 90s, I listened to friends play immersive games and wished I could do the same with my computer. Over the years, many games have been developed for vision impaired players, and many of them have been quite good. Other games, while not designed with the needs of blind people in mind, have been accessible by accident, if you will. Still other games have started out without accessibility, but blind people were able to have productive discussions with the game developers which resulted in an accessible game play experience further into the life of the game.

Recently, a new audio game has been released that takes the game play experience for blind and sighted players alike to a very high level of excellence.

The Vale: Shadow of the Crown by Falling Squirrel requires no sight for game play because the main character in the story is blind. Born into a royal family, the blind princess whose story unfolds during many hours of game play is sent to a border region of the land where she lives after the death of her father the king, and her brother’s subsequent ascension to the throne as king of the land. On her way to her new home, the party with whom she travels is overtaken by a group of rebel warriors. She finds herself alone with the exception of a shepherd who befriends her.

Never fear, however. Alex, the princess, has some rather unique abilities that place her in a better position to survive than one might first think. Just because her father the king thinks a blind girl can’t succeed in the world let alone rule a kingdom doesn’t make it so, and Alex the princess proves this over and over again as the story of the Vale unfolds.

Who is Alex, really? What powers does she actually possess? Finally, why is it that dark forces from another realm are desperate to stop Alex from surviving, let alone returning to her home in the capitol?

Over the course of at least five hours of game play, all these questions will begin to be answered in some intriguing and even startling ways.

Obtaining the Game

The Vale is available on several platforms including Steam, Xbox, and PC. Steam is not recommended for screen reader users. I played the game on a Windows 10 PC. You can purchase the game here for $19.99 at the time of this writing.

The game is totally self-voicing, and a screen reader is not needed. I set NVDA’s speech mode to off and had no difficulty playing the game, but you will have best success if you completely unload JAWS if that is the screen reader you are using.

Playing The Vale

It is possible to use a game controller when playing The Vale, and you will receive haptic feedback, but I used a keyboard with no difficulty. There is a beginner’s user guide available for this game, but you could easily delve into game play without reading it first. Intuitive tutorials take you through various elements of game play as the story begins. The human voice of a narrator will give you commands to use such as W to walk forward, S to move backward, and the Letters A and D to side-step. There are a couple of tutorials available from the main menu that take you through important elements of game play including the use of weapons. One nice touch is that these tutorials actually take you through scenes from Alex's memories which sets the context for the commands you are learning.

Much game play is very linear and you need do nothing more than sit back and listen as the story unfolds. At times, your character will need to fight, and your skill in wielding your character’s weapons will determine the outcome of the story. If your character dies, and she most likely will do so frequently as you gain skill in playing The Vale, the most recent scene you are playing restarts seamlessly. Finally, there are times when your character will need to explore her surroundings and make decisions that will enhance and vary the game play experience. I have seldom experienced game play that makes such good use of sound cues during exploration, and I can safely say that I have played no game that does this better than The Vale. Important sounds are easily isolated but yet blend nicely with other ambient sounds. Sound cues can include music, clanging hammers, people talking, and other-worldly sounds that are uniquely important to this game.

There are three difficulty levels of game play, and I chose normal. I found game play to be easy enough to not discourage me from finishing the story, but yet challenging enough to cause me to need to take a break at times to rest my brain!

The musical score for this game is varied and beautiful, and I have found no voice acting in any game that is better than The Vale. I can only assume that professional acting talent was used throughout the entirety of the game.

There are many places in the game where Alex, the main character, flashes back to earlier events in her life or is given insight into things that are yet to occur. These cut scenes are handled as smoothly as anything I have ever heard, and they make game play a real treat.

In preparing to play the game, I read a mention of binaural game play experience. This is the technique of giving the player the illusion that things are not only taking place on either side of them but also in front of or behind them. Done properly, this can be a startling effect, and I was pleased to note that The Vale handles this superbly. I don’t own the most expensive headphones available, but they are decent, and I was literally able to turn my character 180 degrees and place a sound source behind her successfully. There are a couple under-water fight scenes that made my jaw drop during game play as well.

The Vale contains RPG elements including the building of character stats from magic to better weapons and armor. As weapons and abilities are added throughout the game, the tutorial tells the player how to best utilize these elements. There are various interactions with other people that your character can have, including quests that can both improve character stats and enhance game play. The up and down arrow keys let you move through these interactions, and it would be nice if the player could be notified of how many interactions there are in a given scene. For example, one of four quests available. This is not a major point of concern, but rather an enhancement that I believe would make game play even better than it already is.

The Bottom Line

The Vale: Shadow of the Crown is an audio-only game designed without a video component for economic reasons. However, the game developer soon realized that the blind community would benefit from this game, and he reached out to the Canadian National Institute for the Blind for input. The result is a game that has a rich plot, a beautiful musical score, and some of the best sounds I have ever heard.

All elements of game play are well thought out and, for me at least, easy to carry out. There was enough of a challenge to keep me interested in the game but not such an element of difficulty to make me want to throw up my hands in frustration and walk away. I didn’t mind paying the almost 20 bucks for this game, and should Falling Squirrel produce a sequel to The Vale or another audio game in the future, you can be sure I will lay down the money for the new title.

I found an interesting article on how The Vale came to be, and I include it here in case you are interested as well.

The Vale might remind you of an earlier game I reviewed for AccessWorld. You can read my review of A Blind Legend here. Another audio game that comes to mind when playing The Vale, is A Hero’s Call which was reviewed for AccessWorld by Aaron Preece. You can read Aaron’s review of A Hero’s Call here.

Compared to other role playing games such as A Hero's Call, The Vale is a simpler and shorter game. That being said, it provides some of the best sound design and voice acting you will find in audio gaming and can be enjoyable to play for both beginners and experienced audio gamers alike.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

<I>AccessWorld</i> News: What's New in Windows 11 for Screen Reader Users?

Judy Dixon In this piece, I will briefly review the major new features in Windows 11 that will affect screen reader users. Many of the changes from Windows 10 to 11 are visual, and will have little to no impact on those who rely on screen readers. The new interface is meant to be softer, friendlier, and more discoverable. The changes such as the start menu being in the middle of the taskbar will not even be discernible to those who do not see the screen. There are, however, a few differences that will be noticeable, but the overall experience of using Windows 11 is very similar to Windows 10 and should not present any difficulties.

Upgrading to Windows 11

After more than six years since the release of Windows 10, on October 5, 2021, Microsoft released Windows 11 to the public. Not everybody is being offered the free update immediately. Microsoft is rolling it out over a period of months. But if you just can't wait, you can download and install it whenever you like.

There are several prerequisites that a computer must meet to be eligible for Windows 11. Details on these are listed at Windows 11 System Requirements.

If you would like to know if your computer is eligible for Windows 11, Microsoft recommends installing the accessible PC Health Check app. It can be downloaded from this same system requirements page.

New computers are now shipping with Windows 11. If your first encounter with Windows 11 is on a new computer, as you boot the computer, Windows 11 will tell you that you can launch Narrator to go through Setup. I recommend that if you are not a regular Narrator user and you choose to do this, you brush up on your Narrator keyboard shortcuts. You can get a Complete guide to Narratorfrom the Microsoft website.

New Keyboard Shortcuts in Windows 11

There are a few new keyboard shortcuts in Windows 11.

The Action Center has been split into two parts, Quick Settings and Notifications Center:

Windows key + A opens Quick Settings. Here you can manage wi-fi, Bluetooth, and audio, edit the Quick Settings menu and open the All Settings interface.

Windows key + N opens the Notification Center. The Notification Center includes all your notifications and a full-month calendar view.

Windows key + C opens Chats. Microsoft Teams is now built into Windows and this keystroke opens the Chat feature.

Windows key + W opens the Widgets interface. Widgets is a new feature similar to "news and interests" for Windows 10. By default, it has widgets for weather, news, sports, and so forth. You can add and delete items as you wish.

Windows key + Z opens the Snap layouts menu. Snap layouts is a way to visually position the windows on your screen.

Voice Typing

Voice Typing is the dictation feature built into Windows 11. Actually, Windows 10 had a rudimentary dictation feature but it has been significantly enhanced in Windows 11, and is now a very smooth and seamless experience that is available any time you can enter text from the keyboard.

Simply press the Windows key + h and Voice Typing will be activated. Even if you are running a screen reader such as JAWS, the words are not spoken as they are entered so the additional speech does not interfere with the dictation. If you are using a braille display, the words being dictated are scrolled onto the display as you speak them so you can follow along as the words are written. You will need an Internet connection for Voice Typing to work. Pressing any key on the keyboard will turn Voice Typing off.

Autopunctuation which is off by default can punctuate your dictation as you speak it. To turn on autopunctuation, while Voice Typing is active, press alt + Windows + h. Shift-tab once to reach Settings and press Enter. Tab to Autopunctuation, and press the spacebar to turn it on. As of this writing, the alt + Windows + H hotkey does not bring up this dialog in JAWS, but if you unload JAWS and launch Narrator with Control + Windows + Enter, you will be able to do it. Once turned on, autopunctuation stays on until you turn it off.

A Few Small Changes Worth Noting

Ease of Access has been renamed Accessibility.

There is a new Sound Scheme. The sounds are softer, and much more subtle than the sounds in Windows 10. Light mode has one set of sounds while dark mode has a separate set of sounds.

Focus Assist now includes the opportunity to set a Focus Session as part of the Clock app to set an amount of time to limit notifications, include scheduled breaks, and associate specific tasks.

New Features on the Horizon

Additional features have been rolled out to the Windows Insiders. These are the beta testers and developers who test new features. Two new features of particular interest are the ability of Windows 11 to run Android apps, and Voice Access, which will allow users to control a PC by talking to it.

Conclusion

In summary, Windows 11 is a smooth experience for screen reader users with a few nuggets worth getting excited about. For a good set of lessons describing the new features and changes in Windows 11 with JAWS, see Windows 11 Training - dSurf.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Next Article

Back to Table of Contents

Letters to the Editor

In this section, we publish letters sent to us here at AccessWorld. If you would like to submit a letter, you can do so by activating the "Comment on this Article" link at the bottom of each article, or send an email to the Editor, Aaron Preece, directly at apreece@afb.org.

Dear AccessWorld Editor,

This message is in response to the October 2021 Editor's Page

I am 73 and was employed in very difficult times. I was a social worker and had to go where no one wanted to live just to work. With no public transportation and no paratransit, life was challenging.

When I worked as a Vocational Rehabilitation Counselor, I found many people unwilling to risk moving to an unfamiliar place.

I also think that SSI can be a disincentive. If your subsidized apartment is $75 and you receive $625, it's all a matter of scale. It is hard to risk failure.

I was the lucky recipient of affirmative action. Even so, I found I had to put in three times the amount of effort to get the job done. Before computers, I had to work after closing, dictating information that a reader would put on a form.

When Ceta ended, I asked the agency to pay for reading. I had to go to the Equal Opportunity Employment Commission about this and the agency administration made my life miserable. I ended up changing jobs.

I worry about people receiving services from TVIs only once a week. I went to a resource room and then out to regular classes for six years. When I went to Holy Names Academy in 1960, I had all of my Braille and typing skills. I was fortunate enough to have a mother who learned Braille and did a lot of brailling for me. 

Belief in oneself is critical but so are good orientation and mobility skills as well as a solid education. We have to be twice as competent to get a whack at the ball. Although I believe attitudes are improving, obtaining competitive employment continues to be a real challenge.

Happy Holidays,

Alco

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Previous Article

Back to Table of Contents