• Seeking a new VPN provider, does anyone have suggestions?

    I’m looking for a new VPN provider and am curious if any of my readers use one that they would recommend. I’ve been extremely happy with my current VPN provider, ExpressVPN, but $116.95 billed every 12 months is frankly more than I want to pay for a VPN service that I only use occasionally. Ideally, the perfect VPN for me would:

    • Be accessible on Mac, Windows, iOS, and Android.
    • Allow 3RD party clients to be used if needed, such as if their main client becomes inaccessible because of an update or something.
    • Alert me if the connection has been dropped. After all, a VPN that would allow traffic to proceed normally if the VPN connection drops is not very useful. My work VPN has no trouble managing this, but I haven’t encountered a consumer VPN product yet that does this right — maybe I’ve just been using the wrong ones?

    When reading various reviews for VPN services, they often contain a number of features that don’t interest me particularly although I guess they would be bonuses:

    • Ability to direct traffic through a specific country.
    • Ability to circumvent geolocation restrictions.
    • Additional products unrelated to the VPN itself such as password managers and the like.

    If you use a VPN service that you like, I would definitely love to learn more. And if you can get some sort of referral credit by sharing an affiliate link, feel free to pass that along as well.

  • Playing around with Aiko, an amazing, accessible transcription app for Mac and iOS

    I recently heard about this fantastic app, available for both Mac OS and iOS, called Aiko which leverages AI technology to transcribe audio. What sets Aiko apart from similar solutions though include, in part:

    • It's free, totally free.
    • Audio can be dictated directly into the app, or a pre-recorded file can be imported. I'm particularly excited about this second piece.
    • Everything happens on the end-user's device, nothing is sent to the cloud.
    • Multiple languages are supported, we're talking a lot of languages: 100 languages according to Aiko's home page.

    I was excited to test out this fascinating technology and so to really put it through it's paces, in a sub-optimal recording environment, I decided to record some audio using my Apple Watch, while standing outside with lots of traffic and other background noise. What follows is the unedited output of my little experiment. I'm also adding the actual recorded audio, so that you can get a sense of the crummy audio I gave Aiko to work with.

    Hello, and thanks for joining me today.

    I'm playing with an app called AIKO.

    It's an app that leverages Whisper, which is a technology made by OpenAI, the folks that brought us ChatGPT.

    Now unless you've been living under a rock for the past couple of months, I'm sure you've heard quite a lot about ChatGPT and the fascinating possibilities it opens up to us.

    Anyway, Whisper, and on top of that this AIKO app, allow transcription of audio.

    The interesting thing about it is that you can record directly in the AIKO app, or you can import audio, say from a file that was pre-recorded.

    For example, you might have a pre-recorded audio file of a lecture or a class.

    You would be able to import it into this AIKO app, transcription would happen, and then you would have the output as text.

    For my test today, I'm standing outside in front of my house recording on my Apple Watch with traffic going by.

    And the reason I'm doing this is because I wanted to come up with a very sub-optimal recording environment, just to better understand how the technology would deal with audio recorded in such an environment.

    I'm also trying to speak as naturally as I can without saying words like um and uh, things that I think often get said when speaking.

    The interesting thing about AIKO and the way that it transcribes audio is that it supposedly is able to insert punctuation correctly.

    I'm not sure if it does anything about paragraphs or not, but as the speaker, I don't have any way of controlling format.

    Once you run a file or recording through AIKO, the output is rendered as text.

    However, there are a few things you can do with it.

    First, you can of course copy the text into some other application.

    The other thing that you can do is have the text be timestamped.

    The reason that this can be handy is that you can use that then to create files that can be used as closed captioning for videos.

    Anyway, it is kind of loud out here, and so I will go back inside.

    I also didn't want to make this too long because I'm not sure if it'll work at all or how accurate it'll be, but my plan is to post this to the blog without editing it.

    Stop, stop, stop.

    Aiko-generated transcription from my Apple Watch recording.

    One final note, the dictation ends with the words "stop stop".  I didn't actually speak those words, but because I have VoiceOver activated on my Apple Watch, they were picked up in the recording as I located and activated the stop button.  This is definitely incredible technology and the price certainly can’t be beat. From an accessibility perspective, I found Aiko to be extremely accessible with VoiceOver on both Mac and IOS and since it is a native app using native controls, I feel confident that it will work with other assistive technologies as well. You can find more information about Aiko, including FAQs, links to app store pages and more here.

  • When Success Means Buying a Smaller Suit

    Recently, I got to participate on the Parallel podcast talking about, of all things, accessibility and fitness. The reason I phrase it this way is that anyone who knows me probably knows that fitness and I don't normally go together in the same setence, let alone the same podcast. From the show description:

    Starting or maintaining a fitness program is a challenge for anyone. If you have accessibility needs, you might experience barriers related to touchscreen devices, coaching that doesn't address a hearing or visual disability, or a need for accommodations related to physical limitations. With its Fitness+ service, Apple has taken on some of these issues, and opened up the program to many more people with disabilities, We'll talk with a Fitness+ user, and someone who has worked on Apple accessibility teams.

    https://www.relay.fm/parallel/80

    Talking about anything fitness related has always been challenging for me and so I want to particularly thank the ever-awesome Shelly Brisbin for being brave enough to include me. I also want to especially thank Sommer Panage and the other unsung heros that dare to dream of a more accessible world, and work so hard to make that a reality.

    Parallel can be found everywhere great podcasts can be found, more info about the episode and how to subscribe to Parallel, which you should totally consider doing whether you listen to this episode or not, can be found on Parallel's home page.

  • 100 Days of SwiftUI, my foray into understanding a bit more about how iOS works

    Ever since I was able to accessibly use an iOS device, an iPhone 3GS, I've imagined how awesome it would be to be able to develop my own applications. That excitement was very short lived though as I soon became aware of just how complicated developing an application really is. It's a very involved process -- or so it seemed to me -- and for someone who hasn't written any code since C ++ was the talk of the town, it seemed like an impossibility. I wrongly assumed this was especially true for iOS because apps are often very visual and interactive and I just couldn't imagine how I'd tackle that without vision. And so I quickly decided that iOS app development was just not for me.

    Fast-forward quite a few years and Apple releases Swift and SwiftUI which, at the risk of over simplifying things quite a bit, is a more powerful and natural programming language for application development. Put another way, Swift and SwiftUi is intended to make application development easy enough for just about anyone to learn and do. Being a natural skeptic, I doubted that it could be quite as easy as Apple seemed to suggest, but the idea behind it seemed really interesting to me.; indeed, Swift and SwiftUI have taken the iOS development community by storm, with entire applications being developed using it. With only so many hours in the day though, my challenge was going to be finding the time to devote to learning it. And so again, I set the idea aside figuring I might look into it whenever I had more time.

    I'm not proud of this, but I have a long list of the things I want to do when I have more time, the thing is, the longer I wait to do any of the stuff on that list, the less time I'll actually have to do any of it.

    I initially learned about 100 Days of SwiftUI from Darcy and Holly of the Maccessibility Roundtable podcast. The idea behind this course is simple: learn SwiftUI gradually -- you guessed it -- over 100 days. The course suggests devoting an hour per day to learning and practicing the material. An hour per day doesn't seem that bad to me, I probably spend at least an hour per day thinking about all the stuff I'd love to do, if only I had an hour per day. :) While looking at the contents of the course is a little scary for someone like me who is just beginning, I love that there are days set aside for review and practice. In addition, there is emphasis on not trying to go it alone, students are encouraged to share progress and help one another. That sharing progress thing is actually one of the two rules of the course, as it can help with accountability and can also help the student make connections with others who are also learning.

    So, what do I hope to ultimately accomplish? Sure, I'd absolutely love to get to the point where I can start developing or working on apps that are useful to someone, but that's not actually my goal. I want to understand more about iOS apps because so often, when I report an accessibility issue, I feel like I really don't have a way to describe what's not working for me other than to say that something just isn't working. I'm hoping that by learning the basics of SwiftUI, I might be in a slightly better position to provide more constructive feedback. Whether I'm able to develop my own apps, or help other developers improve theirs, I figure it's a win either way and so I'm excited to get to learning. For anyone else who might also be interested, let's definitely connect and learn together.

  • Tip: Does the FaceTime control bar sometimes get in your way? There's an accessible way to dismiss it.

    One of the new features introduced in iOS15 is this call control bar which provides FaceTime audio controls across the top of the iOS screen during a FaceTime audio call.

    Screen shot of Steve's very messy iOS home screen with the FaceTime control bar across the top. Visible controls, from left to right, are leave call, open messages, Audio route, Mute, camera, share content.
    Screen shot of FaceTime control bar

    I actually really like this new control bar because it gives me the option to mute/unmute from wherever I am and for me, this is much faster than having to switch back to the FaceTime app each and every time. That said, there are times when this control bar gets in the way. For example, sometimes I'll be in an application and I know there's a "back" button, but I can't get to it with VoiceOver because it's obscured by the FaceTime audio control bar. I mentioned my frustration about this to a sighted friend and she told me that visually, it's possible to swipe this control bar away. At first, I thought we might have an accessibility issue of some sort as I could not find a way to do this when using VoiceOver. Eventually, I remembered the two-finger scrub gesture and like magic, away it went.

    For anyone unfamiliar with it, the two-finger scrub gesture is a VoiceOver command that can be used in a few different ways depending on context. IF a keyboard is visible, the two-finger scrub gesture will dismiss it. If an application has a "back" button, the two-finger scrub gesture will perform that action. The easiest way to think about the purpose of this gesture is that it can help you get out of something by dismissing a control, navigating back, closing a pop-up or menu -- in many ways, similar to what might happen when pressing the escape key when using a desktop application. To perform this gesture, place two fingers on the screen and move them quickly in a scrubbing motion such as right, left, right.

    Putting it all together

    If you ever have a reason to temporarily dismiss the FaceTime Audio call control bar and need to do so using VoiceOver, here's how to do it.

    1. Touch the FaceTime Audio control bar with one finger, this will set VoiceOver's focus to the correct place. This is important because otherwise, VoiceOver's focus will remain on your home screen or on whatever aplicationp screen you have open and the scrub gesture will not dismiss the control bar.
    2. Perform the two-finger scrub gesture. If successful, the control bar will go away. IF not, double check that you have correctly set VoiceOver focus to the control bar as just described. If the two-finger scrub gesture isn't performed correctly, it is possible that focus may inatvertantly move away from the FaceTime Audio control bar.

    A few more things to note. First, I don't know of a way to permanently dismiss the FaceTime Audio control bar and so you will have to repeat these steps whenever you need to dismiss it. Second, if you dismissed the control bar and then want to have it back, you can make it reappear by double tapping the call indicator located on the iOS status bar.

    I really like the new FaceTime Audio control bar and find it super useful to have call controls available regardless of which app I'm in or which screen I'm on. For those times though where it might come in handy to move that bar out of the way, I'm glad there's an accessible way to do so.

  • On this Thanksgiving, a quick note of thanks

    As we celebrate Thanksgiving here in the US today, I wanted to send out a quick note of thanks to all of you: for reading my words, for providing encouragement as I continue my blogging journey, and for engaging in some really amazing conversation along the way. I have a lot to be thankful for this year, but there is one group of folks I want to recognize in particular: those developers who work extra hard to ensure their apps are accessible.

    There are many developers who work tirelessly to make their apps accessible, not because they necessarily have to, but because they simply realize it’s the right thing to do. There are many accessibility resources out there that can help developers make their apps accessible, but finding those resources, understanding them, and figuring out how to implement them can be a real challenge, especially for developers with extremely limited resources.

    I’d like to encourage everyone to think about an app that makes a real difference to them, whether for accessibility or other reasons, and consider writing the developer a positive review of thanks today. I’ve spoken with many developers who have indicated to me that while it may seem like a small thing, positive reviews make a real difference. First, the more stars an app receives, the more likely it will be discovered by others. Second, a kind review is a great way to show appreciation in a public way. And finally, your review might make a difference to someone who appreciates the hard work a developer has put into making their app accessible — I know I’ve felt more comfortable purchasing apps when I see a review like, “works well with VoiceOver” or “very accessible”. Writing a quick review is a great way to say thank you, it’s something that makes a real difference, something that is appreciated, and something that only takes a few minutes to do.

    Again, thank you all for reading my words, supporting me, and for continuing the conversation. To those who celebrate, have a happy Thanksgiving.

  • Quick tip: how to get rid of the iOS bubble sound when typing or using Braille Screen Input

    I've been using Braille Screen Input on iOS for years, as it helps me to type more efficiently. One thing that has bothered me though, whether typing with Braille Screen Input or the on-screen keyboard, is this bubble sound that VoiceOver occasionally makes. While that sound does have a purpose and an important one at that, I find it distracting and have always lamented that I didn't have a way to disable it. Little did I know that there actually is a way to disable it.

    [twitter.com/SteveOfMa...](https://twitter.com/SteveOfMaine/status/1434219256439320579)

    I received many replies on Twitter, some from people experiencing the same frustration as me, and others, offering a solution I likely never would have found on my own.

    [twitter.com/walkside3...](https://twitter.com/walkside3/status/1434222462418386946)

    As it turns out, there are actually a lot of sound customizations that can be made in VoiceOver, many of which are off by default and so I never even knew they existed. Not only that, but it's possible to preview each of the VoiceOver sounds which is a great way to learn what they actually mean. I recorded a brief video showcasing these settings in the hopes it might be useful to others.

    Demo of the VoiceOver sounds dialog

    Disabling the VoiceOver auto fill sound has made a world of difference for me. Now I can use Braille Screen Input without being distracted every couple of words. In fact, I've written this very entry solely using Braille Screen Input.

    I would like to thank Rachel, Matthew, and Kara, for getting back to me so quickly with what proved to be the perfect solution. Twitter can be an awesome place for conversation and I'm glad these awesome people are a part of it.

  • An open thank-you to @YNAB for improving accessibility in an incredibly meaningful way..

    For years, I've been a fan of YNAB, You Need a Budget. I love the principles behind their budgeting methodology, I love the app, I love the company, I'm just a really huge fan. YNAB has helped me to pay down debt, feel more confident about where my money is coming from, and going to, and generally feel way more in control of my financial life. Unfortunately, on 07/14, I downloaded an update to YNAB's iOS app, an update that contained significant accessibility issues.

    [twitter.com/SteveOfMa...](https://twitter.com/SteveOfMaine/status/1415307175648452612)
    Tweet from Steve

    I was hurt. I was up-set. I was not sure if I had set enough money aside to cover my rent payment -- in short, it was not a very good day.

    When it comes to making products and services accessible, it's really important to understand that accessibility isn't a nice-to-have, or a feature request. This is especially true if you offer a product or service that people might depend on. Sure I could have changed to another app, but I also would likely have had to change my budgeting method to one that would align with whatever new app I had chosen. That would have been especially challenging as I couldn't put my financial life on pause while I figured it all out.

    Fortunately, the fine folks at YNAB were extremely responsive and understanding, indicating that they were already working on fixes and, more importantly, were working to ensure that issues like this wouldn't happen again.

    And so here we are, roughly three weeks later, and I again get a notification that an update to the YNAB app is available. Even better, the "What's new" section of YNAB's App Store entry mentions:

    • Two major accessibility wins:

    ◦ We made many improvements to VoiceOver interactions.

    ◦ We changed our background colors so that the Increase Contrast accessibility setting will now apply to the YNAB app and actually increase the contrast.

    What's New section in YNAB's Apple App Store entry for version 3.01

    I downloaded the app and was absolutely blown away. YNAB is now more accessible than ever, it's a complete accessibility transformation. Because of their work on accessibility, I can use the YNAB iOS app way more efficiently than ever before.

    [twitter.com/SteveOfMa...](https://twitter.com/SteveOfMaine/status/1423032166942924806)
    Steve's tweet thanking YNAB after being blown away with their 3.01 update

    So, what does all this mean? Certainly this is a win for me personally, but it goes way beyond that. By

    working to improve our approach to accessibility concerns to prevent instances like this in the future

    Mentioned by YNAB in a follow-up tweet

    YNAB has helped ensure that I remain a loyal customer: I'm happy to continue using their product, and I'm happy to continue paying for it because I feel listened to and I feel valued. And that's a big part of accessibility that often gets left out of the conversation; accessibility is about equal access to products and services, but it's also about listening to, and responding to, customer needs; and isn't that a key component of many brands? When a company values me, and goes the extra mile to show me that I am valued, they create loyalty because like most consumers, I appreciate companies and brands that appreciate me.

    So Thank you, YNAB team, for your work on accessibility over these past few weeks. Not only have you transformed your iOS app in an incredible way, but you've also demonstrated, by taking action, that you value me and others who use assistive technologies. I'm proud of the tremendous amount you've accomplished and am excited to see what comes next.

  • Can I use Apple's new MagSafe to attach an iPhone to my fridge? Yes, yes I can.

    I was curious if I could use Apple’s MagSafe technology to attach an iPhone to my fridge and apparently I can. I’m not sure how posting a YouTube video to WordPress works, so hope this comes through.

    www.youtube.com/watch

  • The surprising accessibility possibilities of mobile check deposits

    Recently, I had a conversation with a blind friend of mine who finds herself in an interesting situation.  She has received paper checks, however because everything is locked down, depositing them has become a real issue.  That got me to wondering how accessible mobile check deposits might be; it seems that just about every bank offers this option, but is it an accessible one?  Thinking it over, a few possible challenges immediately came to mind:

    1. Knowing exactly where to endorse the back of the check and writing “for mobile deposit” or similar which many banks now require.
    2. Aligning the camera so that the front and back images of the check are properly captured.
    3. Knowing one way or the other that the deposit has been accepted.

    While I certainly can’t test every banking app out there, I did try a test with Wells Fargo’s app and was extremely impressed.  Wells Fargo has somehow implemented camera guidance, so that VoiceOver helps the user position the camera correctly for the check image to be captured.  Even better, when everything is aligned, the photo is automatically taken and, before final submission, the user gets notified if the photos need to be re-taken because of quality or other factors.  

    So, how does it work?  First, the app asked me to capture the front of the check.  I discovered that I needed to hold my phone in portrait mode (left to right) which is something I hadn’t expected.  Since a check is small, I assumed — wrongly it would seem — that the phone could be held in portrait orientation.  As I lifted my camera away from the front of the check, VoiceOver started providing me with guidance information, “move closer” “move right” “move down” and finally, the picture was taken.  The process then repeated itself to capture the image of the back of the check.  Unfortunately, the part that remained inaccessible for me was properly endorsing the back of the check and writing “For mobile deposit only” which the bank requires.  Maybe this could have been accomplished with the help of a service like Be My Eyes or Aira?  

     

    I was surprised that the process of mobile check deposits, at least with Wells Fargo, was not as inaccessible as I feared.  Unfortunately, I tried with a few other banking apps and met with very different results.  I also did not test with Android.  In summary though, the process of mobile check deposits can be made mostly accessible as demonstrated by Wells Fargo’s app.  If you try this with your bank and meet with different results, it might be worth sending them a support message and encouraging them to further investigate the possibilities of making their process more accessible.  While the technical details surpass my development abilities, my understanding is that Apple makes various APIs available to developers who want to incorporate camera guidance in their applications.  

    Has anyone else tried mobile check deposit recently?  If so, what have your experiences been?

     

  • How Microsoft's accessible OneNote helps me to manage a medical crisis

    On April 25, our daughter, Gabrielle, was rushed to the hospital by ambulance after having a breathing episode.  Gabrielle (Gabby) has a condition which unfortunately causes her to have many such episodes, however, this time was different as she had a seizure and lost consciousness, twice.  To say that the weeks since have been a nightmare would be a huge understatement.

     

    in addition to all the emotional stuff, the sheer volume of incoming information soon became overwhelming.  Multiple doctors conducting multiple tests, prescribing multiple medications, making multiple changes to her diet, proposing multiple theories as to what might be going on with her.  Our focus needed to be on Gabby and on the situation and yet we also needed to do our best to stay on top of the ever-growing pile of information if we were to have any hope of making informed decisions in reference to her care.  How to manage it all?

     

    Information was coming in all sorts of formats.  “Call me with any questions,” said many of the doctors while handing me their printed cards.  “Here’s a bunch of articles I’ve printed out for you to read,” said others.  My own frantic research attempts were turning up links and information at a staggering rate.  And of course there were the actual meetings with her medical team that required me to write stuff down very quickly and without much time to prepare.  I have a plethora of scanning and note-taking apps, but I really needed everything centralized in one place.  not only that, but I needed to make sure my wife and I could share information back and forth without giving any thought to the actual logistics of making that happen.

     

    I’ve been a huge OneNote fan ever since learning of the Microsoft Accessibility team’s efforts to make it accessible.  I use OneNote primarily for work, but also use it to keep track of various things going on in my personal life.  Still, I’ve always had the luxury of knowing that if OneNote failed me, I could use a backup option and while it might be less convenient, I could certainly make due.  Within hours, I no longer felt like I had that luxury:  I needed a system that would work for more than just me.  I needed a system that would be dependable.  I needed a system that would allow me to use my phone, my computer, or Anything my wife, Jenn might have access to from the hospital.  OneNote met all those requirements, but accessibility of OneNote is relatively new, should I really trust it for something like this?

     

    Dealing with all the print.

     

    Microsoft makes a product called Office Lens which allows a photo to be taken of a printed page.  The text in that photo can then be recognized using optical character recognition and the results read aloud.  One of the really awesome things about Office Lens, at least on iOS is that I get spoken feedback when positioning the camera.  I can also send the original image, along with the recognized text version to OneNote.  Whenever given something in print, whether a sheet of paper or business card, I tried to immediately capture it using Office Lens.  Being wired on caffeine and Adrenalin, I’m amazed i was able to hold my phone’s camera steady enough to capture anything, but Office Lens talked me through the positioning and for the most part, it worked great.  Certainly I didn’t get 100% accuracy, but I got names and numbers and enough text to get the gist.  Microsoft also makes a version of Office Lens for Windows 10 which I was very excited about until I realized it wouldn’t let me use my flatbed scanner, apparently, like the mobile versions, it’s really designed to use a camera.  I found a work-around by scanning pages using an alternative app and importing the images into Office Lens, but maybe someone out there knows of a better way?  During this past CSUN, Microsoft demonstrated the ability to scan a document using their Surface Pro, I may need to add this thing to my Christmas list if it really works.

     

    Quickly writing stuff down.

     

    i don’t know how many times I’ve heard the saying “there’s never a pen around when you need one,” but it’s true.  No matter how prepared I think I am to write something down, it almost never fails that someone has information for me when I’m in the most inconvenient place to receive it.  One great aspect of OneNote is that there are numerous ways to quickly take down information.  On iOS, there’s a OneNote widget that allows me quick access from any screen.  I can pull down the notification center, swipe right and my OneNote widget is the first widget on my list.  I simply select the type of note I wish to write, text, photo, or list, and get a blank page for writing.  I have the option of titling my page although if I’m in a hurry, I’ve found it easier to just write whatever it is down and title the page later.  If I’m not in a position to type, or if there’s simply too much information, OneNote gives me the option to attach a voice recording to a note.

     

    If I’m at my computer, I have a really great option for taking a quick note:  The OneNote desktop app which is bundled as part of office has a feature called, Quick Note.  From anywhere, I simply press windows+n and I’m placed in the title of a new blank page.  I can write a title or title it later, most important, I’m at a place where I can just start writing.  When I close the window, my note is saved and I’m returned to wherever I was when I hit windows+n.  This makes it possible for me to take down a note literally at a moment’s notice, i don’t even have to cycle through open windows which is great since I generally have a ton of those open at any given time.  My only gripe is that OneNote stores these quick notes in their own notebook and I have to move them to the correct place later.  I’m hopeful there’s a setting somewhere which will allow me to configure this behavior, but if not, I consider it a very small price to pay for an ultra-convenient way to take a quick note.

     

    Managing Gabby's home care.

     

    While Gabby still has a long medical journey ahead, she is stable and is able to be home with medication, monitors and other supports in place.  Coordinating which medications she needs to take and when, in addition tracking other aspects of her condition is again something we’re managing to accomplish with OneNote.  First, we created a to-do list of all her medications to use as a sort of template.  We then copied this list renaming each copy to a corresponding date.  In this way, we can keep track, day-to-day, of which medications have been taken and which remain; no back-and-forth between Jenn and me around whether Gabby has taken a specific medication or not.  There are a few drawbacks to this system, most notably that if any of her medications change, we’ll need to delete and re-create all the future pages in her journal section.  There are certainly other to-do apps that could help us more easily manage recurring to-dos like this, but by using OneNote, we’re able to keep all her information centralized and synchronized.  In addition, using OneNote makes it easy for us to track events such as breathing episodes and other real-time observations which we could not properly capture in a to-do app.  As we continue to work toward figuring out the best next step for Gabby, we have a central place to compile research.  Also, as medical bills and insurance claim determinations start arriving by mail (amazing how fast that happens) we have a way to organize that as well.

     

    Problems and challenges.

     

    I don’t regret my decision to use OneNote to help me manage these past few weeks, not even a little.  That said, I have encountered some challenges and feel they’re worth mentioning.  To be fair, i see that OneNote for iOS actually has an update today, so some of these may no longer exist.

     

    On the iOS app, when using a Bluetooth keyboard, editing text doesn’t seem to work as expected.  Specifically, when i arrow around, sometimes I find myself on a different line, sometimes on a different word, commands to move word by word don’t seem to work as I think they should.  My stopgap solution has been to simply not edit on iOS; I hit the asterisk ‘*’ key a few times to mark that there’s a problem, hit enter and just keep on typing..  While editing would be great on iOS, and maybe it’s just me who’s doing something wrong, my primary interest is in capturing the information knowing that I can always clean it up and edit it later on my PC.  When using Braille Screen input, my preferred method of typing on iOS, i sometimes need to double tap the text area even though the keyboard is visible.  I’m not sure why this is the case, but it’s an easy fix to a strange problem.

     

    On the PC side, working with the Windows 10 OneNote application is far easier than working with the OneNote Desktop application provided as part of Office.  That said, the Quick Note functionality is only available in the Office version, not the Windows 10 app version.  For the most part this doesn’t cause any problems, it’s just a little confusing as if you want to use Quick Notes, you have to make sure the Office version of OneNote is installed even if, like me, you don’t use it for anything else.

    My other frustration with the Quick Notes functionality of the Office app, as mentioned above, is that i can’t seem to change where it wants to actually put my quick notes.  I want them in the cloud, within a specific notebook, and Office wants them on my local machine, in a different notebook.  Fortunately it’s very easy to move notes from one place to another, it’s just one more thing I need to remember to do and if I forget, those notes won’t be synchronized to my phone and to Jenn.

    Currently, in the Windows 10 OneNote app, I cannot figure out how to check items off the to-do lists.  I can read the lists just fine, but can’t tell what’s checked and what isn’t.  My solution for this is to simply use iOS for now when checking off Gabby’s medication.

     

    Office Lens has got to be one of the coolest apps ever, especially on iOS where it provides fantastic guidance for positioning the camera.  On Windows, Office Lens seems very accessible although I haven’t figured out how to make it work with my flatbed scanner. I don’t know if there’s a way to fix this, or if I need to find another way to import scanned images into the windows 10 OneNote app, such that text within the image is recognized.

     

    Summary

     

    Throughout my life I’ve done many things to prepare for all sorts of emergencies, starting as far back as fire drills in elementary school, but I’ve never given a great deal of thought to what for now I’ll call, informational preparedness.  The following are a few questions you may wish to consider as, having the answers now when they’re not needed, is much better than not having them later, when they might be.

    • If I were in a situation where I needed to write something down, right now, how would I do it?
    • Am I dependent on one device?  Put another way, if I drop my phone or laptop and it smashes, what does that mean for the information that's important to me?
    • Do i have the contact numbers for friends, family, doctors, transportation services, friends and any others I might need and can i access them quickly?  Do I have these on more than one device and do I know how to access them wherever they are?
    • Do I have a way to share information with someone else in a way that makes sense to me and them? Who might that someone else be and have we discussed this specifically?
    • How do i handle materials in an inaccessible format to me in an urgent situation? it might be fine for my neighbor to help me read my mail, but they may not be available to me all day, every day.
    • Does my doctor/pharmacy/healthcare provider have a way to send me information in a more accessible format? Many places are using online systems similar to MyChart, but getting that set up when it's actually needed it is not fun -- it's really not.
    I'm sure there are many other questions that should be asked, but the above list should be a good starting point. Certainly let's keep the conversation going and if there are others, put them in the comments and I can add them to the list.

     

    Finally, I want to thank the OneNote team and countless others who have been working to make technology accessible.  Technology is truly an equalizer in ways that, even as a member of the accessibility field, continue to amaze me and I couldn’t be more appreciative.

     

  • It may be just an app, but sometimes, it's why my life sucks.

    Update

    On July 26, I received yet another support Email saying in part,

    Dear Steve, Thank you for contacting Weight Watchers.  My name is [Name redacted] and I will be more than happy to assist you with troubleshooting your application. I do apologize for this inconvenience.  Your email has been escalated to me. In order for us to be sure we offer you the best support for Weight Watchers Mobile, please answer the following questions for us: * Are you using a mobile device or a computer? * What is your device model and Operating System? * If you are using an iPhone, iPad or iPod, please confirm whether you are using the Weight Watchers Mobile app for iPhone App or accessing our mobile site [a.weightwatchers.com](http://a.weightwatchers.com/) ? * If you are using a computer, what internet browser are you using. * If you have not already done so in your initial Email to us, please let us know what error you are receiving. * If your issue is technical in nature and you have not already done so in your initial Email to us, please describe as best you can what is occurring and what steps you took prior to running into the problem.  Also please provide any error messages you may have received. As soon as we receive your response we will investigate on your behalf.
    OK, clearly, they're still confused.  That said, this issue is obviously on someone's radar as there most recent app update has fixed the SmartPoint values reading on foods.  The daily and weekly totals still don't read correctly, but at least now I am no longer disillusioned by chocolate cake having a 0 point value. :)

     

    While the title of this post may seem a bit dramatic, I assure you it isn’t, at least not to me.  In a nut shell, the situation is this:  I pay for an app or service, use the app or service and then, with one update, it suddenly becomes impossible to use the app or service any longer.  This may not seem like that big a deal to those who are able to see, but for those of us who depend on VoiceOver or other assistive technologies, it’s a situation that is very real.

     

    As many of my social media followers know, I’ve been a member of Weight Watchers for quite a few months.  After all, I can definitely stand to lose a few pounds and I’ve seen the program be successful with many who have benefited greatly from it.  I was also very encouraged to learn that Weight Watchers has a page dedicated to accessibility which says in part::

    In our ongoing commitment to help as many people as possible to lose weight, including those with disabilities, Weight Watchers is dedicated to improving accessibility for people with visual impairments in the following ways.
    The page then goes on to describe how to use the Weight Watchers online service with the JAWS screen reader, with VoiceOver and Safari, how to request information in alternative formats, how to optimize the Tracker for accessibility and much more.  I felt their commitment to accessibility to be genuine and in all fairness, their web site and iOS app worked extremely well, that is until the latest version.

     

    For those unfamiliar with Weight Watchers, the program is essentially a points-based system where by individuals are allocated a number of points to be used throughout the day and foods are also given a point value, healthier foods receiving lower values than non-healthy foods.  A person can eat whatever they wish, the goal being to stay within their allocated number of points.  In short, it’s totally fine to have a big slab of chocolate cake, but because that slab of cake has a high point value, a smarter decision might be to opt for different, more healthier foods.  Using their iOS app, it’s possible to look up a food’s point value and to track it against the daily total.  Not only is this an efficient system, but the app can be instrumental in making healthy food choices by allowing the user to look up point values before deciding what to eat.

     

    Like many of their customers, I update the Weight Watcher’s app regularly.  I certainly didn’t anticipate any problems when installing the latest version described as:

    What's New in Version 4.9.1 Fixed an issue with the barcode scanner.

    We’re always working to improve the app and maximize your experience — thanks for sharing your thoughts so we can make it even better. More exciting improvements to come!

    Imagine my surprise when, after installing this harmless-looking update, all the point values suddenly started reading as ‘0’?

     

    After getting over my initial euphoria over chocolate cake suddenly having a ‘0’ point value, I realized that the problem was in fact an accessibility one.  For whatever reason, VoiceOver is no longer able to read point values accurately.  What this means is that in search results, when adding foods, when reviewing meals and anywhere else a point value might present itself, it is simply read as ‘0’.  Given the critical part the point values play in the program, this is a real problem.  How can I utilize a system based on points when I can’t read the actual points?

     

    So, what to do?  My first step was to utilize live chat functionality which is built directly into the Weight Watchers app.  This chat system is pleasantly accessible and since it’s available around the clock, I thought it would be a quick way to describe the issue and see if it had already been reported.  After explaining the situation to the chat representative, my chat was “transferred”; I never knew a chat could be transferred.  Anyway, I get a new representative to whom I again explain the situation only to have my chat disconnected.  By this point my hands hurt from all the typing in addition to my already-mounting frustration, so I figure the next best thing to do is to contact them via the web site.  I do this, being sure to mention that I’m blind, this is an accessibility issue followed by a descriptive explanation of the problem.  Over a day later, I receive this response:

    Dear Steve, Thank you for contacting Weight Watchers. My name is [name redacted] and I'm sorry about the challenges that you have encountered in accessing your account through the WW Mobile App. Rest assured, that I will help you with your concern. I appreciate your subscription with our Online Plus  plan.

    We want to take this opportunity to thank you for trying our site and for making us a part of your weight loss journey. Please try the following troubleshooting steps:

    1. Please log out from the App and log back in.
    2. If that does not work, force close the App if you have an Android device. Then relaunch the App. For iOS, close the App by double-clicking on the home button, swipe up on app snapshot, and click home button. Then relaunch the App.
    3. If steps 1 and 2 do not work, delete the App and reinstall. Please note that recently scanned items are stored locally on the device and will be lost when you uninstall. If you would like to keep a recently scanned item, please save it as a favorite. The Mobile App requires iOS 8.0 or later. It is compatible with iPhone, iPad, and iPod touch. For Android users, it requires Android 4.0.3 and up. While it might also work on an Android tablet, it is not yet fully supported and may not be compatible. Let us know how things go! If the troubleshooting steps do not help, please reply here with details about what you are experiencing. We’ll investigate further and reach out should we need to gather additional details.
    Clearly the rep misunderstands what’s meant here by “accessibility” despite my having mentioned blind, VoiceOver, and referencing their own accessibility page in my request.  No matter, I decide to be a trooper and try all the steps which, as expected, don’t accomplish anything at all.  I’ve sent an even more descriptive reply and as of this writing, have heard absolutely nothing.
  •  

    So why the dramatic post title?  It’d be one thing if this were a situation pertaining to one specific company or app, but this is a situation that occurs again and again.  Right now on my phone, I have an entire folder of apps that fall into this category, apps that I either want to use or that I’ve come to depend on which have become partially or completely useless to me.  Some of these apps are health-related, some are social and more disturbingly, some are productivity apps that help me maintain employment.  The company may change, the app or web site may change, but what it all amounts to is that I spend a lot of time feeling frustrated and navigating the realm of tech support when, like everyone else, I just want to live my life.  It’s especially sad in this case though, given Weight Watcher’s

    "ongoing commitment to help as many people as possible to lose weight, including those with disabilities,".

     

  • iFidget, an app to help blind people stop rocking, good idea or bad?

    My mouth dropped open in disbelief when a friend, Grace, told me about an app designed to help the blind stop rocking back and forth, something that many blind people do. There’s lots of reasons for the rocking that I won’t go into here, but suffice it to say it’s one of those habits that parents, educators and other adults try to curb in children in an effort to help them be more “socially acceptable.” Well move over parents, educators and other adults, because as Apple would say, “there’s an app for that.”

    Brought to us by the New Mexico Commission for the Blind:

    iFidget is an app designed to help people with a range of habits from rocking back and forth to restless leg syndrome or even just constant fidgeting. It has an incredibly simple design, but it has a very big future.

    iFidget is designed to be used while you’re sitting. It can be set to vibrate or play a sound when it detects that you aren’t sitting still. iFidget attempts to tell the difference between somebody who is rocking, fidgeting or moving constantly vs somebody who is just shifting their weight at a table.

    The description goes on from there describing how the app can be a “therapeutic tool” that can help people who subconsciously engage in this behavior and wish to stop. So how does it work? Basically, the app runs on an iOS device and when motion is detected, it vibrates to provide the user with a subtle reminder, presumably to be still. The app can also play a sound effect if vibration isn’t an option or isn’t desired. In addition, the app’s sensitivity can be adjusted to ensure that a greater or lesser amount of motion is needed to trigger the alert. But wait, that’s not all. iFidget also gives the user – or someone working with the user – the ability to see a graph showing just how much rockin' is happenin'.

    As a long time hard-core rocker myself, I had mixed feelings when I heard about iFidget, the first one being absolute horror that kids could potentially be forced to use this app in school settings “for their own good.” Would a child see this as a gentle reminder or a means of negative reinforcement? And what about the potential humiliation of needing to share the graph with an educator or therapist of some kind? Second, the app just doesn’t seem very practical to me. I’ve been using it throughout the day and initially found that the app alerted me to any motion including when I’d engage in such socially unacceptable tasks as reaching for my coffee cup. Adjusting the sensitivity helped with this, however, the app would still alert me to major motion such as my standing up to walk into another room. In fact, I got quite the massage walking from my basement office to my upstairs kitchen. The app also doesn’t run in the background and can’t be configured to run when the iOS device starts up. Oh yes and if the device’s screen locks, the app stops working as well. One other discovery I made is that if I put the device in my pants pocket, I could rock with my upper body all I wanted – how long before kids figure that one out?

    I posted a link to the app on Twitter and the response was swift and immediate.

    twitter.com/arwen3791…

    twitter.com/allisonfm…

    twitter.com/amy0223/s…

    twitter.com/ladymem/s…

    twitter.com/lisasali/…

    twitter.com/lulu_bear…

    The tweets go on and on and on … the above is just a small sampling … clearly this is an emotionally charged issue. While I’m certainly not opposed to apps that help people self-improve, I remain concerned about the potential long-term effects this could have on blind kids if forced to used this app. Oh and one more thing, while the description may claim that this app “has a very big future,” the app itself hasn’t been updated since November 20, 2014. So, positive or negative, what do you think?

  • Demo of Direct Touch Typing on iOS8

    In this audio demo, I discuss the new Direct Touch Typing input method introduced in iOS8 and show how it works with VoiceOver.

  • My demo of the Zoom IQ5 stereo mic for Lightning devices

    In this episode, I demonstrate the Zoom IQ5 stereo mic for Lightning devices. WARNING, as this is a live demo, there are severe fluctuations in audio volume.

  • Just held my first Nano SIM and wow is it tiny

    I just received an iPhone 5 and of course before powering it up, I had to take it apart, at least a little. One thing that has made me curious about the iPad Minis and the new iPhone 5 is Apple’s use of a new, Nano SIM chip. For those unfamiliar, the SIM chip is what contains all your personal cell carrier data, IE which carriers your phone should work on, your phone number and other settings. One major advantage to using SIM chips is that you, the user, could easily remove the chip from one device and use it in another, especially handy if you often switch devices, or have multiple devices. You might also have more than one SIM chip in theory allowing you to use one device with multiple carriers, great for international travelers. Anyway, the Nano SIM represents the fourth generation of the SIM chip, the original first generation roughly being the size of a credit card. Each generation has gotten progressively smaller than the last while retaining the same core form factor. So, this being the fourth generation, you can naturally understand why I was curious about it’s size .. what’s that? You can’t? huh.

    I’m not super good at judging sizes, but I’d guess that the Nano SIM is roughly 12.3 mm by 8.8 mm by 0.67 mm. In the interest of fairness, I should mention that these measurements are available from multiple sources, so I’m probably better at using Google than I am at guessing sizes. Anyway, if you’ve ever held the SIM chip contained in the GSM version of the iPhone 4, imagine something even smaller and thinner. If you were to drop both this and a needle in a haystack, I suspect the needle would be the easier of the two to find.

    As one of those who actually does remember the first generation credit card sized SIM, it’s kind of amazing to me to see just how much smaller this generation has become. Another neat thing – I think it’s neat anyway – is that by using a cutter, or a razor blade, you can actually cut a previous generation SIM down to Nano SIM size and assuming you don’t cut the gold contacts or your finger off, it’ll actually work. This could come in incredibly handy if your particular carrier doesn’t offer nano SIM chips or, if you just like playing with razor blades.

    What’s that? You want to learn more about SIM chips? as mentioned above, there are lots of great, dare I say interesting resources on the net such as this one. Isn’t technology great?

  • subscribe via RSS