Seeking a new VPN provider, does anyone have suggestions?

I’m looking for a new VPN provider and am curious if any of my readers use one that they would recommend. I’ve been extremely happy with my current VPN provider, ExpressVPN, but $116.95 billed every 12 months is frankly more than I want to pay for a VPN service that I only use occasionally. Ideally, the perfect VPN for me would:

  • Be accessible on Mac, Windows, iOS, and Android.
  • Allow 3RD party clients to be used if needed, such as if their main client becomes inaccessible because of an update or something.
  • Alert me if the connection has been dropped. After all, a VPN that would allow traffic to proceed normally if the VPN connection drops is not very useful. My work VPN has no trouble managing this, but I haven’t encountered a consumer VPN product yet that does this right — maybe I’ve just been using the wrong ones?

When reading various reviews for VPN services, they often contain a number of features that don’t interest me particularly although I guess they would be bonuses:

  • Ability to direct traffic through a specific country.
  • Ability to circumvent geolocation restrictions.
  • Additional products unrelated to the VPN itself such as password managers and the like.

If you use a VPN service that you like, I would definitely love to learn more. And if you can get some sort of referral credit by sharing an affiliate link, feel free to pass that along as well.

Playing around with Aiko, an amazing, accessible transcription app for Mac and iOS

I recently heard about this fantastic app, available for both Mac OS and iOS, called Aiko which leverages AI technology to transcribe audio. What sets Aiko apart from similar solutions though include, in part:

  • It’s free, totally free.
  • Audio can be dictated directly into the app, or a pre-recorded file can be imported. I’m particularly excited about this second piece.
  • Everything happens on the end-user’s device, nothing is sent to the cloud.
  • Multiple languages are supported, we’re talking a lot of languages: 100 languages according to Aiko’s home page.

I was excited to test out this fascinating technology and so to really put it through it’s paces, in a sub-optimal recording environment, I decided to record some audio using my Apple Watch, while standing outside with lots of traffic and other background noise. What follows is the unedited output of my little experiment. I’m also adding the actual recorded audio, so that you can get a sense of the crummy audio I gave Aiko to work with.

Hello, and thanks for joining me today.

I’m playing with an app called AIKO.

It’s an app that leverages Whisper, which is a technology made by OpenAI, the folks that brought us ChatGPT.

Now unless you’ve been living under a rock for the past couple of months, I’m sure you’ve heard quite a lot about ChatGPT and the fascinating possibilities it opens up to us.

Anyway, Whisper, and on top of that this AIKO app, allow transcription of audio.

The interesting thing about it is that you can record directly in the AIKO app, or you can import audio, say from a file that was pre-recorded.

For example, you might have a pre-recorded audio file of a lecture or a class.

You would be able to import it into this AIKO app, transcription would happen, and then you would have the output as text.

For my test today, I’m standing outside in front of my house recording on my Apple Watch with traffic going by.

And the reason I’m doing this is because I wanted to come up with a very sub-optimal recording environment, just to better understand how the technology would deal with audio recorded in such an environment.

I’m also trying to speak as naturally as I can without saying words like um and uh, things that I think often get said when speaking.

The interesting thing about AIKO and the way that it transcribes audio is that it supposedly is able to insert punctuation correctly.

I’m not sure if it does anything about paragraphs or not, but as the speaker, I don’t have any way of controlling format.

Once you run a file or recording through AIKO, the output is rendered as text.

However, there are a few things you can do with it.

First, you can of course copy the text into some other application.

The other thing that you can do is have the text be timestamped.

The reason that this can be handy is that you can use that then to create files that can be used as closed captioning for videos.

Anyway, it is kind of loud out here, and so I will go back inside.

I also didn’t want to make this too long because I’m not sure if it’ll work at all or how accurate it’ll be, but my plan is to post this to the blog without editing it.

Stop, stop, stop.

Aiko-generated transcription from my Apple Watch recording.

One final note, the dictation ends with the words “stop stop”.  I didn’t actually speak those words, but because I have VoiceOver activated on my Apple Watch, they were picked up in the recording as I located and activated the stop button.  This is definitely incredible technology and the price certainly can’t be beat. From an accessibility perspective, I found Aiko to be extremely accessible with VoiceOver on both Mac and IOS and since it is a native app using native controls, I feel confident that it will work with other assistive technologies as well. You can find more information about Aiko, including FAQs, links to app store pages and more here.

When Success Means Buying a Smaller Suit

Recently, I got to participate on the Parallel podcast talking about, of all things, accessibility and fitness. The reason I phrase it this way is that anyone who knows me probably knows that fitness and I don’t normally go together in the same setence, let alone the same podcast. From the show description:

Starting or maintaining a fitness program is a challenge for anyone. If you have accessibility needs, you might experience barriers related to touchscreen devices, coaching that doesn’t address a hearing or visual disability, or a need for accommodations related to physical limitations. With its Fitness+ service, Apple has taken on some of these issues, and opened up the program to many more people with disabilities, We’ll talk with a Fitness+ user, and someone who has worked on Apple accessibility teams.

https://www.relay.fm/parallel/80

Talking about anything fitness related has always been challenging for me and so I want to particularly thank the ever-awesome Shelly Brisbin for being brave enough to include me. I also want to especially thank Sommer Panage and the other unsung heros that dare to dream of a more accessible world, and work so hard to make that a reality.

Parallel can be found everywhere great podcasts can be found, more info about the episode and how to subscribe to Parallel, which you should totally consider doing whether you listen to this episode or not, can be found on Parallel’s home page.

100 Days of SwiftUI, my foray into understanding a bit more about how iOS works

Ever since I was able to accessibly use an iOS device, an iPhone 3GS, I’ve imagined how awesome it would be to be able to develop my own applications. That excitement was very short lived though as I soon became aware of just how complicated developing an application really is. It’s a very involved process — or so it seemed to me — and for someone who hasn’t written any code since C ++ was the talk of the town, it seemed like an impossibility. I wrongly assumed this was especially true for iOS because apps are often very visual and interactive and I just couldn’t imagine how I’d tackle that without vision. And so I quickly decided that iOS app development was just not for me.

Fast-forward quite a few years and Apple releases Swift and SwiftUI which, at the risk of over simplifying things quite a bit, is a more powerful and natural programming language for application development. Put another way, Swift and SwiftUi is intended to make application development easy enough for just about anyone to learn and do. Being a natural skeptic, I doubted that it could be quite as easy as Apple seemed to suggest, but the idea behind it seemed really interesting to me.; indeed, Swift and SwiftUI have taken the iOS development community by storm, with entire applications being developed using it. With only so many hours in the day though, my challenge was going to be finding the time to devote to learning it. And so again, I set the idea aside figuring I might look into it whenever I had more time.

I’m not proud of this, but I have a long list of the things I want to do when I have more time, the thing is, the longer I wait to do any of the stuff on that list, the less time I’ll actually have to do any of it.

I initially learned about 100 Days of SwiftUI from Darcy and Holly of the Maccessibility Roundtable podcast. The idea behind this course is simple: learn SwiftUI gradually — you guessed it — over 100 days. The course suggests devoting an hour per day to learning and practicing the material. An hour per day doesn’t seem that bad to me, I probably spend at least an hour per day thinking about all the stuff I’d love to do, if only I had an hour per day. 🙂 While looking at the contents of the course is a little scary for someone like me who is just beginning, I love that there are days set aside for review and practice. In addition, there is emphasis on not trying to go it alone, students are encouraged to share progress and help one another. That sharing progress thing is actually one of the two rules of the course, as it can help with accountability and can also help the student make connections with others who are also learning.

So, what do I hope to ultimately accomplish? Sure, I’d absolutely love to get to the point where I can start developing or working on apps that are useful to someone, but that’s not actually my goal. I want to understand more about iOS apps because so often, when I report an accessibility issue, I feel like I really don’t have a way to describe what’s not working for me other than to say that something just isn’t working. I’m hoping that by learning the basics of SwiftUI, I might be in a slightly better position to provide more constructive feedback. Whether I’m able to develop my own apps, or help other developers improve theirs, I figure it’s a win either way and so I’m excited to get to learning. For anyone else who might also be interested, let’s definitely connect and learn together.

Tip: Does the FaceTime control bar sometimes get in your way? There’s an accessible way to dismiss it.

One of the new features introduced in iOS15 is this call control bar which provides FaceTime audio controls across the top of the iOS screen during a FaceTime audio call.

Screen shot of Steve's very messy iOS home screen with the FaceTime control bar across the top. Visible controls, from left to right, are leave call, open messages, Audio route, Mute, camera, share content.
Screen shot of FaceTime control bar

I actually really like this new control bar because it gives me the option to mute/unmute from wherever I am and for me, this is much faster than having to switch back to the FaceTime app each and every time. That said, there are times when this control bar gets in the way. For example, sometimes I’ll be in an application and I know there’s a “back” button, but I can’t get to it with VoiceOver because it’s obscured by the FaceTime audio control bar. I mentioned my frustration about this to a sighted friend and she told me that visually, it’s possible to swipe this control bar away. At first, I thought we might have an accessibility issue of some sort as I could not find a way to do this when using VoiceOver. Eventually, I remembered the two-finger scrub gesture and like magic, away it went.

For anyone unfamiliar with it, the two-finger scrub gesture is a VoiceOver command that can be used in a few different ways depending on context. IF a keyboard is visible, the two-finger scrub gesture will dismiss it. If an application has a “back” button, the two-finger scrub gesture will perform that action. The easiest way to think about the purpose of this gesture is that it can help you get out of something by dismissing a control, navigating back, closing a pop-up or menu — in many ways, similar to what might happen when pressing the escape key when using a desktop application. To perform this gesture, place two fingers on the screen and move them quickly in a scrubbing motion such as right, left, right.

Putting it all together

If you ever have a reason to temporarily dismiss the FaceTime Audio call control bar and need to do so using VoiceOver, here’s how to do it.

  1. Touch the FaceTime Audio control bar with one finger, this will set VoiceOver’s focus to the correct place. This is important because otherwise, VoiceOver’s focus will remain on your home screen or on whatever aplicationp screen you have open and the scrub gesture will not dismiss the control bar.
  2. Perform the two-finger scrub gesture. If successful, the control bar will go away. IF not, double check that you have correctly set VoiceOver focus to the control bar as just described. If the two-finger scrub gesture isn’t performed correctly, it is possible that focus may inatvertantly move away from the FaceTime Audio control bar.

A few more things to note. First, I don’t know of a way to permanently dismiss the FaceTime Audio control bar and so you will have to repeat these steps whenever you need to dismiss it. Second, if you dismissed the control bar and then want to have it back, you can make it reappear by double tapping the call indicator located on the iOS status bar.

I really like the new FaceTime Audio control bar and find it super useful to have call controls available regardless of which app I’m in or which screen I’m on. For those times though where it might come in handy to move that bar out of the way, I’m glad there’s an accessible way to do so.

On this Thanksgiving, a quick note of thanks

As we celebrate Thanksgiving here in the US today, I wanted to send out a quick note of thanks to all of you: for reading my words, for providing encouragement as I continue my blogging journey, and for engaging in some really amazing conversation along the way. I have a lot to be thankful for this year, but there is one group of folks I want to recognize in particular: those developers who work extra hard to ensure their apps are accessible.

There are many developers who work tirelessly to make their apps accessible, not because they necessarily have to, but because they simply realize it’s the right thing to do. There are many accessibility resources out there that can help developers make their apps accessible, but finding those resources, understanding them, and figuring out how to implement them can be a real challenge, especially for developers with extremely limited resources.

I’d like to encourage everyone to think about an app that makes a real difference to them, whether for accessibility or other reasons, and consider writing the developer a positive review of thanks today. I’ve spoken with many developers who have indicated to me that while it may seem like a small thing, positive reviews make a real difference. First, the more stars an app receives, the more likely it will be discovered by others. Second, a kind review is a great way to show appreciation in a public way. And finally, your review might make a difference to someone who appreciates the hard work a developer has put into making their app accessible — I know I’ve felt more comfortable purchasing apps when I see a review like, “works well with VoiceOver” or “very accessible”. Writing a quick review is a great way to say thank you, it’s something that makes a real difference, something that is appreciated, and something that only takes a few minutes to do.

Again, thank you all for reading my words, supporting me, and for continuing the conversation. To those who celebrate, have a happy Thanksgiving.

Quick tip: how to get rid of the iOS bubble sound when typing or using Braille Screen Input

I’ve been using Braille Screen Input on iOS for years, as it helps me to type more efficiently. One thing that has bothered me though, whether typing with Braille Screen Input or the on-screen keyboard, is this bubble sound that VoiceOver occasionally makes. While that sound does have a purpose and an important one at that, I find it distracting and have always lamented that I didn’t have a way to disable it. Little did I know that there actually is a way to disable it.

I received many replies on Twitter, some from people experiencing the same frustration as me, and others, offering a solution I likely never would have found on my own.

As it turns out, there are actually a lot of sound customizations that can be made in VoiceOver, many of which are off by default and so I never even knew they existed. Not only that, but it’s possible to preview each of the VoiceOver sounds which is a great way to learn what they actually mean. I recorded a brief video showcasing these settings in the hopes it might be useful to others.

Demo of the VoiceOver sounds dialog

Disabling the VoiceOver auto fill sound has made a world of difference for me. Now I can use Braille Screen Input without being distracted every couple of words. In fact, I’ve written this very entry solely using Braille Screen Input.

I would like to thank Rachel, Matthew, and Kara, for getting back to me so quickly with what proved to be the perfect solution. Twitter can be an awesome place for conversation and I’m glad these awesome people are a part of it.

An open thank-you to @YNAB for improving accessibility in an incredibly meaningful way..

For years, I’ve been a fan of YNAB, You Need a Budget. I love the principles behind their budgeting methodology, I love the app, I love the company, I’m just a really huge fan. YNAB has helped me to pay down debt, feel more confident about where my money is coming from, and going to, and generally feel way more in control of my financial life. Unfortunately, on 07/14, I downloaded an update to YNAB’s iOS app, an update that contained significant accessibility issues.

Tweet from Steve

I was hurt. I was up-set. I was not sure if I had set enough money aside to cover my rent payment — in short, it was not a very good day.

When it comes to making products and services accessible, it’s really important to understand that accessibility isn’t a nice-to-have, or a feature request. This is especially true if you offer a product or service that people might depend on. Sure I could have changed to another app, but I also would likely have had to change my budgeting method to one that would align with whatever new app I had chosen. That would have been especially challenging as I couldn’t put my financial life on pause while I figured it all out.

Fortunately, the fine folks at YNAB were extremely responsive and understanding, indicating that they were already working on fixes and, more importantly, were working to ensure that issues like this wouldn’t happen again.

And so here we are, roughly three weeks later, and I again get a notification that an update to the YNAB app is available. Even better, the “What’s new” section of YNAB’s App Store entry mentions:

• Two major accessibility wins:

◦ We made many improvements to VoiceOver interactions.

◦ We changed our background colors so that the Increase Contrast accessibility setting will now apply to the YNAB app and actually increase the contrast.

What’s New section in YNAB’s Apple App Store entry for version 3.01

I downloaded the app and was absolutely blown away. YNAB is now more accessible than ever, it’s a complete accessibility transformation. Because of their work on accessibility, I can use the YNAB iOS app way more efficiently than ever before.

Steve’s tweet thanking YNAB after being blown away with their 3.01 update

So, what does all this mean? Certainly this is a win for me personally, but it goes way beyond that. By

working to improve our approach to accessibility concerns to prevent instances like this in the future

Mentioned by YNAB in a follow-up tweet

YNAB has helped ensure that I remain a loyal customer: I’m happy to continue using their product, and I’m happy to continue paying for it because I feel listened to and I feel valued. And that’s a big part of accessibility that often gets left out of the conversation; accessibility is about equal access to products and services, but it’s also about listening to, and responding to, customer needs; and isn’t that a key component of many brands? When a company values me, and goes the extra mile to show me that I am valued, they create loyalty because like most consumers, I appreciate companies and brands that appreciate me.

So Thank you, YNAB team, for your work on accessibility over these past few weeks. Not only have you transformed your iOS app in an incredible way, but you’ve also demonstrated, by taking action, that you value me and others who use assistive technologies. I’m proud of the tremendous amount you’ve accomplished and am excited to see what comes next.

The surprising accessibility possibilities of mobile check deposits

Recently, I had a conversation with a blind friend of mine who finds herself in an interesting situation.  She has received paper checks, however because everything is locked down, depositing them has become a real issue.  That got me to wondering how accessible mobile check deposits might be; it seems that just about every bank offers this option, but is it an accessible one?  Thinking it over, a few possible challenges immediately came to mind:

  1. Knowing exactly where to endorse the back of the check and writing “for mobile deposit” or similar which many banks now require.
  2. Aligning the camera so that the front and back images of the check are properly captured.
  3. Knowing one way or the other that the deposit has been accepted.

While I certainly can’t test every banking app out there, I did try a test with Wells Fargo’s app and was extremely impressed.  Wells Fargo has somehow implemented camera guidance, so that VoiceOver helps the user position the camera correctly for the check image to be captured.  Even better, when everything is aligned, the photo is automatically taken and, before final submission, the user gets notified if the photos need to be re-taken because of quality or other factors.  

So, how does it work?  First, the app asked me to capture the front of the check.  I discovered that I needed to hold my phone in portrait mode (left to right) which is something I hadn’t expected.  Since a check is small, I assumed — wrongly it would seem — that the phone could be held in portrait orientation.  As I lifted my camera away from the front of the check, VoiceOver started providing me with guidance information, “move closer” “move right” “move down” and finally, the picture was taken.  The process then repeated itself to capture the image of the back of the check.  Unfortunately, the part that remained inaccessible for me was properly endorsing the back of the check and writing “For mobile deposit only” which the bank requires.  Maybe this could have been accomplished with the help of a service like Be My Eyes or Aira?  

 

I was surprised that the process of mobile check deposits, at least with Wells Fargo, was not as inaccessible as I feared.  Unfortunately, I tried with a few other banking apps and met with very different results.  I also did not test with Android.  In summary though, the process of mobile check deposits can be made mostly accessible as demonstrated by Wells Fargo’s app.  If you try this with your bank and meet with different results, it might be worth sending them a support message and encouraging them to further investigate the possibilities of making their process more accessible.  While the technical details surpass my development abilities, my understanding is that Apple makes various APIs available to developers who want to incorporate camera guidance in their applications.  

Has anyone else tried mobile check deposit recently?  If so, what have your experiences been?