-
How Microsoft's accessible OneNote helps me to manage a medical crisis
On April 25, our daughter, Gabrielle, was rushed to the hospital by ambulance after having a breathing episode. Gabrielle (Gabby) has a condition which unfortunately causes her to have many such episodes, however, this time was different as she had a seizure and lost consciousness, twice. To say that the weeks since have been a nightmare would be a huge understatement.
in addition to all the emotional stuff, the sheer volume of incoming information soon became overwhelming. Multiple doctors conducting multiple tests, prescribing multiple medications, making multiple changes to her diet, proposing multiple theories as to what might be going on with her. Our focus needed to be on Gabby and on the situation and yet we also needed to do our best to stay on top of the ever-growing pile of information if we were to have any hope of making informed decisions in reference to her care. How to manage it all?
Information was coming in all sorts of formats. “Call me with any questions,” said many of the doctors while handing me their printed cards. “Here’s a bunch of articles I’ve printed out for you to read,” said others. My own frantic research attempts were turning up links and information at a staggering rate. And of course there were the actual meetings with her medical team that required me to write stuff down very quickly and without much time to prepare. I have a plethora of scanning and note-taking apps, but I really needed everything centralized in one place. not only that, but I needed to make sure my wife and I could share information back and forth without giving any thought to the actual logistics of making that happen.
I’ve been a huge OneNote fan ever since learning of the Microsoft Accessibility team’s efforts to make it accessible. I use OneNote primarily for work, but also use it to keep track of various things going on in my personal life. Still, I’ve always had the luxury of knowing that if OneNote failed me, I could use a backup option and while it might be less convenient, I could certainly make due. Within hours, I no longer felt like I had that luxury: I needed a system that would work for more than just me. I needed a system that would be dependable. I needed a system that would allow me to use my phone, my computer, or Anything my wife, Jenn might have access to from the hospital. OneNote met all those requirements, but accessibility of OneNote is relatively new, should I really trust it for something like this?
Dealing with all the print.
Microsoft makes a product called Office Lens which allows a photo to be taken of a printed page. The text in that photo can then be recognized using optical character recognition and the results read aloud. One of the really awesome things about Office Lens, at least on iOS is that I get spoken feedback when positioning the camera. I can also send the original image, along with the recognized text version to OneNote. Whenever given something in print, whether a sheet of paper or business card, I tried to immediately capture it using Office Lens. Being wired on caffeine and Adrenalin, I’m amazed i was able to hold my phone’s camera steady enough to capture anything, but Office Lens talked me through the positioning and for the most part, it worked great. Certainly I didn’t get 100% accuracy, but I got names and numbers and enough text to get the gist. Microsoft also makes a version of Office Lens for Windows 10 which I was very excited about until I realized it wouldn’t let me use my flatbed scanner, apparently, like the mobile versions, it’s really designed to use a camera. I found a work-around by scanning pages using an alternative app and importing the images into Office Lens, but maybe someone out there knows of a better way? During this past CSUN, Microsoft demonstrated the ability to scan a document using their Surface Pro, I may need to add this thing to my Christmas list if it really works.
Quickly writing stuff down.
i don’t know how many times I’ve heard the saying “there’s never a pen around when you need one,” but it’s true. No matter how prepared I think I am to write something down, it almost never fails that someone has information for me when I’m in the most inconvenient place to receive it. One great aspect of OneNote is that there are numerous ways to quickly take down information. On iOS, there’s a OneNote widget that allows me quick access from any screen. I can pull down the notification center, swipe right and my OneNote widget is the first widget on my list. I simply select the type of note I wish to write, text, photo, or list, and get a blank page for writing. I have the option of titling my page although if I’m in a hurry, I’ve found it easier to just write whatever it is down and title the page later. If I’m not in a position to type, or if there’s simply too much information, OneNote gives me the option to attach a voice recording to a note.
If I’m at my computer, I have a really great option for taking a quick note: The OneNote desktop app which is bundled as part of office has a feature called, Quick Note. From anywhere, I simply press windows+n and I’m placed in the title of a new blank page. I can write a title or title it later, most important, I’m at a place where I can just start writing. When I close the window, my note is saved and I’m returned to wherever I was when I hit windows+n. This makes it possible for me to take down a note literally at a moment’s notice, i don’t even have to cycle through open windows which is great since I generally have a ton of those open at any given time. My only gripe is that OneNote stores these quick notes in their own notebook and I have to move them to the correct place later. I’m hopeful there’s a setting somewhere which will allow me to configure this behavior, but if not, I consider it a very small price to pay for an ultra-convenient way to take a quick note.
Managing Gabby's home care.
While Gabby still has a long medical journey ahead, she is stable and is able to be home with medication, monitors and other supports in place. Coordinating which medications she needs to take and when, in addition tracking other aspects of her condition is again something we’re managing to accomplish with OneNote. First, we created a to-do list of all her medications to use as a sort of template. We then copied this list renaming each copy to a corresponding date. In this way, we can keep track, day-to-day, of which medications have been taken and which remain; no back-and-forth between Jenn and me around whether Gabby has taken a specific medication or not. There are a few drawbacks to this system, most notably that if any of her medications change, we’ll need to delete and re-create all the future pages in her journal section. There are certainly other to-do apps that could help us more easily manage recurring to-dos like this, but by using OneNote, we’re able to keep all her information centralized and synchronized. In addition, using OneNote makes it easy for us to track events such as breathing episodes and other real-time observations which we could not properly capture in a to-do app. As we continue to work toward figuring out the best next step for Gabby, we have a central place to compile research. Also, as medical bills and insurance claim determinations start arriving by mail (amazing how fast that happens) we have a way to organize that as well.
Problems and challenges.
I don’t regret my decision to use OneNote to help me manage these past few weeks, not even a little. That said, I have encountered some challenges and feel they’re worth mentioning. To be fair, i see that OneNote for iOS actually has an update today, so some of these may no longer exist.
On the iOS app, when using a Bluetooth keyboard, editing text doesn’t seem to work as expected. Specifically, when i arrow around, sometimes I find myself on a different line, sometimes on a different word, commands to move word by word don’t seem to work as I think they should. My stopgap solution has been to simply not edit on iOS; I hit the asterisk ‘*’ key a few times to mark that there’s a problem, hit enter and just keep on typing.. While editing would be great on iOS, and maybe it’s just me who’s doing something wrong, my primary interest is in capturing the information knowing that I can always clean it up and edit it later on my PC. When using Braille Screen input, my preferred method of typing on iOS, i sometimes need to double tap the text area even though the keyboard is visible. I’m not sure why this is the case, but it’s an easy fix to a strange problem.
On the PC side, working with the Windows 10 OneNote application is far easier than working with the OneNote Desktop application provided as part of Office. That said, the Quick Note functionality is only available in the Office version, not the Windows 10 app version. For the most part this doesn’t cause any problems, it’s just a little confusing as if you want to use Quick Notes, you have to make sure the Office version of OneNote is installed even if, like me, you don’t use it for anything else.
My other frustration with the Quick Notes functionality of the Office app, as mentioned above, is that i can’t seem to change where it wants to actually put my quick notes. I want them in the cloud, within a specific notebook, and Office wants them on my local machine, in a different notebook. Fortunately it’s very easy to move notes from one place to another, it’s just one more thing I need to remember to do and if I forget, those notes won’t be synchronized to my phone and to Jenn.
Currently, in the Windows 10 OneNote app, I cannot figure out how to check items off the to-do lists. I can read the lists just fine, but can’t tell what’s checked and what isn’t. My solution for this is to simply use iOS for now when checking off Gabby’s medication.
Office Lens has got to be one of the coolest apps ever, especially on iOS where it provides fantastic guidance for positioning the camera. On Windows, Office Lens seems very accessible although I haven’t figured out how to make it work with my flatbed scanner. I don’t know if there’s a way to fix this, or if I need to find another way to import scanned images into the windows 10 OneNote app, such that text within the image is recognized.
Summary
Throughout my life I’ve done many things to prepare for all sorts of emergencies, starting as far back as fire drills in elementary school, but I’ve never given a great deal of thought to what for now I’ll call, informational preparedness. The following are a few questions you may wish to consider as, having the answers now when they’re not needed, is much better than not having them later, when they might be.
- If I were in a situation where I needed to write something down, right now, how would I do it?
- Am I dependent on one device? Put another way, if I drop my phone or laptop and it smashes, what does that mean for the information that's important to me?
- Do i have the contact numbers for friends, family, doctors, transportation services, friends and any others I might need and can i access them quickly? Do I have these on more than one device and do I know how to access them wherever they are?
- Do I have a way to share information with someone else in a way that makes sense to me and them? Who might that someone else be and have we discussed this specifically?
- How do i handle materials in an inaccessible format to me in an urgent situation? it might be fine for my neighbor to help me read my mail, but they may not be available to me all day, every day.
- Does my doctor/pharmacy/healthcare provider have a way to send me information in a more accessible format? Many places are using online systems similar to MyChart, but getting that set up when it's actually needed it is not fun -- it's really not.
Finally, I want to thank the OneNote team and countless others who have been working to make technology accessible. Technology is truly an equalizer in ways that, even as a member of the accessibility field, continue to amaze me and I couldn’t be more appreciative.
-
My Frustrations with Android Notifications
Notifications: They tell me when I’ve missed a call, gotten an Email, received a text message and so so much more. Notifications have become a critical part of how I work and play and without them, I sometimes wonder if I’d know where to begin.
The Lock Screen
On iOS, the first way I likely encounter notifications is on my lock screen. Quite simply, when I wake my phone, notifications show on my lock screen in the order received, oldest to newest. So, when I wake up in the morning, or come out of a meeting and grab my phone, I can quickly skim through whatever notifications I've missed over night. On Android, the experience is very different. First, my lock screen shows notifications, however, they do not seem ordered in any particular way. For example, looking at my lock screen right now, I see a FaceBook notification that came in an hour ago followed by a Skype notification telling me about a message I received three minutes ago. Next to both of these notifications, I have an "expand" button which, if activated, will show me additional notifications from that application. Put another way, the notifications seem to be grouped even if the groups themselves don't seem ordered in any particular method. On the one hand this grouping thing is kind of neat as I can quickly see the apps that have sent me notifications and, if I'm interested in the particulars of any, I can expand them. The problem is that this too doesn't seem standardized between applications: Some applications group notifications as just described, others don't. In addition, some applications have a specific button that says "expand" to which I can swipe and others require me to tap on the notification itself and go on faith that it will expand to show additional content. Others say "dismissable" although I haven't figured out how to actually dismiss them. Much as I like the concept of grouped notifications, the inconsistencies I've observed so far make it more confusing than anything else. One cool thing that Android seems to have on the lock screen though is this thing I'm calling the notification summary bar. If I explore by touch, moving upward from the bottom of the lock screen, I encounter a line that, when touched, reads a number followed by a detailed listing of all my notifications. I'm not sure what this looks like visually as there's just no way all the content that gets read aloud would fit on the lock screen, let alone a single line. Still, it's a good way to quickly get an overview of all notifications.Notification Center and the Notification Shade
Both iOS and Android have a way to display notifications once the device is unlocked, iOS calls this the notification center and Android (at least TalkBack) calls this the Notification Shade. On iOS, the Notification Center is opened by using a three-finger swipe down gesture from the top status bar. On Android, there are two ways to access the Notification Shade, either a TalkBack-specific swipe right then down gesture, or a two-finger swipe down from top gesture. I'm improving, however in the beginning, it was a bit challenging for me to perform either of these gestures reliably. When the Notification Shade is activated, I first encounter the time followed by my WIFI status and a control to disable WIFI, then my cellular signal status, then my battery status, then my Bluetooth status, then my screen orientation, and then my notifications. While this is quite a bit to have to go through, having a sort of quick control center easily available is neat. As with the lock screen, notifications are grouped, or at least they attempt to be and like the lock screen, the grouping doesn't seem consistent. On the shade, I have a GMail notification that says "nine more notifications inside". Other notifications though don't tell me how much additional content they may or may not include and I only know they are expandable as they are followed by a button that says "expand." This button isn't programmatically associated with the notification though, so unless I swipe through this shade, I'm not sure which notifications are associated with buttons to expand additional content. The Notification Shade also contains a few details that don't appear on my lock screen, one is my local weather and another is an Android notification advising me that I can enable the ability to unlock my phone with my voice. While it doesn't really bother me, the weather appearing here is a bit incongruous with the other types of notifications present. At the very end of the Notification Shade is an unlabeled button which I've discovered is a clear all notifications button of some sort. I know it's possible to clear all notifications on iOS if using an iDevice with 3D touch, however, this seemingly simple and logical feature has existed on Android for a long time now and it could almost be fantastic. I say almost because, when I activate this button, my phone starts going crazy and counting down messages while playing a notification tone, "82 messages 81 messages 80 messages 79 messages 78 messages ..." and a tone for each one. I've discovered that if I lock my screen at this point, the countdown seems to proceed much faster, probably because TalkBack isn't trying to read the number of messages. I really have no idea why this is happening, but while the clear all notifications feature is a good one, I definitely hesitate before using it.Sounds, vibrations and other observations
One of the more baffling things I've noticed about notification sounds on Android is that, at least on the devices I've tried, they always play through both the headphones (assuming headphones are plugged in) and the phone's speaker. So, let's say I'm in a meeting and I decide to have a text conversation with someone -- strictly a hypothetical situation in case my boss happens to be reading this blog post. :) I plug headphones in and send a text. When I receive an answer though, the notification sound is played through both the headphones and the phone's speaker. I can set my notification alerts to vibrate only and solve this problem, but it still strikes me as odd that I can't make notification sounds play strictly through headphones. Conversely, if I'm on a call, phone/Skype/WebEx/other, I don't hear any notification sounds at all. Presumably the thinking here is that I wouldn't want my call interrupted with additional sounds being played, however, I find those notification sounds very helpful for determining the notification I just received. If I get a notification while on a call, indicated by a vibration, the only thing I can do is open the Notification Shade and hope that the most recent notification is on top, or at least not grouped with other notifications. In reality, this has proven extremely problematic for me, almost to the point of being a complete deal breaker. Part of the reason this doesn't work as smoothly as it possibly could is because TalkBack forces me to make a very difficult choice; whether notifications should be read aloud when the phone's screen is locked. If I enable this feature, all my notifications get read aloud when the screen is locked including sensitive content such as text messages, Hangouts conversations and so forth. If I disable this feature, TalkBack stays quiet when notifications appear on the lock screen, however, as the screen automatically locks after a short period of time when on a call, this means nothing gets read which isn't helpful since I don't get the sounds in the call scenario either. But let's push that entire mess to the side for just a moment and talk a little about notification sounds themselves. One of the really cool things about Android is that many apps allow their notification sound to be customized. This means that unlike in iOS where many applications use the default iOS tri-tone notification default sound, Android applications allow the user to pick from whatever text/notification sounds exist on the device. This is one feature I absolutely love, at least I would if Android would stop resetting certain sounds to the default sound. For example, I configured my device to play one notification sound when receiving work Email and another sound when receiving personal Email. That worked fantastic for three days or so, but now I'm getting the default notification sound regardless of whether Email is received on my work or personal accounts. Other apps which have unique notification sounds on iOS don't seem to have any unique sounds on Android, either that, or they do have the same unique sounds, but the default notification sound is being played for reasons I can't explain. For example, there's an accessible dice game called Dice World which has a notification sound of dice rolling when an opponent has played their turn. Initially, this sound would play just fine on my device, but now, I just get the standard notification sound and don't seem able to change it. Quick side note: Yes, I do have the "play notification sound" enabled in Dice World. Same situation with Tweetings, a very powerful Twitter client that has built-in notification sounds that initially played, but which now no longer do. Point here is that the ability to customize notification sounds is extremely powerful, but I'm not sure how stable it is. In addition, not all apps allow notification sounds to be customized in the first place.As I wrap up this blog post, I’m left with the feeling that I’m barely scratching the surface of Android notifications. I say this because I’ve gotten feedback on Twitter and elsewhere that others are not having the same experiences as me. For example, some people claim to have an edit button on their Notification Shade which allows them to specify how notifications get sorted while others do not. I’m also not sure if anyone else is experiencing the same inconsistencies as me with regard to notification sound preferences resetting themselves to default. In the end though, I remain confident that I can find workable solutions to these challenges, how difficult those solutions may be to implement remains to be seen.
-
While all men may have been created equal, all Android devices are not.
I wanted to write about something which Android folks probably take incredibly for granted, but which might be a bit perplexing to users coming from iOS. And it’s one of these things that, as I write it, seems incredibly silly yet it’s really not. In fact, it’s one of those twist of irony things that make Android an attractive option in the first place. First some background.
As written previously, I’ve been learning on a Motorola G4 Play which I picked up on Amazon Prime for around $149. I really love this particular device, it’s small, it’s light-weight, texture wise it’s easy to grip, and you just can’t beat the cost. Still, one thing that’s been frustrating me is that occasionally, when I double tap something, the phone doesn’t register a double tap. In addition, I find that sometimes I’ll swipe my finger from left to right and yet the phone will act as if I hadn’t swiped at all. I was complaining about this to a friend of mine, David, one night.
“gosh,” I complained, “how can anyone take Android seriously when it can’t even reliably recognize a swipe gesture?”"
After a bit of a pause, David cautiously replied, “do you think it could possibly be your hardware?”
Honestly, in my frustration, I hadn’t even considered the hardware angle and how that might have a real functional impact on things like gesture recognition. But it stands to reason that the hardware differences between my $149 Moto G4 Play and David’s $649 Pixel just might be a factor somehow.
Going down the rabbit hole
I wanted to get an idea of just how much of a difference different hardware configurations might make, especially in terms of device accessibility. Obviously devices with faster processors and more RAM are going to perform at greater speeds, but what about touch screen sensitivity, ROM customizations and anything else that might impact accessibility?
I started out by purchasing an Asus ZenFone 3 Laser because not only is its metal construction extremely solid, but, well, it just has a really awesome name. OK all that is true, but I really bought it because it has a larger display, more RAM and a slightly faster processor than my Moto G4 Play. After getting through the setup process, I was introduced to what Asus calls ZenUI 3.0. ZenUI is basically the Asus customization of Android including a number of applications and widgets, a redesigned home screen, customized dialer, custom sound effects for things like locking and unlocking the screen, and notifications and other tweaks to make their phone unique. Coming from iOS, the idea that the entire home screen, notifications and even the phone dialer can be customized is very unsettling. After all, if I talk to another iPhone user, I can walk them through how to place a call because I do it exactly the same way on my own device. The asus customizations, however, were so significant that I was unable to figure out how to access my menu of applications. I want to be clear here, I’m not saying necessarily that finding the application menu is inaccessible, it just wasn’t at all intuitive and definitely wasn’t the same experience as on my Moto G4 Play. What I soon learned though is that Android allows for the installation of what are known as Launchers. My understanding thus far is that Launchers basically define things like the home screen layout. After installing the Google Now Launcher, which is apparently installed by default on my G4 Play,my application menu appeared where I was expecting it and some of the other random dialogs that had started popping up simply went away. In the end, I experienced similar frustrations to those I had been facing with the G4 Play with the additional frustrations of figuring out how to get my home screen and other aspects into a state where I could use them. As awesome as its name is, the ZenFone soon found its way back to the store.
Next up, I purchased a Blu 5R which is also a solidly built phone – yeah, I tend to gravitate toward phones that are solid, heavy and which feel like they won’t fall apart at the drop of a hat. As with the Asus model that I returned, the Blu phone has a larger screen and slightly better specs than my G4 Play. While the Blu had its share of customizations, such as rather cute startup and shutdown sounds and a number of pre-installed applications, my experience was a very positive one. Although not perfect, I experienced fewer issues with gesture recognition, I loved the finger print sensor (the G4 Play doesn’t have this) and the speaker, once I realized it was initially covered over by a sticker, is really fantastic. If anyone reading this is looking for a budget entry-level phone, the Blu 5R should definitely be considered. I wound up returning mine, but only because I couldn’t justify it given that I already own the G4 Play.
And so it was with great anticipation that I awaited the arrival of my latest phone from somewhere in China, the OnePlus 3T. I’d never heard of the company, OnePlus, but they are a startup specializing in high-end, high-performance devices at mid-level prices. The specifications of the OnePlus 3T rival those of the Nexus at just over half the price and the reviews are fantastic. If I decide to seriously make the switch to Android, the 3T, with it’s super fast battery charging capability, 6 GB of RAM, convenient slider to quickly enable do-not-disturb and amazing form-factor is a device I could see myself using day-to-day. More importantly though, gestures are definitely recognized, accurately and consistently.
What have I learned?
I’ve actually learned a lot over these past few weeks, beyond the fact that my local electronics shop has a really great return policy. First, when purchasing a new Android device or when seeking assistance, it’s important to remember that Android devices can be different, sometimes vastly so. If you’re coming from iOS, this is extremely important because for the most part, iOS devices and how they operate are pretty similar across the board. Another thing I learned is that when an Android user says, “hmm, I’m not experiencing that issue,” it could really mean that, given their specific hardware/software which may be different than yours, they’re really not experiencing the same issue as that which you may be experiencing. It’s been my experience that sometimes, when an iOS user says that they’re not experiencing an issue, it’s meant as a mild rebuke: something along the lines of, “I’ve got the same hardware as you, I’ve got the same software as you, it’s working fine on my end, clearly it’s a problem on yours.” Looking over this paragraph, I realize I’m over-using the word experience, but in a way, that’s exactly what we’re talking about here. One of the very things that makes Android such an attractive option is the flexibility to customize just about every part of the experience. This comes at a cost though, the cost being fragmentation between what I might experience and what you might experience.
subscribe via RSS