We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Accessibility in Action

00:00

Formal Metadata

Title
Accessibility in Action
Title of Series
Number of Parts
46
Author
License
CC Attribution 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Most developers agree that their apps should be “accessible,” but what does that even mean? Even if you’d like to have an accessible application, you find roadblocks along the way: lack of documentation, push-back from product priorities, no standards for mobile devices, and perhaps most importantly, not understanding what users really want. As Accessibility Lead for iTriage, Kelly has worked with advocacy groups such as the Blind Institute of Technology to uncover what makes an app truly accessible, and the actual developer time investment required. Spoiler alert: it’s easier than you might think! Join us as we discuss common pain points suffered by some technology users, how the Android platform is quickly gaining on iOS in accessibility features, and how you can make your app accessible to all. We’ll finish up with some code samples of common pain points and see how easily they can be fixed.
18
Group actionMenu (computing)Broadcast programmingAndroid (robot)Mobile appAndroid (robot)Software developerComputer programmingoutputSampling (statistics)Different (Kate Ryan album)Similarity (geometry)CodeComputing platformUser interfaceBitLecture/ConferenceMeeting/InterviewComputer animation
Mobile appBitSoftware developerFirmwareMultiplication signDegree (graph theory)Data storage deviceUniverse (mathematics)Product (business)DiagramLecture/ConferenceMeeting/Interview
Computer architectureObservational studySlide rulePresentation of a groupIntegrated development environmentData miningGroup actionSocial classFeedbackLevel (video gaming)Process (computing)XML
Right angleDot productCentralizer and normalizerSpiralCircleStatement (computer science)Level (video gaming)View (database)Computer animation
CodeSimilarity (geometry)SoftwareAndroid (robot)Software developerMultiplication signSingle-precision floating-point formatLine (geometry)MereologyComputer programmingPoint (geometry)Lecture/Conference
Machine visionDisk read-and-write headSoftwareType theoryComputer programmingMachine visionSpectrum (functional analysis)Mobile WebMultiplication signClique-widthSoftware design patternLengthAndroid (robot)Arithmetic meanSet (mathematics)State of matterMenu (computing)Computer animationLecture/Conference
Disk read-and-write headSession Initiation ProtocolPhysical systemEuclidean vectorSoftware design patternMobile appClosed setConnectivity (graph theory)Computer hardwareReal numberAndroid (robot)Disk read-and-write headView (database)SmartphoneSpectrum (functional analysis)GoogolPatch (Unix)Insertion lossElectronic visual displayVisualization (computer graphics)NeuroinformatikTouchscreenPhysical systemPattern languageGroup actionComplete metric spaceRepresentation (politics)Amsterdam Ordnance DatumComputer animation
Contrast (vision)Set (mathematics)Physical systemMachine visionConnectivity (graph theory)Mobile appType theorySoftware developerGraph coloringComplete metric spaceActive contour modelComputer animation
outputTouchscreenBitMachine visionAndroid (robot)Element (mathematics)OnlinecommunityException handlingComplete metric spaceFigurate numberComputer animation
TouchscreenAndroid (robot)Figurate numberInternet forumBlogoutputTouchscreenLecture/ConferenceMeeting/InterviewComputer animation
TouchscreenJust-in-Time-CompilerTouchscreenReguläres MaßGroup actionoutputAndroid (robot)Instance (computer science)Lecture/ConferenceMeeting/Interview
TouchscreenTouchscreenoutputMachine visionTap (transformer)Android (robot)Different (Kate Ryan album)Lecture/Conference
TouchscreenClosed setOpen setTouchscreenGoogoloutputSoftware developerBeta functionOperating systemAndroid (robot)Lecture/ConferenceMeeting/Interview
Closed setOpen setSoftware developerService (economics)Mobile appMachine visionAndroid (robot)Regular graphIntegrated development environmentTablet computerLecture/Conference
Greatest elementReguläres MaßSet (mathematics)Volume (thermodynamics)Computer animation
FeedbackVolume (thermodynamics)
FeedbackMachine visionTouch typingTouchscreenFeedbackNavigationDifferent (Kate Ryan album)Right angleComputer animationSource code
Touch typingReading (process)Element (mathematics)Right angleService (economics)Tap (transformer)Similarity (geometry)FacebookView (database)Disk read-and-write headDreizehnType theoryCone penetration testTwitterDifferent (Kate Ryan album)Physical systemBookmark (World Wide Web)Message passingElectronic mailing listLecture/ConferenceComputer animation
Local ringContext awarenessMenu (computing)Degree (graph theory)Context awarenessArithmetic meanMenu (computing)TouchscreenRight angleDemo (music)JSONXMLComputer animationLecture/Conference
Menu (computing)TouchscreenSlide rulePower (physics)Group actionCircleFerry Corsten1 (number)System callObject (grammar)
Twin primePhysical systemMenu (computing)Context awarenessTouchscreenMenu (computing)CodeDescriptive statisticsView (database)Content (media)Computer programmingMereologyComputing platformSampling (statistics)HierarchyMedical imagingLecture/ConferenceXMLComputer animation
EmulatorSoftware developerSet (mathematics)Multiplication signAndroid (robot)NeuroinformatikComputer hardwareTouchscreenHookingSoftware testingEmulatorRight angleLecture/ConferenceMeeting/Interview
EmulatorView (database)Multiplication signMachine visionMobile appInformationComputer animation
Android (robot)RobotComputer-generated imageryAttribute grammarMedical imagingMobile appView (database)Video gameDemo (music)XMLUML
RepetitionType theoryMedical imagingGrass (card game)Order (biology)Duality (mathematics)Machine visionDescriptive statisticsArithmetic meanContent (media)View (database)Computer animation
Android (robot)Computer-generated imageryDuality (mathematics)Grass (card game)RepetitionDescriptive statisticsCodeMedical imagingContent (media)Physical systemError messageMobile appTouchscreenDivision (mathematics)Sheaf (mathematics)Source codeJSONXMLComputer animationLecture/Conference
Computer-generated imageryView (database)CodeDescriptive statisticsBoolean algebraAttribute grammarMedical imagingJSONXMLUML
State of matterMultiplication signFacebookMedical imagingState of matterNumberUniform resource locatorTriangleLine (geometry)Computer animationLecture/Conference
Duality (mathematics)Grass (card game)State of matterView (database)String (computer science)BuildingRevision controlState of matterContent (media)Descriptive statisticsView (database)Group actionCodeMedical imagingGreatest elementMathematicsEvent horizonJava appletLatent heatRight angleType theoryFocus (optics)Demo (music)Lecture/ConferenceSource codeJSON
Grass (card game)Duality (mathematics)Data typeEmailAndroid (robot)String (computer science)Different (Kate Ryan album)View (database)EmailText editorType theoryCuboidComputer animation
Data typeEmailString (computer science)Different (Kate Ryan album)Disk read-and-write headDifferent (Kate Ryan album)Right angleDescriptive statisticsVisualization (computer graphics)View (database)Machine visionString (computer science)Default (computer science)Lecture/ConferenceComputer animationSource codeJSONXML
EmailMenu (computing)Data typeFreewareDefault (computer science)View (database)Computer animation
Revision controlBuildingView (database)Android (robot)MathematicsEvent horizonInterrupt <Informatik>Key (cryptography)State of matter
Reading (process)Element (mathematics)Single-precision floating-point formatMultiplicationView (database)Medical imagingLine (geometry)Single-precision floating-point formatReading (process)BitXMLComputer animationLecture/Conference
Element (mathematics)Reading (process)Maxima and minimaLibrary (computing)Java appletView (database)Descriptive statisticsString (computer science)QuicksortContent (media)Computer animation
Tangible user interfaceMobile appSet (mathematics)Content (media)Descriptive statisticsQuicksortReading (process)Normal (geometry)Bound stateXMLLecture/Conference
PlastikkarteView (database)View (database)outputPlastikkarteSoftware bugLecture/Conference
Reading (process)Element (mathematics)Bit error ratePlastikkarteView (database)Content (media)Element (mathematics)Regular graphPlastikkarteDescriptive statisticsCASE <Informatik>Android (robot)View (database)Demo (music)Lecture/ConferenceComputer animation
View (database)String (computer science)PlastikkarteContent (media)PlastikkarteDescriptive statisticsType theoryEntire functionGraph coloringMedical imagingInformationJava appletLecture/ConferenceJSONXMLUML
PlastikkarteMaxima and minimaGroup actionMobile appElectronic mailing listGreatest elementPlastikkarteView (database)Right angleLecture/Conference
RepetitionStructural loadGroup actionTraverse (surveying)Mobile appSheaf (mathematics)Focus (optics)Order (biology)View (database)Touch typingElectronic mailing listoutputLecture/Conference
BuildingRevision controlCodierung <Programmierung>Order (biology)Android (robot)PlastikkarteNormal (geometry)Electronic mailing listGroup actionOrder (biology)Set (mathematics)Traverse (surveying)View (database)MereologyCodeSampling (statistics)Slide ruleWeb pageLink (knot theory)Mobile appSource codeJSONXMLComputer animation
Lecture/Conference
Transcript: English(auto-generated)
Thank you so much everybody for coming. It's wonderful to be here at Droidcon Berlin. My name is Kelly Schuster, and I will be talking about programming accessibility for Android development. So, a quick rundown of what we're talking about today. First, we're going to talk about
what is accessibility? What do we mean when we say that? Android platform versus iOS platform, where each of them are, and the differences and similarities between the ecosystems. We'll take a look at the accessibility features that are available to Android users, and then finally we'll finish up with some code samples.
So just before I continue on, so I know like where to tailor my talk specifically, a couple questions, so please be honest. How many people here are Android developers? Yay! How many people are on the design UX UI side? Awesome, okay. How many people have heard of TalkBack? How many people have used TalkBack for more than an hour?
Awesome, cool. And how many people know a blind friend or colleague who uses an Android device? Awesome, okay, great. Thank you very much. So,
a little bit about me. I'm an Android developer. I work in Denver, Colorado, USA at a place called Itriage. We make a medical app for the healthcare industry with over 5 million downloads from the Play Store, and we're owned by an American health insurance company called Aetna,
and so we're also helping them out with some of their new products. And before I was an Android developer, I did firmware and C and C++, and my degree is in electrical engineering from the University of Colorado at Boulder. And during my time in undergrad, that's actually when I started caring about
accessibility. So a friend of mine was studying architecture, and he had a big presentation that he had to give to his class, so he wanted to give it to me first and get my feedback. So his study was actually about the built environment accessibility for wheelchairs, things like that.
And his presentation was awesome. He had a great job, but one slide from his presentation really stood out to me, and that was this one. This is a map of the campus aerial view, and this piece right here is the central corridor of campus. And each of the red dots signifies a doorway that's inaccessible
to a wheelchair. The red dots with a black circle are doorways that are inaccessible by just one stair. When I saw this, I was astounded that almost half of the inaccessible doorways were due to a single stair. And I got really mad for two reasons. The first reason was, this is ridiculous. It's just one stair.
It's not like a grand architectural statement or a spiral staircase that's very ornate. It would have been very easy to take that into account, just slope the ground up and get rid of that stair. But the second reason that I was very upset was that I had never noticed. I had been going to school here for four years,
and I never noticed that this was even a problem. This specific doorway, I went in well over a hundred times and never thought about it. And I think that we face a similar situation as Android developers when it comes to accessibility of software. There are a lot of one steps that we have in our code
that we can fix by just a single line of code. It's very easy. It's not technically difficult. But if we don't experience the pain point, we don't know it exists, and so we forget about it. So I hope at the end of this talk, you'll take away that programming for accessibility is very simple. And the hardest part about it is just remembering to do it.
All right, so when we talk about accessibility in like a really broad sense, we're talking about people all over the world, every person could use our software. But usually accessibility for software focuses on impairments. And there are three major types of impairments that you need to keep in mind when you're programming. Mobility, audio and vision impairments.
So each of these impairments exists along a spectrum. So for mobility impairment, you might have some small dexterity issues. Maybe your hand is shaking all the way to complete paralysis. And there are technical solutions for this. So you always want to make sure you're following the 48 DP rhythm.
It's a design pattern in Android that states that any actionable items should not be smaller than 48 DP in width or height. Also in the accessibility settings menu, users can set a time length for what a short press and what a long press means. That way they can have more time to do a short or long press if they need it.
You go all the way on the other side to complete paralysis. There's something called switch access where you have a button and you can highlight views and then double tap the button to select the views. So you could do this with a finger or your head even.
You can do the same navigation of an app with a straw by sipping or blowing into the straw. And I even saw an EMG sensor where someone had a patch that was picking up flexing and relaxing of their chest muscle. And they were completely paralyzed to control a visual display and they were navigating even between their computer and their smartphone.
It was quite brilliant. There's a really good example of this year's Google I O Android talk which was recorded. A good example of what this switch access looks like. Audio impairment also on a spectrum from some hearing loss to complete deafness. And remember all of your audio notifications must also have a visual component.
And users can also enable system wide closed captioning on their Android device. Visual impairment is the most difficult one to design for and that's what my talk is going to focus on today. If you were in the keynote yesterday or many of the other talks today, everything is focused on visual representations, visual pattern grouping.
All of her mentions about affordance, all these things were very focused on like when I see something, how do I react? And even the hardware itself, like phones keep getting bigger and bigger. And it's mostly because people want bigger screen real estate, material design pattern.
It's all about animations and floating pieces of paper and every single one of these is a visual component. So it's actually getting harder and harder as a developer to design an app that's not completely focused on the visual spectrum. So if somebody is colorblind, there are settings in the accessibility settings where you can pick which type of colorblind
blindness you are and it will adjust the hue of colors so that you can still see everything and that's system wide. There's also contrast settings and for low vision and for complete blindness, you would be using TalkBack, which is a screen reader that reads through the elements on the phone.
So a little bit about Android versus iOS. iOS has been the big player predominantly in accessibility. They put a lot of infrastructure in place at the very beginning. And so they're kind of like the original big player. However, Android is catching up.
So TalkBack, the screen reader for Android actually started all the way back in Donut. It's evolved a lot since then and voiceover for iOS started in iOS 3, but voiceover is also kind of leveraged from the original voiceover, which is on every MacBook or Mac computer.
Unfortunately, as we all know, Android doesn't have great documentation in many things and accessibility is not an exception. And so this goes for documentation, both in how to develop for accessibility as well as for users. And there's also a smaller user community. So if a user can't figure out how to access some piece of their phone, usually they reach out on blogs and forums.
The community is smaller for Android. iOS, of course, better documentation and a much larger community. So if you're having trouble developing or using for iOS, there are a lot of people that can help you out.
There are two major features that are, I think, missing from Android. One is screen blanking. So if you can't see your screen, why waste your battery power displaying it? So you can turn your screen off. On iOS devices, it's very easy to toggle the screen on and off while you have accessibility enabled. On Android, you can turn the screen off in regular settings, but it's, you know, buried in the regular settings.
It's not like a very quick action. And this is important because as a blind person, occasionally you do need to share your phone with someone who's sighted and show them what you're talking about. And so in these instances, you would want to be able to quickly turn your screen on and off.
The actual screen reader also is easy to turn on and off on iOS. And the same reason, if the screen reader is on, then different taps mean different things. And that's very confusing for somebody if they're not vision impaired and they're not used to using the screen reader. Android claims to support this, but it's quite complex.
And I've worked with several blind Android users and myself, and we've all tried to do it. And occasionally it just accidentally turns your phone off instead, and it's kind of annoying. So I did ask about both of these features during the Google IO session on accessibility.
I was there last week. They said that they're hoping to support screen blanking soon in the future. So who knows what that means? And they said that we already do support the quick on and off capability, but I would claim it's still in beta. So the big advantage, though, that we have as Android developers is the Android operating system is open.
iOS gives you a lot of features up front, but then that's all you get. Android developers have an opportunity to create what's called accessibility service, which is very similar to a regular Android service. And you can create custom apps that are focused for people who are vision impaired.
And I think this is a great opportunity. A lot of people aren't ready to switch over from iPhones to Android phones, but they're all starting to explore the environment by purchasing tablets that are Android. So the market is growing. This talk is not about building an accessibility service, though. It's about making your existing app already accessible.
All right, so I'm going to turn talk back on on my phone. Turn the volume up here. Actually, let's see. Is there a way to get audio support for my phone?
I could just hold the microphone next to it, if that's all right. Great, thank you so much.
All right, so accessibility settings are in your regular settings towards the bottom. They usually have a hand. And you just go to talk back, switch it on.
Let's see if I can usually hit or miss if the volume is for talk back or for...
Talk back, navigate up. There we go. Can you hear this? OK, great. All right, so talk back has a couple of different ways that you can interact with it. One is touch to explore. So you're going to drag your finger along the screen and it will read things out to you. So when talk back is on, your device provides spoken feedback to help blind and low...
Talk back, navigate. When talk back is on, you're... So when I first started developing for accessibility, I thought that this was the main way that blind users use their phones. And I was 100% wrong. This is actually used more for a last resort. The more common way is actually to do this swipe where you swipe left and right to access items.
And you can swipe right and left anywhere on the screen and it'll run through the items. Accessibility, services, talk back on, braille back off, system. And then if you want to select something, you can just double tap.
Braille back on, talk back on. So a single tap will just select something to be read out and a double tap will actually select something to be actionable. If you want her to be silenced because she's reading something that's too long and you're done with it, you can do a single tap.
Talk back on, switch. When talk back is on. So I just muted her right then. And the last one is there are these clues called ear cones. And this is something that we are all very familiar with. We know that our hangouts messages sound different than a Facebook notification, than a Twitter message.
They all have these different little chimes. And that's the similar thing with talk back. They all have different types of sounds. So one of my favorites is the list view. So as I scroll up and down the list view, I do it with two fingers instead of one. The xylophone tone will get low and high
depending on where I am at the list. So that's kind of interesting. So there are a lot of special gestures included in accessibility, meaning that if you move your finger along the screen
in these 90 degree motions, you'll get special menus. One of these is the most important one I think is the global context menu. So I will just demo to you what that looks like. So if I go, the global context menu is down into the right.
So you can see now there's like this circle here. And if I tap on the circle, then off to my left, the screen button quick navigation is the cool one. So I'm gonna slide over there. Quick navigation menu. Okay, now this navigation menu contains every one of the actionable items
that's currently on the screen in this spoked wheel. So I can just go through. Magnification gesture, large text, not checked. Power button ends call, check. Auto rotate screen, not checked. And then I can exit the menu just by coming back to the center of the circle. Auto rotate screen, not checked. So this is really useful because the screen is really big
and they're getting bigger. And so it's difficult to know everything that's on the screen but this menu, everything is just literally at your fingertips. And this is one that's really commonly used that was taught to me by Amelia, one of the blind women that we work with at the Blind Institute of Technology. And so it's very handy to know. All right, so now onto the code samples
what everyone wants to hear. So basically the main idea of programming for accessibility is to set what's called a content description on your view and the content description is part of the main view hierarchy or the main view platform. So anything that inherits from view, you can set a content description on.
So image view, text view, layouts, all of it is included. So I did have a few tips and tricks. Awkwardly enough, turning TalkBack on and off back and forth will sometimes turn off your USB debugging in your developer settings for you. This took me a long time to figure out.
I kept restarting ADB, unplugging, replugging my device, what's going on. But actually, yes, this is what happens. So forewarning. TalkBack's not supported in an emulator, hence my awkward camera setup right here. So you do have to run things in a device. There are no screenshots allowed
with the hardware buttons when you're running TalkBack. So if you do want to do a screenshot of something in TalkBack specifically, you can hook your phone up to your computer using Android screen monitor or Droid at screen which basically just mirrors your Android screen and then take a screenshot on your computer. And please work with QA.
So if you have a QA team that's doing automated tests, they should be using the Android ID. It's there for them and that's what it's for. A lot of times QA likes to hijack the content description and use it for their own purposes. But don't let them do that because then your users are actually going to hear that when they're going through your app.
And it's not like a lose-lose situation, an us versus them. I've been able to several times work with my QA team about what kind of information they wanted from this view and phrase it in a way that was actually meaningful to the user. So QA got what they want and the vision impaired users got what they want.
So it's not really an us versus them. There are win-win situations. So I'm gonna demo an app that I wrote for accessibility to show some of these items. And the most common one, image view and image button, I feel like this is the most common lint warning that I've seen in my entire life.
And most people just kind of like skim over it and ignore it and be like, oh yeah, image button is always highlighted for some reason but I don't really know why, let's just move on. So I will show you how you can get rid of that once and for all.
So the first example is for an image view that actually has some meaning to the user. So this is a quiz that I wrote. What type of Pokemon is this? And I would like to guess what type of Pokemon this is but I have to know that it's Bulbasaur in order to guess.
So I'm gonna guess it's probably dual type grass and poison but I'm not sure. If you did not know that this Pokemon was Bulbasaur and you could not understand what that image was, you would have no way to participate in this quiz. So I've set a content description on the image so that when you take the quiz as a vision impaired user,
you could successfully attempt to answer. Image view examples. What type of Pokemon is this? Bulbasaur. Reveal answer button. Bulbasaur is dual type grass and poison. Yay, so I got it right and I was able to know that it was Bulbasaur. So this is quite simple. Here's the XML code for the image
and then you just simply set the content description as Bulbasaur. If there was no content description on that image, TalkBack would have just jumped right over it. You wouldn't even know it was there. You wouldn't even know you were missing out on anything so it's important to set that. The next example is a decorative image. So sometimes you have images like in the background or as like a borderline.
For example, I have this image here of balloons and it's just like using a division between these two sections of the app and you shouldn't set a content description on that because it's hard enough to go through an app painstakingly with a screen reader. You don't need to clutter it up with junk. But you don't wanna just not set a content description
because you're still gonna get the lint error because we're not guaranteed that TalkBack is going to read nothing. It might pull something from a system and say like unlabeled image 46 and that's also confusing. So you have to explicitly tell TalkBack don't say anything.
So again, quite simple. Here's your image view code and you just set the content description to at null. Now Google has since upgraded this idea to include an XML attribute called important for accessibility which is a Boolean so you can set it true or false. And in this situation you would say important for accessibility false.
However, this item is only supported for API 16 and up so I usually still use at null for the content description. The next situation is states. A lot of times we have images that change based on the states. So you can think of like the little Facebook world when you have notifications it's blue.
When you have a notification it's white and blue with like a little red number at the top. Or Pandora was a great example. You know when you pause and play that button is in the same location but it changes from a triangle to the two lines. So you need to also be thinking that if you have a button that changes states
you also need to update the content description. So let me do my demo of this. So I have a Pokeball down here at the bottom. It's closed right now. I'm gonna tap on it and it will open and reveal the Pokemon that are inside. And so the image will change and you will also notice the content description
changes with the image to tell you what action to take based on the different states. Show Pokemon button. Now showing Pokemon. So it said show Pokemon button. Now when I tap on this button. Hide Pokemon button. Hide the Pokemon, okay.
Put them away for now. So here's the code for it. This is the Java code. So these two first items, all Pokemon was that layout that contains all of my Pokemon. And the Pokemon toggle button is the one that we're focusing on right now. Here's our on click listener. And so when the Pokemon are currently visible
and you clicked on the button you're going to want to hide them. So you set the image resource to the closed Pokeball and then you just set the content description to show Pokemon. Because now you're talking about the next action that the user would take. So this is how you set a content description in Java. Quite simple again.
And then of course if you want to hide it you would set the description to hide Pokemon. So there's another state here. It's the Pokemon visibility when the Pokemon appear on our hide. And here we can say there are two ways to do this. So this one I like better, announce for accessibility.
This means that this content description isn't necessarily attached to a view. So you are kind of attaching it to a view but then it'll just read it out on some event. It'll just say something and then when the user swipes back through and to navigate through they won't hear it again. So it's only sort of like an event specific item on that view.
This is really awesome but again it's only supported for Jellybean and up and there's no backwards compatibility for this one. So if you're not on Jellybean or higher you will do this. You just set the content description on the entire view itself. And then you say send accessibility event type view focus.
So you are forcing talk back to center on that view specifically and then it'll read it out. Now this content description will stick around so as the user scrolls back and forth then they're going to hear this again but that's kind of the only way around it.
Edit text. So edit text is also quite simple. A hint is all you need here. So we all know how to do a hint. There you go, ta-da. It's accessible.
The next one is sometimes you want text view to be different for talk back. So here's an example. When we read this in our minds we don't necessarily say man, tway, wed, thu, fry. In my mind I read it and I say Monday, Tuesday, Wednesday, Thursday, Friday. So sometimes you want to give a better experience
to a user and talk back is sort of like the voice inside their head. So you want to say something different than it's displayed visually to your users. Monday, Tuesday, Wednesday, Thursday. So this is quite easy also. Here's my text view for this Monday.
I set my string to Monday and I set my content description to Monday. And so I set my string to M-O-N and my content description to Monday. So it will still display the right text to your visual users but then your vision impaired users will hear something different. Otherwise text view default just reads out
like the text that's set to display to visual users. So toasts, when you show a toast you get that toast for free automatically. You don't have to do anything for accessibility. However, if you want to toggle the visibility of an item
I had to programmatically do that so that the accessibility would read out when that item appeared. And this one surprises me that that's not supported by default. So I just set the visibility to visible and then again I do that send accessibility event.
So in the Android talk this year at Google I-O they talked about setting an item as a live region. So you can set this as a live region and then TalkBack will automatically update for you if this item changes. So I thought, oh, how cool. It's only supported in KitKat and above
but this sounds awesome. Unfortunately when I tried it in this example it didn't work because it's only for updating the state. So my text right now says this was a hidden text view. If I updated it and added, yay, Droidcon Berlin to the end of it then TalkBack would automatically read it out. However, live region does not do anything
when you toggle between invisible or visible or gone and visible which is quite unfortunate so you still have to automatically set it. You also see here that I see the accessibility live region polite. So if you're being polite with TalkBack then she'll wait. If she's saying something else she'll finish saying that
before she says what you wanted her to say. You can also set it as like aggressive where she'll just interrupt anything else that's going on. So you should try and be polite if you want to use the live region. Sometimes you have multiple view items maybe you have for some reason an image view in between two text views that visually is seen all together.
You want to read that all out as one thing or in my example I have multiple text views. So let me head over to layout examples. All right, so here is the original, the top line is multiple text views that I've added together.
So this is kind of awkward and it would be nicer if I could just read it as a single item with TalkBack like this. Read me as a single item. So you can do that. It's a little bit work around but this is how it's done.
So you do need to include the v4 support library to use this important for accessibility in the Java. And basically what I'm doing is I'm taking each of the text views, read me as a single an item. These are the four text views.
And I'm saying these are no longer important for accessibility so don't read the text because otherwise she'll automatically read out all of the text. Then I set a content description on the layout that wraps all of the text items and I just build out a string by getting the text that's included
in each of the text views. So this is kind of a work around, kind of a bummer. But that is how you would do it. And again this is sort of extra credit. So if you want to make an app accessible and you're trying to do the bare minimum this isn't really something that you probably want to worry about but it is a nicer experience
for users. A good example of an interesting situation on how you cannot trust TalkBack is the former Google Now weather app. They have since changed it. But it's nice to set content descriptions on layouts because you're not guaranteed to understand
how TalkBack reads this out. So in this app, TalkBack used to read out Monday, Tuesday, Wednesday, Thursday, Friday, rainy, cloudy, sunny, sunny, sunny, 55, 41, 34, 46, 48. So if you wanted to know the weather on Thursday you had to just count and hope that you did your counting correctly.
So I was like oh, I totally know what's going on here. They probably did these vertical stacked layouts and so TalkBack's reading it out. So I did the draw layout bounds. Nope. All the layouts were as columns. But TalkBack is optimized to be sort of intelligent and try and read things like normal text.
So she was reading it wrong. So in this situation what you could do is set all of these as not important for accessibility and then set a content description on this layout and then it would be read left to right as Monday, cloudy, high of 55, low of 28.
All right, so there is some very interesting behaviors with CardView. I actually was able to go to the accessibility booth during Google I-O and ask them about why this view is behaving so interestingly with TalkBack and they were quite surprised. So I think these are less feature and more bugs
but they're important to know about for accessibility. So onto the material design demo. Android access. Okay, so this first CardView that I have is just a regular CardView with items in it.
There is no on-click listener or special treatment of this CardView. This is a CardView with no on-click listener. So it reads out each of the individual elements. Now this next Pikachu card is the same exact layout,
same text views, still no content description but I've set an on-click listener on the CardView. Pikachu, electric Pokemon. This CardView layout is identical except it has an on-click listener. So she just reads out everything in the card all at once which is very interesting.
The Google team was surprised and they didn't really know why that was happening so that's just something to be aware of that this will be read out differently if some of your CardViews have on-click listeners and some of them don't. Another one that's interesting is this Charmander card. It has an on-click listener and I've set a content description on the entire card.
Usually when you set a content description on a surrounding layout, it will read that content description first and then traverse through the content descriptions of the elements that are contained within that surrounding layout. Interestingly enough for CardView, that's not the case.
It will only read the content description that you've set on the entire card. Charmander is a fire type Pokemon. It is orange in color and has a flaming tail. So I added some extra information about this image so you would get some information about Charmander but then it didn't read out any of these details about that you can see here.
So that's something also to be aware of. And just a quick example of setting the content description on the card. Yes, just regular Java set content description. So the floating action button. This is a really big one. If you have, so the floating action button
is down here at the very bottom, right? So after I've traversed through, Jigglypuff is my last card. So then she reads out the name of this button which I've called floating action button. That's fine. Floating action button is supposed to be one of your most prominent actions in your app. And what happens if you have an everloading list view?
As soon as you get to the bottom of a list in TalkBack and you swipe again, it auto loads the next section of your list view. So your user may never actually find your floating action button. They'll be forced to start trying to traverse with the touch to explore which is very difficult and very annoying.
And why would you make the most prominent action of your app the most difficult thing to figure out how to do? So I actually finally did get a good answer from the Google team at Google IO, yay. And you can change the traversal order for accessibility. You might be thinking, why can't you just set like next focus order?
Unfortunately, when you do that in XML, it does not translate to accessibility. So that never worked in the past. There is a new way. It's only available for API 22 and above, but you can still do it. So for the floating action button, here's the important part.
Set accessibility traversal before scroll cards. So the scroll cards is my scrolling like list view layout. So that way, after you read through the action bar, you'll be able to hear the action of your floating action button. And then you can go through the normal list view order. So I've open sourced this app with all the examples. So if you wanna go back and refer to the code samples,
you're more than welcome to. Also, I will post a link to these slides on the GitHub read me page so that you can kind of go through the slides and the code examples together if you like. But with that, I am done with my talk. And if you have any questions, I'd love to hear them. Thank you very much, Kelly. Thank you.