Encore x Accessibility: A Balancing Act
On the Encore team (Spotify’s design system), there’s nothing we love more than collaborating with our fellow squads — (1) because they have awesome squad names and (2) because they have awesome people. Enter our most recent collab with the Mandalorian squad, which is in charge of accessibility at Spotify — and aptly named, because accessibility is “the Way.”
The goal of this project was to enable any team to get as much out-of-the-box accessibility in their components as possible, and when not possible, at least offer the best guidance for achieving fully accessible experiences. Practically, this means enforcing accessibility where possible and providing documentation when not. You’ll read more later about the tension between these two poles and why we can’t always enforce accessibility at the component level.
Recently, I hosted a conversation between our in-house web accessibility expert, Tamas (who is blind); Encore engineer Rose; and myself (Arielle), a product manager on Encore, about our recent joint project to uplevel the accessibility of our design system.
We discussed what made this collaboration unique and what other teams can take away from this experience, especially those working on design systems. Below is a transcript of our discussion.
Arielle: What were some of the methods that enabled you both to work together, and what were some of the surprises that you encountered working together for the first time?
Rose: As far as tools went, that took a little bit of iteration and Tamas explaining to me what was accessible and what wasn’t. Because, to be honest, I just kind of assumed that everything was in a better place than it was, as far as software that’s live and that we’re paying for. Those tools are being used by so many people. So, sure, they must be accessible! I learned that that’s definitely not always the case. I mean, there are aspects of them that are accessible.…
Tamas: Yes, absolutely. So when we first started out, we literally used a question-and-answer document, and that was a very good way for me to start scoping out what the exact situation was. It wasn’t until I started seeing the progress-tracker spreadsheet, which, I felt, detailed [the status] really well, that I was like, “OK, here’s a really good breakdown of which tickets are currently open.” That was very powerful to start with, but I knew that it would be hard for me alone, as an engineer, to submit PRs [pull requests] directly without having that whole context of the culture and how Encore PRs work. That probably requires more of a deeper embed. So that was a tricky area, determining how best to work with the team.
My very first pairing session was with Rebecca [another Encore engineer]. We sat down for a huddle session for about an hour and just walked through some scenarios and different code, how it could be written, and what the different outcomes could be. That was really when things ramped up and I was getting approached by the other developers. So it ballooned out to where I could actually look through pull requests and work with the developers directly in chat conversations to solve it. It’s really difficult to work with a lot of these tools that may have been created by smaller startups that have only been around for maybe four or five years.
Rose: Can you talk a little bit more about the pairing you did? Was it mainly conversation based?
Tamas: What we would do is paste in code snippets through chat. Then I would paste that chat into my VS Code instance, and then I could share my screen and say, “Does this look similar to what you’re writing?” And it does sort of create that pairing effect. It’s also really hard with job interviews because you have to do full paired programming sessions, typically, and I can’t track [the interviewer’s] cursor. If they’re modifying something slightly in my code, I get disoriented.
There is some research on newer ways to do this, like using sound cues and stereo positioning to indicate where someone is in the document. And it’s really helpful in Google docs, because I’m able to use this add-on. Through loudness and quiet volumes, I can tell how far they are from my cursor point at all times. You can have someone be a guitar, a trumpet, a drum; you can have different people be different sound effects for the collaboration. And that really gives you an option to, in real time, get that feedback that screen readers just miss because you can’t just have a live alert region firing all the time. So I think solving some of these challenges is not so straightforward.
Arielle: I want to pick up on what you just said — that it’s not black and white, which to me was a takeaway from this project. I really thought that with accessibility you were compliant or not compliant. There is actually a lot of gray area, and there’s a limit to how much we can bake into the components themselves. Sometimes, we have to just trust Product teams to actually follow our guidance, which we put into documentation.
Rose: Yeah. That’s definitely more of a challenge for design system accessibility than product accessibility in evaluating our code and figuring out how to look at things in context. I think it really is context based. A lot of the time, pure compliance doesn’t account for things like whether that tab index is a valid number but doesn’t make sense to use there. That’s a struggle with design systems where everything is so modular and piecemeal, and also everything in our library is stateless. I think a lot of the most challenging problems, for us and on Encore Web to learn about and fix, have all been the interactivity bits.
The other part worth touching on is that there’s compliance, which is there for a reason and is important — but to really do due diligence on accessibility, we’ve started thinking about it more like user experience. We’re thinking about users, who are not always the typical users that people are thinking about, and we’re just trying to make the best experience. So it’s less about passing an automated scan. You’re not going to put your design through an automated scan and be like, “Yes, this is an optimal navigation layout for users.” Tamas has helped us understand patterns that are available, like, “Oh, hey, maybe tabbing through all the interactive items isn’t the only option. What about left and right keyboard arrows? That might be a better interaction here,” or something that is more standard for this type of element.
Tamas: That’s a really good point — the modular nature — you’re spot on with that. That’s the biggest challenge, when a component can be rendered in multiple forms based on the props that are passed, it can be a challenge to predict how it’s going to get used. I think we’re still going to run into this with the chip component that tends to be used everywhere for a bunch of random little things.
But with Encore, it’s definitely the challenge of unpredictability, of knowing what they’re going to do with it, how it’s going to get applied. What safeguards can we put in place to make sure that there’s at least some kind of warning that if a customer does these kinds of no-nos, it could have a bad effect on accessibility?
Rose: Right. The more I learn about this, the more I’m realizing what we can do in Encore — especially with our libraries currently being stateless and trying to be pretty flexible with our components — is rely on examples and guidance. If we were to build accessibility into every component out of the box, we would need to build every version, and that’s just not scalable. So that is something I feel like I’ve had to come to terms with a little bit. We can’t have it both ways. I want everybody to learn about accessibility and take ownership over this stuff themselves and not rely on Encore to do it all for them or Mandalorian to tell them when they’re doing something wrong. But on the flip side, I realized, if we just expect everybody to do that, it’s not going to happen everywhere. It’s not going to be the quality level that we want. So we also need to bake in as much as we can, and it’s just a balancing game.
Tamas: That’s such a good point, getting that balance right. Now, we do have to make a lot of this guidance documented and get the culture started. It’s going to take a while for us to really inject this as a conversation that doesn’t just happen when you’re asked about it, but actually having accessibility be part of day-to-day conversations. That’s why I was so excited when I saw some of these conversations happening, because it’s clear to me that people are asking the right questions. As we launch [accessibility] courses in partnership with Fable Tech Labs (an accessibility testing platform powered by people with disabilities), for all these different types of roles, I think it’s going to really accelerate that culture. And hopefully, over time, there will be less of a need to document, and folks will understand that a component could be used in the wrong way, and they are the ones responsible for their overall site health and maintaining that accessibility.
Arielle: For the wider audience of designers and engineers who work at other companies, who don’t have the expertise or access to the expertise you both have, what’s the best advice you would give them to build accessible experiences?
Rose: My skills are 1/100th or less of what Tamas is capable of doing with a screen reader, a keyboard, and a braille display. I’m still very much at the level of turning on and off screen reader and tabbing through, and doing maybe a couple of other little things. Getting past the point of that feeling scary, because it is really kind of overwhelming if you’re not used to it. Something starts talking really quickly at you, and just learning the basics of all the different ways that users interact with your product, I think, just makes it really glaringly obvious.
It’s hard to look at code and know what the experience is, but if you just use [a screen reader], then it’s easier. Unless you have access to do real user testing or have a subject matter expert, just be your own user and realize you’re not going to be able to make everything perfect. Just the way that all designs can always be tested more and iterated on. It’s important to start making incremental improvements and not expect that to ever end. You’re not going to make it accessible, like, check, you’re done! Just expect that that is now a part of your workflow.
Arielle: I love that idea — “you’re never done.” And we talked about this in our project, that the guidepost will always be changing for the better, but you’re always going to try to keep up, and you’re always going to find ways to improve the experience, and you have to be tuned in to the advice that is coming out. To your point, Tamas, make use of the tooling that’s becoming more available to people now.
Tamas: Yeah. A lot of the information I pulled up was literally knowledge that was changing on the spot. So on top of things that you could be doing to maintain the product, there might also be newer and simpler ways to do something in terms of accessibility. On screen readers, my main mantra is you don’t have to be an expert to be able to test and to understand the basic flow and the basic experience of what a screen reader outputs and how someone may interact with it. And I think the biggest thing is not having generalizations like, “All keyboard users are screen reader users exclusively.” Well, no. There’s a lot of people who may have fatigue, arthritis, and some invisible disabilities that force them to use the keyboard because mice require very precise movement that becomes painful for them. So removing those personal biases and understanding that there are different ways. There’s a lot of good getting started guides that you can look at side by side. Maybe even print off a sheet with all the commands.
What’s really interesting and is worth reading came out from Fable, its year in review. According to Fable, the most difficulty is experienced by users of screen readers, and that’s why even if you just look at that one aspect, it’s such a big chunk. WebAIM has been around for so long, and they’ve been doing this thing called the WebAIM Million, which charts the top million websites and what those inaccessible experiences are like. Missing alt text tends to still be a top [issue]; the contrast tends to be a top [issue]. So there are clear patterns between some of the Fable data and WebAIM Million data.
You can go back to 2021, 2020 trends and see how web accessibility has evolved or devolved. In some cases, things haven’t really improved much at all. The incorrect ARIA is actually getting worse because more sites are now applying bad roles and bad semantics with their divs, and they don’t know which ARIA rules to apply.
Rose: So they’re making an effort, but not always in the most effective way.…
Tamas: The biggest advice is to look at things holistically and try to consider as many groups as you can. If [you have] a low budget, start with at least one expert in the team who can spread some knowledge and documentation for others to learn, and pass on links and information so you can have multiple experts. Because I think any developer can at least do basic screen reader tab-and-arrowing; that’s not stuff that requires 50 commands, more like 10 or 15, but it’s really a five-minute kind of thing. And as you said, it’s really impactful when you can directly do that, since the code sometimes doesn’t give you that full picture.
Rose: I remember realizing that unless I learned more, trying harder to produce more-accessible code by adding more attributes to my code was not making it more accessible, it was making it worse. And that was an interesting learning. There is a dangerous place of caring about being accessible, but not testing accessibility or really knowing what you’re doing yet. This is where your efforts could actually make things worse. I definitely fell into that category for a while, and I’m hopefully clawing myself out of it.
You touched on CSS grid, and that reminded me that we can’t just expect all new technology that comes out, even if it’s something that’s really browser standard, to be accessible. Flexbox made it way easier to reorder content in an inaccessible way. And you think because, well, Flexbox is huge, everybody says to use it, you know that it’ll just be fine. It’s a reminder for developers that just because something is a new standard that everybody can use, it doesn’t mean that it can’t promote bad accessibility practices also.
Tamas: Yeah, that’s why I pushed back against using [certain functions or properties popular with web developers].
Arielle: This is gold conversation. I love it. One thing I just wanted to mention: even being a product manager on this project, I picked up a couple of things from working with a blind engineer as well. What you said about the spreadsheet was interesting because I realized it’s important to say what you’re talking about and not just point to your screen and rely on cursors and assume people are going to know what you’re talking about. I remember using a lot more explicit language when we were working with you, and that can benefit everyone.
And the other thing I had noticed was when you have watercooler chat at the beginning of a meeting — that is a moment to be more inclusive. One day I was wearing a green shirt and someone complimented it. And you said, “Oh, that sounds nice,” and I described the shirt I was wearing. And it was really nice to talk to you about that and to include you in that conversation. Your empathy when it comes to working with us has been so special, and you’re a really unique person in that way.
Rose: It’s been really great. You could have been much more impatient with us, and it would have been understandable. You’ve been very inclusive of us taking time to understand things. So, thank you.
Tamas: I really appreciate hearing that because sometimes it’s a little nerve-racking because other people can see me, but I can’t perceive them with the same senses. I know that there’s, like, a lot of sunlight coming in from my left, but I have the blinds fairly closed to block glare. But being aware of my own visual environment and making sure that I’m, like, not coming to meetings in raggedy PJs is important too. I have an app that tells me if my face is centered in the camera view, which is really useful. I have some idea of visuals of myself, but it’s definitely a world where of course I’m going to want to be a part of it, but I can’t necessarily see it in the same way as others are seeing me in it. This has been amazing, and I love this open conversation.
Arielle: Well, this is really awesome. Thanks, Tamas and Rose. It’s great to talk to you both and reflect on the project, I’m excited for what’s next!
Now that our design system is in a much better state of accessibility, we’re thinking about how to improve our product development cycle to include accessibility considerations from the start. With our two latest components, Tamas helped us think through all the complex interactions we needed to anticipate, and we are still learning more every day.
This conversation has been edited for brevity and clarity.