Reading Inclusively

Kevin Mar-Molinero
6 min readSep 29, 2020

A weekly collection of things that i’ve read on inclusive design, accessibility and ethics, things that were interesting to me and hopefully to you too.

Another Tuesday another round up of things i’ve found online that have tickled my fancy.

If i’m totally honest i’m surprised i’ve managed to get this far in, mind you in the times of Covid and lockdown there’s something cathartic in writing, whether you know if it’s being read or not (I’ll assume not for the most part as i’m fine with that; if it is it’s a bonus).

Anyway in this weeks collection there’s bits on increasing disabled access to art, a great “tear down” of the new “plug and play” accessibility tools, racial bias in health algorithms and much more.

Inclusive Design

Sign language avatar creator wins $30,000 grant from Possibility FundThis is a lovely bit of design/innovation, Kara Technologies have invented an AI/ML driven avatar that can translate digital content into sign language to assist people who are deaf. Working with the deaf community, and led by a founder who’s partially deaf himself Kara Technologies decided to do something meaningful with tech that is often no more than a gimmick.

Digital poverty and digital capital — It’s entirely possible these days to start pretty much everything with “in the times of Covid” so i’m going to avoid that trope, even if the article itself does not, and instead point to this article that discusses the reality of digital exclusion and what it means for the future of work and access to jobs. The article however is true in saying that if we do not address these disparities now then we are potentially creating a new type of digital poverty and creating a new structure of inequality and entrenched class divides.

P&G Shares diversity data after it’s own marketing raises expectations — Marketing is in a phase of what many call “virtual signalling”, we see campaigns and comms that speak about inclusion and diversity but at the same time read reports of how the self same companies doing it don’t live by their public personas. The news that P&G have been driven by their own marketing to not just publicly publish their own diversity data, but also hold their hands up that they’re not as good as they should be should be seen as a positive, and something to be congratulated on.

Please Touch, making art accessible to the blind — This is wonderful, as someone for whom visual impairment has always been a factor in their life (my uncle is fully blind, my mum registered blind but with some sight) I have something of “bee in my bonnet” about creating a more egalitarian world for people with visual impairments. What I find the most wonderful about this is that often people will focus on things that have “business value” or “economic benefit” forgetting that joy is just as important a part of all of our lives, and not every piece of access has to have a financial value attached to it.

Accessibility

CEO’s must be mindful of web accessibility — Whilst there may well be little in here that’s new to many (and the reference to AccessiBe a worry, more on tools like that later) it’s encouraging to see that we are starting get articles written about accessibility that are targeted at the people who hold the purse strings.

Styling for Windows high contrast with new standards for forced colorsInternet Explorer, or the IE of old, is soon to be dead. Microsoft are leading the way in many areas, but in particular they’re one of the few big tech giants taking accessibility and inclusivity seriously, and their new Chromium based browser has been built with that in mind. This article documents the features Edge has for people that use their machines in high contrast mode, and alongside the code examples give design examples to help people contextualise what it means as a visual output.

Bolt on accessibility, 5 gears in reverse — AccessiBe and various other “tools” make promises that a single line of javascript will fix all of your accessibility and ADA (they’re primarily American at the moment) “woes”. These tools seem to be gathering pace as a quick fix remedy but the reality is that they’re more of a P.T Barnum “medicine” for people watching the bottom line than anything of value to the actual people they purport to help (and i’d personally question if helping people with access needs is even their intention). Stephen Faulkner, a man who knows accessibility better than most, does a wonderful “tear down” here of why a cheap fix is a bad fix.

ARIA Grid as an anti patternAdrian Roselli is one of those people on social media who if you don’t follow you should, his blog highlights all sorts of things that we should know, from his brilliant insight into the new changes in WCAG requirements, to his devastating exposes of accessible toolbars, his content is a wealth of information, tips and tricks that I personally find indispensable.

How ARIA fits in the layers of accessible technologies — ARIA is one of the most misused technologies in accessible development, it’s often seen as a blunt trauma object to “fix” poor practice, rather than a surgeons knife that should be delicately used to eradicate problems. I’m personally of the opinion that this is because many developers simply don’t understand how it interacts with assistive technology, or why it should be used, and video guides like this one are part of that step in the journey to learning when it should be introduced (and if you’re in any doubt then always refer to the 1st rule of ARIA).

Lighthouse HR & payroll company sued due to lack of access — Recently I was having a discussion with a colleague about how we started to make our company a non-ableist workplace and what that would look like in reality, and one of the points we touched upon was the lack of accessibility to most internal HR tools, well it seems we are not in recognising this problem, as one of the leading HR & payroll tools in the United States, Lighthouse, has had a lawsuit raised against it for it’s lack of accessibility by blind and partially sighted managers.

Ethics

Lawmakers demand scrutiny of racial bias in health algorithms — So much is made of AI/ML data and ethics right now, and rightly so. As we’ve seen from recent feats of journalism data has been used to target and deter people from certain racial groups from voting, or in the case of twitter’s recent scandal been trained on data sets that declare images with more white in them to be of more interest than those with more black. This story of how a group of NFL players filed a lawsuit against the league because the algorithm they were using held a racist bias is yet another prime example. I may sound a broken record here but as we invite more and more AI to our lives we MUST also co-create the datasets and algorithms with diverse groups, or this will keep happening.

There is a racial divide in speech recognition systems — And talking of keeping on happening… Amazon, Apple, Google, IBM and Microsoft all find their voice assistants misidentify words from people of colour on average 35% more than they do people who are white. Another classic example of the lack of inclusion and diversity in the space of AI/ML, and not through a lack of talent in that sector I should add, but in the hiring policies of the companies that are pushing the technology.

--

--

Kevin Mar-Molinero

Director of Experience Technology at Kin and Carta Connect and Member of BIMA’s Inclusive Design Council