BIRD UX - Beyond Interfaces, Real delight

Get in touch

hello@birdux.studio

Phone Berlin 0171.12 45 07 3
Phone Mannheim 0177.71 38 208

A petition for a digital inclusivity countdown

22. April 2024 | Research and Evaluation

Reading time: 8 minutes

A few weeks ago, we were at an event organised by the Digital Media Women Rhine-Neckar and the Business Professional Women Mannheim-Ludwigshafena "Future Talk" panel on the digital gender gap, which describes, among other things, the different degree of digitalisation of men and women (read more initiatived21.de). On the panel were Maren Heltsche, co-founder of speakers.org and special representative for the policy field of digitalisation in the German Women's Council, and Johanna Illgner, city councillor for the city of Heidelberg and co-founder of Plan W - Agency for Strategic Communication.

During the hour-long discussion, Maren and Johanna explored the question of why the digital gender gap exists and discussed possible solutions. The discussion centred on inequality within the workforce in digital teams, but also on the so-called "digital gender data gap". The digital gender data gap refers to the state of the data that feeds algorithms and is used to train AIs. These are currently quite one-sided and disadvantage marginalised groups. In the course of the discussion, these topics led us to the question: If we have an online accessibility countdown, why don't we actually have a countdown for digital inclusivity, an online inclusivity countdown?

What is the online accessibility countdown?

The online accessibility countdown refers to a law passed in 2021 that, in short, requires digital services to be accessible to people with disabilities in the EU. Annika Brinkmann then put a page online that shows how many days are left until all websites in the EU must be accessible. The page describes the online accessibility countdown: "The European Accessibility Act (EAA) comes into force on 28 June 2025. By then, the websites of companies with more than 10 employees and more than EUR 2 million in annual turnover must also be accessible or those that are published after that date."
However, the online accessibility countdown is not the topic of this article - it is only a template for us, an inspiration for a truly accessible and inclusive digitalised world.

Why do we need a digital inclusivity countdown?

Reason #1 If we want to become progressive, we must not cling to the status quo

There are many aspects of the digital gender gap that can be focussed on. One is, for example Homogeneous, largely male-dominated teams within the tech industry. There is an inequality within the workforce of work teams in the tech industry, with the proportion of women in the industry varying somewhat depending on geographical location. According to www.womentech.net it will take between 53 years (in Latin America and the Caribbean) and 189 years (in East Asia and the Pacific) to close this gap. This has a direct and indirect impact on the products and services that are developed, as it creates a relatively one-sided perspective.

Although the above figures are shocking, our petition for a digital inclusivity countdown will focus on the so-called Digital Gender Data Gap to close. The reason for this is that the rapid development of AI threatens a regression or standstill instead of an improvement in inclusivity in digital services and products, which requires immediate action.

Currently, there is a very high probability that data relied on by algorithms and used to train AIs is biased, sexist and therefore harmful or discriminatory towards marginalised groups. This state of affairs is historically conditioned. During the panel discussion, Maren Hetschle rightly said that in an unfair world with unfair data, fair systems seem utopian.

An additional problem: training data is not regulated. One of the reasons for this is economic. The business models and competitive advantage of most AI companies would be destroyed if it were possible to gain access to the data. If it now appears that we have agreed that regulation at international level is impossible solely in the interests of the success of AI companies, this is only half the truth. Because the question remains: who, i.e. which authority, should be responsible for regulating data internationally?

If you now believe that it will take some time before the consequences of this development take on ugly proportions, you should take a look at a recent UNESCO report that already finds evidence of regressive gender stereotypes in generative AI (www.unesco.org/en/articles/generative-ai-unesco-study-reveals-alarming-evidence-regressive-gender-stereotypes). AI, which children and young people, for example, are already using to do their homework.

Reason #2: Discrimination means missing out on (economic) opportunities.

27 % of people living in the EU live with one or more disabilities. While not all people with disabilities are impaired when interacting with digital products, the number of people affected by non-inclusive systems is large enough to bring the EU to its senses and enact the European Accessibility Act (EAA).

It is therefore all the more surprising that there is still no countdown to online inclusivity, given that women make up the majority of the EU's population with an average of almost 51%. Based on these facts, it seems downright absurd that we are not even trying to regulate data models that are used in algorithms. In fact, they discriminate against most people.

No matter what role women, i.e. half of the world's population, play in your daily work: whether they are customers, Employees, patients or voters, if you rely on discriminatory data when interacting with these people, you will miss out on opportunities and possibilities, e.g. to attract customers, employees or voters.

Reason #3: Discrimination is expensive

If the argument of discrimination against more than 50 % population seems too weak, the financial impact of discrimination could be a more convincing argument.

A costly development fiasco such as the Amazon Recruiting Tool or the Austrian Labour Market Service's Labour Market Opportunity Assistance System (AMAS) (Stefanie's talk at WUD Berlin addresses this tool: youtu.be/PndW3UR_p1s?si=uRhYHQJpB-DSrp2k&t=483) could have been avoided if the data below had been analysed and adapted before the tools were developed.

If you think that this type of discrimination won't have a negative impact on your budget because you're not developing tools, for example, then you're wrong. If, for example, you use the internet to advertise to your target group, discriminatory algorithms could prevent you from reaching the people you want. Find out more: algorithmwatch.org/en/automated-discrimination-facebook-google.

We are sure that there are many more reasons for an online inclusivity countdown, but listing them will only be relevant if we find an answer to the following question: How do we design non-discriminatory systems and services when our tools are flawed?

The current international consensus seems to be that data cleansing and enrichment, as well as the regulation of training data, are not feasible. We will therefore have to develop our own solutions.

Designing just systems in an unjust world

#1 Get to know your data

Whether you are basing a design on analytics data or training an AI with data sets, take a close look at your data set and put it to the test. Find out how, when and by whom the data was collected. Ask what kind of data was collected and, perhaps more importantly, what was not collected. If you don't do this, you are more or less flying blind. This would be comparable to analysing the performance of a website that has not been cleaned of spam traffic.

#2 Get to know your users

User research can do more than just inform user-centred design. It also has the potential to complement and challenge existing data. This is important because data is rarely neutral and unbiased. Real insights can only be secured by asking the people you are designing for. To prevent these insights from becoming another source of biased data, they should be thoroughly documented and explain, for example, how design decisions are influenced by them.

#3 Diversify your teams

Every person is ultimately the result of "Nature and Nurture". Different people, shaped by different environments, ask different questions and develop different solutions. This is an advantage that should definitely be utilised.

While we can do nothing but hope that one day the online inclusivity countdown will start ticking, we won't sit idly by and wait for that day! If you need help creating inclusive design solutions, please get in touch! The first Geekettez claim was "We Design For Humans". We have remained true to this claim since 2012. #PowerToThePeople

Further links and sources:

Which usability test do I need?

Which usability test do I need?

In usability testing, two broad categories of tests can be distinguished - namely summative usability tests (summarising results) and formative usability tests ("shaping" the design). Which type of test you need depends on what you...

Cookie Consent with Real Cookie Banner