UBA logo

There is a huge market for tech that doesn't suck

Roel Verrycken, De Tijd at Mediafin Journalist
Meredith Whittaker.png

On 19 March 2026, Meredith Whittaker (president Signal) will take to the stage at the UBA Trends Day. The topic of her keynote speech? AI, power & ethics: tech that serves people, not the other way around. Get a sneak preview here.

The messaging app Signal aims to be a privacy-friendly alternative to big tech platforms. Meredith Whittaker, the chair of the foundation behind it, takes that fight personally. 'People with power who have access to intimate thoughts, that's a lot of control.

Did you also receive that alert on your phone? It gave me quite a fright. Meredith Whitaker descends the stairs into the lobby – more of a small sitting area – of the extremely charming boutique hotel Rosalia's Menagerie in Amsterdam. It is just after noon on the first Monday of the month, when the Netherlands tests its system designed to warn of disaster. Her spokesperson reassures her that she doesn't need to do anything. 'Okay. Reassuring, I guess?

Even though we're indoors, the American boss of Signal – the most secure and privacy-friendly messaging app – groans under the biting wind and keeps her thick coat on throughout the interview. But nothing stops her from spending an hour shining her critical light on big tech and the business model of what Whitaker calls “surveillance capitalism”, making insane profits at the expense of users' most intimate personal data.

Whitaker, so keen on privacy that even her age is a secret, is only too happy to burrow deep into the luxurious fur of Silicon Valley. Signal is run by a foundation of which she is the chairwoman and does things differently from the big platforms. The app was invented by cryptographer Moxie Marlinspike and co-funded by Brian Acton, a co-founder of WhatsApp. Whitaker, formerly of Google, came on board in 2022 and is the face and evangelist of Signal.

Its trademark is the guarantee of absolute privacy, which makes Signal attractive to a diverse group: activists, journalists and the oppressed, but also the most powerful people on earth. The latter became apparent in March, when an American journalist was accidentally added to a chat group in which leading figures in the Trump administration exchanged war plans on Signal.

From Google rebel to Signal chairwoman
Meredith Whitaker grew up in California and studied literature and rhetoric at Berkeley. After graduating, she joined Google, where her achievements included founding the Google Open Research Group, which aimed to promote collaboration between the academic and open-source worlds. In 2018, she was one of the organisers of a major protest at Google against the company's collaboration with the Pentagon.

After leaving Google, she co-founded the AI Now Research Institute at New York University, which focuses on studying the social impact of artificial intelligence. She is still its chief advisor. Since 2022, she has been the chair of the Signal Foundation, the foundation behind the popular messaging app Signal. It has around 100 million users and is known as the most secure chat app in the world. Whitaker does not share anything about her age or private life.

There are no figures available for Belgium, but Signal has around 100 million users worldwide. That's peanuts compared to Meta's apps such as WhatsApp. But only Signal is the right choice for truly secure chat traffic, says Whitaker. 'Imagine you have 100 per cent pure gold, and you have gold-coloured metal that looks the same, but is actually 70 per cent lead. Which would you choose? Yes, WhatsApp has a bit of encryption – it uses our standard protocol, by the way – to protect a small portion of data. But Signal does that 100 per cent, or as close to it as possible. We encrypt metadata – information such as when and with whom you exchange messages – from which so much can be deduced. We don't want that access. If you do have it, you could one day be pressured into giving it up."

Meta does have one huge advantage: the network effect. Everyone uses WhatsApp because everyone uses WhatsApp. How do you compete when people value ease of use over privacy?
Meredith Whittaker: 'That does play a huge role. Throughout the history of communication, the network effect has always led to monopolies. But people don't choose an app because they want to use it, they choose it because they want to talk to each other. That's why we've always opted for a design that is as simple and human as possible. No matter how pure your privacy principles are, you want people – from your little brother to your landlord – to use your app.

'We are ready to support everyone in their fundamental right to privacy. For example, we see people switching to us in times of political instability, or when big tech messes up again. Or when an authority such as the European Union recommends us.'

Does Signal prove that a different business model is possible? Large companies collect your data, sell it and earn loads of money. Signal is a non-profit organisation that relies on donations. Is that sustainable?
Whittaker: 'We prove that there is a massive desire for – and a huge market for – tech that doesn't suck. People want tech that is effective, cool and innovative. The monoculture of the surveillance tech industry, where more is always better, leaves a lot to be desired. It's not very innovative. We need much safer technology, much more control over critical infrastructure, and so on. We need services that work better for the people who use them.'

Yet there is a lot of jealousy in Europe because our continent cannot produce a single large communication platform.
Whittaker: 'It would be very good if Europe could cast that fear aside. It is better to focus on the things that the hyperscalers are not doing well, but which need to be done. Things that are critical to our social, economic and political infrastructure and that require real innovation. Instead of hoping that the monopolies that have been ingrained for decades can be broken down.'

Can you live a modern life without the apps of the big players?
Whittaker: Oh God, no. I don't know anyone who can. It's not a matter of choice. They form the infrastructure. I'm not trying to live the purest life, I'm just trying to be honest about what I see and where I can push it in a better direction. I'm very aware of the monopolies and adjust the settings as much as possible to my advantage. But if you try to stay away from big tech, you can't function and live and have relationships in today's world. We can't frame that as an individual choice either. It's like the climate debate. The idea of the “personal ecological footprint” is an invention of the fossil fuel industry and a way of placing the responsibility on each individual consumer. That distracts from the simple fact that there is hardly any choice. Everyone feels very guilty, but the problem lies elsewhere.

Signal is one of the major opponents of the European Union's “chat control” plan: the controversial bill that requires platforms to grant access to chats in the fight against child abuse. Are you pleased with the recent compromise that removed that requirement?
Whitaker: “That bloody endless process, it never stops. It's very good that the obligation has been removed; it's now optional. That's one less huge problem. But there is still room for it to be reintroduced. Signal is classified as high risk”. We disagree with that, because what we do is very simple and very necessary. It's magical thinking to believe that there is such a thing as a backdoor that only the government can use. There is enormous technological consensus on this: a backdoor makes the entire network and all the information on it vulnerable. Fortunately, people now realise this sufficiently. That's a good sign.' I am really surprised that this idea keeps coming back with such arrogant assumptions. If they had continued with this, it would have been a disaster. When people in power have access to the most intimate and personal thoughts, that is a huge degree of control.

To play devil's advocate for a moment... Whitaker: Please do. The devil could use a lawyer. The law must combat child abuse. What if we could curb it more, in exchange for a little less privacy?
Whitaker: My God, what a false dichotomy. The next time you hear someone from the government defending something like that, ask them how much funding goes to protecting women or children in vulnerable positions. How much funding does the police have to investigate abuse? I'd like to see those figures, and I know they'll be very meagre. If it's really about the children, follow the money. Follow the policy. Do we really care about children, or are we using them as a political pretext so that the intelligence services can get something off their wish list? I think the devil loses that argument.

We need to talk about AI. From WhatsApp to Messenger to Snapchat, all the major messaging apps now have a built-in bot for chatting. Why not Signal?
Whitaker: 'Because everyone hates them. They're also completely useless. We're lucky: Signal hasn't pumped insane amounts of capital into a gamble on AI that forces us to use them where people don't like them. Much of what you see are desperate attempts to find a market for AI. It doesn't belong in our private communications. Normal people – 99 per cent of the world, that is – don't really need an AI summary of their mother's messages, do they? It's not a response to consumer demand, just a need on the part of the companies themselves, who have to justify their investments.

It has been another year of incredible excitement surrounding AI. How do you view this?
Whitaker: 'It's a monoculture: one very limited approach out of the many that exist. That approach is: the more computing power and data there is, the better the neural networks (AI models inspired by the human brain, ed.) will become. That's the path that was taken a decade ago, when it became apparent that the technology could be useful for some companies. Fundamentally, AI in this era was born out of platform monopolies. There is no innovation that we can view separately from that.

How could it be any other way?
Whitaker: ‘There are many other possible ways to use AI that do not involve massive supermodels based on huge investments and that could be very interesting in many areas. But for that to happen, the term AI actually needs to be reclaimed. It is not a technical term; it is a desire, an aspiration. Since the term AI was first used in 1957, the technology has been applied in many different ways, such as through neural networks and deep learning (neural networks that can independently learn complex patterns, ed.). But it is now reduced to large language models (the model on which ChatGPT is built, for example, ed.) based on neural networks. There are many interesting things we can do, but to do so, we need to move away from the bigger is better paradigm. That only benefits the established monopolies. Only they have all the ingredients to create and distribute this type of AI.

Do you believe in something like superintelligence?
Whittaker: “That's desire upon desire. What is it even, man? So no, I don't believe in it. Is that 'divine AI” a mythical realisation of a superior consciousness that will surpass all human capabilities and to which we must simply bow down? No, but it is a very dangerous narrative. We can have some philosophical debates. That's fine and perhaps interesting. But the risk I see is that we are attributing almost religious and supernatural powers to a technology that is controlled by a handful of companies, whose goals may not be aligned with what is good for everyone. And implicitly, we are being told not only to submit to this technology, but also to assume that it is correct and superior to our own judgements. That is a catastrophic mindset.

You are particularly critical of “agentic AI”, programmes that act as a kind of butler, autonomously making decisions for us, such as booking a restaurant. These pose a threat to privacy. It has recently been discovered that Apple feeds WhatsApp voice messages to Siri.
Whitaker: 'In order to do what is promised, enormous amounts of data are required. If an agent is going to act on your behalf, it needs access to information about you: your credit card, your calendar, your browser history, you name it. So that it can act as a kind of fake version of you, contacting restaurants and making reservations. In that paradigm, an agent also needs access to your messages from Signal, for example. The technology is not built with security in mind, but with autonomy and ease of use. The architecture is very similar to that of malware. It creates very disturbing possibilities for attacks that go against the best practices in cyber security that have been in place for years.

Isn't there a risk that increasingly intelligent AI will eventually defeat all encryption?
Whitaker: 'We are obviously concerned about a future in which quantum computers could undermine encryption. We were the first messaging app to implement technology around this. We protected Signal against harvest now, read later. These are attacks that collect data now and decrypt it in the future, when quantum technology exists. There is also AI that is good at detecting vulnerabilities in encryption technology. We take that threat very seriously.'

On another note, you studied literature and rhetoric, which is not an obvious choice for a career in tech.
Whitaker: 'Yes, I love that stuff. I think it's an extremely valuable education. If you can read well, you can learn anything. That's an essential skill. But I don't really understand the obsession with how your university education will determine the rest of your life. I graduated, posted my CV on a site and was recruited by Google. I needed money – if that hadn't been the case, I probably would have become a poet. At Google, I was lucky to be surrounded by very capable people who were very good at computer science. I was able to learn from them.'

Is all lost for those who do not opt for STEM courses in this digital age, with its focus on mathematics, science and technology?
Whitaker: ‘Don't write off the humanities. We need them. People are so hungry for something real.’



This article has been reproduced with the publisher's permission, all rights reserved. Any reproduction must be subject to specific permission from the management company License2Publish: info@license2publish.be.

Also interesting for you

FAQ ?
UBA logo

Cookies on this website

This website makes use of cookies to function properly. If you would like to change which cookies we can use, change the cookie settings. Read more about our use of cookies in our privacy policy.

Cookie settings

Strictly necessary cookies 22 cookies

These cookies are necessary in order to enable our website to function properly and are therefore installed without your consent. We use them e.g. to protect our website. These cookies permit you to navigate between the different parts of the website, to complete forms. If you block these cookies, it is possible that certain parts of the website will not function optimally. We do not collect any personal data with these cookies, and we never pass the collected information on to third parties. Along with the following cookies, cookies are also installed by our partner Vimeo when you make use of our Training 24/7 offer.
Name Vendor Description Expiry

Preference cookies 1 cookies

These cookies simplify the functioning of our websites, make them more enjoyable for the visitor and ensure that you receive a more personalised surfing experience. These are, for example, cookies that remember whether you were already asked to participate in a survey, so that we do not propose the same survey to you over and over, or a cookie that offers you a personalised layout as a result of a previous visit.
Name Vendor Description Expiry

Analytical cookies 7 cookies

We use analytical cookies in order to collect information about how visitors use our website, for the purpose of improving its content, better adapting it to the wishes of visitors and increasing its user-friendliness.
Name Vendor Description Expiry

Marketing cookies 6 cookies

These cookies are installed for marketing purposes and are used in order to follow your surfing behaviour after you visited our website and/or to show you personalised ads. They can be installed on our website by us or by third parties. You decide for yourself whether or not to consent to the installation of such marketing cookies.
Name Vendor Description Expiry

External cookies 8 cookies

Some cookies are installed, with UBA´s consent, by third parties for the purpose of drawing your attention to certain products and services or to give you direct access to social media. For the cookies that are installed by these external parties, the information they collect with them and the purpose for which this information is used, we refer to the privacy statements of these parties on their own websites. These statements can be modified regularly, something over which UBA has absolutely no control.
Name Vendor Description Expiry