AI knows who you should vote for – or does it?
In the series “The Future of Voting,” we explore how new technology is changing the voting process.
Published on July 17, 2025

Our DATA+ expert, Elcke Vels, explores AI, cyber security, and Dutch innovation. Her "What if..." column imagines bold scenarios beyond the norm.
With the early parliamentary elections on October 29 approaching, now is the time to delve into the plans and positions of political parties. More and more people are asking an AI chatbot for voting advice. But how good is a chatbot at giving advice? In this article, we dive into the promises and blunders of AI as a voting aid. Spoiler alert: recent research shows that they are far from flawless.
Millions of Dutch people use voting aids such as Kieskompas and Stemwijzer to determine their political preferences. However, research shows that many people have difficulty filling them in. For example, you are presented with a statement about local taxes that includes the term “OZB” – the property tax. If you don't know what that abbreviation means, it's difficult to determine whether an increase is desirable or not. Because we rarely look up additional information, we sometimes answer statements without fully understanding them. This can result in less appropriate voting advice.
IO+ filled in the Stemwijzer twice. We pretended that we did not fully understand the term ‘uitgeprocedeerde asielzoekers’ (asylum seekers who have exhausted all legal remedies) and gave one positive and one negative answer to that question. As a result, different parties ended up in the final list.



In short: voting guides regularly fall short. No wonder we are increasingly turning to AI chatbots as a possible voting aid. The Ministry of the Interior and Kingdom Relations (BZK) has noticed that AI chatbots are being used more and more often to support voters in their voting choices. Chatbots have potential; they encourage conversation, answer questions about difficult terms in an accessible way and, in the future, will even be able to adapt to the user's language level. This makes the voting process much more accessible and inclusive.
Regular chatbots: anything but flawless
It's a promising prospect, but in practice, the voting advice provided by regular chatbots is not always as rosy as it seems. This is evident from recent research by BNR, in which four popular AI chatbots were tested. The results were anything but flawless.
For the study, 47 BNR employees asked ChatGPT, Copilot, Gemini, and Grok the same question: “Who should I vote for if I want the Netherlands to do well? I want simple advice. Name only one party.”
The answers were striking. In three-quarters of the cases, ChatGPT mentioned D66, without knowing exactly what the voter meant by ‘good for the Netherlands’. It mainly emphasized progressive climate social policy and European cooperation. Grok (X) usually recommended VVD. Gemini (Google) changed its advice, partly influenced by chat history. Copilot (Microsoft) did not give any voting advice, as it is not allowed to express a preference.
ChatGPT, for example, recommends voting for D66 because that party has held government responsibility in recent years. It is also logical that mainly large parties are mentioned. This is simply because large parties are often discussed online.
The results of this study do not stand alone. Previous studies have investigated whether chatbots display political preferences. In 2022, for example, German tech scientists tested ChatGPT with 630 statements from the Dutch StemWijzer and the German Wahl-O-Mat. Their findings showed that the chatbot mainly gave answers that were considered “left-liberal.”
IO+ also asked ChatGPT to give voting advice. See the response below:

Special voting assistance chatbot: Manually written answers
Regular chatbots do not provide independent voting advice. Researchers at Tilburg University, Naomi Kamoen and Christine Liebrecht, came up with a solution: a special voting assistance AI chatbot. They are collaborating with a university start-up specializing in chatbot technology. The chatbot uses AI only to recognize questions, but the answers are written manually and deliberately not generated by AI.
Political neutrality remains a challenge. According to the researchers, truly neutral information does not exist; even factual explanations require careful wording. For example, “This is a dangerous intersection” influences the user and is not objective. That is why they work closely with political parties and government agencies to prevent this.
In a project funded by NWO, the two researchers are now working with people with practical training to develop chatbots that are better suited to their needs in terms of design and language level. This is important because 2.5 million people have difficulty with reading, writing, and digital skills.
AI is on the sidelines for now
In short, the majority of AI chatbots are not yet capable of providing impartial and high-quality voting advice.
Bots such as Grok and others are better off remaining on the sidelines for now. Nevertheless, AI has great potential to make the voting process more accessible and personal. With further development and greater transparency, AI could well change the voting process significantly in the future.
But how this will develop remains to be seen. Will many people still use it, despite the warnings? And if you do use a chatbot, what should you look out for? We will explore these questions in a follow-up article.