Folketingets wicked Week 2025: Responsible AI
We rethink the political process with youth involvement in focus through three formats: The Political Laboratory, the Youth Summit and the final Youth Hearing at Christiansborg.
Period: 2024-2025
Partners: SAGA
Supported by the Tuborg Foundation
The project in 5 points:
The project is a pilot test of the new democracy format "Folketingets wicked Uge" and developed in collaboration with SAGA, with support from the Tuborg Foundation
Focus is to engage young people (18-30 years old) in the responsible use and development of artificial intelligence
More than 150 young people participated in three events and the project has involved over 700 young people across the country
The Policy Lab was a two-day foresight workshop
The Youth Summit and Youth Hearing took place at Christiansborg and resulted in five key dilemmas that were presented to policy makers
Releases:
About the project
With support from the Tuborg Foundation and in collaboration with SAGA, the Institute for wicked Problems has piloted a new democracy format: The Danish Parliament's wicked Week. The aim is to create political action through foresight and citizen involvement, especially in relation to complex societal challenges - the so-called wicked problems.
During the test period 2024-2025, responsible AI was explored as a case study, with a special focus on engaging young people aged 18-30. Over 150 young people participated in a process that combined survey analysis, future lab, summit and political consultation.
The ambition is that Folketingets wicked Week will eventually become an annual democracy festival with Christiansborg as the setting - inspired by the British Evidence Week - where citizens, decision-makers and experts together develop solutions to today's most complex problems.
The anatomy of the wicked problem: Responsible AI
-
Responsible AI encompasses many areas such as technology, ethics, law, culture, labor, security and politics. There is no clear professional framework or central authority that can take responsibility for the entire field. The problem therefore arises from a complexity and lack of clear boundaries.
-
There is disagreement about what "responsible AI" really means. Different actors - from tech companies to governments and civil society - have divergent priorities and values such as innovation, data security, fairness and transparency, creating conflict over how to address the challenges.
-
AI involves a global network of actors with different roles and interests. Power and responsibility are unevenly distributed and no single actor can solve the problem alone. The problem needs to be tackled at multiple levels - from local to international - making coordination difficult.
-
AI evolves rapidly and unpredictably, making it impossible to set definitive solutions or rules. Responsible AI requires continuous negotiation, flexibility and the ability to deal with new dilemmas as they arise - it's a problem we must live with and navigate going forward.
How did we do it?
Folketingets wicked Week 2025: A democratic process in four steps
1. The survey: What do young people think about AI?
As a starting point, we conducted a national survey among young people aged 18-30. The survey was conducted in collaboration with Epinion in the fall of 2024 and included over 500 respondents from across the country.
The questions uncovered young people's attitudes towards artificial intelligence and accountability. The analysis was conducted using the Institute for Wild Problems' Wild Problems Model - and showed that young people are particularly concerned with transparency, control and ethical frameworks for AI. This became the guiding principle for the further process.
2. The Political Laboratory - November 2024
Two-day future lab focusing on responsible AI, with guest presentations and workshops by Christian Villum and Nicklas Larsen.
Held in November 2024 with participation by:
Young AI enthusiasts aged 18-30 years old
Experts from the tech industry and civil society
Practitioners and researchers from universities, consultancies and more
The lab worked with the foresight method and developed scenarios for responsible AI in Denmark towards 2040.
3. Youth Summit - February 1, 2025, Christiansborg
150 young people gathered in the Parliament Hall for a full day of debate, democratic formats and artistic inputs.
Opening panel:
Oliver Anton (philosopher and debater)
Nafisa Fiidow (tech consultant, Deloitte)
Dan Rose Johansen (CEO, Todai)
Activities:
Participants wrote letters to their future selves
Interacting with Face2Wiki - an algorithm-based work by Andreas Refsgaard
Exhibition of AI dilemmas in the Parliament Hall
Video greetings with participants' concerns and hopes for the future
Vote:
The day ended with the participants voting on the five dilemmas that would go forward for political consideration in the Youth Hearing.
4th Youth Hearing - March 18, 2025, Christiansborg
Final hearing with 150 participants - including politicians, officials, youth organizations and participants from the process.
Opening:
Keynote by Minister for Digitalization Caroline Stage Olsen (M)
Moderators:
Selma Voldtofte Rian and Magnus Lambæk Lund from SAGA
Pitch:
The five chosen dilemmas were presented to a panel from the Danish Parliament's Committee for Digitalization and IT:
Lisbeth Bech-Nielsen (SF, chair of the committee)
Per Huset (S, AI spokesman)
Kim Edberg Andersen (DD, digitalization spokesperson)
Jeppe Søe (Freelancer, member of the committee)
The consultation marked the end of the process - and was documented in both video and catalog.
Our team
SAGA
Naima Yasin (steering committee)
Mathilde Dyhr (steering group)
Magnus Lund (co-creator)
Selma Voldtofte Rian (co-creator)
Ida Pedersen (communication)
INVI
Tine Maria Borresø (project manager)
Sofie Burgos-Thorsen (project manager)
Frederik Langkjær
Anders Degn
Elisabeth Rebel
Kristine Fisker
Linn Lüders
Louise Enevoldsen
Mads Højmark Pedersen
Mette Susgaard
Sam Rahbar
Sina Haubjerg Jæger
Related news: