In recent years, many civil society organizations have emerged, demanding more respect for fundamental rights in the digital sphere from below. His work has led institutions to regulate these spaces, as well as setting an example for launching initiatives that defend the same banner in other forums that are sometimes closed to grassroots associations.
is one such initiative Society of the digital future, the cooperation of the Ministry of Economy and the Ministry of Digital Transition with Mobile World Capital Barcelona, the organizer of the famous technology congress in the capital of Catalonia. Created in 2018 as a digital rights and “technological humanism” think tank, the two institutions recently renewed their agreement to keep it active for another two and a half years for €7 million.
Its director, Cristina Colom, was the only Spaniard at the last meeting of the World Economic Forum (Davos Forum), which focused on governance and technology. There he defended the need for “digital inclusion to be a priority in every agenda”, he explains in an interview with elDiario.es. We need to mitigate the negative impacts of this digital transformation, such as the digital divide. Not just access, but what happens because of gender, age or ability. Leave no one behind,” he says.
What is discussed at meetings like the World Economic Forum that you attended?
Various advisory committees are organized. Which I participated in, about the connected world, it includes everything we call the Internet of Things. It started out as a more technological committee, but it has become more humanistic in the sense that it no longer works from the perspective of “this is great, we have new gadgets” or “the Internet of Things will cause technological disruptions”, but rather how we plan and how we implement these technologies with the citizen in mind.
The last day of the event was mainly about what will happen to the new technologies that are entering what we now call metaversion or immersive reality. How to make design inclusive and sustainable? It was a kind of leitmotif. About 250 people from around the world participated in the meeting to discuss, as I say, these use cases.
Can you give examples of how decisions made in these spaces affect the population?
The question is how to help technology to solve the crisis situations that we experience, such as something as elementary as the fires that occur in places of great drought, such as throughout the Mediterranean, in Turkey or in California. Using data can help prevent fires and how we can work on pilot tests through which technology can be used for the common good.
Don’t you think institutions sometimes focus on finding technological solutions to problems that can be fixed with more resources? Prevention and personnel in case of fire.
We must work in parallel and each institution must have its role. For example, when we were confined by the pandemic, it was obviously important that our children were fed and housed. But it was also important that they could continue their studies. Our message here is that this digital transformation is unstoppable and what it is causing is a societal divide between those who live in a connected world and those who live in a disconnected world.
Digital inclusion should be a priority on every political agenda, be it local, regional, state or supranational governments. My perspective is that we live in what I call a digital emergency that affects us in all areas of our daily lives, from how we move around the city, remote work, how we communicate, scarcity of materials, energy consumption of technology companies, digitization is in everything.
If we can get this data prediction to be used for shared services as well, we should be on our way, and I think we have a chance to achieve anything. The negative impacts of this digital transformation, such as the digital divide, must be reduced. Not just access, but what is given because of gender, age or ability. We must empower technology to benefit everyone and above all leave no one behind.
Do you think governments have already learned about the problems that digitization will bring?
Admittedly, a few years ago no one prioritized solving these dilemmas, such as the digital divide, algorithmic discrimination, unequal opportunities due to technological issues, all these challenges that are now amplified. I’m not saying that no one applied to them, of course, but they were not a priority. We all know very well that due to the pandemic, we had to not only digitize. It is much faster from the public sector, obviously from small and medium companies, but also as citizens, to continue with our minimum daily activity.
This concern concerns some institutions that did not address the issue of digital inclusion until relatively recently. What it cannot be is that, for example, technology discriminates against you because you are a woman, or because of a different skin color, or because of certain religious beliefs. As this is happening, I think we need to act, because if not, we may be able to solve the firefighter issue at some point, but the long-term challenges will be insurmountable.
The digital emergency is a reality that, if we know how to address it, we can address other emergencies, such as the climate emergency itself. Behind the technological responsibility lies also the responsibility for the sustainability of the planet. As well as the issue of materials, which is necessary, but also because of the energy consumption of the technological sector, which is very high. This is what technology itself can change, working towards a more efficient and circular economy, using renewable energy, etc. In technology lies the risk and the opportunity to reduce the impact of climate change.
You mentioned that Metaversion was also worked on at the World Economic Forum meeting. It’s a term that receives a lot of criticism for being used to sell projects that are still far from being implemented.
all. Sometimes words don’t do it justice, as the name Metaverse is used to lump a lot of technology into one bag. At the end of the day, it’s the new technology trends related to immersive reality, augmented reality, and obviously the metaverse that we still don’t know what’s coming. At the meeting there was an exhibition of devices, from my point of view, spectacular, of everything that this immersive reality can be. They worked a lot with the American business and academic world, with some students from Harvard showing the various devices they were working on. By no means was this an exhibition of junk, but above all, how far they could go with them and the need to discuss the ethics and responsibility that exists in the design of this metaverse.
I experienced it in the first person. They showed us immersive realities for flying, falling off a cliff, interacting with digital characters. But they also showed me how a man attacked me and virtual harassment, with bad words and aggression. I have been attacked and have had experiences that I have not personally experienced in networks, but we have all seen how it works. They did this so that you know the damage and damage that can also be in this metaversion. It is necessary to outline the limits and their ethical codes, and not just talk about technical standards.
What do you think of the criticism that this is a technology that will expand the extraction of personal data for commercial purposes?
We need to be aware of how companies are using our data, and I think the pandemic has contributed to that. Reading books like Karissa Veliz’s is highly recommended. I always try to look at it in a positive light, and I think it’s much more common now for people to refuse to give out personal data, whereas before it was a problem that no one discussed. We gave all our data and gave everything. Even an ID card.
All this information is something that has been used unethically, this is the reality and that is why the new regulation was necessary. I think that Europe, in the whole scenario, has come a long way in terms of data protection, which, by the way, states like California or India are taking as a reference for their own privacy regulations. The goal is not to be late again, as happened with the current events with us.
But you also have to remember that there are examples of data being used in the opposite direction, such as medical data that have served to greatly improve medicine. There are also other examples, such as BlaBlakar, which has worked towards a completely transparent use of data, rather than a model of strictly monitoring our information.
Source: El Diario