World Communications Day: Preserving Human Voices and Faces
Message of the Holy Father on the occasion of the 60th World Communications Day
In his message for the 60th World Day of Social Communications, Pope Leo XIV calls for an alliance between responsibility, cooperation and education to preserve human voices and faces in the face of the risks of Artificial Intelligence.
Below we publish the Holy Father’s message on the topic: Safeguarding human voices and faces.
***
Message from the Holy Father
Dear brothers and sisters:
The face and voice are unique, distinctive features of each person; they manifest their own unrepeatable identity and are the constitutive element of every encounter. The ancients knew this well. Thus, to define the human person, the ancient Greeks used the word “face” ( prosōpon ), which etymologically indicates that which is visible, the place of presence and relationship. The Latin term persona (from per-sonare ), on the other hand, includes sound; not just any sound, but someone’s unmistakable voice.
The face and the voice are sacred. They have been given to us by God, who created us in his image and likeness, calling us to life with the Word he himself addressed to us. This Word first resounded through the centuries in the voices of the prophets, and then became flesh in the fullness of time. This Word—this communication that God makes of himself—we have been able to hear and see directly (cf. 1 Jn 1:1-3), because it was revealed in the voice and face of Jesus, the Son of God.
From the moment of creation, God has willed humankind as his interlocutor and, as St. Gregory of Nyssa says, [1] has imprinted on our faces a reflection of divine love, so that we may fully live our humanity through love. Therefore, safeguarding human faces and voices means preserving this mark, this indelible reflection of God’s love. We are not a species made of predefined biochemical algorithms. Each of us has an irreplaceable and inimitable vocation that arises from life and is manifested precisely in communication with others.
Digital technology, when not properly managed, risks radically altering some of the fundamental pillars of human civilization, which we sometimes take for granted. By simulating human voices and faces, wisdom and knowledge, conscience and responsibility, empathy and friendship, systems known as artificial intelligence not only interfere with information ecosystems but also invade the deepest level of communication: the relationship between people.
The challenge, therefore, is not technological but anthropological. Safeguarding faces and voices ultimately means caring for ourselves. Embracing the opportunities offered by digital technology and artificial intelligence with courage, determination, and discernment does not mean hiding from ourselves the critical points, the ambiguities, the risks.
Do not give up on your own thinking.
There has long been ample evidence that algorithms designed to maximize engagement on social media—profitable for the platforms—reward fleeting emotions while penalizing human expressions that require time, such as the effort to understand and reflect. By trapping groups of people in bubbles of easy consensus and easy outrage, these algorithms weaken the capacity for listening and critical thinking, and increase social polarization.
Added to this was a naively uncritical trust in artificial intelligence as an omniscient “friend,” dispenser of all information, repository of all memory, and “oracle” of all advice. All of this can further erode our capacity for analytical and creative thinking, for understanding meanings, and for distinguishing between syntax and semantics.
Although AI can provide support and assistance in managing communicative tasks, bypassing the effort of thinking for ourselves and settling for artificial statistical compilation, in the long run it risks eroding our cognitive, emotional, and communicative abilities.
In recent years, artificial intelligence systems have increasingly taken control of the production of texts, music, and videos. Much of the human creative industry is thus at risk of being dismantled and replaced by the label “ Powered by AI ,” turning people into mere passive consumers of unthought-out, anonymous products, devoid of authorship and love. Meanwhile, masterpieces of human genius in the fields of music, art, and literature are reduced to a mere training ground for machines.
The question that concerns us, however, is not what the machine can or will do, but what we can or will be able to do, growing in humanity and knowledge, with a wise use of such powerful tools at our service. Humankind has always been tempted to appropriate the fruits of knowledge without the effort that commitment, research, and personal responsibility entail. However, to renounce the creative process and surrender our mental functions and imagination to machines means burying the talents we have received to grow as persons in relationship with God and with others. It means hiding our faces and silencing our voices.
To be or to pretend: simulation of relationships and reality
As we navigate our information feeds , it becomes increasingly difficult to discern whether we are interacting with other human beings or with virtual “ bots ” or “ influencers .” The opaque interventions of these automated agents influence public debates and people’s decisions. In particular, chatbots based on large linguistic models (LLMs) are proving surprisingly effective at covert persuasion through the continuous optimization of personalized interactions. The dialogic, adaptive, and mimetic structure of these linguistic models is capable of mimicking human feelings and thus simulating a relationship. This anthropomorphization, which can even be amusing, is simultaneously deceptive, especially for the most vulnerable. Overly “affectionate” chatbots , besides being ever-present and readily available, can become hidden architects of our emotional states, thereby invading and occupying the sphere of people’s intimacy.
Technology that exploits our need to connect can not only have painful consequences for people’s lives, but can also damage the social, cultural, and political fabric of societies. This happens when we replace relationships with others with relationships with AI trained to categorize our thoughts and, therefore, to construct around us a world of mirrors, where everything is made “in our own image.” In this way, we deprive ourselves of the possibility of encountering the other, who is always different from us and with whom we can and must learn to relate. Without the acceptance of otherness, there can be neither relationship nor friendship.
Another major challenge posed by these emerging systems is bias , which leads to acquiring and transmitting a distorted perception of reality. AI models are shaped by the worldview of those who build them and, in turn, can impose ways of thinking that replicate the stereotypes and prejudices present in the data they draw upon. The lack of transparency in algorithm design, coupled with the inadequate social representation of data, tends to keep us trapped in networks that manipulate our thoughts and perpetuate and deepen existing inequalities and social injustices.
The risk is great. The power of simulation is such that artificial intelligence can also deceive us by fabricating parallel “realities,” appropriating our faces and voices. We are immersed in a multidimensional world, where it is increasingly difficult to distinguish reality from fiction.
Added to this is the problem of a lack of precision. Systems that pass off statistical probability as knowledge actually offer us, at best, approximations of the truth, which are sometimes nothing more than outright illusions. The lack of source verification, coupled with the crisis in field journalism, which involves the continuous work of gathering and verifying information at the scene of events, can create an even more fertile ground for disinformation, leading to a growing sense of distrust, bewilderment, and insecurity.
A possible alliance
Behind this enormous, invisible force that involves us all, there are only a handful of companies, those whose founders have recently been presented as the creators of the “Person of the Year 2025”—that is, the architects of artificial intelligence. This raises significant concerns about the oligopoly’s control of algorithmic and AI systems capable of subtly shaping behavior and even rewriting the history of humanity—including the history of the Church—often without our realizing it.
The challenge ahead is not to halt digital innovation but to guide it, and to be aware of its ambivalent nature. It is up to each of us to raise our voices in defense of human beings so that these tools can truly be embraced by us as allies.
This alliance is possible, but it needs to be based on three pillars: responsibility , cooperation , and education .
First and foremost, responsibility . Depending on the role, this can translate into honesty, transparency, courage, vision, the duty to share knowledge, and the right to be informed. But, in general, no one can shirk their responsibility for the future we are building.
For those at the top of online platforms, this means ensuring that their business strategies are not guided solely by the criterion of maximum profit, but also by a vision for the future that takes into account the common good, just as each of them cares about the well-being of their children.
AI model creators and programmers are being asked for transparency and social responsibility regarding the planning principles and moderation systems that underlie their algorithms and models designed to promote informed consent by users.
The same responsibility is also required of national legislators and supranational regulatory bodies, who are tasked with ensuring respect for human dignity. Appropriate regulations can protect people from forming emotional bonds with chatbots and curb the spread of false, manipulative, or misleading content, preserving the integrity of information against its deceptive simulation.
News agencies and media outlets cannot allow algorithms designed to win the battle for a few extra seconds of attention at any cost to prevail over fidelity to their professional values, which are centered on the pursuit of truth. Public trust is earned through accuracy and transparency, not through seeking any kind of bias. Content generated or manipulated by AI must be clearly identified and distinguished from content created by people. The authorship and sovereign ownership of the work of journalists and other content creators must be protected. Information is a public good. A constructive and meaningful public service is not based on opacity, but on transparency of sources, inclusion of stakeholders, and a high level of quality.
We are all called to cooperate . No single sector can face the challenge of guiding digital innovation and governing AI alone. It is therefore necessary to create safeguards. All stakeholders—from the technology industry to policymakers, from creative businesses to academia, from artists to journalists and educators—must be involved in building and implementing a conscious and responsible digital citizenship.
This is what education aims for : to increase our personal capacities for critical reflection; to evaluate the credibility of sources and the possible interests behind the selection of information that reaches us; to understand the psychological mechanisms that are activated in response; to allow our families, communities and associations to develop practical criteria for a healthier and more responsible communication culture.
This is precisely why it is increasingly urgent to introduce media literacy, information literacy, and AI literacy into educational systems at every level, something some civil institutions are already promoting. As Catholics, we can and must contribute to helping people, especially young people, acquire the ability to think critically and grow in spiritual freedom. This literacy should also be integrated into broader lifelong learning initiatives, reaching older people and marginalized members of society, who often feel excluded and powerless in the face of rapid technological change.
Media literacy, information literacy, and AI literacy will help everyone avoid the anthropomorphizing tendencies of these systems and instead treat them as tools. It will also help people always use external validation of sources—which may be inaccurate or erroneous—provided by AI systems, and protect their privacy and data by understanding security parameters and options for challenging them. It is important to educate ourselves and others on how to use AI intentionally and, in this context, to protect our own image (photos and audio), our face, and our voice to prevent their use in creating harmful content and behaviors such as digital scams, cyberbullying, and deepfakes that violate people’s privacy and intimacy without their consent. Just as the industrial revolution demanded basic literacy so that people could react to new developments, the digital revolution also requires digital literacy (along with a humanistic and cultural education) to understand how algorithms shape our perception of reality, how AI biases work, what mechanisms determine the appearance of certain content in our information feeds , and what the economic assumptions and models of the AI economy are and how they can change.
We need the face and voice to once again express the person. We need to safeguard the gift of communication as the deepest truth about humanity, toward which all technological innovation should also be directed.
In proposing these reflections, I thank those who are working towards the goals set forth here and I wholeheartedly bless all those who work for the common good through the media.
Vatican City, January 24, 2026, Memorial of Saint Francis de Sales.
LEON XIV PP.
___________________________
[1] “The fact of being created in the image of God means that, from the moment of his creation, man has been given a royal character […]. God is love and the source of love; the divine Creator has also placed this feature on our faces, so that through love—a reflection of divine love—human beings may recognize and manifest the dignity of their nature and their likeness to their Creator” (cf. St. Gregory of Nyssa, The Creation of Man : PG 44, 137).
Related
Pope Leo XIV in Africa: Pilgrim Among “Different Peoples and Worlds”
Exaudi Staff
10 April, 2026
3 min
Pope Leo XIV calls for a prayer vigil for peace at the Vatican
Exaudi Staff
08 April, 2026
2 min
Pope Leo XIV: “Holiness is not a privilege for the few, but a gift that commits every baptized person”
Exaudi Staff
08 April, 2026
9 min
Pope Leo XIV: “The entire Iranian people cannot be condemned”
Exaudi Staff
08 April, 2026
2 min
(EN)
(ES)
(IT)
