Report — International Meeting on Diversity of Content in the Digital Age

Ottawa, February 7–8, 2019

Disclaimer

This meeting report has been prepared by the Department of Canadian Heritage and the Canadian Commission for UNESCO. Please note that this report summarizes discussions and exchanges between participants from private sector corporations, academia, civil society organizations, and governments. The report aims to accurately reflect what participants discussed. The report's content cannot be deemed to reflect official positions or opinions from organizations represented by participants, the Department of Canadian Heritage or the Canadian Commission for UNESCO.

On this page

Acknowledgements

The Department of Canadian Heritage and the Canadian Commission for UNESCO would like to thank the following people for their contribution to the success of the February 7 and 8, 2019 international meeting on diversity of content in the digital age.

Thank you to the members of the Advisory Committee who helped ensure the appropriate direction and relevant participation to the meeting:

  • Kelly Beaton, Interim Director General, Creative Marketplace and Innovation, Department of Canadian Heritage;
  • Sébastien Goupil, Secretary General, Canadian Commission for UNESCO;
  • Jason Kee, Counsel for Public Policy and Government Relations, Google Canada;
  • Tammy Lee, CEO, Culture Creates;
  • Alban de Nervaux, Head, International and Legal Affairs, Ministère de la Culture et de la Communication, France; and
  • Charles Vallerand, Consultant, Editor, Cultures in the Digital Era newsletter.

Thank you to all the participants, with a special emphasis for those who played specific roles during the meeting, either as panelists, moderators or grand rapporteurs. Finally, thank you to all the staff members of both the Department of Canadian Heritage and the Canadian Commission for UNESCO who provided support to the organization of the meeting.

Introduction

Quantity of content does not equal diversity of content

We live in exciting but tumultuous times. The digital age has brought tremendous opportunities for people to learn and communicate with others around the world and in their community. Social media and content platforms have grown exponentially over the past few years, offering consumers, through constant and cutting-edge innovation, unparalleled opportunities to see, hear and read an overwhelming quantity of digital content.Footnote 1

Search and recommendation systems, also called algorithms, guide users through this extraordinary volume of content by means of a curation process that selects and promotes certain content. These algorithms are designed to build a convenient and pleasant experience for the user while maximizing engagement on a given platform.

However, challenges have appeared over time, highlighting limits and dangers associated with the new realities of the digital age. Although more and more cultural content is being produced and experienced, it may be more challenging for citizens to find or be exposed to a diversity of content. Users rely almost exclusively on these algorithms and many voice concerns that only global content from culturally dominant countries is prominent on digital platforms, making it more difficult for users in smaller countries to be exposed to local or national content. It is even more challenging for people from a minority language or from a minority cultural community.

It is often stated that, in order for cultural diversity to be present, an equivalent diversity of content creators must be in place to share stories and experiences from a wide range of perspectives. Yet content creators face significant challenges. The current digital environment has disrupted traditional business models and has redistributed power, and accompanying profitability, among actors in the creative sector. We often hear of the difficulties content creators experience in obtaining a fair or sustainable remuneration associated with their work. This is perceived to threaten the long-term financial sustainability for content creators and lead to a world where only a select few will benefit from the digital environment, whereas the great majority of creators will find little incentive to pursue their creative efforts.

Algorithms may also have led to complex phenomena labelled as "filter bubbles" or "echo chambers" where one may only be exposed to content that aligns with what the system understands as being one's point of view or taste. This may result in a narrowing of perspective. In addition, worrying trends have appeared, such as "fake news" or disinformation, as well as possible election interference in various countries. These phenomena are also sometimes linked to a growing polarization of views, with resulting tensions in the social fabric.

A question of democratic resilience

Access and exposure to diverse content, including reliable information and news, is a central component of democratic resilience. A healthy democracy requires its citizens to have access to reliable information as well as content representing a wide range of opinions, points of views and experiences. Exposure to this diversity of content will contribute to a healthy public discourse, greater social inclusion and a better understanding between countries, cultures and communities.

These concepts of social inclusion and greater understanding have long been part of the impetus behind UNESCO Convention on the protection and promotion of the diversity of cultural expressions, which has noted the distinctive nature of cultural activities as "vehicles of identity, values and meaning", including in the digital world.Footnote 2

The notion of "diversity of content" broadens the concept of cultural diversity by including content such as information and news, and by putting an emphasis on diverse points of view and perspectives, rather than on diversity from a purely cultural aspect.

Need for a multi-stakeholder approach

The Internet environment issues involve unique governance challenges. Online platforms work at mass scale, across continents, languages, and societies. This global, transborder dimension means that many Internet challenges require approaches that go beyond the influence of a single country, civil society organization or platform. Rather, a wide range of stakeholders, originating from different sectors such as civil society, the private sector, academia, and governments, must work together. This concept of multi-stakeholder engagement is one relevant method of making progress on Internet-related issues.

An ongoing conversation

Since 2017, Canadian Heritage recognized the need for a global conversation to better understand the challenges and opportunities that the digital environment brings to cultural diversity, as well as unforeseen impacts on democratic resilience.

An important step of this global conversation took place on February 7 and 8, 2019 with the organization of an International Meeting on Diversity of Content in the Digital Age in Ottawa. The meeting was co-organized by the Department of Canadian Heritage and the Canadian Commission for UNESCO.

The Canadian Commission for UNESCO's mandate involves raising awareness and engaging civil society at large in promoting UNESCO's priorities in Canada and abroad, while maintaining a continuous and politically neutral dialogue with government partners who are working hard to ensure that Canada continues to uphold international standards in UNESCO's mandated areas. The partnership with the Canadian Commission for UNESCO brought a deep understanding of cultural diversity, in a global context, to the meeting.

Seventy-two participants attended the meeting, representing a wide and inclusive range of stakeholders from private-sector online platforms, civil society, governments, and academia. Participants came from all parts of Canada, some representing linguistic minority groups, others, Indigenous perspectives. Many came from abroad, notably Europe, South America and the U.S. The meeting's objectives were to deepen the collective understanding of the issues linked to diversity of content, as well as identify lessons learned, best practices and potential actions that could be designed to protect and promote diversity of content.

The meeting was structured under three main themes: discoverability of local, regional and national content; remuneration and economic sustainability of content creators and, finally, algorithms and integrity of the digital public sphere. This report will be structured accordingly.

To underpin discussions, the organizers invited five academic experts to write thought leadership papers presenting issues, reflection and potential ways forward in terms of actions that could be taken by various stakeholders on the questions at play. The report will only briefly summarize certain elements of the papers and the reader is invited to consult them for a more thorough contextualization of the issues. Hyperlinks to the thought leadership papers can be found in the Annex.

The meeting was held under Chatham House Rule, a recognized framework often used in Internet governance contexts. The Rule is designed to ensure frank and open exchanges between participants. The Rule states that participants can use the information shared during the meeting but without attributing that information to a specific person or organization. Accordingly, the present report will summarize the participants' discussions and exchanges without attribution, with the exception of some elements from the thought leadership papers.

Creation, access and discoverability of local, regional and national content

The issues

The first theme explored during the meeting was the creation, access, and discoverability of local, regional and national content. Dr. Philip M. Napoli of Duke University presented an overview of his paper titled Diversity of Content in the Digital Age: Discoverability of Diverse Local, Regional, and National Content, while Dr. Mira Burri of the University of Lucerne presented her paper titled Discoverability of Local, National and Regional Content Online: Mapping Access Barriers and Contemplating New Orientation Tools. They argued that the vast quantity of digital content is increasingly being curated by a small number of gatekeepers with a negative impact on discoverability of local or national content.

During the panel discussions and workshops on that first theme, participants made the following observations.

Some felt that democratic values and strong communities are at stake when we explore concepts of availability and discoverability of diverse content. Diverse groups solve problems better than non-diverse groups and have greater empathy for others. People exposed to content from a narrow range of perspectives risk becoming entrenched in their own ideas and beliefs and having difficulty developing empathy towards others. In other words, being exposed to diverse content is about fostering a more empowered society that will hopefully become more inclusive and more tolerant of differences.

A common theme expressed throughout the meeting was the tension between private interests and public interests. Private corporations are accountable to owners or shareholders who want to see a return on their investment whereas the public good may not coincide with financial imperatives. Business models demonstrate innovation and meet users' needs in an impressive fashion but sometimes at the cost for society of some negative consequences, outside the immediate purpose of the platforms and the content they make available. For example, even if civil society and governments were to push for promotion of local content for the public good, there is likely no incentive for the private sector to do so, unless a business case can be made. In addition, users may find ways to circumvent this promotion and locate and consume other types of content, perhaps not deemed as desirable by a "local content policy".

Another recurring topic from one workshop to the other was the sense that there is no "one-size-fits-all" answer or solution to discoverability. Each creative industry is different (for example, music vs. movies vs. books) and their specific circumstances make a general or global policy intervention challenging. In addition, each geographical market, for example smaller markets such as the Canadian French-language market outside Quebec or the national market of a mid-size or small country, involve a different set of circumstances and issues which fall outside the global solutions at scale sought by the large online platforms when they design their algorithms.

Concretely, it means that it is very challenging for content from a small market to be discoverable not only globally, but also in that small market itself. Finally, each online platform presents different opportunities and challenges, depending on its business model (for example subscription-based vs. advertising-based).

The concentration of media sources was also deemed to negatively impact cultural diversity and the possibility of being exposed to a diversity of content. Some participants said that major systemic issues remain in the representation of minority or marginalized groups that often cannot see themselves reflected in digital content. For example, in a Canadian context, it is very challenging for Indigenous communities, whether in urban settings or in the North, or for French-language minority communities, to create, access and make discoverable content that reflects their realities and points of view.

Other participants presented a different perspective. They challenged the assumption that the analog world offered a strong or adequate presence of cultural diversity. They disputed the notion that there was an ideal diversity before and that the digital world had only worsened or deteriorated the situation.

Some argued that "imposing" cultural diversity or certain types of content to users demonstrates a paternalistic attitude from governments and civil society representatives of industry interests. Platforms respond to users' needs as it is in their business interest to do so; directing people to content based on a policy goal of promoting local content could have negative impact, not only on the business interests of platforms but also on user engagement.

Some participants highlighted the fact that individual users have an ability to curate their online environment themselves through the people or sources they follow on social media, for example. Other participants expressed a divergent perspective by stressing that the majority of users probably do not have the interest or ability to intentionally curate their digital content and are happy to rely on algorithms to do so.

Discussions on discoverability were often focused on the "black box" phenomenon with respect to online platforms, whereby users do not have the capacity or tools to understand how algorithms function, including how they recommend content. How users interact with these recommendations and the impact they have on what users consume is not well understood and has been repeatedly mentioned as a topic deserving further research.

Avenues for actions and collaborations

Participants focused on identifying potential actions or interventions that all or some of the stakeholders could adopt to better protect and promote diversity of content online. Most participants acknowledge that the challenges are significant and solutions, complex.

Regulatory approaches

Dr. Burri proposed actions along two main concepts:

  • governance of algorithms, that would involve typical market regulations as well as various self- and co-regulation initiatives depending on the specific issues being targeted; and
  • governance through algorithms, meaning targeted interventions with tools that would promote exposure to diversity of content by increasing the visibility and discoverability of certain types of content through editorial processes done by algorithms.

Dr. Napoli highlighted the importance of considering vertical integration as a potential area of regulation, given its impact on search and recommendation systems and how they promote or "push" internally-produced content at the expense of others.

Many participants expressed support for governments to consider regulating the digital sphere on matters of content, data privacy, reliable information, and others. Many expressed the view that self-regulation by online platforms had shown its limits and that it was time to start a new conversation about regulation, and more specifically about the objectives a proposed regulatory framework would try to achieve, as well as how regulations would work concretely.

Some participants saw a way forward with governments setting general objectives or targets regarding discoverability of local, regional and national content on digital platforms that each of them could reach in different ways, according to its business model and specific circumstances. Setting objectives or targets would require some kind of transparency on the part of platforms, in terms of sharing data that would allow an evaluation or audit to assess whether or not the targets or objectives are met.

This notion of transparency was raised repeatedly over the course of the meeting. Some felt that online platforms are not transparent as they refuse to share data related to consumer behaviour or consumption. On that subject, many people expressed concerns over the fact that data is the key to the profit-making capacities of private sector platforms and that sharing this data would jeopardize their entire business models. Others stated that governments were also not transparent in how they hold public consultations and in policy development activities that follow.

Data trusts

The possibility of setting a data trust was deemed to offer a potential solution to issues of transparency. A data trust can be understood as a fiduciary that would govern a shared resource; in our case, digital content-related data from online platforms, governments or civil society as applicable. A data trust can be either a concrete organization or a series of agreements or contracts that would designate a group of individuals or organizations as trustees, with the authority to manage or make decisions about how data can be used or shared. As an example, this could mean that private-sector organizations would share some elements of data to a data trust that would allow others, such as civil society or governments, to evaluate or verify certain elements of consumers' behaviours.

Content quotas

The matter of imposing quotas relative to the presence of national content on online platforms was raised. Quotas have long been seen as a way of fostering cultural diversity although their effectiveness in the digital environment is not yet demonstrated. The European Union is examining the introduction of content quotas on streaming services, requiring that 30 percent of their catalogues come from European countries. Canada could adopt the same approach to ensure the presence of Canadian content online. However, mere presence may not be sufficient to make sure that users would be recommended that content by algorithms. The matter of prominence of local or national content was deemed to be as important as presence itself. Prominence would mean that national or local content is given particular visibility or is recommended with a certain priority over other content.

Metadata

Another avenue of intervention could be a form of support for the development and implementation of metadata associated with diverse cultural content. Metadata can be defined as information that is understandable by machines and that can be shared and reused between applications or platforms. This would help ensure that algorithms would be able to "recognize" cultural content as part of their curation processes. In turn, this would improve the discoverability of such content.

Support for creators

Providing support to small and mid-range content creators and providers to ensure they have access to large markets and make their content available and discoverable on these markets was stated as worthwhile. Continued support for public service broadcasters was deemed important in terms of promoting quality and reliable content that could be made more present on online platforms through collaborative approaches. There could be fruitful collaboration between the academic sector and content creators to help them learn more about opportunities presented by the digital environment. Partnering with educational organizations in that respect would contribute to fostering the development of content creators' skills and lead to greater discoverability of diverse content.

Content creators could also benefit from closer interaction with online platforms to build their digital skill set and ensure they get the most out of all the possibilities offered by the online world. This could involve equipment and technological training, as well as training on monetization of content and maximizing audience impact.

Partnership opportunities already exist between traditional broadcasters, including public broadcasters and online platforms towards maximizing the dissemination of Canadian content, in Canada and abroad. These partnerships could become more regular and extensive.

Creation, Access and Discoverability of Local, Regional and National Content

Summary list of avenues for actions and collaborations

  • Regulatory approaches on discoverability of content, data privacy, reliable information, vertical integration
  • Setting targets on discoverability of local, regional and national content
  • Greater transparency through sharing of data
  • Data trusts
  • Support to smaller-sized content creators to ensure access and discoverability on large markets and support to build content creators' digital skill set
  • Content quotas on presence and prominence of local, regional and national content on platforms
  • Support for development and implementation of metadata associated with digital content
  • Partnerships between traditional broadcasters and digital platforms

Remuneration and economic sustainability of content creators

The issues

The second main theme of the meeting focused on issues around remuneration and economic sustainability of content creators. Dr. Giuseppe Mazziotti of New York University presented his paper titled Remuneration of Content Creators in the Digital Space: Challenges, Obstacles and a Common Language to Foster Economic Sustainability and Cultural Diversity. According to Dr. Mazziotti, the economic situation of individual creators, broadly defined to include all copyright holders and the creative sector as a whole, is particularly relevant to discussions about diversity of content, since maintaining a diversity of cultural expressions relies on the artistic and intellectual labour of individuals.

Participants discussed key issues relating to copyright and remuneration of creators in a context where online platforms have become the primary intermediaries for the dissemination of cultural content, and ways forward for ensuring the economic sustainability of creators and creative industries.

With respect to copyright, creators are able, in some instances, to leverage it to receive remuneration for their work, but as individuals they have little to no bargaining power against enormous technology firms. Participants also noted that this imbalance may even exist for the larger organizations, such as collecting societies representing a large number of rights holders, when negotiating with platforms.

Content creators also face challenges with respect to enforcing copyright, as protections vary from country to country. Moreover, many platforms rely on a "notice and takedown" approach to copyright enforcement, which is challenging for individual creators to use. Some platforms have designed mechanisms to help creators identify and manage the use of their content online, which were highlighted as useful tools for creators seeking to enforce copyright to ensure remuneration for their work. Again, these mechanisms may be more available and useful for large organizations than for smaller players or individual creators.

Monetization schemes hold the promise of remuneration for creators on some platforms, but this may chiefly be of advantage to "superstars" with significant commercial appeal. Most content creators using online platforms receive little remuneration due to their works being oriented towards local or niche audiences. In addition, monetization schemes may vary greatly depending on the content creator and the platform in question. In addition, outside online platforms, creators must contend with piracy and uncertain remuneration.

Participants agreed that there is a role for governments to play in ensuring fairness and transparency, and supporting copyright collectives who can represent large groups of artists in negotiations. Significant obstacles remain, though, in the form of a lack of data from platforms, a lack of standards for rights management, and the sheer scale and power of online platforms, which far outpaces that of individual creators or collectives.

Participants wrestled with many questions related to remuneration of creators in this new digital environment. The first issue was around defining who can now be considered a content creator, and which conception of content creators would ensure economic sustainability and diversity of cultural creation in the long term. Most participants agreed that the platforms challenged traditional notions of "professionalism" for creators of cultural content. They settled on a broad definition of creators on online platforms as anyone with the time, resources and inclination to create content. Few tools are required beyond a smartphone or other connected device and access to necessary bandwidth, and, on most platforms, there is little or no "gatekeeping" functions that limit participation.

When one talks about professional content creators, on the other hand, one comes up against the diversity inherent in this category. It includes individual artists, producers, and large corporations. It includes world-renowned stars and emerging artists. It also includes journalists and academics. It includes—particularly important in the Canadian context—linguistic minorities and Indigenous peoples. The power and remuneration situation will vary tremendously from creator to creator.

While content creators of all kinds may use platforms to reach an audience, professional creators also seek financial remuneration from, and maintaining some measure of control over, their creative work. This may involve moving beyond more straightforward notions of "creator" to that of "creative entrepreneur", with its requisite skill sets.

Professional creators continue to play an important role in society, and participants upheld the value and contribution of artists—and underscored that they deserve to be remunerated fairly for their creative work. In Canada and in many other countries, governments at all levels create interventions to foster and promote the creative work of artists and cultural industries. It was noted that the 2005 UNESCO Convention on the Protection and Promotion of the Diversity of Cultural Expressions affirms the strong relationship between cultural diversity and sustainable development. There is a direct link between an artist's ability to achieve local success and their contribution to a global diversity of cultural expressions.

Participants highlighted the tremendous opportunities afforded to professional content creators by online platforms, in terms of reaching a public. There are also new opportunities for remuneration from platforms.

There are nonetheless significant challenges. Several participants highlighted the concerns of minority linguistic communities. It can be very challenging for creators working in such a context to gain sufficient remuneration through online platforms. Culture and language are intertwined, and diversity of content plays an important role transferring culture to youth.

Another broad challenge to creator remuneration is the general perceived devaluing of the content itself, resulting in an expectation from many users that content will be free or accessible through a very small subscription cost.

Copyright laws and regulations have played a historic role in protecting creators' work and enabling them to be remunerated—and this continues in the digital environment. However, creators do not always understand what copyright can or cannot do for them, and they often do not have the resources to follow up and enforce copyright on their own. This is also perceived as problematic by platforms seeking to enter into good faith negotiations with large numbers of rights holders. Participants agreed that more copyright literacy for creators is required, as well as improved transparency and dialogue mechanisms between rights holders and platforms.

As several participants noted, however, copyright is not the same as remuneration. This has always been the case. In the digital sphere, remuneration can only happen when a creator's work is used (viewed, listened to, or otherwise streamed or downloaded), and there is some scheme in place to generate revenues for that use. Platforms provide innovative approaches to remuneration and some can offer significant revenues to those posting content.

New technologies, such as blockchain and innovative uses of metadata, may offer new remuneration approaches to content creators. Several participants noted that content creators may need to be better equipped with knowledge and skills to derive remuneration from their content. This may be in the context of an online platform, but may also be in an offline context (for example, live performances or merchandise).

Copyright as a source of remuneration also has some inherent shortcomings:

  • In some industries, where content creation is complex and involves many actors, it may not be available to artists or creative workers developing content.
  • It also does not protect the collective cultural and intellectual property rights of Indigenous peoples. It can be very complex and difficult for individual creators to navigate.
  • Finally, copyright was born of another era and is challenged by the fast-moving digital environment; for instance, its reliance on the notice/takedown approach to copyright infringement on online platforms.

Several participants noted that copyright may need to be reconsidered—and may not be the solution to all remuneration issues. For example, government approaches to taxation or competition may complement copyright in helping ensure fair remuneration.

Many participants highlighted the immense power of platforms to set out the terms of participation and the remuneration scheme. There was concern that artists may not be getting their fair share in streaming services. Copyright collectives are long-standing ways through which creators work together to negotiate and manage payments, and they may have an important role to play moving forward. They already play a strong role with respect to music platforms. It was also noted that the situation varies from one field to another.

Avenues for actions and collaborations

Dr. Mazziotti articulated areas worthy of further exploration, such as:

  • Who is a creator in the online environment and how could we ensure that different categories of creators would be compensated for the exploitation and dissemination of their works?
  • Should collecting societies and others who license rights from content creators be required to disclose data that helps creators know whether the remuneration offered is fair?
  • How to address the limits of copyright regarding the levels of remuneration across value chains of content production?
  • Is a reconciliation possible between the corporate and cultural perspective on copyright?

Greater transparency

With respect to potential actions or interventions that could be explored, participants identified some possibilities. For example, there is a need to make progress towards understanding the inner workings of online platforms. Neither researchers nor copyright collectives can access sufficient data, nor can these actors clearly understand the way that remuneration is allocated to creators. Having access to more data could be very helpful in better understanding the remuneration situation.

Lack of data also prevents content creators from understanding how the remuneration schemes function. That knowledge would help content creators to make the most of all the options available. The various remuneration schemes should be made more transparent and easily understandable to content creators.

Aspects of remuneration

Some participants wondered whether the presence of content on a platform—rather than the use—could be a basis of remuneration (i.e. models based on creation/production, rather than on consumption alone). The platforms often use a threshold based on a large number of units streamed or played before payments start being made to the content creator. Having part of the remuneration associated with presence on the platform would contribute to financial sustainability for some content creators. Others disputed these notions by expressing support for equality of opportunities but not necessarily equality of outcomes.

Some participants supported extending taxation to online platforms in order to provide funding to support the creation of content and remuneration of creators. Some participants said that it would be relevant that financial support mitigates risk taking associated with new business models.

Training and awareness

Many participants raised the fact that the typical user of digital content expects content to be free. It seemed to many that raising public awareness on the costs related to professional content creation and taking measures to uphold the status of professional content creators would contribute to economic sustainability for content creators.

On the other hand, many content creators feel that they lack skills needed to extract the most of the various opportunities that exist on digital platforms and notably the different aspects of content remuneration. Many considered that it is essential to provide ways for professional creators to build their skill sets, facilitate the sharing of best practices, and benefit from collective bargaining.

Remuneration and Economic Sustainability of Content Creators

Summary list of avenues for actions and collaborations

  • Explore how to compensate different categories of creators
  • Greater transparency with access to remuneration data to better understand various remuneration schemes
  • Enhanced knowledge and skills for creators
  • Facilitated sharing of best practices
  • Exploring different remuneration schemes (e.g. based on presence rather than consumption alone)
  • Taxation to fund creation of content and remuneration of creators
  • Other financial support to creators
  • Raise public awareness on the costs related to professional content creation

Algorithms and integrity of the digital public sphere

The third and final theme was the impact of algorithms and the integrity of the digital public sphere. At the outset of that session, Dr. Fenwick McKelvey and Robert Hunt of Concordia University summarized important elements of their paper titled Algorithmic Accountability and Digital Content Discovery. Algorithms suggest content to a given user based on selected criteria, for example previous content consumed by the user or content experienced by other users with a similar "profile". Algorithms can be seen as new intermediaries between the content creator and the user and they are perceived as having a significant impact on the discoverability of content as well as on the integrity of the digital public sphere.

Dr. Taylor Owen of McGill University provided an overview of his paper titled Six Observations on Securing the Integrity of the Digital Public Sphere, noting that algorithms are designed to maximize the time a user spends on a platform. This has led to an environment where content recommended is often polarizing or sensationalistic. In addition, algorithms have a tendency to recommend content closely aligned to an individual user's perspective and may reinforce or confirm existing biases.

The issues

Algorithms

An overarching issue about algorithms is the "black box" phenomenon and the fact that data on algorithms is proprietary, meaning that platforms own data collected on their users' behaviours. Given that this data is considered intellectual property and a key asset of the corporations' business model, it is not meant to be shared with others.

It is thus argued that it has become difficult to identify how to hold online platforms accountable for the impact of their algorithms. There is no public contribution or deliberation in how algorithms function and what results they achieve. Algorithms are optimized for certain results: maximize the time a user spends on a platform or expose him or her to as much advertising as possible for greater profitability. This optimization, while beneficial in many instances, has created externalities or negative consequences, not only for users but also for society in general.

Many participants have raised an important paradox: the technological capacity to deal with these negative consequences resides with platforms that are not accountable to public interest.

Participants nonetheless expressed the point of view that algorithms may be adjusted or modified to meet public policy goals or objectives such as the discoverability of local, regional or national content or the presentation of trusted and reliable information. As such, the issue is not the algorithms themselves but rather what we ask them to do.

Strong arguments were put forth from two opposite perspectives. On the one hand, private corporations need to be accountable and responsive to their shareholders. Profitability is the key metric used in that context. It was argued that a service that would not meet users' needs and interests would not achieve financial sustainability. On the other hand, negative consequences (terms like "harms" were also used) are externalized, meaning they don't show on the corporation's balance sheet, but negatively impact society as a whole nevertheless. This led to exchanges on whether responses exist that would reduce democratic or societal harm without undermining duty to shareholders.

Many people present at the meeting expressed skepticism that government interventions on these matters would be beneficial or relevant, with the possible exception of combating election interference. Some noted that algorithms do not generate fake news or disinformation even though they may contribute to its rapid and extensive dissemination. To these participants, the problem lies elsewhere, with the individuals or groups creating this disinformation.

Integrity of the digital public sphere

A second topic explored as part of this main theme was the integrity of the digital public sphere, which involves the need for a safe and open space for public discourse and the need for reliable information. Traditional intermediaries that were responsible for the mediation of information are being disrupted by the digital environment. Up to a few years ago, it seemed that platforms had achieved the feat of providing consumers across the world with easy access to information, often not mediated by traditional gatekeepers. This raised hopes of democratizing the production and dissemination of information or news. However, more recently, it has been shown on multiple occasions that this remarkable access to information has resulted in a series of negative consequences, for example, the spread of disinformation and concerns about the reliability of news and information, as well as toxic and fragmented public discourses whereby citizens are exposed to a narrow range of perspectives.

Avenues for actions and collaborations

Governance and accountability

Three elements of algorithmic accountability were suggested to participants by Dr. Fenwick McKelvey and Rob Hunt:

  • the setting of data standards and metrics related to audience engagement and measurement;
  • the adjustment of algorithms' optimization towards public policy purposes; and
  • conducting impact assessment on algorithms before they are deployed.

Participants highlighted on multiple occasions the challenges around governance of algorithms. Some supported impact assessments of algorithms where, at the design phase, information about the purpose, reach and potential impact of an algorithm would be made public to allow governments and the public to better understand what an algorithm does and the impacts it would have on users, including management of data about them. Any step in this direction would allow progress on the transparency and accountability principles that were raised multiple times over the course of the meeting.

Dr. Owen articulated a number of possible policy responses to integrity and information issues that he separated in two categories, first those involving significant impact, consensus and minimal risks:

  • increased advertising transparency;
  • stronger data privacy legislation;
  • identification of automated accounts (bots or other form of artificial intelligence impersonating humans);
  • modernization of tax and competition policies for the digital economy;
  • review of the role of public broadcasters such as the Canadian Broadcasting Corporation in relation to reliable information; and
  • digital literacy initiatives.

The second category of policy options would be more challenging as they involve less consensus and thornier discussions between different policy areas:

  • identifying approaches to speech moderation, for example how to moderate content from millions of users in real time;
  • re-examining liability, for example whether online platform companies should be held accountable for legal breaches committed by users while using their functionalities; and
  • providing support for professional journalism.

Some participants strongly articulated the need to bring back ethical dimensions to the conversation on governance and accountability of algorithms. Algorithms by themselves do not necessarily take into account matters of ethics in their design even though their operations often have ethical implications.

Adjusting algorithms

Giving users the power to adjust algorithms according to their preferences and concerns also seemed like a promising avenue to mitigate concerns brought by the "black box" phenomenon associated with algorithms. Users should have the choice to opt in, opt out, or adjust algorithmic content curation, based on their priorities and preferences, be it around principles of diversity of content or others. Some participants highlighted the fact that some platforms already offer some choices and control for users to be better informed about the impact of algorithms over the content they are exposed to.

Others raised issues stemming from the potential biases in the design and implementation of algorithms which result from the perceived limited diversity in the workforce and from the lack of training offered to that workforce to ensure a greater awareness of diversity issues. It was hoped that some flexibility could be achieved in the design of algorithms and how they use data. Others mentioned the need to "decolonize" how we organize and categorize data to better address systemic biases. Metadata used by algorithms is generated with sociological or political assumptions that derive from these systemic biases.

Digital and media literacy

The majority of participants mentioned the need for greater support for digital and media literacy, with a special emphasis on algorithm literacy as relevant to issues raised by algorithms. The role and responsibilities of citizens in the digital sphere are to use their critical judgment and question what they are accessing in terms of content. Many felt that users should be offered a "plain language" explanation of how algorithms curate content.

However, putting that burden solely on individuals seemed unreasonable to some, given the massive amount of information and skills a person would need to have to make this assessment, for example an extensive understanding of disinformation techniques, "deep fakes" methods and others.

Traditionally, governments and civil society have played a key role in media literacy initiatives, but exchanges centered on how they could work with online platforms for greater impact and relevance. Civil society organizations could receive government funding and collaborate with platforms on such initiatives. Again, many expressed concerns about the differences between platforms as each raises different issues in terms of digital literacy. Literacy initiatives will also need to take into account important differences among users, especially from an age perspective, as youth or senior users offer distinct challenges. Other participants wondered whether it would be realistic or feasible to have the platforms contribute financially towards digital and media literacy.

Data trusts

The data trust solution previously outlined in the "Creation, access and discoverability of local, regional and national content" section was again mentioned as one option for addressing some of the transparency and accountability challenges linked with algorithms. Some participants indicated that separating consumer data from recommendation systems data would allow some form of high-level sharing that would respect privacy requirements, while allowing other stakeholders to have access to the preferences of the audience and better understand how people consume content. Some participants wondered whether governments could require online platforms to share some data in the public interest. Having access to data would allow stakeholders to better assess the impact of algorithms and possibly develop initiatives to alleviate externalities resulting from the lack of diversity such as risks to social cohesion or democratic resilience.

Standards and indicators

Regarding the integrity of the digital public sphere, notably the reliability of news and information, it was expressed that civil society has come up with constructive initiatives such as fact-checking or trust indicators that allow citizens to better distinguish between credible and less credible sources. Public broadcasters, support for high-quality journalism and the setting of journalistic standards were raised as important forms of government action that would be beneficial and relevant.

Algorithms and Integrity of the Digital Public Sphere

Summary list of avenues for actions and collaborations

  • Data standard and metrics related to audience engagement and measurement
  • Algorithms' optimization adjusted towards public policy purposes
  • Algorithms impact assessment (e.g. purpose, reach and potential impact made public at the design phase)
  • Increased advertising transparency
  • Identification of automated accounts
  • Modernized data privacy, tax and competition policies
  • Review role for public broadcasters (in relation to reliable information)
  • Digital and media literacy initiatives
  • Identify approaches to speech moderation
  • Review legal liability for online platforms
  • Support for professional journalism
  • Ethical dimensions of algorithm's governance and accountability
  • Data trusts
  • Power to adjust algorithm according to a user's preferences and concerns
  • Diversity training for workforce
  • Development of trust indicators and standards related to information

Conclusion

The great majority of participants expressed satisfaction with the format of the meeting, involving the four categories of stakeholders. Many stated that it is not common to have in the same room representatives from online platforms, academia, civil society, and governments to discuss these questions. Having said that, it was noted that improvements could be made regarding the participation and representation of citizens' perspectives. Many participants were also pleased by the breadth and complexity of conversation on the three themes as the interrelationship among them was explored and clarified.

There was a clear interest from the majority of participants to keep being involved in next steps and continue the discussion. There is a willingness to pursue avenues of collaboration and make progress towards addressing the various issues related to diversity of content. Participants are looking forward to further opportunities in that respect to keep the momentum gained at the meeting.

It was clear that a multi-stakeholder approach is well suited to tackling the issues at play, given the fact that power to understand and intervene is distributed across a wide range of players. However, a multi-stakeholder approach involves going beyond consultations in order to co-create policies or regulations with others; it involves a loss of control for governments and creates a more challenging policy process.

Participants expressed on multiple occasions that similar meetings would need to occur on a regular basis, but more preferably in a smaller fashion to allow stakeholders to focus on more specific questions and to delve further on a narrower range of issues and solutions.

A survey was distributed to participants a few weeks after the meeting.Footnote 3 Ninety-three percent of respondents were either satisfied or very satisfied by the meeting and 95% found the format of the meeting either excellent or very good, while 75% of participants felt they were better informed on issues related to diversity of content after the meeting and felt better equipped to begin new collaborations or initiatives on these issues. Finally, 98% of participants were interested in pursuing the discussion on diversity of content through a multi-stakeholder network.

The availability of data is essential if we are to make meaningful progress on issues related to diversity of content. Data is at the core of online platforms' business models and sharing this data presents significant issues in terms of privacy, innovation and competition. A balance will need to be struck between private-sector innovation and human dignity and the right to access one's own culture.

The behaviour and role of the consumer of content are not well known outside of online platforms themselves. How users interact with and are influenced by recommendation systems driven by algorithms is still unclear. In addition, the correlation between exposure to diverse content and democratic engagement still needs to be further researched. The academic sector would be uniquely positioned to collaborate with private sector online platforms, governments and civil society to further investigate these questions.

The statement by many participants that the era of self-regulation has come to an end does not translate into a clear way forward in terms of legislative or regulatory interventions.

The meeting was considered a success by participants. It validated the relevance and importance of the issue of diversity of content in the digital age. It is a complex subject that touches on a multitude of facets of the online world, but it is a topic rich in meaning that deserves close attention on an ongoing basis from all stakeholders involved, in a collaborative spirit.

Moving forward

An outcome of the meeting is the confirmation that there is not one global solution to the various issues at play. It seems more likely that a range of initiatives or vectors of action will need to be planned and implemented to address in a more precise and concrete manner elements of the issues identified over the course of the meeting.

These various initiatives could be spearheaded by different stakeholders and one of the main challenges will be to identify mechanisms of coordination and collaboration among the various actors involved. Stakeholders, whether individually, bilaterally or collectively, are invited to continue the reflection and identify how to adapt their activities and priorities according to their specific knowledge, expertise and interests along the main themes covered during the meeting.

The Department of Canadian Heritage, along with its partners, intends to pursue the engagement on diversity of content in the digital age. The Canadian government's Budget 2019 included a proposal to invest in an international initiative to design guiding principles on diversity of content as an element to strengthen citizen resilience to online disinformationFootnote 4.

These guiding principles will provide a framework for assessing and orienting policy actions and operational projects that would touch on matters related to diversity of content. The principles will represent a non-binding instrument providing a set of high-level elements that all categories of stakeholders should use in their planning and delivery of services, programs and policies.

These principles would represent an important step in ensuring that citizens have access and are exposed to diverse content that would allow them to experience various points of views and perspectives and develop a greater understanding and empathy for other peoples, cultures and communities.

Online platforms, civil society, academia and governments each have unique and valuable contributions to make towards protecting and promoting diversity of content. Meeting participants confirmed that ongoing collaboration between all stakeholders is the best way to move forward to ensure citizens have access and are exposed to a diversity of content online.

Annex

The five thought leadership papers prepared for the meeting can be found at the following location:

Page details

Date modified: