Prof. Dr. Radu N. Catană
Dr. Adina Ponta
Center for Business Law & Information Technology
Faculty of Law, Babeș-Bolyai University, Cluj-Napoca
The rising number and influence of digital companies, especially social media platforms, has generated numerous legal controversies in recent years. In the context of intertwining principles of fair digital markets with European law, the most pressing question was whether ex-post application of EU competition law is effective enough to keep pace with Big Tech companies and fast evolving digital markets. Aiming at extending applicable instruments, the European Commission proposed December 2020 two reforming instruments, the Digital Services Act (DSA) and the Digital Markets Act (DMA). One way to moderate the conduct of Big Tech is to identify how these companies can internalize the costs they levy on users, through supervision, dependence and manipulation, by imposing new social responsibilities. This approach creates a new class of fiduciary duties for those who collect, store and use data provided by online users. In this context, the compatibility of new duties imposed on “information fiduciaries” can be assessed against traditional corporate directors’ fiduciary duties and legal provisions in other branches of law.
The growth of Big Tech giants such as Google, Amazon, Facebook, Apple and Microsoft and the integration of the services they offer in our daily life has raised various questions and generated many legal controversies. The fight against the anti-competitive behavior of Big Tech companies has kept both the European Commission and national competition authorities pretty busy in recent years. Although the Commission was initially passive about these changes, its decision to sanction Google Android in 2018 seems to have been an important starting point for the application of European competition rules with regard to these entities.
A worrying aspect of Big Tech behavior is their ability to exploit user data to manipulate users’ preferences. User dependence, low standards of privacy and data protection generally enhance these business models. Digital companies collect, store and use huge amounts of data, and in view of the continuous development of their supervisory and control capacities, they must also assume new legal responsibilities. These companies own information about users, which they may use in various ways, unknown to their users, which makes the latter increasingly vulnerable. Although there is a lack of transparency regarding the concrete activities of digital companies, users are required to trust that these services will not betray or manipulate them for their own purposes.
A new concept of information fiduciaries proposes to alleviate this imbalance, distinguishing between “predatory” and “non-predatory” advertising in order to regulate the use of data. The proposal of this model is motivated by the desire to create a role-based status that would impose on platforms the obligation to protect the people that the business model uses as a resource, even from their own tactics. In short, certain for-profit entities, such as Facebook, can resolve the inherent conflict between their business model and their role as information trustees. Should such a fiduciary role be assumed or imposed on them, platforms should engage in content filtering, supplementing the protections and safeguards provided by consumer protection laws.
New European regulations in the field of Big Tech
Digital services include a wide range of online activities, from simple websites to internet infrastructure services and online platforms. DSA aims to create a uniform EU-wide framework for managing illegal or potentially harmful online content, holding intermediaries accountable for third-party content, protecting users’ fundamental rights and resolving information asymmetries between online intermediaries and users. The main provisions of the DSA include a modernized liability regime for online intermediaries, new transparency obligations for online platforms and new obligations regarding online advertising.
In order to prevent anti-competitive behavior as opposed to mere sanctioning this type of conduct, the DMA mainly focuses on platforms it considers to be “gatekeepers”. These are services provided on platforms with a significant impact on the internal market, including, but not limited to online intermediation services (such as AppStore or Google Play), online search services (such as Google), social networks (such as Facebook), video-sharing services (such as TikTok), independent communication services (such as WhatsApp), operating systems (iOS or Android). The term “gatekeeper” in the DMA is different from what the DSA calls “very large online platforms”. If a platform qualifies as a gatekeeper, it must comply with certain obligations provided by art. 5 and 6 DMA. Some of these obligations relate to behaviors similar to those that have generated cases of unfair competition by Big Tech companies in recent years.[i]
New fiduciary duties for directors of Big Tech companies?
Although in civil law jurisdictions jurisprudence of fiduciary duties is not as rich as in common law countries, fiduciary relations have a special legal tradition. Their specificity stems from moral foundations, and the term “fiduciary duties” includes what the Delaware Supreme Court called the “triad of fiduciary duties”: the duty care, the duty of loyalty (and according to some authors also good faith). These are recognized by doctrine and jurisprudence as providing the standards of fiduciary duties identified and analyzed by shareholders and courts when assessing the conduct of corporate directors.
There are two main means to moderate the conduct of Big Tech companies. The first model relies on competition law and encouragement of free markets. Increased enforcement of existing competition law rules and new policies could restructure digital advertising, improving the current deadlock and generating revenue for a greater variety of social media companies. These regulations could also divide large companies, as more social media companies translate into more platforms for innovation, different software functions and accessibility. For example, Facebook and Google often bought potential competitors before these could grow enough to threaten them.
A second approach is to prompt social media companies to internalize the costs they impose on society through supervision, dependency and manipulation, giving them new social responsibilities. The short-term goal is to counter the most egregious examples of abusive behavior, and in the long run, legal incentives can be created for these companies to develop a public-oriented culture and norms. In this regard, a return to the traditional institutions of law governing the relations of dependency, namely, fiduciary duties, has been proposed.
In 2016, an innovative approach was proposed at Yale University, i.e., to consider digital companies as “information fiduciaries” of their users. This model imposes fiduciary duties of care, loyalty, good faith, and implicitly confidentiality, on companies that hold a significant amount of data, especially social media companies and large online sellers and distributors.[i] This approach is intended to ensure that tech giants do not violate the trust of their users after collecting and using the collected personal data. At a time when content moderation and data protection are important points on legislative agendas, academic discussions on both sides of the Atlantic started to thrive.
Fiduciary relations involve asymmetries of power, information and transparency. Customers need to trust their trustees and expect that these will not betray them. Fiduciaries in turn, must act in good faith, especially with regard to the information they obtain about users. If a fiduciary relationship involves the collection and use of meaningful information about the customer, we can talk about information fiduciaries. The contractual relationship is for that matter a component of a relationship of trust which gives rise to fiduciary duties.
As information fiduciaries, media companies would have some obligations towards their user. The duty of care requires fiduciaries to secure customer data and not disclose it to third parties who do not assume similar obligations. In other words, this duty is transmitted together with the data. The duty of loyalty requires fiduciaries not to obtain benefits from their users and to avoid creating conflicts of interest, respectively not to betray the trust offered for their own benefit, being closely linked to an element of good faith, the duty of confidentiality. Fiduciary duties become relevant in situations when social media companies have strong market incentives not to protect their users, for example, when providing third-party access to data without adequate safeguards to prevent end-user manipulation or when conducting social science experiments on users.
One of the critiques towards this theory is the so-called incompatibility of with the Delaware principles of corporate governance, as this model would impose new, non-existent fiduciary duties on Big Tech companies, while corporate law already imposes fiduciary duties for directors. These duties are owed to the shareholders (in common law countries), or to the company (in civil law countries), creating a “problem of conflicting fiduciary duties”. It is thus feared that, to the extent that the interests of shareholders and users diverge, directors will be placed in a difficult position to breach their fiduciary duties (towards shareholders or the company) in order to fulfill their fiduciary duties owed to customers (either buyers or users of digital platforms) in accordance with the proposed new body of rules. These issues are more nuanced in the continental legal system than in common law countries, because in the civil law system, directors owe fiduciary duties to the company as a legal entity, not to its shareholders. Moreover, in the context of the new corporate governance culture, also promoted by European lawmakers, through new regulations on sustainability, corporate social responsibility and respect for social values, directors’ fiduciary duties towards the company imply compliance with new standards, through diligent and cumulative efforts to take action to implement these European values. Furthermore, it could be argued that reforms to promote the best interests of customers, by reducing dependency and protecting privacy, would also promote the best interests of online platforms and their shareholders, as encouraging trust can attract new users.
The question arose whether Big Tech companies have a genuine fiduciary relationship with their users, and if so, whether their business model is compatible with this legal relationship. Traditionally, fiduciary duties have involved a financial relationship between parties, but do not preclude their application to social media companies. Users generate or provide data, the value of which is substantial for those who purchase it for marketing purposes in order to target their ads. Trust is a significant factor in people’s willingness to share personal information on social networks, so the fiduciary model is not only compatible with this relationship, but also urgently needed. Fiduciary rules often work when obligors have strong incentives to serve their own interests rather than those of their clients, and it is this drive to serve one’s own interests to the detriment of a third party that creates the need for fiduciary protection.
Applying the duty of care in the social media context translates into accepting the function of these platforms to collect and use data, respectively the same obligations imposed by the DSA and DMA. Social network users do not produce confidential information, but generate Big Data. This is different in the relationship between a lawyer and a client, where the former must make reasonable efforts to avoid accidental disclosure of confidential information. Thus, understanding the duty of care in this context can be adapted to the Big Tech business model, requiring them to verify the companies with which they interact. This model would encourage social media companies to engage in data sharing with users, while limiting the possibility of data compromise.
[i] https://www.hoover.org/sites/default/files/research/docs/balkin_webreadypdf.pdf [accessed 17.10.2021]
[i] https://www.lexology.com/library/detail.aspx?g=b56a8bdc-6cc4-4b98-bca8-08636a5ca233 [accessed 17.10.2021]