As with all rapidly developing trends, the lack of a decent regulatory framework leads to new risks and new opportunities for all the stakeholders involved. At Robeco, we recognized many years ago that data privacy posed a material business risk for internet and telecom companies. Some business models are completely built around gathering, using or selling consumer data. Cybersecurity and the social risks of artificial intelligence (AI) are subsequently issues that we focus our attention on when it comes to digital human rights.
Covid-19 response puts digital human rights to the test
In the Covid-19 crisis – as with many other material ESG topics, by the way – digital human rights are being put to the test. On the one hand, apps to track down infected people can save lives and help unlock societies, in turn helping the economy. On the other hand, if it is not done carefully, people’s privacy is at risk.
Some people might argue that privacy should be sacrificed for this purpose. However, AccessNow, a leading not-for-profit organization in this space, claims that strong digital rights can actually help improve public health.2
We fully support that notion. If digital rights are not properly protected, the voluntary uptake of apps to track down infected people will be refused, and they will then only be successful if they are made mandatory. In many countries this will not be accepted, and it would leave governments unable to successfully implement digital health tools.
Stay informed on our latest insights with monthly mail updates
Issues around digital rights
But there are also other, less obvious issues. Health data is one of the most sensitive types of data that exists. Data protection is paramount. Tracking health data can be necessary for authorities to respond to a fast-moving outbreak. However, mismanagement of this data can lead to mistrust and the lower use of digital health tools.
Another relevant topic is surveillance. Governments may use this crisis to implement surveillance tools on a wide scale that could be considered controversial. Facial recognition, for example, is already playing a role in the surveillance, monitoring and control of people’s movements in the coronavirus outbreak. China is using it to track infected individuals and identify those not wearing masks.
In Moscow, Russian authorities are reportedly using surveillance cameras, facial recognition systems and geolocation to enforce its quarantine regime and track infected individuals and their family members.3 Even though we understand that individualism is valued more in Western countries than elsewhere, we are of the opinion that these practices run the risk of severely violating the basic human right to privacy without a clear benefit to the people.
Censorship and disinformation
ICT companies face an increasing number of orders from governments around the world that seek to restrict access to services and to disrupt networks. The consequences of disruptions include restricting internationally recognized rights to free expression, preventing access to vital emergency, payment and health services, and disrupting contact with family members and friends.
In some cases, these mandates pose an additional risk of human rights breaches when they restrict the free flow of information in the run-up to elections, or are used to target particular regions, districts or ethnic groups. In this crisis we have seen authorities in China, Iran and even the US trying to control the information that was shared on social media by journalists and health officials.
On the other hand we have seen a lot of misinformation being published, ranging from holding your breath to all kinds of medicines being hailed as the solution to this pandemic. In response to this, large platforms such as Facebook, Google and Twitter are pointing consumers proactively to reliable sources like health authorities.
What can companies do?
In the absence of a good regulatory framework, we see digital human rights as presenting risks to the companies we invest in. Exposure to issues such as data privacy, cybersecurity or social impacts of AI can badly impact their business, so good management of them can set companies apart.
Therefore, in our fundamental investment processes, we systematically analyze how companies are managing these issues. To assess these risks, we examine not only the strength of the editorial and information security policies, but also the processes and outcomes when it comes to breaches and fines. Some companies are also more transparent on these outcomes than others.
Combining this analysis with the other material issues such as corporate governance and human capital management, we assess the impact on the value drivers of the companies in these sectors. Data privacy, cybersecurity or social risks in artificial intelligence are often difficult to quantify in terms of revenue or cost drivers. So, we mostly adjust our figure for the company’s cost of capital based on our analysis as an estimate for reputational, legal and business risks.
The tip of the iceberg
In our investment analysis and engagement efforts, we focus on imminent risks or challenges, such as algorithm bias leading to discrimination or breaches of privacy. We also look at the effectiveness of corporate governance around digital human rights, their implications for human capital, and also the opportunities that AI offers. All these issues are financially material to a company’s business.
We ask companies to build knowledge at board level, and to implement robust policies and processes that will respect human rights in this increasingly digitized world. We also ask them to be transparent if they encounter issues or breaches. And let’s be clear, of course an app that could help track down and contain Covid-19 is a good idea, but its success will depend on how well human rights are protected.
Robeco believes that sustainable investing is not black and white, nor is it about shortcuts or easy answers. We also believe that some widely accepted assumptions about sustainability might actually be wrong. Robeco therefore likes to use its leading position in the field of sustainable investing to educate people about the topic.
In our new series of SI Openers, we look at the eye-opening issues facing sustainable investing and human progress, beginning with the digital threat to privacy and human rights. With this series, Robeco would like to draw attention to the fact that sustainable investing is complex, and that you need substantial expertise and experience to be able to make the right decisions in investing.
SI Opener: Seeing is believing.
Footnotes
1 Source: 25+ Impressive Big Data Statistics for 2022
2 Source: Protect digital rights, promote public health: toward a better coronavirus response
3 Source: AccessNow.org