28-05-2020 · 可持續投資《開闊視野》

SI Opener: Protecting digital human rights in the post-Covid-19 era

Digital data has expanded beyond imagination. It can create some serious issues for privacy and human rights, especially when it comes to the sensitive health data now being used in the battle against Covid-19. The amount of data in the world will reach about 40 trillion gigabytes in 2020, or more than 5,000 gigabytes for every person on the planet.1 That’s five times the capacity of the average personal computer… and much of it is about YOU.


  • Masja Zandbergen-Albers - Head of sustainability Integration

    Masja Zandbergen-Albers

    Head of sustainability Integration

  • Daniëlle Essink-Zuiderwijk - Engagement Specialist

    Daniëlle Essink-Zuiderwijk

    Engagement Specialist

As with all rapidly developing trends, the lack of a decent regulatory framework leads to new risks and new opportunities for all the stakeholders involved. At Robeco, we recognized many years ago that data privacy posed a material business risk for internet and telecom companies. Some business models are completely built around gathering, using or selling consumer data. Cybersecurity and the social risks of artificial intelligence (AI) are subsequently issues that we focus our attention on when it comes to digital human rights.

Covid-19 response puts digital human rights to the test

In the Covid-19 crisis – as with many other material ESG topics, by the way – digital human rights are being put to the test. On the one hand, apps to track down infected people can save lives and help unlock societies, in turn helping the economy. On the other hand, if it is not done carefully, people’s privacy is at risk.

Some people might argue that privacy should be sacrificed for this purpose. However, AccessNow, a leading not-for-profit organization in this space, claims that strong digital rights can actually help improve public health.2

We fully support that notion. If digital rights are not properly protected, the voluntary uptake of apps to track down infected people will be refused, and they will then only be successful if they are made mandatory. In many countries this will not be accepted, and it would leave governments unable to successfully implement digital health tools.


Issues around digital rights

But there are also other, less obvious issues. Health data is one of the most sensitive types of data that exists. Data protection is paramount. Tracking health data can be necessary for authorities to respond to a fast-moving outbreak. However, mismanagement of this data can lead to mistrust and the lower use of digital health tools.

Another relevant topic is surveillance. Governments may use this crisis to implement surveillance tools on a wide scale that could be considered controversial. Facial recognition, for example, is already playing a role in the surveillance, monitoring and control of people’s movements in the coronavirus outbreak. China is using it to track infected individuals and identify those not wearing masks.

In Moscow, Russian authorities are reportedly using surveillance cameras, facial recognition systems and geolocation to enforce its quarantine regime and track infected individuals and their family members.3 Even though we understand that individualism is valued more in Western countries than elsewhere, we are of the opinion that these practices run the risk of severely violating the basic human right to privacy without a clear benefit to the people.

Censorship and disinformation

ICT companies face an increasing number of orders from governments around the world that seek to restrict access to services and to disrupt networks. The consequences of disruptions include restricting internationally recognized rights to free expression, preventing access to vital emergency, payment and health services, and disrupting contact with family members and friends.

In some cases, these mandates pose an additional risk of human rights breaches when they restrict the free flow of information in the run-up to elections, or are used to target particular regions, districts or ethnic groups. In this crisis we have seen authorities in China, Iran and even the US trying to control the information that was shared on social media by journalists and health officials.

On the other hand we have seen a lot of misinformation being published, ranging from holding your breath to all kinds of medicines being hailed as the solution to this pandemic. In response to this, large platforms such as Facebook, Google and Twitter are pointing consumers proactively to reliable sources like health authorities.

What can companies do?

In the absence of a good regulatory framework, we see digital human rights as presenting risks to the companies we invest in. Exposure to issues such as data privacy, cybersecurity or social impacts of AI can badly impact their business, so good management of them can set companies apart.

Therefore, in our fundamental investment processes, we systematically analyze how companies are managing these issues. To assess these risks, we examine not only the strength of the editorial and information security policies, but also the processes and outcomes when it comes to breaches and fines. Some companies are also more transparent on these outcomes than others.

Combining this analysis with the other material issues such as corporate governance and human capital management, we assess the impact on the value drivers of the companies in these sectors. Data privacy, cybersecurity or social risks in artificial intelligence are often difficult to quantify in terms of revenue or cost drivers. So, we mostly adjust our figure for the company’s cost of capital based on our analysis as an estimate for reputational, legal and business risks.

The tip of the iceberg

In our investment analysis and engagement efforts, we focus on imminent risks or challenges, such as algorithm bias leading to discrimination or breaches of privacy. We also look at the effectiveness of corporate governance around digital human rights, their implications for human capital, and also the opportunities that AI offers. All these issues are financially material to a company’s business.

We ask companies to build knowledge at board level, and to implement robust policies and processes that will respect human rights in this increasingly digitized world. We also ask them to be transparent if they encounter issues or breaches. And let’s be clear, of course an app that could help track down and contain Covid-19 is a good idea, but its success will depend on how well human rights are protected.

Robeco believes that sustainable investing is not black and white, nor is it about shortcuts or easy answers. We also believe that some widely accepted assumptions about sustainability might actually be wrong. Robeco therefore likes to use its leading position in the field of sustainable investing to educate people about the topic.

In our new series of SI Openers, we look at the eye-opening issues facing sustainable investing and human progress, beginning with the digital threat to privacy and human rights. With this series, Robeco would like to draw attention to the fact that sustainable investing is complex, and that you need substantial expertise and experience to be able to make the right decisions in investing.

SI Opener: Seeing is believing.


本文由荷宝海外投资基金管理(上海)有限公司(“荷宝上海”)编制, 本文内容仅供参考, 并不构成荷宝上海对任何人的购买或出售任何产品的建议、专业意见、要约、招揽或邀请。本文不应被视为对购买或出售任何投资产品的推荐或采用任何投资策略的建议。本文中的任何内容不得被视为有关法律、税务或投资方面的咨询, 也不表示任何投资或策略适合您的个人情况, 或以其他方式构成对您个人的推荐。 本文中所包含的信息和/或分析系根据荷宝上海所认为的可信渠道而获得的信息准备而成。荷宝上海不就其准确性、正确性、实用性或完整性作出任何陈述, 也不对因使用本文中的信息和/或分析而造成的损失承担任何责任。荷宝上海或其他任何关联机构及其董事、高级管理人员、员工均不对任何人因其依据本文所含信息而造成的任何直接或间接的损失或损害或任何其他后果承担责任或义务。 本文包含一些有关于未来业务、目标、管理纪律或其他方面的前瞻性陈述与预测, 这些陈述含有假设、风险和不确定性, 且是建立在截止到本文编写之日已有的信息之上。基于此, 我们不能保证这些前瞻性情况都会发生, 实际情况可能会与本文中的陈述具有一定的差别。我们不能保证本文中的统计信息在任何特定条件下都是准确、适当和完整的, 亦不能保证这些统计信息以及据以得出这些信息的假设能够反映荷宝上海可能遇到的市场条件或未来表现。本文中的信息是基于当前的市场情况, 这很有可能因随后的市场事件或其他原因而发生变化, 本文内容可能因此未反映最新情况,荷宝上海不负责更新本文, 或对本文中不准确或遗漏之信息进行纠正。