In the Covid-19 crisis – as with many other material ESG topics, by the way – digital human rights are being put to the test. On the one hand, apps to track down infected people can save lives and help unlock societies, in turn helping the economy. On the other hand, if it is not done carefully, people’s privacy is at risk.
Some people might argue that privacy should be sacrificed for this purpose. However, AccessNow, a leading not-for-profit organization in this space, claims that strong digital rights can actually help improve public health.2
We fully support that notion. If digital rights are not properly protected, the voluntary uptake of apps to track down infected people will be refused, and they will then only be successful if they are made mandatory. In many countries this will not be accepted, and it would leave governments unable to successfully implement digital health tools.
But there are also other, less obvious issues. Health data is one of the most sensitive types of data that exists. Data protection is paramount. Tracking health data can be necessary for authorities to respond to a fast-moving outbreak. However, mismanagement of this data can lead to mistrust and the lower use of digital health tools.
Another relevant topic is surveillance. Governments may use this crisis to implement surveillance tools on a wide scale that could be considered controversial. Facial recognition, for example, is already playing a role in the surveillance, monitoring and control of people’s movements in the coronavirus outbreak. China is using it to track infected individuals and identify those not wearing masks.
In Moscow, Russian authorities are reportedly using surveillance cameras, facial recognition systems and geolocation to enforce its quarantine regime and track infected individuals and their family members.3 Even though we understand that individualism is valued more in Western countries than elsewhere, we are of the opinion that these practices run the risk of severely violating the basic human right to privacy without a clear benefit to the people.
ICT companies face an increasing number of orders from governments around the world that seek to restrict access to services and to disrupt networks. The consequences of disruptions include restricting internationally recognized rights to free expression, preventing access to vital emergency, payment and health services, and disrupting contact with family members and friends.
In some cases, these mandates pose an additional risk of human rights breaches when they restrict the free flow of information in the run-up to elections, or are used to target particular regions, districts or ethnic groups. In this crisis we have seen authorities in China, Iran and even the US trying to control the information that was shared on social media by journalists and health officials.
On the other hand we have seen a lot of misinformation being published, ranging from holding your breath to all kinds of medicines being hailed as the solution to this pandemic. In response to this, large platforms such as Facebook, Google and Twitter are pointing consumers proactively to reliable sources like health authorities.
In the absence of a good regulatory framework, we see digital human rights as presenting risks to the companies we invest in. Exposure to issues such as data privacy, cybersecurity or social impacts of AI can badly impact their business, so good management of them can set companies apart.
Therefore, in our fundamental investment processes, we systematically analyze how companies are managing these issues. To assess these risks, we examine not only the strength of the editorial and information security policies, but also the processes and outcomes when it comes to breaches and fines. Some companies are also more transparent on these outcomes than others.
Combining this analysis with the other material issues such as corporate governance and human capital management, we assess the impact on the value drivers of the companies in these sectors. Data privacy, cybersecurity or social risks in artificial intelligence are often difficult to quantify in terms of revenue or cost drivers. So, we mostly adjust our figure for the company’s cost of capital based on our analysis as an estimate for reputational, legal and business risks.
In our investment analysis and engagement efforts, we focus on imminent risks or challenges, such as algorithm bias leading to discrimination or breaches of privacy. We also look at the effectiveness of corporate governance around digital human rights, their implications for human capital, and also the opportunities that AI offers. All these issues are financially material to a company’s business.
We ask companies to build knowledge at board level, and to implement robust policies and processes that will respect human rights in this increasingly digitized world. We also ask them to be transparent if they encounter issues or breaches. And let’s be clear, of course an app that could help track down and contain Covid-19 is a good idea, but its success will depend on how well human rights are protected.
1 Source: https://techjury.net/stats-about/big-data-statistics/#gref
2 Source: https://www.accessnow.org/protect-digital-rights-promote-public-health-towards-a-better-coronavirus-response/
3 Source: AccessNow.org
The content displayed on this website is exclusively directed at qualified investors, as defined in the swiss collective investment schemes act of 23 june 2006 ("cisa") and its implementing ordinance, or at “independent asset managers” which meet additional requirements as set out below. Qualified investors are in particular regulated financial intermediaries such as banks, securities dealers, fund management companies and asset managers of collective investment schemes and central banks, regulated insurance companies, public entities and retirement benefits institutions with professional treasury or companies with professional treasury.
The contents, however, are not intended for non-qualified investors. By clicking "I agree" below, you confirm and acknowledge that you act in your capacity as qualified investor pursuant to CISA or as an “independent asset manager” who meets the additional requirements set out hereafter. In the event that you are an "independent asset manager" who meets all the requirements set out in Art. 3 para. 2 let. c) CISA in conjunction with Art. 3 CISO, by clicking "I Agree" below you confirm that you will use the content of this website only for those of your clients which are qualified investors pursuant to CISA.
Representative in Switzerland of the foreign funds registered with the Swiss Financial Market Supervisory Authority ("FINMA") for distribution in or from Switzerland to non-qualified investors is ACOLIN Fund Services AG, Affolternstrasse 56, 8050 Zürich, and the paying agent is UBS Switzerland AG, Bahnhofstrasse 45, 8001 Zürich. Please consult www.finma.ch for a list of FINMA registered funds.
Neither information nor any opinion expressed on the website constitutes a solicitation, an offer or a recommendation to buy, sell or dispose of any investment, to engage in any other transaction or to provide any investment advice or service. An investment in a Robeco/RobecoSAM AG product should only be made after reading the related legal documents such as management regulations, articles of association, prospectuses, key investor information documents and annual and semi-annual reports, which can be all be obtained free of charge at this website, at the registered seat of the representative in Switzerland, as well as at the Robeco/RobecoSAM AG offices in each country where Robeco has a presence. In respect of the funds distributed in Switzerland, the place of performance and jurisdiction is the registered office of the representative in Switzerland.
This website is not directed to any person in any jurisdiction where, by reason of that person's nationality, residence or otherwise, the publication or availability of this website is prohibited. Persons in respect of whom such prohibitions apply must not access this website.