Our new bite-sized module covers the ethical implications of digitalisation and the implementation and development of solutions
by Leah Clarkson
Ethics refer to moral principles that determine how we make decisions and lead our lives, and are influenced by culture, religion, upbringing and education, according to our Digital Ethics Professional Refresher.
Digital ethics is not a new concept, merely “the application of ethical standards in a digital environment,” says the module. Just as in society, certain ethical behaviour standards are expected across the financial sector, such as the requirements in the CISI Code of Conduct.
The module looks at some of the impacts of digitalisation, individual and corporate digital ethics, and the legal and regulatory framework. It explains how the CISI Code of Conduct can be applied in a digital environment. Below is a preview of some of the points.
Impact of digitalisation
Environmental
The CISI Code of Conduct calls for all members to “respect others and the environment”, but the digital world, says the module, leaves a “massive physical imprint”. All digital products consume electricity, and massive server parks used by global corporations such as Amazon consume “colossal quantities”. Meanwhile, dependency on batteries has led to unethical and hazardous mining practices that put human lives and the environment at risk.
Social
Another impact, related to the ‘social’ part of environmental, social and governance, is on digital communications, which can lead to online bullying and cyber abuse. The CISI’s Code of Conduct requires members to “Speak up & listen up” and be active in promoting a safe environment for all.
Artificial intelligence
As pointed out by the module, AI is not intelligent “in and of itself”. Any real harm suffered because of AI (for example, from a self-driving car), raises questions of responsibility. However, it says, self-learning algorithms are able to improve their performance over time by “constantly evaluating outcomes to improve results”. This often leads to ‘filter bubbles’, which display content based on what the users interacts with the most. This narrows down the content to the extent that a reader may think their own views, even if extreme, represent the majority.
Regulators have not yet caught up with these developments. Effective regulation is required. The module provides the example of:
A dynamic, self-learning credit-scoring app. With enough cases, the algorithm will detect a higher probability of female professionals suffering a loss of monthly income during maternity leave. While data alone would suggest that declining credit in this case is rational, it is immoral (and in many cases unlawful).
Individual and corporate digital ethics
Digital self-determination is “a cornerstone” of digital ethics, meaning individuals have a right to know how their data is being used by online data collection tools such as website cookies and decision-making algorithms that in turn use this information to aid targeted marketing. Laws such as the EU’s and the UK’s General Data Protection Regulation (GDPR) aim to regulate these activities by corporations and so protect users.
Nevertheless, questions remain even as regulations about data and profit are consistently revisited. For example, although social media is free for the user, these companies pursue an alternative mode of revenue by tracking the habits and interests of their users in what is known as ‘targeted marketing’, and “most organisations now try to pool their available data and monetise them". These data pools raise the question of data ownership, citing the example of data tracked through a mobile phone. Who owns that data? The phone provider, the app provider, or the owner of the phone?
Customers now expect instant, targeted services that cannot be delivered unless companies harvest and store data. The ethical dilemma is that, with no multinational regulator and thus no consistent regulatory approach to the use of customer data, it is up to individuals, through their own consumer behaviour, to decide how much personal data they are willing to sacrifice in the interest of preserving the conveniences of such services. The module encourages customers to be proactive by “trying to understand their data as an asset and manage it accordingly.”