Call for papers | Issue 4(2024): Ethical, Regulative and Legislative Perspectives on Emerging Technologies and Education

2024-02-16

[Call for paper – JEHE Issue 4(2024), May-June: publications@globethics.net] Note by the Managing Editor (July 2024): The Issue No. 4 has been released!

Ethical considerations in emerging technologies for education entails clear options on at least four different sets of concepts.

  1. Emerging technologies, such as artificial intelligence (AI) are transforming the landscape of education. However, their adoption raises ethical questions.

One challenge lies probably in obtaining informed consent from students in the changing environment of the places where they study, in close interaction with AI technological innovations.

  1. There are regulations and legal perspectives that both bring solutions but also challenges.
  2. Education and science, as distinct from technology, may need to be redefined from a philosophical perspective.
  3. Technologies of information open a new dimension on regulatory and legal aspects in relation to bringing more justice on a global scale.

Technology and science are two distinct fields, entailing a set of different disciplines. These days, public media give us the impression that supercomputers, which are used to analyse tens of millions of risk scenarios, will solve most of our concrete risks, but is it always useful to use the computing power of a computer to maximise societal stability and prevent concrete harms?

Before we think about regulating emerging technologies, we might need to think about what our problems require, often there will be a delicate balance, between knowing the reality of the problem, before we think of the solution, and before we check if we have mastered some physical reality-based application of our problem-solving thinking. On the one hand, rapid technological advancements exist, on the other scientific use of intelligence in solving concrete problems which is not only a matter of technology, is rather often related to defining the right pragmatic use of sciences. For example, the question of the predictability of risks has always been a complex issue, and technologies in the past as today, posed challenges as much as they brought possible solutions. There are many situations, where we can describe a set of phenomena, but cannot predict, like tossing a coin in the air and trying to guess which way it will land. But this little game has no consequences. In the field of mathematical calculus, precision could be defined as when in calculus we can recognize that our measurements have inherent uncertainty; then, rather than obsessing over infinite decimal places, we should focus on practical precision for meaningful results. Poor mathematical statistical approaches can lead to overstating the importance of the computer-based precision, without having defined the right variables, necessary to achieve the expected result.

Before building an aqueduct on an embankment, the Romans who did not have fast computers had a simple and effective way of finding out whether the embankment was solid: it was to observe, whether it had functioned as a dyke and had not been submerged by floodwater for at least half a century. The assumption was that if an embankment was kept dry, the possibility that anything built on it would sustain for hundreds of years - if not a millennium -, was real, but the usual rule of defining a few tests to assess the risk (or the cost of a guaranty, or the cost of preventive maintenance) ought not be forgotten. If a depot is placed under the arch of the aqueduct, three hundred years after it was built [to standard on the embankment], this simple modification of the overall conditions could weaken the structure of the aqueduct.

In the case of education, - probably as in engineering (in health, business, etc.), in most major areas of human economic and social activity, - affected as it is by the use of emerging technologies, solutions are deployed to deal with risks, and therefore similar reflections apply, provided we know whether there are ordinary or extreme risks (or both).

Legal and regulatory challenges: soft laws and timely legal responses are always good, but crafting effective regulations is complex. Policymakers must not only consider complex statistical data before proposing a rule-based framework, but the collection and interpretation of these data is not easy, and should not be used across various very different situations. Uncertainties and insufficiently critical analysis of conditions considered independent when they are not may occur and is a matter of expert analysis. Extreme phenomena may entail redefining ordinary risks into extreme risks, and often large international panels are used to deal with very large-scale issues, such as international natural disasters, war, or even the role of AI across continents when large conventions are built to mitigate poverty and provide a more secured access to basic goods.

Privacy, data security, intellectual property, and accessibility are adding additional difficulties, even though the promise is to better harness uncontrolled innovation – when competition for power uses against basic rights, what originally was aimed at giving adapted protection. Moreover, international collaboration becomes crucial, as emerging technologies transcend national boundaries, although expertise and international representation and comparisons are distinct levels of comprehension, cooperation, and action; none of them should be underestimated and reduced to be a simple placeholder or empty shell. Legal frameworks ought to be adapted to the dynamic nature of technology, while safeguarding ethical principles, but principle-based solutions have lost some popularity in the mass media.

Social implications and the moral vacuum: We certainly have on the one hand emerging technologies impacting society beyond legal and ethical dimensions, but on the other hand we often feel that the King is naked, and that promises are not fully kept. When released without adequate consideration, these techniques risk creating a moral, policy, and legal vacuum. Scholars emphasize the need for interdisciplinary dialogue involving educators, policymakers, technologists, and ethicists. Addressing social implications requires anticipating unintended consequences. It is often pointed out that, for instance, AI-driven personalized learning systems may perpetuate biases or infringe on student privacy. But balancing innovation with ethical responsibility is essential; and by fostering awareness, promoting transparency, and engaging stakeholders, we can navigate the complex interplay of ethics, regulations, and legal perspectives in the ever-evolving landscape of educational technologies. Is that all? It seems that on the other side, transparency may also lead to all sorts of difficulties in the various national legal definitions of accountability, which should not deter SMEs and business leaders, or public services, on some empty, purely managerialist and legalist grounds.

There is a potential but as well limits of ICTs in providing new ways of access to justice, for instance in current (hopefully soon) post-conflict situations. In summary, as emerging technologies continue to shape education, a holistic approach that integrates ethics, regulations, and legal considerations is vital but insufficient. By melting all the different aspects of emerging technologies together, we can hope to start to harness the potential of these technologies, while safeguarding individual rights, societal values, and educational equity – to effectively realize this objective we need to deepen character education, probably add a pinch of scientific audacity, and apply this audacity as well to enter in dialogue with the other, as this last purely moral dimension of social life, remain as central and an apodictical condition for success in living in this new world.

Short bibliography:

Banks, N. and D. Hulme (2012), ‘The role of NGOs and civil society in development and poverty reduction’, Brooks World Poverty Institute Working Paper. Uni. Manchester Global Development Institute, website: https://www.gdi.manchester.ac.uk/research/publications/gdi-working-papers/bwpi-wp17112/

Günther, Gotthard: Das Bewußtsein der Maschinen. Eine Metaphysik der Kybernetik. Mit einem Nachwort von Peter Trawny. 4. Auflage 2021, basierend auf der 2., erweiterten Auflage 1963, Klostermann Rote Reihe 133; p.200. ISBN 978-3-465-04564-9.

Kendal, E. Ethical, Legal and Social Implications of Emerging Technology (ELSIET) Symposium. Bioethical Inquiry 19, 363–370 (2022). https://doi.org/10.1007/s11673-022-10197-5

Mathematical Modelling Company, Corps. Tools for decision help since 1995. Risks analysis. URL: https://scmsa.eu

Wiik, Astrid. “Friend or Foe? Examining the Potential and Pitfalls of ICTs in Improving Access to Justice in Post-Conflict Countries.” In Technology, Innovation and Access to Justice: Dialogues on the Future of Law, edited by Siddharth Peter de Souza and Maximilian Spohr, 208–26. Edinburgh University Press, 2021. http://www.jstor.org/stable/10.3366/j.ctv1c29sj0.21.

Yuk Hui. 2016. On the Existence of Digital Objects, Uni. Of Minnesota Press. 336p. ISBN 978-0-8166-9891-2