Crime Prevention and Criminal Justice in an Artificial Intelligence-enabled Future
The world is undergoing a massive technological transformation, involving all levels of public and private life. Developments in artificial intelligence (AI) are challenging traditional perspectives, boundaries and methods in virtually all sectors: this calls for new approaches, standards and metrics. The disruptive nature of these technologies is being discussed at large, but much work remains to be done in order to advance understanding of this change and how society can prepare for it, particularly from the perspective of policy and legislation, not only to safeguard human rights but also to ensure that (inter)national governance frameworks remain relevant in light of the pace of technological innovation.
UNICRI, through its Centre for AI and Robotics in The Hague, invites interested stakeholders to contribute papers to a UNICRI Special Collection on Artificial Intelligence to be released in early 2020.
Empirical contributions dealing with real-world examples and current cases are encouraged. Strictly theoretical papers may also be accepted, but should remain within the bounds of reasonable speculation. Papers should be concise (no more than 4,000 words), practical, thought-provoking and accessible to readers from various fields (policy makers, academia, private sector, etc.). The thesis should reflect a multicultural perspective as much as possible, and be relevant internationally.
Areas of inquiry should focus on AI and related technologies (in particular, but not limited to, computer vision, natural language processing [NLP], audio/visual surveillance, resource optimization, ubiquitous internet connectivity, the internet of things [IoT], advanced computation), as concerns the following:
- Risks, gaps and opportunities for crime prevention and criminal justice; use by law enforcement and the judiciary; and the risk and threat of malicious use by criminal and terrorist groups.
- The role in advancing progress toward Goal 16 of the United Nations 2030 Agenda for Sustainable Development on peace, justice and strong institutions; ethical, legal and social implications.
- How new technologies can be kept in check for compliance with human rights frameworks; how to avoid the creation of ‘police states’; personal data protection; accountability; transparency; explainability; trust; responsibility; legislation and its relevance.
- Global integration and collaboration between the public sector, industry, academia and other sectors; how the harmful effects of disruption can be addressed and mitigated; and other frontiers and opportunities in horizontal organization.
- Similar historical technological events (especially in terms of volatility, uncertainty, complexity and ambiguity [VUCA]), and lessons learned.
- Addressing unintended effects; the impact of massive unemployment on social order and stability (particularly, migration flows, crime rates, the obsolescence of borders); the digital divide; macro- and microsystems and their integration in a hyperconnected world.
- Abstracts should be submitted to email@example.com and firstname.lastname@example.org by 30 November 2019, with the subject line CfP UNICRI AI;
- Abstracts will be reviewed by UNICRI and the authors of the selected submissions will be contacted for the development of the full manuscripts by 31 December 2019.
- Abstracts and manuscripts will be double-blinded for review.
- Manuscripts will be reviewed by both UNICRI and a committee of external independent reviewers consisting of eminent stakeholders from both the public and private sector identified and selected by UNICRI.
- Manuscripts must be original, unpublished, and not be submitted elsewhere.
- First page: title and author(s) names, institutional affiliation(s), contact information.
- Second page: title and brief (100 word max.) biography author(s).
- Third page: title, 400-word abstract and 4 to 6 keywords (NB. the author’s name or affiliation should not appear on this page).
For further information please contact: email@example.com