The demand for transparency, validation and explainability of automated advice systems is not new. Back in the 1980s, extensive discussions were held between proponents of rule-based systems and those based on statistical analysis, partly based on which were more transparent and how they should be evaluated. More recently, Onora O’Neill’s emphasis on demonstrating trustworthiness, and her idea of ‘intelligent transparency’, has focused attention on the ability of algorithms to, if required, show their workings.
In this talk, Professor Spiegelhalter will argue that we should ideally be able to check (a) the basis for the algorithm, (b) its past performance, (c) the reasoning behind its current claim, (d) its uncertainty around its current claim and e) that these explanations should be open to different levels of expertise. These ideas will be illustrated by the Predict system for women choosing follow-up treatment after surgery for breast cancer, which has four levels of explanation of its conclusions.
David Spiegelhalter is Winton Professor for the Public Understanding of Risk at Cambridge University. He works to improve the way in which risk and statistical evidence are taught and discussed in society, and makes frequent media appearances. In 2017-2018 he was President of the Royal Statistical Society, and in 2011 he came 7th in an episode of Winter Wipeout.
Read more about David Spiegelhalter
About the Turing
The Alan Turing Institute, headquartered in the British Library, London, was created as the national institute for data science in 2015. In 2017, as a result of a government recommendation, we added artificial intelligence to our remit.
The Institute is named in honour of Alan Turing (23 June 1912 – 7 June 1954), whose pioneering work in theoretical and applied mathematics, engineering and computing are considered to be the key disciplines comprising the fields of data science and artificial intelligence.
To learn more about what we do, watch our new video on ‘What is The Alan Turing Institute’
https://www.youtube.com/watch?v=IjS2sVPR2Zc
Add comment