Sciencewise evaluates all its activities - all the individual public dialogue projects for which it provides funding and the programme overall - for the following reasons:
• Improving the effectiveness of future practice and policy around public dialogue, by developing and sharing knowledge and evidence of 'what works' from project evaluations
• Demonstrating the value of public dialogue to encourage its wider use, by providing evidence of the impacts and benefits
• Increasing the transparency and accountability of the Sciencewise programme, by openly reporting what is being done and what is being achieved
• Improving the effectiveness and impacts of the Sciencewise programme in meeting its overall objective, by measuring progress against agreed metrics of success.
The ways in which Sciencewise approaches evaluation, both in the commissioning and oversight of the independent evaluations of all public dialogue projects funded by Sciencewise and in the evaluation of the Sciencewise programme itself, are described here.
There are five main elements to Sciencewise work on evaluation:
• Individual public dialogue project evaluations. Sciencewise provides detailed guidance on the key questions and principles for project evaluations, and publishes all the resulting evaluation reports. Sciencewise also provides a list of evaluators with relevant experience; the list is for information only and does not imply recommendation.
• Following up longer term project impacts. We monitor and follow up each public dialogue project supported by Sciencewise, to explore the longer term impacts of the dialogue project on policy and on the organisation that has run the dialogue.
• Sciencewise programme evaluation. The final report of the independent evaluation of the Sciencewise programme from 2012 to 2015, by Risk and Policy Analysts (RPA) was published in March 2015. This builds on the Theory of Change process undertaken in 2013 to support Sciencewise strategic planning in 2014-2015 and to provide a framework for future evaluation. Earlier programme evaluations fed into these recent activities.
• Quality in public dialogue. Sciencewise has recently developed a framework for assessing the quality of public dialogue, based on earlier evaluations and additional research and consultation. The resulting working paper provides a set of key questions to guide future evaluations of public dialogue and to support high quality in the design and delivery of dialogue in future.
• Learning from practice. A series of internal and external guidance notes have been produced, drawing on the lessons that have been identified through formal project and programme evaluations. In addition, specific research and development has been undertaken on the costs and benefits of public dialogue, resulting in two reports: the recent Valuing Dialogue: economic benefits and social impacts, by Robin Clarke (2015) and the earlier Evidence Counts: understanding the value of public dialogue, by Diane Warburton (2010). Sciencewise has also collaborated with a number of academic research projects on learning from practice, including a PhD on organisational learning in and around the Sciencewise programme in 2013. A short report back to Sciencewise on this research has been published (in July 2015), to coincide with the publication of the full PhD thesis.