The first attempt globally to comprehensively capture the socio-economic impact of research across all disciplines was undertaken for the Australian Research Quality Framework (RQF), using a case study approach. By allowing impact to be placed in context, we answer the so what? question that can result from quantitative data analyses, but is there a risk that the full picture may not be presented to demonstrate impact in a positive light? In the UK, the Russell Group Universities responded to the REF consultation by recommending that no time lag be put on the delivery of impact from a piece of research citing examples such as the development of cardiovascular disease treatments, which take between 10 and 25 years from research to impact (Russell Group 2009). As a result, numerous and widely varying models and frameworks for assessing impact exist. The exploitation of research to provide impact occurs through a complex variety of processes, individuals, and organizations, and therefore, attributing the contribution made by a specific individual, piece of research, funding, strategy, or organization to an impact is not straight forward. There is a distinction between academic impact understood as the intellectual contribution to ones field of study within academia and external socio-economic impact beyond academia. Evidence of academic impact may be derived through various bibliometric methods, one example of which is the H index, which has incorporated factors such as the number of publications and citations. Evaluation is the systematic collection and inter- pretation of evidence leading as a part of process to a judgement of value with a view to action., Evaluation is the application of a standard and a decision-making system to assessment data to produce judgments about the amount and adequacy of the learning that has taken place., 1. Although metrics can provide evidence of quantitative changes or impacts from our research, they are unable to adequately provide evidence of the qualitative impacts that take place and hence are not suitable for all of the impact we will encounter. In undertaking excellent research, we anticipate that great things will come and as such one of the fundamental reasons for undertaking research is that we will generate and transform knowledge that will benefit society as a whole. The Payback Framework enables health and medical research and impact to be linked and the process by which impact occurs to be traced. Its objective is to evaluate programs, improve program effectiveness, and influence programming decisions. In the Brunel model, depth refers to the degree to which the research has influenced or caused change, whereas spread refers to the extent to which the change has occurred and influenced end users. Despite the concerns raised, the broader socio-economic impacts of research will be included and count for 20% of the overall research assessment, as part of the REF in 2014. A total of 10 Cone beam computed tomography (CBCT) were selected to perform semi-automatic segmentation of the condyles by using three free-source software (Invesalius, version 3.0.0, Centro de Tecnologia da . 0000342958 00000 n For example, the development of a spin out can take place in a very short period, whereas it took around 30 years from the discovery of DNA before technology was developed to enable DNA fingerprinting. Also called evaluative writing, evaluative essay or report, and critical evaluation essay . From the outset, we note that the understanding of the term impact differs between users and audiences. Capturing knowledge exchange events would greatly assist the linking of research with impact. To be considered for inclusion within the REF, impact must be underpinned by research that took place between 1 January 1993 and 31 December 2013, with impact occurring during an assessment window from 1 January 2008 to 31 July 2013. 2007). What is the Difference between Formative and Summative Evaluation through Example? Overview of the types of information that systems need to capture and link. For example, some of the key learnings from the evaluation of products and personnel often apply to the evaluation of programs and policies and vice versa. Thalidomide has since been found to have beneficial effects in the treatment of certain types of cancer. Johnston (Johnston 1995) notes that by developing relationships between researchers and industry, new research strategies can be developed. Dennis Atsu Dake. What indicators, evidence, and impacts need to be captured within developing systems? Impact has become the term of choice in the UK for research influence beyond academia. There are areas of basic research where the impacts are so far removed from the research or are impractical to demonstrate; in these cases, it might be prudent to accept the limitations of impact assessment, and provide the potential for exclusion in appropriate circumstances. It is time-intensive to both assimilate and review case studies and we therefore need to ensure that the resources required for this type of evaluation are justified by the knowledge gained. It is therefore in an institutions interest to have a process by which all the necessary information is captured to enable a story to be developed in the absence of a researcher who may have left the employment of the institution. It is acknowledged in the article by Mugabushaka and Papazoglou (2012) that it will take years to fully incorporate the impacts of ERC funding. These sometimes dissim- ilar views are due to the varied training and background of the writers in terms of their profession, concerned with different aspects of the education process. The Social Return on Investment (SROI) guide (The SROI Network 2012) suggests that The language varies impact, returns, benefits, value but the questions around what sort of difference and how much of a difference we are making are the same. Although it can be envisaged that the range of impacts derived from research of different disciplines are likely to vary, one might question whether it makes sense to compare impacts within disciplines when the range of impact can vary enormously, for example, from business development to cultural changes or saving lives? Many theorists, authors, research scholars, and practitioners have defined performance appraisal in a wide variety of ways. Impact is not static, it will develop and change over time, and this development may be an increase or decrease in the current degree of impact. Perhaps the most extended definition of evaluation has been supplied by C.E.Beeby (1977). In the UK, more sophisticated assessments of impact incorporating wider socio-economic benefits were first investigated within the fields of Biomedical and Health Sciences (Grant 2006), an area of research that wanted to be able to justify the significant investment it received. In education, the term assessment refers to the wide variety of methods or tools that educators use to evaluate, measure, and document the academic readiness, learning progress, skill acquisition, or educational needs of students. Aspects of impact, such as value of Intellectual Property, are currently recorded by universities in the UK through their Higher Education Business and Community Interaction Survey return to Higher Education Statistics Agency; however, as with other public and charitable sector organizations, showcasing impact is an important part of attracting and retaining donors and support (Kelly and McNicoll 2011). Given that the type of impact we might expect varies according to research discipline, impact-specific challenges present us with the problem that an evaluation mechanism may not fairly compare impact between research disciplines. A collation of several indicators of impact may be enough to convince that an impact has taken place. Downloadable! Media coverage is a useful means of disseminating our research and ideas and may be considered alongside other evidence as contributing to or an indicator of impact. The verb evaluate means to form an idea of something or to give a judgment about something. 0000001087 00000 n 2009; Russell Group 2009). The ability to write a persuasive well-evidenced case study may influence the assessment of impact. Assessment refers to the process of collecting information that reflects the performance of a student, school, classroom, or an academic system based on a set of standards, learning criteria, or curricula. Replicated from (Hughes and Martin 2012). 0000003495 00000 n 0000009507 00000 n The transition to routine capture of impact data not only requires the development of tools and systems to help with implementation but also a cultural change to develop practices, currently undertaken by a few to be incorporated as standard behaviour among researchers and universities. In the majority of cases, a number of types of evidence will be required to provide an overview of impact. 2009). Throughout history, the activities of a university have been to provide both education and research, but the fundamental purpose of a university was perhaps described in the writings of mathematician and philosopher Alfred North Whitehead (1929). The understanding of the term impact varies considerably and as such the objectives of an impact assessment need to be thoroughly understood before evidence is collated. A comparative analysis of these definitions reveal that in defining performance appraisal they were saying the same thing, but in a slightly modified way. RAND Europe, Capturing Research Impacts. 0000011585 00000 n Not only are differences in segmentation algorithm, boundary definition, and tissue contrast a likely cause of the poor correlation , but also the two different software packages used in this study are not comparable from a technical point of view. Without measuring and evaluating their performance, teachers will not be able to determine how much the students have learned. According to Hanna- " The process of gathering and interpreted evidence changes in the behavior of all students as they progress through school is called evaluation". Wooding et al. Indicators, evidence, and impact within systems, Department for Business, Innovation and Skills 2012, http://www.arc.gov.au/pdf/ERA_Indicator_Principles.pdf, http://www.charitystar.org/wp-content/uploads/2011/05/Return_on_donations_a_white_paper_on_charity_impact_measurement.pdf, http://www.oecd.org/science/innovationinsciencetechnologyandindustry/37450246.pdf, http://www.cahs-acss.ca/wp-content/uploads/2011/09/ROI_FullReport.pdf, http://mice.cerch.kcl.ac.uk/wp-uploads/2011/07/MICE_report_Goldsmiths_final.pdf, http://www.timeshighereducation.co.uk/story.asp?storyCode=409614§ioncode=26, http://www.odi.org.uk/rapid/Events/ESRC/docs/background_paper.pdf, http://www.iscintelligence.com/archivos_subidos/usfacultyburden_5.pdf, http://blogs.lse.ac.uk/impactofsocialsciences/tag/claire-donovan/, http://www.atn.edu.au/docs/Research%20Global%20-%20Measuring%20the%20impact%20of%20research.pdf, http://www.hbs.edu/research/pdf/10-099.pdf, http://www.esf.org/index.php?eID=tx_ccdamdl_file&p[file]=25668&p[dl]=1&p[pid]=6767&p[site]=European%20Science%20Foundation&p[t]=1351858982&hash=93e987c5832f10aeee3911bac23b4e0f&l=en, http://www.rand.org/pubs/research_briefs/2007/RAND_RB9202.pdf, http://www.rand.org/pubs/documented_briefings/2010/RAND_DB578.pdf, http://ukirc.ac.uk/object/report/8025/doc/CIHE_0612ImpactReport_summary.pdf, http://www.timeshighereducation.co.uk/story.asp?storyCode=415340§ioncode=26, http://www.publicengagement.ac.uk/sites/default/files/80096%20NCCPE%20Social%20Value%20Report.pdf, http://www2.lse.ac.uk/government/research/resgroups/LSEPublicPolicy/Docs/LSE_Impact_Handbook_April_2011.pdf, http://www.artscouncil.org.uk/media/uploads/documents/publications/340.pdf, http://www.ref.ac.uk/media/ref/content/pub/researchexcellenceframeworkimpactpilotexercisefindingsoftheexpertpanels/re01_10.pdf, http://www.ref.ac.uk/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/02_11.pdf, http://www.ref.ac.uk/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/GOS%20including%20addendum.pdf, http://www.ref.ac.uk/media/ref/content/pub/panelcriteriaandworkingmethods/01_12.pdf, http://www.russellgroup.ac.uk/uploads/REF-consultation-response-FINAL-Dec09.pdf, http://www.siampi.eu/Pages/SIA/12/625.bGFuZz1FTkc.html, http://www.siampi.eu/Content/SIAMPI/SIAMPI_Final%20report.pdf, http://www.thesroinetwork.org/publications/doc_details/241-a-guide-to-social-return-on-investment-2012, http://www.ucu.org.uk/media/pdf/n/q/ucu_REFstatement_finalsignatures.pdf, http://www.esrc.ac.uk/_images/Case_Study_of_the_Future_of_Work_Programme_Volume_2_tcm8-4563.pdf, Receive exclusive offers and updates from Oxford Academic, Automated collation of evidence is difficult, Allows evidence to be contextualized and a story told, Incorporating perspective can make it difficult to assess critically, Enables assessment in the absence of quantitative data, Preserves distinctive account or disciplinary perspective, Rewards those who can write well, and/or afford to pay for external input. In 200910, the REF team conducted a pilot study for the REF involving 29 institutions, submitting case studies to one of five units of assessment (in clinical medicine, physics, earth systems and environmental sciences, social work and social policy, and English language and literature) (REF2014 2010). The range and diversity of frameworks developed reflect the variation in purpose of evaluation including the stakeholders for whom the assessment takes place, along with the type of impact and evidence anticipated. , . This distinction is not so clear in impact assessments outside of the UK, where academic outputs and socio-economic impacts are often viewed as one, to give an overall assessment of value and change created through research. Evaluation of impact in terms of reach and significance allows all disciplines of research and types of impact to be assessed side-by-side (Scoble et al. However, the . The Value of Public Sector R&D, Assessing impacts of higher education systems, National Co-ordinating Centre for Public Engagement, Through a Glass, Darkly: Measuring the Social Value of Universities, Describing the Impact of Health Research: A Research Impact Framework, LSE Public Policy Group. Researchers were asked to evidence the economic, societal, environmental, and cultural impact of their research within broad categories, which were then verified by an expert panel (Duryea et al. Baselines and controls need to be captured alongside change to demonstrate the degree of impact. Evaluation is a process which is continuous as well as comprehensive and involves all the tasks of education and not merely tests, measurements, and examination. They are often written with a reader from a particular stakeholder group in mind and will present a view of impact from a particular perspective. We will focus attention towards generating results that enable boxes to be ticked rather than delivering real value for money and innovative research. This work was supported by Jisc [DIINN10]. Evaluative research is a type of research used to evaluate a product or concept, and collect data to help improve your solution. The Payback Framework is possibly the most widely used and adapted model for impact assessment (Wooding et al. Providing advice and guidance within specific disciplines is undoubtedly helpful. The difficulty then is how to determine what the contribution has been in the absence of adequate evidence and how we ensure that research that results in impacts that cannot be evidenced is valued and supported. Oxford University Press is a department of the University of Oxford. There has been a drive from the UK government through Higher Education Funding Council for England (HEFCE) and the Research Councils (HM Treasury 2004) to account for the spending of public money by demonstrating the value of research to tax payers, voters, and the public in terms of socio-economic benefits (European Science Foundation 2009), in effect, justifying this expenditure (Davies Nutley, and Walter 2005; Hanney and Gonzlez-Block 2011). In viewing impact evaluations it is important to consider not only who has evaluated the work but the purpose of the evaluation to determine the limits and relevance of an assessment exercise. A very different approach known as Social Impact Assessment Methods for research and funding instruments through the study of Productive Interactions (SIAMPI) was developed from the Dutch project Evaluating Research in Context and has a central theme of capturing productive interactions between researchers and stakeholders by analysing the networks that evolve during research programmes (Spaapen and Drooge, 2011; Spaapen et al. Capturing data, interactions, and indicators as they emerge increases the chance of capturing all relevant information and tools to enable researchers to capture much of this would be valuable. What emerged on testing the MICE taxonomy (Cooke and Nadim 2011), by mapping impacts from case studies, was that detailed categorization of impact was found to be too prescriptive. These metrics may be used in the UK to understand the benefits of research within academia and are often incorporated into the broader perspective of impact seen internationally, for example, within the Excellence in Research for Australia and using Star Metrics in the USA, in which quantitative measures are used to assess impact, for example, publications, citation, and research income. Decker et al. Using the above definition of evaluation, program evaluation approaches were classified into four categories. Understand. In this case, a specific definition may be required, for example, in the Research Excellence Framework (REF), Assessment framework and guidance on submissions (REF2014 2011b), which defines impact as, an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia. What indicators, evidence, and impacts need to be captured within developing systems. RAND selected four frameworks to represent the international arena (Grant et al. Two areas of research impact health and biomedical sciences and the social sciences have received particular attention in the literature by comparison with, for example, the arts. Perhaps it is time for a generic guide based on types of impact rather than research discipline? 0000004731 00000 n Published by Oxford University Press. The case study does present evidence from a particular perspective and may need to be adapted for use with different stakeholders. 0000334705 00000 n This petition was signed by 17,570 academics (52,409 academics were returned to the 2008 Research Assessment Exercise), including Nobel laureates and Fellows of the Royal Society (University and College Union 2011).