Debate: Should we measure the impact of social and human sciences?
New public management means we have to define and measure the impact of research. Should the humanities play along, and affirm their values through such evaluations?
The significance of data-based impact indicators has increased greatly in recent years in the fields of politics and administration, but also in companies and non-profit organisations – not least on account of societal demands for more transparency, and because of the new Article 170 in the Federal Constitution of 1999 on the evaluation of public policy. For this reason, the Swiss State Secretariat for Education, Research and Innovation (SERI) regularly carries out data-based impact analyses of measures taken in the fields of science and research – such as on the societal impact of national research programmes run by the Swiss National Science Foundation. At the same time, data-based impact analyses are supported by new, comprehensive data sources, indicators and methods (e.g., big data).
The broader public, along with stakeholders in society, are an important target group for the humanities and social sciences. One of our tasks is to provide the public with orientation knowledge – in other words, to provide them with a critical, in-depth understanding of developments in our societal, cultural, economic and political contexts. The literature shows that researchers in these disciplines generally regard their influence on society as greater than do researchers in the natural and technological sciences. Precisely because of this, it is all the more important to measure their impact and to prove it by means of facts and data.
Current impact indicators are often based on single counting processes that are too simplistic, and that are unable to depict the complexity of the phenomenon to be measured (i.e., the societal impact of scholarship). But developing differentiated measuring processes is one of the core competences of the humanities and social sciences, especially when it comes to complex phenomena such as cultural identity, social integration, innovation and also societal impact. To this end, we first need to clarify the fundamental concepts. Only in this manner can we differentiate transparently between aspects of societal impact that are measurable, and those that are not. Such an explicit impact concept would define what concrete influence should be achieved in which societal target group. This would provide a basis for reliable, precise impact indicators that could enable us to demonstrate the undeniable impact of the humanities and social sciences in a fact-based, data-based manner.
Christian Suter runs the Sociological Institute of the University of Neuchâtel. He is a member of the Scientific Board of the Swiss Centre of Expertise in the Social Sciences (FORS), and from 2001 to 2010 was a member of the National Research Council of the Swiss National Science Foundation (SNSF).
The idea that we should measure the social impact of research in the social and human sciences (SHS) is understandable. Quantification is central to policy-making, despite its well-documented distortions, simplifications and biases. If the SHS want to increase their accountability, visibility and participation in public debate, so the argument goes, then they must submit to quantification. However, this view is based on a three-fold error and should be rejected.
First, we simply do not know how to quantify “societal impact” in a meaningful way. The modes of interaction between the SHS and society are too varied to be reduced to standardised indicators. A six-year research programme (“Research performances in the humanities and social sciences”), financed by Swissuniversities, recently concluded that the scientific impact of SHS can only be apprehended through discipline-specific, peer-determined standards in which quantification should play only a secondary role.
Secondly, even if we could rely on quantitative measurement, this would be a wrong-headed approach to SHS’s role in democratic society. Measuring “impact” presupposes that it is a force for good: the more people touched by our research, the merrier. This mass-marketing logic prevents us from raising more important questions about the kind of impact that SHS should have: on whom and for whom. In sum, it risks replacing democratic debate about our research results with a popularity contest based on hits and likes.
Thirdly, even if we could and should measure societal impact, this would not address the underlying issue of how SHS should respond to legitimate calls for accountability in the use of public funds. If we favour narrow accounting techniques over a broad, political discussion of accountability, then we will have done nothing to allay the doubts of those who are already sceptical about the value of scientific research. The popular mistrust of science reflects fears about elitism, hyper-specialisation and exclusionary forms of expertise. Addressing these concerns through even more sophisticated and counterintuitive tools of quantification is neither good science nor good strategy.
Instead, we should promote the public evaluation of SHS research by rewarding how we communicate with civil society and decision-makers. Switzerland is well positioned thanks to its robust system of tertiary education. It encourages engagement with local actors on issues of concern to them. Our first priority should be to track these exchanges through qualitative studies that highlight the many contributions made by SHS to public debate. Quantitative assessment should play a secondary role. The future of the social and human sciences lies in increased democratisation, not technification.
Ellen Hertz is a professor of anthropology at the University of Neuchâtel. She is the founder of the Swiss Graduate Programme in Anthropology and a member of the Swiss Science Council (SSC).
Let there be no special treatment! The humanities and social sciences should be treated the same as the natural and technological sciences with which they are in competition, and with which they are inevitably compared – be it explicitly or implicitly. A refusal to measure their societal impact would awaken the (false) impression that they have none. But the opposite is very much the case. They have a considerable impact, because the successful societal implementation of knowledge presupposes a critical, concept-based process of assessment and interpretation. Fact-based, data-based evaluation can make this impact visible and prove it systematically.
The significance of data-based impact indicators has increased greatly in recent years in the fields of politics and administration, but also in companies and non-profit organisations – not least on account of societal demands for more transparency, and because of the new Article 170 in the Federal Constitution of 1999 on the evaluation of public policy. For this reason, the Swiss State Secretariat for Education, Research and Innovation (SERI) regularly carries out data-based impact analyses of measures taken in the fields of science and research – such as on the societal impact of national research programmes run by the Swiss National Science Foundation. At the same time, data-based impact analyses are supported by new, comprehensive data sources, indicators and methods (e.g., big data).
The broader public, along with stakeholders in society, are an important target group for the humanities and social sciences. One of our tasks is to provide the public with orientation knowledge – in other words, to provide them with a critical, in-depth understanding of developments in our societal, cultural, economic and political contexts. The literature shows that researchers in these disciplines generally regard their influence on society as greater than do researchers in the natural and technological sciences. Precisely because of this, it is all the more important to measure their impact and to prove it by means of facts and data.
Current impact indicators are often based on single counting processes that are too simplistic, and that are unable to depict the complexity of the phenomenon to be measured (i.e., the societal impact of scholarship). But developing differentiated measuring processes is one of the core competences of the humanities and social sciences, especially when it comes to complex phenomena such as cultural identity, social integration, innovation and also societal impact. To this end, we first need to clarify the fundamental concepts. Only in this manner can we differentiate transparently between aspects of societal impact that are measurable, and those that are not. Such an explicit impact concept would define what concrete influence should be achieved in which societal target group. This would provide a basis for reliable, precise impact indicators that could enable us to demonstrate the undeniable impact of the humanities and social sciences in a fact-based, data-based manner.
Christian Suter runs the Sociological Institute of the University of Neuchâtel. He is a member of the Scientific Board of the Swiss Centre of Expertise in the Social Sciences (FORS), and from 2001 to 2010 was a member of the National Research Council of the Swiss National Science Foundation (SNSF).
The idea that we should measure the social impact of research in the social and human sciences (SHS) is understandable. Quantification is central to policy-making, despite its well-documented distortions, simplifications and biases. If the SHS want to increase their accountability, visibility and participation in public debate, so the argument goes, then they must submit to quantification. However, this view is based on a three-fold error and should be rejected.
First, we simply do not know how to quantify “societal impact” in a meaningful way. The modes of interaction between the SHS and society are too varied to be reduced to standardised indicators. A six-year research programme (“Research performances in the humanities and social sciences”), financed by Swissuniversities, recently concluded that the scientific impact of SHS can only be apprehended through discipline-specific, peer-determined standards in which quantification should play only a secondary role.
Secondly, even if we could rely on quantitative measurement, this would be a wrong-headed approach to SHS’s role in democratic society. Measuring “impact” presupposes that it is a force for good: the more people touched by our research, the merrier. This mass-marketing logic prevents us from raising more important questions about the kind of impact that SHS should have: on whom and for whom. In sum, it risks replacing democratic debate about our research results with a popularity contest based on hits and likes.
Thirdly, even if we could and should measure societal impact, this would not address the underlying issue of how SHS should respond to legitimate calls for accountability in the use of public funds. If we favour narrow accounting techniques over a broad, political discussion of accountability, then we will have done nothing to allay the doubts of those who are already sceptical about the value of scientific research. The popular mistrust of science reflects fears about elitism, hyper-specialisation and exclusionary forms of expertise. Addressing these concerns through even more sophisticated and counterintuitive tools of quantification is neither good science nor good strategy.
Instead, we should promote the public evaluation of SHS research by rewarding how we communicate with civil society and decision-makers. Switzerland is well positioned thanks to its robust system of tertiary education. It encourages engagement with local actors on issues of concern to them. Our first priority should be to track these exchanges through qualitative studies that highlight the many contributions made by SHS to public debate. Quantitative assessment should play a secondary role. The future of the social and human sciences lies in increased democratisation, not technification.
Ellen Hertz is a professor of anthropology at the University of Neuchâtel. She is the founder of the Swiss Graduate Programme in Anthropology and a member of the Swiss Science Council (SSC).