The San Francisco Declaration: An Old Debate from the Latin American Context

In the past few weeks, new criticisms of the impact factor have been made by research communities. What is surprising in this case is that these criticisms come from scholars in misnamed hard sciences, and that they are trying to present a set of complaints that are not new, neither in their meaning...

Full description

Autores:
Tipo de recurso:
article
Fecha de publicación:
2013
Institución:
Pontificia Universidad Javeriana
Repositorio:
Repositorio Universidad Javeriana
Idioma:
spa
OAI Identifier:
oai:repository.javeriana.edu.co:10554/33369
Acceso en línea:
http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998
http://hdl.handle.net/10554/33369
Palabra clave:
null
null
Rights
openAccess
License
Atribución-NoComercial-SinDerivadas 4.0 Internacional
id JAVERIANA_d4034dfcee364cfa279a728af5cc8c98
oai_identifier_str oai:repository.javeriana.edu.co:10554/33369
network_acronym_str JAVERIANA
network_name_str Repositorio Universidad Javeriana
repository_id_str
dc.title.none.fl_str_mv The San Francisco Declaration: An Old Debate from the Latin American Context
Declaración de San Francisco: una vieja discusión desde el contexto latinoamericano
title The San Francisco Declaration: An Old Debate from the Latin American Context
spellingShingle The San Francisco Declaration: An Old Debate from the Latin American Context
López-López, Wilson; Pontificia Universidad Javeriana
null
null
title_short The San Francisco Declaration: An Old Debate from the Latin American Context
title_full The San Francisco Declaration: An Old Debate from the Latin American Context
title_fullStr The San Francisco Declaration: An Old Debate from the Latin American Context
title_full_unstemmed The San Francisco Declaration: An Old Debate from the Latin American Context
title_sort The San Francisco Declaration: An Old Debate from the Latin American Context
dc.creator.none.fl_str_mv López-López, Wilson; Pontificia Universidad Javeriana
author López-López, Wilson; Pontificia Universidad Javeriana
author_facet López-López, Wilson; Pontificia Universidad Javeriana
author_role author
dc.contributor.none.fl_str_mv null
null
dc.subject.none.fl_str_mv null
null
topic null
null
description In the past few weeks, new criticisms of the impact factor have been made by research communities. What is surprising in this case is that these criticisms come from scholars in misnamed hard sciences, and that they are trying to present a set of complaints that are not new, neither in their meaning nor in their statement. We have long known that a quantitative indicator such as the Impact Factor is not only insufficient, but also very vulnerable, since the citations-to-articles ratio can create situations in which, for example, a journal with few articles but a good, controlled quantity of citations generated by groups interested in getting a higher indicator for the journal, actually has that effect, an increase that I call a “bubble” effect. However, the creators of these indicators have already undertaken measures designed to prevent these actions. These measures range from warning editors to creating more indicators and more diverse forms of measuring impact. Scientometric indicators have also been long shown to be a measure of communication between academic peers, and not a measure of social appropriation of knowledge, or of the impact of professional training. It is probably necessary to have different measures for those contexts. But it has been academic communities, especially those from hard sciences and from countries with higher outputs, like the ones rediscovering the aforementioned facts, that have played a role in legitimizing these indicators as a criterion of quality. The problem is not the indicators per se. On the contrary, it is academic communities, operating from universities or other entities, who have become the problem by having given indicators a significant weight both in research assessment and resource allocation processes. But at least in our context, it is clear that the final decision on whether a researcher gets resources does not depend on the IF, but on a complex peer reviewing system that makes the weight of indicators more relative. But also, we cannot ignore the role that these indicators have been given by incentive systems within Universities and institutions, both for researchers and for research groups. These indicators cannot reflect the whole spectrum of their efforts or the dynamics of knowledge-producing communities in their early stages of development. Nevertheless, we have enough evidence nowadays that once communities are consolidated, these indicators can provide information about certified quality. That is why most of these measures provide several informative dimensions and are useful because they give transparency to the processes that account for research activity. If we did not have these indicators, we would not have any other way to understand this activity. Moreover, open-access systems such as REDALYC and SciELO, distinguished projects committed to promote open-access, have been facing the challenges of improving access for communities that simply do not have enough money to pay for it. These initiatives have been fighting for quality and democratisation of access to knowledge – REDALYC in particular has also suggested alternative indicators for regional academic communities. Up to this point, these voices had been ignored, and it is only now that they are heard again, that mainstream scholars raise their own to discuss something we had been debating for years in our region. Furthermore, globally Scimago group has had an outstanding job and complementary for measurement of isolated indicators developing multiple and complex measures of production, impact and use of knowledge. I think we also need to ask ourselves what forces and interests are behind this discussion today. Maybe emergent communities are unleashing these declarations calling for a change in the rules? Is this rediscovery important now that these emergent academic communities have a voice and citations? In this sense, these rebellions must be taken with a pinch of salt, because we have been reflecting about the subject for years and this is not a discovery for us. We have discussed the need for other measures and, of course, we have had clarity about the importance of the social impact of knowledge. Wilson López López Editor
publishDate 2013
dc.date.none.fl_str_mv 2013-08-07
2018-02-24T16:06:04Z
2018-02-24T16:06:04Z
2020-04-15T18:30:29Z
2020-04-15T18:30:29Z
dc.type.none.fl_str_mv http://purl.org/coar/version/c_970fb48d4fbd8a85
Artículo de revista
http://purl.org/coar/resource_type/c_6501
info:eu-repo/semantics/article
info:eu-repo/semantics/publishedVersion
format article
status_str publishedVersion
dc.identifier.none.fl_str_mv http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998
2011-2777
1657-9267
http://hdl.handle.net/10554/33369
url http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998
http://hdl.handle.net/10554/33369
identifier_str_mv 2011-2777
1657-9267
dc.language.none.fl_str_mv spa
language spa
dc.relation.none.fl_str_mv http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998/4871
Universitas Psychologica; Vol. 12, Núm. 2 (2013); 323-326
dc.rights.none.fl_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional
info:eu-repo/semantics/openAccess
http://purl.org/coar/access_right/c_abf2
rights_invalid_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv PDF
application/pdf
dc.publisher.none.fl_str_mv Pontificia Universidad Javeriana
publisher.none.fl_str_mv Pontificia Universidad Javeriana
dc.source.none.fl_str_mv reponame:Repositorio Universidad Javeriana
instname:Pontificia Universidad Javeriana
instacron:Pontificia Universidad Javeriana
instname_str Pontificia Universidad Javeriana
instacron_str Pontificia Universidad Javeriana
institution Pontificia Universidad Javeriana
reponame_str Repositorio Universidad Javeriana
collection Repositorio Universidad Javeriana
_version_ 1803712829463724033
spelling The San Francisco Declaration: An Old Debate from the Latin American ContextDeclaración de San Francisco: una vieja discusión desde el contexto latinoamericanoLópez-López, Wilson; Pontificia Universidad JaveriananullnullIn the past few weeks, new criticisms of the impact factor have been made by research communities. What is surprising in this case is that these criticisms come from scholars in misnamed hard sciences, and that they are trying to present a set of complaints that are not new, neither in their meaning nor in their statement. We have long known that a quantitative indicator such as the Impact Factor is not only insufficient, but also very vulnerable, since the citations-to-articles ratio can create situations in which, for example, a journal with few articles but a good, controlled quantity of citations generated by groups interested in getting a higher indicator for the journal, actually has that effect, an increase that I call a “bubble” effect. However, the creators of these indicators have already undertaken measures designed to prevent these actions. These measures range from warning editors to creating more indicators and more diverse forms of measuring impact. Scientometric indicators have also been long shown to be a measure of communication between academic peers, and not a measure of social appropriation of knowledge, or of the impact of professional training. It is probably necessary to have different measures for those contexts. But it has been academic communities, especially those from hard sciences and from countries with higher outputs, like the ones rediscovering the aforementioned facts, that have played a role in legitimizing these indicators as a criterion of quality. The problem is not the indicators per se. On the contrary, it is academic communities, operating from universities or other entities, who have become the problem by having given indicators a significant weight both in research assessment and resource allocation processes. But at least in our context, it is clear that the final decision on whether a researcher gets resources does not depend on the IF, but on a complex peer reviewing system that makes the weight of indicators more relative. But also, we cannot ignore the role that these indicators have been given by incentive systems within Universities and institutions, both for researchers and for research groups. These indicators cannot reflect the whole spectrum of their efforts or the dynamics of knowledge-producing communities in their early stages of development. Nevertheless, we have enough evidence nowadays that once communities are consolidated, these indicators can provide information about certified quality. That is why most of these measures provide several informative dimensions and are useful because they give transparency to the processes that account for research activity. If we did not have these indicators, we would not have any other way to understand this activity. Moreover, open-access systems such as REDALYC and SciELO, distinguished projects committed to promote open-access, have been facing the challenges of improving access for communities that simply do not have enough money to pay for it. These initiatives have been fighting for quality and democratisation of access to knowledge – REDALYC in particular has also suggested alternative indicators for regional academic communities. Up to this point, these voices had been ignored, and it is only now that they are heard again, that mainstream scholars raise their own to discuss something we had been debating for years in our region. Furthermore, globally Scimago group has had an outstanding job and complementary for measurement of isolated indicators developing multiple and complex measures of production, impact and use of knowledge. I think we also need to ask ourselves what forces and interests are behind this discussion today. Maybe emergent communities are unleashing these declarations calling for a change in the rules? Is this rediscovery important now that these emergent academic communities have a voice and citations? In this sense, these rebellions must be taken with a pinch of salt, because we have been reflecting about the subject for years and this is not a discovery for us. We have discussed the need for other measures and, of course, we have had clarity about the importance of the social impact of knowledge. Wilson López López EditorEn las últimas semanas entre las comunidades de investigadores han surgido nuevamente algunas de las críticas al factor de impacto. En este caso lo que sorprende es que las críticas provienen de académicos que pertenecen a las mal denominadas ciencias duras, y que en este punto se quiera presentar un conjunto de críticas que no son nuevas ni en su sentido ni en su formulación. En primer lugar, hace muchos años que sabemos que un indicador cuantitativo como el que resulta en el factor de impacto (IF – Impact Factor) no solo es una medida insuficiente sino además muy vulnerable; pues el número de citas con relación al número de artículos puede generar que, por ejemplo, una revista con pocos artículos y una buena y controlada cantidad de citas, generadas por grupos interesados en el ascenso de una revista en el indicador, puede incrementar y colocar a la revista con un impacto que yo llamo “burbuja”. Sin embargo, los generadores de estos indicadores ya desde hace algún tiempo han tomado medidas para prevenir estas acciones. Las acciones van desde advertir a los editores, hasta generar más indicadores y formas diversas de medida del impacto entre los usuarios de esta producción. Hace muchos años que también se ha hecho evidente que los indicadores cienciométricos son una medida de la comunicación entre pares (académicos) y que estos indicadores no son una medida de la apropiación social del conocimiento e incluso tampoco lo son de una medida del impacto de la formación profesional y seguramente es necesario tener otras medidas para esos otros ámbitos. Pero han sido las comunidades académicas (en especial las de las denominadas ciencias duras y en los países con mayor producción como las que hoy hacen este redescubrimiento) las que han jugado un papel legitimador de estos indicadores como criterio de calidad. Pero el problema no es el indicador o los indicadores per se. Por el contrario, son las comunidades académicas que, dentro de las instituciones universitarias o en las organizaciones que orientan los recursos para la investigación, han decidido darle un peso significativo, tanto en los procesos de evaluación de la investigación, como en la forma de acceder a estos recursos. Pero es claro que para un investigador, al menos en nuestro contexto, la decisión final sobre sí se le dan o no recursos no depende del IF, sino de un complejo sistema de evaluación de pares que relativiza el peso de los indicadores. Pero por otro lado, tampoco se puede ignorar el papel que se le ha dado a estos indicadores por parte de los sistemas de incentivos dentro de las instituciones, tanto para los investigadores, como para los grupos de investigación o sus instituciones. Además, éstos indicadores no pueden dar cuenta de todos sus esfuerzos y dinámicas de las comunidades productoras de conocimiento que se encuentran en etapas iniciales de desarrollo. Sin embargo, hoy tenemos suficiente evidencia de que cuando las comunidades se han consolidado estos indicadore resultan ser una herramienta de información de calidad certificada para mostrar el uso que las comunidades de producción de conocimiento hacen de ellas. Por lo anterior, se puede afirmar que la mayor parte de estas métricas proveen múltiples dimensiones de información y son de gran utilidad pues le da transparencia a los procesos que dan cuenta de la actividad de los investigadores, los grupos y los esfuerzos institucionales por desarrollar estas actividades de investigación y producción que de otra manera no tendríamos como comprenderlas. Es más, los sistemas de acceso libre como REDALYC Y SCIELO que son proyectos destacados de promoción del acceso abierto al conocimiento se han y se siguen enfrentando al reto no solo de mejorar el acceso a las comunidades que no pueden pagar por el acceso al conocimiento de calidad y que llevan más de 10 años comprometidas con la calidad y la democratización del acceso al conocimiento, además y es el caso particular de Redalyc se ha empeñado en mostrar otros indicadores alternativos de las comunidades académicas de la región. Sin embargo, sus voces siempre habían sido ignoradas o silenciadas y solo hasta ahora que un grupo de científicos de la corriente principal alzan su voz pasada de tiempo para iniciar una crítica que ya ha sido discutida por años en nuestra región. Por otro lado, a nivel mundial el trabajo de grupos como Scimago es sobresaliente y complementa la medición de indicadores aislados desarrollando múltiples y complejas medidas de la producción, impacto y uso de conocimiento. Creo que debemos incluso preguntarnos ¿qué fuerzas e intereses están animando hoy esta discusión? ¿No serán las nuevas comunidades emergentes las que ahora están desatando estas declaraciones llamando a cambiar las reglas del juego? ¿Ahora que estas comunidades académicas emergentes que tienen voz y citas resulta fundamental este redescubrimiento? En este sentido se deben tomar con cautela estas rebeliones pues llevamos varios años reflexionando sobre esto y no son un descubrimiento para nosotros, llevamos varios años discutiendo y trabajando sobre la necesidad de más métricas y tenemos claro, hace tiempo, la importancia de los impactos sociales del conocimiento. Wilson López López EditorPontificia Universidad Javeriananullnull2018-02-24T16:06:04Z2020-04-15T18:30:29Z2018-02-24T16:06:04Z2020-04-15T18:30:29Z2013-08-07http://purl.org/coar/version/c_970fb48d4fbd8a85Artículo de revistahttp://purl.org/coar/resource_type/c_6501info:eu-repo/semantics/articleinfo:eu-repo/semantics/publishedVersionPDFapplication/pdfhttp://revistas.javeriana.edu.co/index.php/revPsycho/article/view/59982011-27771657-9267http://hdl.handle.net/10554/33369spahttp://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998/4871Universitas Psychologica; Vol. 12, Núm. 2 (2013); 323-326Atribución-NoComercial-SinDerivadas 4.0 Internacionalinfo:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2reponame:Repositorio Universidad Javerianainstname:Pontificia Universidad Javerianainstacron:Pontificia Universidad Javeriana2023-03-29T19:26:29Z