The San Francisco Declaration: An Old Debate from the Latin American Context

In the past few weeks, new criticisms of the impact factor have been made by research communities. What is surprising in this case is that these criticisms come from scholars in misnamed hard sciences, and that they are trying to present a set of complaints that are not new, neither in their meaning...

Full description

Autores:
López-López, Wilson; Pontificia Universidad Javeriana
Tipo de recurso:
Article of journal
Fecha de publicación:
2013
Institución:
Pontificia Universidad Javeriana
Repositorio:
Repositorio Universidad Javeriana
Idioma:
spa
OAI Identifier:
oai:repository.javeriana.edu.co:10554/33369
Acceso en línea:
http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998
http://hdl.handle.net/10554/33369
Palabra clave:
null
null
Rights
openAccess
License
Atribución-NoComercial-SinDerivadas 4.0 Internacional
id JAVERIANA2_d4034dfcee364cfa279a728af5cc8c98
oai_identifier_str oai:repository.javeriana.edu.co:10554/33369
network_acronym_str JAVERIANA2
network_name_str Repositorio Universidad Javeriana
repository_id_str
dc.title.spa.fl_str_mv The San Francisco Declaration: An Old Debate from the Latin American Context
dc.title.english.eng.fl_str_mv Declaración de San Francisco: una vieja discusión desde el contexto latinoamericano
title The San Francisco Declaration: An Old Debate from the Latin American Context
spellingShingle The San Francisco Declaration: An Old Debate from the Latin American Context
null
null
title_short The San Francisco Declaration: An Old Debate from the Latin American Context
title_full The San Francisco Declaration: An Old Debate from the Latin American Context
title_fullStr The San Francisco Declaration: An Old Debate from the Latin American Context
title_full_unstemmed The San Francisco Declaration: An Old Debate from the Latin American Context
title_sort The San Francisco Declaration: An Old Debate from the Latin American Context
dc.creator.fl_str_mv López-López, Wilson; Pontificia Universidad Javeriana
dc.contributor.author.none.fl_str_mv López-López, Wilson; Pontificia Universidad Javeriana
dc.contributor.none.fl_str_mv null
null
dc.subject.eng.fl_str_mv null
topic null
null
dc.subject.spa.fl_str_mv null
description In the past few weeks, new criticisms of the impact factor have been made by research communities. What is surprising in this case is that these criticisms come from scholars in misnamed hard sciences, and that they are trying to present a set of complaints that are not new, neither in their meaning nor in their statement. We have long known that a quantitative indicator such as the Impact Factor is not only insufficient, but also very vulnerable, since the citations-to-articles ratio can create situations in which, for example, a journal with few articles but a good, controlled quantity of citations generated by groups interested in getting a higher indicator for the journal, actually has that effect, an increase that I call a “bubble” effect. However, the creators of these indicators have already undertaken measures designed to prevent these actions. These measures range from warning editors to creating more indicators and more diverse forms of measuring impact. Scientometric indicators have also been long shown to be a measure of communication between academic peers, and not a measure of social appropriation of knowledge, or of the impact of professional training. It is probably necessary to have different measures for those contexts. But it has been academic communities, especially those from hard sciences and from countries with higher outputs, like the ones rediscovering the aforementioned facts, that have played a role in legitimizing these indicators as a criterion of quality. The problem is not the indicators per se. On the contrary, it is academic communities, operating from universities or other entities, who have become the problem by having given indicators a significant weight both in research assessment and resource allocation processes. But at least in our context, it is clear that the final decision on whether a researcher gets resources does not depend on the IF, but on a complex peer reviewing system that makes the weight of indicators more relative. But also, we cannot ignore the role that these indicators have been given by incentive systems within Universities and institutions, both for researchers and for research groups. These indicators cannot reflect the whole spectrum of their efforts or the dynamics of knowledge-producing communities in their early stages of development. Nevertheless, we have enough evidence nowadays that once communities are consolidated, these indicators can provide information about certified quality. That is why most of these measures provide several informative dimensions and are useful because they give transparency to the processes that account for research activity. If we did not have these indicators, we would not have any other way to understand this activity. Moreover, open-access systems such as REDALYC and SciELO, distinguished projects committed to promote open-access, have been facing the challenges of improving access for communities that simply do not have enough money to pay for it. These initiatives have been fighting for quality and democratisation of access to knowledge – REDALYC in particular has also suggested alternative indicators for regional academic communities. Up to this point, these voices had been ignored, and it is only now that they are heard again, that mainstream scholars raise their own to discuss something we had been debating for years in our region. Furthermore, globally Scimago group has had an outstanding job and complementary for measurement of isolated indicators developing multiple and complex measures of production, impact and use of knowledge. I think we also need to ask ourselves what forces and interests are behind this discussion today. Maybe emergent communities are unleashing these declarations calling for a change in the rules? Is this rediscovery important now that these emergent academic communities have a voice and citations? In this sense, these rebellions must be taken with a pinch of salt, because we have been reflecting about the subject for years and this is not a discovery for us. We have discussed the need for other measures and, of course, we have had clarity about the importance of the social impact of knowledge. Wilson López López Editor
publishDate 2013
dc.date.created.none.fl_str_mv 2013-08-07
dc.date.accessioned.none.fl_str_mv 2018-02-24T16:06:04Z
2020-04-15T18:30:29Z
dc.date.available.none.fl_str_mv 2018-02-24T16:06:04Z
2020-04-15T18:30:29Z
dc.type.coar.fl_str_mv http://purl.org/coar/resource_type/c_2df8fbb1
dc.type.hasversion.none.fl_str_mv http://purl.org/coar/version/c_970fb48d4fbd8a85
dc.type.local.spa.fl_str_mv Artículo de revista
dc.type.coar.none.fl_str_mv http://purl.org/coar/resource_type/c_6501
dc.type.driver.none.fl_str_mv info:eu-repo/semantics/article
format http://purl.org/coar/resource_type/c_6501
dc.identifier.none.fl_str_mv http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998
dc.identifier.issn.none.fl_str_mv 2011-2777
1657-9267
dc.identifier.uri.none.fl_str_mv http://hdl.handle.net/10554/33369
url http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998
http://hdl.handle.net/10554/33369
identifier_str_mv 2011-2777
1657-9267
dc.language.iso.none.fl_str_mv spa
language spa
dc.relation.uri.none.fl_str_mv http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998/4871
dc.relation.citationissue.spa.fl_str_mv Universitas Psychologica; Vol. 12, Núm. 2 (2013); 323-326
dc.rights.licence.*.fl_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional
dc.rights.accessrights.none.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.coar.spa.fl_str_mv http://purl.org/coar/access_right/c_abf2
rights_invalid_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.spa.fl_str_mv PDF
dc.format.mimetype.spa.fl_str_mv application/pdf
dc.publisher.spa.fl_str_mv Pontificia Universidad Javeriana
institution Pontificia Universidad Javeriana
repository.name.fl_str_mv Repositorio Institucional - Pontificia Universidad Javeriana
repository.mail.fl_str_mv repositorio@javeriana.edu.co
_version_ 1811670666638786561
spelling Atribución-NoComercial-SinDerivadas 4.0 Internacionalinfo:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2nullnullLópez-López, Wilson; Pontificia Universidad Javeriana2018-02-24T16:06:04Z2020-04-15T18:30:29Z2018-02-24T16:06:04Z2020-04-15T18:30:29Z2013-08-07http://revistas.javeriana.edu.co/index.php/revPsycho/article/view/59982011-27771657-9267http://hdl.handle.net/10554/33369In the past few weeks, new criticisms of the impact factor have been made by research communities. What is surprising in this case is that these criticisms come from scholars in misnamed hard sciences, and that they are trying to present a set of complaints that are not new, neither in their meaning nor in their statement. We have long known that a quantitative indicator such as the Impact Factor is not only insufficient, but also very vulnerable, since the citations-to-articles ratio can create situations in which, for example, a journal with few articles but a good, controlled quantity of citations generated by groups interested in getting a higher indicator for the journal, actually has that effect, an increase that I call a “bubble” effect. However, the creators of these indicators have already undertaken measures designed to prevent these actions. These measures range from warning editors to creating more indicators and more diverse forms of measuring impact. Scientometric indicators have also been long shown to be a measure of communication between academic peers, and not a measure of social appropriation of knowledge, or of the impact of professional training. It is probably necessary to have different measures for those contexts. But it has been academic communities, especially those from hard sciences and from countries with higher outputs, like the ones rediscovering the aforementioned facts, that have played a role in legitimizing these indicators as a criterion of quality. The problem is not the indicators per se. On the contrary, it is academic communities, operating from universities or other entities, who have become the problem by having given indicators a significant weight both in research assessment and resource allocation processes. But at least in our context, it is clear that the final decision on whether a researcher gets resources does not depend on the IF, but on a complex peer reviewing system that makes the weight of indicators more relative. But also, we cannot ignore the role that these indicators have been given by incentive systems within Universities and institutions, both for researchers and for research groups. These indicators cannot reflect the whole spectrum of their efforts or the dynamics of knowledge-producing communities in their early stages of development. Nevertheless, we have enough evidence nowadays that once communities are consolidated, these indicators can provide information about certified quality. That is why most of these measures provide several informative dimensions and are useful because they give transparency to the processes that account for research activity. If we did not have these indicators, we would not have any other way to understand this activity. Moreover, open-access systems such as REDALYC and SciELO, distinguished projects committed to promote open-access, have been facing the challenges of improving access for communities that simply do not have enough money to pay for it. These initiatives have been fighting for quality and democratisation of access to knowledge – REDALYC in particular has also suggested alternative indicators for regional academic communities. Up to this point, these voices had been ignored, and it is only now that they are heard again, that mainstream scholars raise their own to discuss something we had been debating for years in our region. Furthermore, globally Scimago group has had an outstanding job and complementary for measurement of isolated indicators developing multiple and complex measures of production, impact and use of knowledge. I think we also need to ask ourselves what forces and interests are behind this discussion today. Maybe emergent communities are unleashing these declarations calling for a change in the rules? Is this rediscovery important now that these emergent academic communities have a voice and citations? In this sense, these rebellions must be taken with a pinch of salt, because we have been reflecting about the subject for years and this is not a discovery for us. We have discussed the need for other measures and, of course, we have had clarity about the importance of the social impact of knowledge. Wilson López López EditorEn las últimas semanas entre las comunidades de investigadores han surgido nuevamente algunas de las críticas al factor de impacto. En este caso lo que sorprende es que las críticas provienen de académicos que pertenecen a las mal denominadas ciencias duras, y que en este punto se quiera presentar un conjunto de críticas que no son nuevas ni en su sentido ni en su formulación. En primer lugar, hace muchos años que sabemos que un indicador cuantitativo como el que resulta en el factor de impacto (IF – Impact Factor) no solo es una medida insuficiente sino además muy vulnerable; pues el número de citas con relación al número de artículos puede generar que, por ejemplo, una revista con pocos artículos y una buena y controlada cantidad de citas, generadas por grupos interesados en el ascenso de una revista en el indicador, puede incrementar y colocar a la revista con un impacto que yo llamo “burbuja”. Sin embargo, los generadores de estos indicadores ya desde hace algún tiempo han tomado medidas para prevenir estas acciones. Las acciones van desde advertir a los editores, hasta generar más indicadores y formas diversas de medida del impacto entre los usuarios de esta producción. Hace muchos años que también se ha hecho evidente que los indicadores cienciométricos son una medida de la comunicación entre pares (académicos) y que estos indicadores no son una medida de la apropiación social del conocimiento e incluso tampoco lo son de una medida del impacto de la formación profesional y seguramente es necesario tener otras medidas para esos otros ámbitos. Pero han sido las comunidades académicas (en especial las de las denominadas ciencias duras y en los países con mayor producción como las que hoy hacen este redescubrimiento) las que han jugado un papel legitimador de estos indicadores como criterio de calidad. Pero el problema no es el indicador o los indicadores per se. Por el contrario, son las comunidades académicas que, dentro de las instituciones universitarias o en las organizaciones que orientan los recursos para la investigación, han decidido darle un peso significativo, tanto en los procesos de evaluación de la investigación, como en la forma de acceder a estos recursos. Pero es claro que para un investigador, al menos en nuestro contexto, la decisión final sobre sí se le dan o no recursos no depende del IF, sino de un complejo sistema de evaluación de pares que relativiza el peso de los indicadores. Pero por otro lado, tampoco se puede ignorar el papel que se le ha dado a estos indicadores por parte de los sistemas de incentivos dentro de las instituciones, tanto para los investigadores, como para los grupos de investigación o sus instituciones. Además, éstos indicadores no pueden dar cuenta de todos sus esfuerzos y dinámicas de las comunidades productoras de conocimiento que se encuentran en etapas iniciales de desarrollo. Sin embargo, hoy tenemos suficiente evidencia de que cuando las comunidades se han consolidado estos indicadore resultan ser una herramienta de información de calidad certificada para mostrar el uso que las comunidades de producción de conocimiento hacen de ellas. Por lo anterior, se puede afirmar que la mayor parte de estas métricas proveen múltiples dimensiones de información y son de gran utilidad pues le da transparencia a los procesos que dan cuenta de la actividad de los investigadores, los grupos y los esfuerzos institucionales por desarrollar estas actividades de investigación y producción que de otra manera no tendríamos como comprenderlas. Es más, los sistemas de acceso libre como REDALYC Y SCIELO que son proyectos destacados de promoción del acceso abierto al conocimiento se han y se siguen enfrentando al reto no solo de mejorar el acceso a las comunidades que no pueden pagar por el acceso al conocimiento de calidad y que llevan más de 10 años comprometidas con la calidad y la democratización del acceso al conocimiento, además y es el caso particular de Redalyc se ha empeñado en mostrar otros indicadores alternativos de las comunidades académicas de la región. Sin embargo, sus voces siempre habían sido ignoradas o silenciadas y solo hasta ahora que un grupo de científicos de la corriente principal alzan su voz pasada de tiempo para iniciar una crítica que ya ha sido discutida por años en nuestra región. Por otro lado, a nivel mundial el trabajo de grupos como Scimago es sobresaliente y complementa la medición de indicadores aislados desarrollando múltiples y complejas medidas de la producción, impacto y uso de conocimiento. Creo que debemos incluso preguntarnos ¿qué fuerzas e intereses están animando hoy esta discusión? ¿No serán las nuevas comunidades emergentes las que ahora están desatando estas declaraciones llamando a cambiar las reglas del juego? ¿Ahora que estas comunidades académicas emergentes que tienen voz y citas resulta fundamental este redescubrimiento? En este sentido se deben tomar con cautela estas rebeliones pues llevamos varios años reflexionando sobre esto y no son un descubrimiento para nosotros, llevamos varios años discutiendo y trabajando sobre la necesidad de más métricas y tenemos claro, hace tiempo, la importancia de los impactos sociales del conocimiento. Wilson López López EditorPDFapplication/pdfspaPontificia Universidad Javerianahttp://revistas.javeriana.edu.co/index.php/revPsycho/article/view/5998/4871Universitas Psychologica; Vol. 12, Núm. 2 (2013); 323-326nullnullThe San Francisco Declaration: An Old Debate from the Latin American ContextDeclaración de San Francisco: una vieja discusión desde el contexto latinoamericanohttp://purl.org/coar/version/c_970fb48d4fbd8a85Artículo de revistahttp://purl.org/coar/resource_type/c_6501http://purl.org/coar/resource_type/c_2df8fbb1info:eu-repo/semantics/article10554/33369oai:repository.javeriana.edu.co:10554/333692023-03-29 14:26:29.434Repositorio Institucional - Pontificia Universidad Javerianarepositorio@javeriana.edu.co