Scientific production evaluation: the role of journals and scientific articles
Keywords:
scientific articles, scientific journals, scientific production evaluationAbstract
The essay was built from the analysis of the relevant literature and also from the author's experience in scientific policy evaluation processes, evaluation of strict sense graduate programs and in editorial activity in scientific journals published in the country and abroad. The evaluation of scientific production through scientific articles has two main aspects: the evaluation of the articles themselves and the evaluation of the journals in which they are published. In the evaluation of the articles themselves, three different and complementary strategies can be fundamentally used: classification according to the formal characteristics of the article, counting of citations received, and qualitative evaluation based on post-publication peer review procedures. In evaluating journals as indirect indicators of production quality, different bibliometric indicators are used, either as a weighting factor for articles numbers, or as criteria for building classifications such as Qualis journals from CAPES. As in any complex issue, there cannot be a single way of carrying out the evaluation, and the use of different strategies can counterbalance existing limitations in each of them, giving a more comprehensive and multifaceted perspective for the evaluators.
Downloads
References
BARATA, R.B. A ABRASCO e a pós-graduação stricto sensu em saúde coletiva. In: LIMA N.T., SANTANA J.P., PAIVA C.H.A. (org.) Saúde Coletiva. A ABRASCO em 35 anos de história. Rio de Janeiro: Editora Fiocruz/ABRASCO, 2015. cap. 8, p.169-198.
BARATA, R.B. Desafio da editoração de revistas científicas brasileiras da área da saúde. Ciência & Saúde Coletiva, Rio de Janeiro, v. 24 n.3, p. 929-939, 2019. DOI: 10.1590/1413-81232018243.29952016. http://www.cienciaesaudecoletiva.com.br
BARATA, R.B. Dez coisas que você deveria saber sobre o Qualis. Revista Brasileira de Pós-Graduação. Brasília; vol.13, n.30, p.13-40, 2016. http://www.rbpg.capes.gov.br
BARRETO, M.L. O desafio de avaliar o impacto das ciências para além da bibliometria. Revista de Saúde Pública, São Paulo, vol.47, n.4, p. 834-837, 2013ª. DOI: 10.1590/S0034-8910.201304705073.http://www.scielo.br/j/rsp/a/cx9vHMPr5kjmd4wv4bfTGNJ/?format=pdf&lang=pt.
BARRETO, M.L. Como avaliar as ciências com uma deficiente ciência da avaliação científica? (debate) Cadernos de Saúde Pública, Rio de Janeiro, vol.29, n. 9, p.1719-1721, 2013b. http://dx.doi.org/10.1590/0102-311XCO060913
BORNMANN, L., LEYDESDORFF, L. The validation of (advanced) bibliometric indicators through peer assessment: a comparative study using data from InCites and F1000. Journal of Infometrics, Taipei, vol. 7, n.2, p.286-291, 2013. http://dx.doi.org/10.1016/j.joi.2012.12.003.
BORNMANN, L. How to analyze percentile citation impact data meaningfully in bibliometrics: the statistical analysis of distributions, percentile rank classes and top-cited papers. Journal for the American Society for Information Science and Technology, New York, vol. 64, n. 3, p.587-595, 2013. https://doi.org/10.1002/asi.22792.
BOURDIEU, P. Os usos sociais da ciência: por uma sociologia clínica do campo científico. São Paulo: Editora UNESP, 2004. 85 p.
CHAUÍ, M. Avaliação irracional da pesquisa e eclipse da docência na USP. Revista ADUSP, São Paulo, vol 60, p.54 – 65. 2017. http://www.adusop.org.br.
DEMERRIT, D. The new social contract for science: accountability, relevance, and value in US and UK science and research policy. Antipode, Malden, vol. 32, n. 3, p.308-329. 2000.
https://doi.org/10.1111/1467-8330.00137
DOBROW, M.J., MILLER, F.A., FRANK, C., BROWN, A.D. Understanding relevance of health research: considerations in the context of research impact assessment. Health Research Policy and Systems, Geneva, 2017; 15:31-40. https://doi.org/10.1186/S1261-017-0188-6.
FERREIRA, M.M., MOREIRA, R.L. Capes 50 anos. Depoimentos ao CPDOC/FGV. Brasília. 2002. Disponível em www.dominiopublico.mec.gov.br
FOX, A., BRAINARD, J. University of California takes a stand on open access. Science, New York, vol. 363, n.6431, p.1023, 2019. https://doi.org/10.1126/science.363.6431.1023-a.
GENG, E.H., PEIRIS, D., KRUK, M.E. Implementation science: relevance in the real world without sacrificing rigor. PLoS Medicine, San Francisco/Cambridge, vol. 14, n. 4, p.e1002288. https://doi.org/10.1371/journal.pmed.1002288.
GLÄNZELL, W., MOED, H.F. Journal impact measures in bibliometric research. Scientometrics, Budapest, vol. 53, n. 2, p.171-193, 2002. https:// link.springer.com/ content/ pdf/10.1023/ A:1014848334806.pdf?pdf=inline%20link.
GUIMARÃES, R. Desafios da pós-graduação em saúde humana no Brasil. Revista de Saúde Pública, São Paulo, vol. 45, n. 1, p. 1-13, 2011. https://www.scielo.br/j/rsp/a/p3GxTJhsdpXKttX3kdCc85k/?format=pdf&lang=pt.
KUHN, T. A estrutura das revoluções científicas. São Paulo: Editora Perspectiva, 1989. 218p.
LARIVIÈRE, V., GINGRAS, Y. The impact factor’s Matthew effect: a natural experiment in bibliometrics. Journal of the American Society for Information Science and Technology, New York, vol. 61, n. 2, p. 424-427, 2010. https://doi.org/10.1002/asi.21232
LOZANO, G.A., LARIVIÈRE, V., GINGRAS, Y. The weakening relationship between the impact factor and paper’s citations in the digital age. Journal of the American Society for information Science and Technology, New York, vol.63, n. 11, p.2140-2145, 2012. https://doi.org/10.1002/asi.22731
MINNITI, S., SANTORO, V., BELLI, S. Mapping the development of Open Access in Latin America and Caribbean countries. An analysis of Web of Science Core Collection and SciELO Citation Index (2005-2017). Scientometrics,Budapest, vol. 117, p.1905-1930, 2018. https://doi.org/10.1007/s11192-018-2950-0
OPPENHEIM, C. The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy and archaeology. Journal of Documentation, vol.53, n.5, p. 477-487,1997. https://doi.org/10.1108/EUM0000000007207
REALE, E., LEPORI, B., SPINELLO, A.O., ZINILLI, A. Topics of social relevance in research project funding instruments: evidences from European countries. In: 23rd International Conference on Science and Technology Indicators, 2018, Leiden. Proceedings. https:www.scholarypublications.universiteitleidn.nl
RINIA, E.J., LEEUWEN, T.N., VUREN, H.G., RAAN, A.F.J. Comparative analysis of a set of bibliometric indicators and central peer review criteria. Evaluation of condensed matter physics in the Netherlands. Research Policy, vol. 27, p.95-107, 1998.https://swts.nl/tvr/documents/avr-peer-bibliom-comp-researchpolicy.pdf
SCARATTI, G., GALUPPO, L., GORLI, M., GOZZOLI, C., RIPAMONTI, S. The social relevance and social impact of knowledge and knowing. Management Learning, vol.48, n. 1, p.57-64,2017.
DOI:10.1177/1350507616680563. https://www.journals.sagepub.com.home/mlq
SCOTT, A. Peer review and the relevance of science. Futures, vol. 39, n. 7, p.827-845, 2007. doi:10.1016/j.futures.2006.12.009 www.pdf.sciencedirectassets.com.
STOKES, D.E. Pasteur’s Quadrant – basic science and technological innovation. Washington: Brookings Institution Press, 1997. 248 p.
STRUCHINER, C.J. Avaliação da qualidade da produção científica e suas consequências imprevistas e indesejadas: um conceito auto evidente? (debate) Cadernos de Saúde Pública, Rio de Janeiro, vol.29, n.9, p.1716-1717, 2013. http://dx.doi.org/10.1590/0102311XCO040913.
SZKLO, M., NIETO, F.J. Epidemiology. Beyond the basics. Jones & Bartlett Learning, Burlington, 2019. cap. 8, p.313-365.
WAINER, J., ECKMAN, M., ROCHA, A. Peer-selected “best papers”- are they really that good? PLoS One, San Francisco/Cambridge, vol.10, n.3, p. e0118446, 2015. https://doi.org/10.1371/journal.pone.0118446.
WALTMAN, L., COSTAS, R. F1000 recommendations as a potential new data source for research evaluation: a comparison with citations. Journal of the American Society for information science and technology, vol.60, n.11, p. 2169-2385,2013. doi 10.1002/asi. www.onlinelibrary.wiley.com
WILBERTZ, J. Evaluating societal relevance of research. University of Groningen, 2013. Report 29 p. www.researchrug.nl.(Download 03/07/2019).
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Perspectivas em Ciência da Informação
This work is licensed under a Creative Commons Attribution 4.0 International License.