A MEDIDA NAS PESQUISAS EM EDUCAÇÃO: EMPREGANDO O MODELO RASCH PARA ACESSAR E AVALIAR TRAÇOS LATENTES

MEASUREMENT IN TEACHING RESEARCH: APPLYING RASCH MODEL TO ACCESS LATENT TRACES

Autores

  • Amanda Amantes Ribeiro Universidade Federal da Bahia (UFBA)
  • Geide Rosa Coelho Universidade Federal do Espírito Santo (UFES)
  • Rafael Marinho Instituto Federal de Minas Gerais (IFMG)

DOI:

https://doi.org/10.1590/1983-21172015170306

Palavras-chave:

Modelo Rasch; Traços Latentes; Metodologia Qualitativa-quantitativa.

Resumo

Neste trabalho, fazemos uma reflexão acerca das potencialidades da associação de métodos qualitativos e quantitativos para responder a questões específicas, no sentido de obter maior coerência interna nas pesquisas da área educacional. Temos como foco discutir o modelo Rasch como ferramenta para acessar traços latentes, apresentando um exemplo de como esse modelo pode ser promissor para trabalharmos com medidas, assim como responder a questões de natureza causal e que se remetem à identificação de efeitos e mudanças.

We report a discussion about the importance of combining qualitative and quantitative methods to lead specific questions in educational area. We point out the relevance of this approach in order to improve ours methods and obtain greater internal consistency. Rasch model is presented as a tool to access latent traces. We show how this model can be promising to work with measures, as well as answering questions of causality and questions which intend to identify effects and changes.

Downloads

Não há dados estatísticos.

Referências

AMANTES, A. (2009). Contextualização no ensino de Física: Efeitos sobre a evolução do entendimento dos estudantes. Tese de Doutorado, 275p. Universidade Federal de Minas Gerais, Belo Horizonte, MG, Brasil. [ Links ]

BERGSTROM, B. A. Ability measure equivalence of computer adaptive and pencil and paper tests: A research synthesis. American Educational Research Association, San Francisco, CA, 1992. [ Links ]

BIGGS, J.; COLLIS, K. Evaluating the quality of learning: the SOLO taxonomy. New York: Academic Press, 1982. [ Links ]

BODZIN, A. M.; BEERER, K. M. Promoting Inquiry-Based Science Instruction: The Validation of the Science Teacher Inquiry Rubric (STIR). Journal of Elementary Science Education. Publisher: Springer Netherlands p. 39-49, 2003. [ Links ]

BOND, L.; SMITH, T . W.; BAKER, W. K.; HATTIE, J. A.. The certification system of the National Board for Professional Teaching Standards: a construct and consequential validity study (Research Report). Greensboro, NC: University of North Carolina at Greensboro, Center for Educational Research and Evaluation, 2000. [ Links ]

BOND, G. T.; FOX, C. M. Applying the Rasch Model: Fundamental Measurement in the Human Sciences. 2. ed. Mahwah, NJ: Lawrence Erlbaum Associates, 2007. 340p. [ Links ]

CLARK, D.; SAMPSON, V. Assessing dialogic argumentation in online environments to relate structure, grounds, and conceptual quality. Journal of Research in Science Teaching, Michigan State University, n.45, v. 3, p. 293-321, 2008. [ Links ]

COELHO, G.R. (2011). A evolução do entendimento dos estudantes em eletricidade:um estudo longitudinal. Tese de Doutorado,173p. Universidade Federal de Minas Gerais, Belo Horizonte, MG, Brasil. [ Links ]

COMMONS, M. L.; PEKKER, A. Hierarchical complexity: A Formal Theory. 2004. Disponível em: < http://www.dareassociation.org/papers.php >. Acesso em: 07 jul. 2013. [ Links ]

CRONIN, J.; KINGSBURY, G.G.; MCCALL, M.S.; BOWE, B. The Impact of the No Child Left Behind Act on Student Achievement and GrowthNorthwest: Evaluation Association, 2005. [ Links ]

DAWSON, T. L. Stage-like patterns in the development of conceptions of energy. In: LIU, X.; BOONE, W. (Ed.). Applications of Rasch measurement in science education Maple Grove, MN: JAM Press, 2006. p. 111-136. [ Links ]

DAWSON, T. L.; STEIN, Z. Cycles of research and application in education: Learning pathways for energy concepts. Mind, Brain e Education, Cambridge,Massachusetts, v. 2, n. 2, p. 90-103, 2008. [ Links ]

DESIGN-BASED RESEARCH COLLECTIVE. Design-Based Research: An Emerging Paradigm for Educational Inquiry. Educational Researcher, Flórida, USA, v. 32, n. 1, p. 5-8, Jan./Fev. 2003. [ Links ]

DRANEY, K.; WILSON, M. Application of the Saltus model to stage-like data: Some applications and current developments. In: VON DAVIER, M.; CARSTENSEN, C. H. (Ed.). Multivariate and mixture distribution Rasch models: Extensions and applications. New York: Springer, 2007. [ Links ]

FISCHER, K. W. A theory of cognitive development: the control and construction of hierarchies of skills. Psychological Review, United States, v. 87, p. 477-531, 1980. [ Links ]

FISCHER, K. W. Dynamic cycles of Cognitive and Brain development. In: BATTRO, A. M.; FISCHER, K. W. (Ed.). The educated brain. Cambridge, U.K.: Cambridge University Press, 2008. [ Links ]

GOLAFSHANI, N. Understanding reliability and validity in qualitative research. The Qualitative Report, Canadá, v. 8, n. 4, p. 597-607, 2003. [ Links ]

GORARD, Stephen. Can we overcome the methodological schism? Four models for combining qualitative and quantitative evidence. Research Papers in Education, University of Exeter, UK, v. 17, n. 4, p. 345-361, 2002. [ Links ]

GRIFFIN, P. The comfort of competence and the uncertainty of assessment. Hong Kong School Principal's Conference, Hong Kong, 2004. [ Links ]

GUTTMAN, L. A basis for scaling qualitative data. American Sociological Review, Vanderbilt University, USA, n. 9, v. 2, p. 139-150, 1944. [ Links ]

HAFNER, J.; HAFNER, P. Quantitative analysis of the rubric as an assessment tool: An empirical study of student peer-group rating. International Journal of Science Education, UK, n. 25, v. 12, p. 1509-1528, 2003. [ Links ]

HESTENES D.; WELLS M.; SWACKHAMER G., Force concept inventory, Physics. Teacher. n. 30, v. 141p. The Physics Teacher, Vol. 30, March 1992, p. 141-158 [ Links ]

KÆRGÅRD, N. Professor Georg Rasch (1901-1980) and Modern Econometrics, Nordic Econometric Meeting, 7, Bergen, p. 17-19, 2013. [ Links ]

KENNEDY, C. A. Models and Tools for Drawing Inferences from Student Work. American Education Research Association, Montreal, Canada, 2005. [ Links ]

LINACRE, J. M.; WRIGHT, B. D. Winsteps: Rasch analysis for all two facet models. Chicago: MESA, 2000. [ Links ]

MASTERS, G. N. Continuity and Growth: Key Considerations in Educational Improvement and Accountability. ACE and ACEL National Conference, Perth, 2004. [ Links ]

MEAD, R. A. Rasch Primer: The Measurement Theory of Georg Rasch Psychometrics services research memorandum 2008-001. Maple Grove, MN: Data Recognition Corporation, 2008. [ Links ]

MORRIS G. A. et al. An item response curves analysis of the Force Concept Inventory, Am. J. Phys, USA, v. 80, p. 8-25, 2012. [ Links ]

MOSKAL, B. M.; LEYDENS J. A. Scoring rubric development: validity and reliability. Practical Assessment, Research & Evaluation, USA, v. 7, n. 10, 2000. [ Links ]

OLSEN, L. W. Essays on Georg Rasch and his Contributions to Statistics. 2003. 237p. Ph.d. (Dissertation Department of Economics) - University of Copenhagen. 2003. [ Links ]

PLANINIC, M.; IVANJEK, L.; SUSAC, A. Rasch model based analysis of the Force Concept Inventory. Physical Review Special Topics. Physics Education Research, USA, v. 6, n. 1, p1-11, 2010. [ Links ]

PRING, R. Editorial conclusion: a philosophical perspective. Oxford Review of Education, Oxford, Reino Unido (UK), v. 26, n. 3 e 4, p. 495-501, 2000. [ Links ]

RASCH, G. Probabilistic Models for Some Intelligence and Attainment Tests. Studies in Mathematical Psychology I. Danmarks pædagogiske Institut Copenhagen, 1960. [ Links ]

RASCH, G. On specific objectivity: an attempt at formalizing the request for generality and validity of scientific statements. Danish Yearbook of Philosophy Dinamarca, v. 14, p. 58-94, 1977. [ Links ]

RYAN, G.; BERNARD, H. Data Management and Analysis Methods. In: DENSZIN, N.; LINCOLN, Y. (ed.) Handbook of qualitative research, 2nd ed. Thousand Oaks, CA: Sage Publications, 2000. p. 769-802. [ Links ]

ROBLYER, M. D.; EKHAML, L. How Interactive are YOUR Distance Courses? A Rubric for Assessing Interaction in Distance Learning. The Online Journal of Distance Learning Administration, University of West Georgia-USA, v. 3, n. 2, 2000. [ Links ]

BEERER, K.; BODZIN, A. Promoting Inquiry-Based Science Instruction With the Science Teacher Inquiry Rubric (STIR). Journal of Elementary Science Education, v. 15, Issue 2, p. 39-49, fall 2003. [ Links ]

SAMPSON, V.; CLARK, D. Assessment of the ways students generate arguments in science education: Current perspectives and recommendations for future directions. Science Education Madison-USA, v. 92, n. 3, p. 447-472, 2008. [ Links ]

SHAFFER, D. W.; SERLIN R. C. What Good are Statistics That don't Generalize? Educational Researcher, University of Wisconsin, Madison, v. 9, n. 33, p. 14-25, Dec. 2004. [ Links ]

SINGER, J. D.; WILLETT, J. B. Applied Longitudinal Data Analysis: Modeling Change and Event Occurrence. Nova York: Oxford University Press, 2003. 644p. [ Links ]

THOMAS, G. et al. Numeracy item signature study: A theoretically derived basis. Auckland, NZ: University of Auckland, Project asTTle, 2002. [ Links ]

THURSTONE, L. L. Measurement of social attitudes. Journal of Abnormal and Social Psychology, Emory University- USA, v. 26, p. 249-269, 1931. [ Links ]

TRIOLA, M. F. Introdução à Estatística. 10. ed. Rio de Janeiro: LTC, 2008. 696p. [ Links ]

WRIGHT, B. D. Where do dimensions come from? Popular Measurement University of Chicago-USA, v. 1, n. 1, p. 32 1998. [ Links ]

WRIGHT B. D.; LINACRE J. M. Observations are always ordinal; measurement, however, must be interval. Archives of Physical Medicine and Rehabilitation, USA, v. 70, p. 857-860, 1989. [ Links ]

Downloads

Publicado

2015-10-08 — Atualizado em 2021-04-28

Edição

Seção

RELATOS DE PESQUISAS / RESEARCH REPORTS