Assessment in the Classroom

This was an interactive session on theory, methods, and empirical findings for research on and evaluation of classroom assessment. Aspects of assessment validity discussed are those especially relevant to assessment used for diagnostic and formative purposes and the integration of instruction and assessment. A theoretical framework for classroom assessment and its relation to external assessments was presented, followed by a panel discussion by a number of PIs with assessment-related projects who described how the framework applies to their own work. A key question discussed is how these approaches to classroom assessment differ from traditional notions of validity as applied to large-scale assessments, and how they add value to the evaluation of classroom assessment and its integration with instruction, and learning. All participants were given an opportunity to react and to relate the ideas to their own research. Presentations included:

  • Lou DiBello and Jim Pellegrino, A Framework for Considering the Validity of Assessments Intended for Use in the Classroom, with examples from Embedded Assessments and Concept Inventories
  • Daniel Hickey, Participatory Assessment: Rethinking Formative & Summative Assessment Functions  [PDF]
  • Maria Araceli Ruiz-Primo, Issues in Developing and Evaluating Instructionally Sensitive Assessments
  • Mike Timms and Edys Quellmalz, Foundations of 21st Century Science Assessments: Establishing validity for simulation-based assessments  [PDF]
  • Ravit Duncan, The Devil's in the Data: Validating a Genetics Learning Progression  [PDF]
  • Jim Minstrell (Moderator/Discussant), The Diagnoser Project: Using Online Tools to Support Assessment for Learning  [PDF]

Download a PDF of the evaluation form for this session here.

About the Speakers

Lou DiBello is a Research Professor and Associate Director of the Learning Sciences Research Laboratory at the University of Illinois, Chicago. DiBello’s research interests include applied psychometrics and educational measurement, with a focus on the development of statistical models, methods, and software for psychometrically-based diagnostic assessment, and their applications to formative and summative assessment settings. His pioneering research and development work with colleagues has led to the Fusion Model and related software that provides psychometric modeling and analytic capabilities for diagnostic assessment. His professional experience includes twelve years in the testing industry in research and development, including Educational Testing Service where he served as a research director. He currently serves as Associate Director of the Learning Sciences Research Institute, co-Lead of the Informative Assessment Initiative, and is participating as a PI or co-PI on several assessment research and evaluation projects funded by NSF and IES.

Jim Pellegrino is Liberal Arts and Sciences Distinguished Professor and Distinguished Professor of Education at the University of Illinois at Chicago. He also serves as Co-director of UIC’s interdisciplinary Learning Sciences Research Institute. Dr. Pellegrino's research and development interests focus on children's and adult's thinking and learning and the implications of cognitive research and theory for assessment and instructional practice. Much of his current work is focused on analyses of complex learning and instructional environments, including those incorporating powerful information technology tools, with the goal of better understanding the nature of student learning and the conditions that enhance deep understanding. Dr. Pellegrino has served as head of several National Academy of Science/National Research Council study committees. These include chair of the Study Committee for the Evaluation of the National and State Assessments of Educational Progress, co-chair of the NRC/NAS Study Committee on Learning Research and Educational Practice, and co-chair of the NRC/NAS Study Committee on the Foundations of Assessment which issued the report Knowing What Students Know: The Science and Design of Educational Assessment.

Daniel Hickey is an Associate Professor in the Learning Sciences program at Indiana University in Bloomington. Dr. Hickey completed his Ph.D in Psychology at Vanderbilt University and a two-year postdoctoral fellowship at the Center for Performance Assessment at Educational Testing Service. He studies participatory approaches to assessment, feedback, evaluation, and motivation, with a particular focus on new models of evidential and consequential validity. He has directed projects in this area funded by the National Science Foundation, NASA, and the MacArthur Foundations, and has published practical theoretical papers in leading journals.

Maria Araceli Ruiz-Primo is Associate Professor at the School of Education and Human Development, University of Colorado Denver. She is the Director of the Research Center and Director of the Laboratory for Educational Assessment, Research, and Innovation (LEARN). She specializes in educational assessment. Her research work focuses on the development and technical evaluation of innovative science learning assessment tools—including performance tasks, concept maps, and student science products—and on the development of a conceptual framework of academic achievement. She has conducted research on the instructional sensitivity of assessments and their proximity to the enacted curriculum. Also, she has worked on the evaluation of curriculum implementation and formative assessment practices in the classroom. She participated in the development of the science teacher certification assessment of the National Board for Professional Teaching Standards and developed an approach to evaluate teacher enhancement programs for elementary science teachers. She has conducted research on educational evaluation for over 10 years. She is the first author of the Student Guide, Statistical Reasoning for the Behavioral Sciences and of books chapters and published papers in the field of science assessment.

Mike Timms serves as Associate Director of WestEd's Mathematics, Science & Technology Program, committed to increasing mathematics, science, and technology literacy among our nation's youth. Michael Timms received his Ph.D. in education from the University of California, Berkeley. Additionally, he received a B.A. in geography from Portsmouth University and a certificate in public administration from East Ham College. In collaboration with the Program Director, Timms oversees the program's services, materials, and strategies that support teachers' professional growth in content knowledge, effective teaching practices, appropriate assessment, and evaluation. Timms specializes in assessment development in e-learning and evaluation of educational technology. His research interest is in the application of educational measurement in intelligent learning systems. He serves as Co-PI on several major projects in the SimScientists program that is focused on the use of computer simulations for learning and assessment. He also serves as Managing Director of the agency's Center for Assessment and Evaluation of Student Learning (CAESL), a project funded by the National Science Foundation to improve student learning and understanding in science through focusing on effective assessment. In 2006, Timms and his fellow National Assessment of Educational Progress (NAEP) Framework Team members received WestEd's Paul D. Hood Award for Distinguished Contribution to the Profession for their work developing the science assessment framework for the Nation's Report Card, which informs the public about the academic achievement of elementary and secondary students in the United States.

Edys Quellmalz is the Director of Technology-Enhanced Assessment and Learning Systems in WestEd’s Math, Science and Technology program. Quellmalz leads SimScientists projects funded by NSF and the U.S. Department of Education related to simulation-based science curricula and assessments for formative and summative assessments that can serve as components of balanced state science assessment systems. Dr. Quellmalz is recognized nationally and internationally as an expert in technology-supported assessment and has published her research widely. She co-directs the development of the framework and specifications for the 2012 Technological Literacy National Assessment of Educational Progress and served on the Steering Committee for the 2011 NAEP Writing Framework. She has consulted for numerous state, national, and international assessment programs. She was Associate Director of the Center for Technology and Learning at SRI International and Director of Assessment Research and Design. She served on the faculty at the Stanford School of Education as research faculty in the UCLA Graduate School of Education.

Ravit Duncan is an Assistant Professor of Science Education at Rutgers University jointly in the Graduate School of Education and the Department of Ecology, Evolution, and Natural Resources. She earned an M.S. in biology from the University of Illinois at Chicago and a B.Sc. in biology from Hebrew University in Jerusalem. Dr. Duncan’s research focuses on student cognition, curriculum development, and teacher education. She is the recipient of two early career awards, from the National Academy of Education/Spencer foundation to study a learning progression in genetics, and from the Knowles Science Teaching Foundation to study the development of preservice teachers’ pedagogical content knowledge. Dr. Duncan was the guest editor, with Cindy Hmelo-Silver, of a special issue on learning progressions that recently appeared in the Journal of Research in Science Teaching. In the context of her REESE project she organized, with Joe Krajcik (Co-PI), a workshop on the assessment of learning progressions.

Jim Minstrell is a Senior Research Scientist and co-founder of FACET Innovations. Jim received his BA in mathematics education from the University of Washington, his Masters in Science Education from the University of Pennsylvania and a PhD in Curriculum and Instruction/ Science Education from the University of Washington. He spent 30 years teaching mathematics, physics and integrated science & mathematics at the high school level. About ten years into his teaching career Jim became aware of “misconceptions” research. He began studying the learning and teaching of his own students and those of his teacher colleagues. The negative connotation of the term misconceptions and the realization that there were strengths as well as problematic thinking in students’ understandings prompted Jim and colleagues to coin the term “facets of student thinking.” Since retiring from the classroom, Jim has remained very active in STEM education. He has been a PI or co-PI of various research and development projects related to assessment, classroom practice, professional development and the Diagnoser project (see www.Diagnoser.com). He also serves as external evaluator and/or advisor on several other STEM projects. Jim’s ongoing interest in formative assessment and the promotion of deep conceptual understanding and reasoning in learners informs all of his work.