Discover what smart strategies, solutions and practices you can be implementing to prepare your IT infrastructure for the inevitable technological changes coming to your campus.
For the first time, students who took the National Assessment of Educational Progress test completed interactive computer simulations.
And as states move toward computer-based assessments for the Common Core State Standards, this test of more than 6,000 fourth, eighth and 12th graders is giving a sneak peek at what future assessments could look like.
The National Assessment of Educational Progress is the largest nationally representative, ongoing assessment of what U.S. students know and can do across major subject areas. The head of the National Center for Education Statistics in the U.S. Education Department oversees this national assessment project.
More than 360,000 students took the 2009 paper-and-pencil science assessment.
A separate sample of more than 6,000 students took the hands-on and interactive computer tasks assessment. These hands-on and interactive computer tasks measure what students know and can do in complex, real-world situations — something that simple drawings, images and text-based questions can't do.
"The computer tasks let us use some things that, although they are simulated, in some ways appear more authentic than trying to make a physical, hands-on task appropriate for the classroom 40-minute administration," said Jack Buckley, commissioner of the National Center for Education Statistics in the U.S. Education Department.
With the computer simulations, students design experiments on their own, change them and redo them. Then they collect and analyze data.
"This is a set of skills that in the real world is invaluable — and which before this we've never been able to know if students really could do this or not," said Alan Friedman, chair of the Assessment Development Committee of the National Assessment Governing Board, which sets the assessment's policy.
The results of this assessment were released on Tuesday, June 19, in "The Nations Report Card: Hands-On and Interactive Computer Tasks from the 2009 Science Assessment."
For example, fourth-graders performed a 40-minute computer task with Mystery Plants. This task first asked how much sunlight plants need to grow well. Fifty-nine percent of fourth-graders answered correctly that different plants needed different levels of sunlight, showing they had a "complex" prior knowledge of the answer. They also were asked how much fertilizer plants need to grow well.
Then they did three experiments in a virtual greenhouse on what effect sunlight and fertilizer have on plants. The greenhouse had three areas: lots of sunlight, some sunlight and a little sunlight. The students had six trays of plants to experiment with.
The first two experiments dealt with sunlight's effect on plants. The third one asked students to discover how much fertilizer helped the plants grow best. The instructions told students to look at the leaves and flowers to see how well the plant grew.
After putting the plants in different areas of the greenhouse, the fourth-graders clicked "do experiment" to see how they grew. They watched the plants grow and could view a data table with the number of leaves, flowers and height of the plant.
After they finished each experiment, students made their conclusions and showed how they came to those conclusions. This proved the most difficult part.
While 93 percent of fourth-graders correctly identified that the plant in the first experiment loved sun, only 36 percent could explain their conclusion with evidence from their investigation.
An experiment for 12th graders showed a similar trend. In the Energy Transfer task, students had 20 minutes to figure out whether copper or aluminum made a better surface for the bottom of a cooking pan.
The 12th-graders used a simulated version of a calorimeter, which calculates the thermal energy transferred between substances that are touching each other. By changing the temperature and mass of the different metals, they could see how much thermal energy passed between the metal and water.
Fifty-five percent of the 12th-graders observed what happened and answered questions correctly. But just 27 percent could explain how the heat actually transferred from the warmer to cooler substance.
About 2,000 students in each grade level completed hands-on and computer-based tasks. By 2014, Freedman hopes that the assessment will be done entirely on computers from eighth grade up.
The U.S. Education Department and contractors tested the assessments for years, and they didn't find a negative effect on students' results from computer-based assessments, Freedman said. In fact, the students enjoyed playing with the computers and took quickly to the assessment.
"They are colorful, they are animated, they're just fun to do as well as having a lot of serious ideas in them," Freedman said. "There's even a bit of whimsy and humor in them."
But it comes at a price. The development costs for the computer-based assessments are dramatically more expensive, Freedman said. With written questions, someone typically thought them up, people would comment on them; it would get fine-tuned, tested and reviewed.
For the interactive computer simulations, developers need to consider whether different graphics, symbols and icons will mislead students. If the flowers in the fourth-grade experiment were bigger or smaller, would they confuse them? Then everyone needs to look at the item again if someone wants a change.
"A bunch of skills goes into creating these, which means it takes more time and it is more expensive," Freedman said. "But the bottom line is we learn so much more that we couldn't have learned from those paper-and-pencil tests."
You may use or reference this story with attribution and a link to