Industry Group's Education Study Draws Conclusions and Critics

Industry Group's Education Study Draws Conclusions and Critics

          

New York Times, August 30, 2000

            By REBECCA S. WEINER

           

 

A new analysis of recent research into the use of technology in the classroom is being praised by some educators and policymakers, but critics are already questioning its conclusions and its credibility.

 

The report, paid for by a software industry group, suggests that teacher training, sophisticated software and increased attention to student behavior are the keys to improving the effectiveness of technology in the classroom.

 

"Technology is most effective when the right conditions are set," said Melinda George, education division director for the Software and Information Industry Association, which commissioned the analysis.

 

Still, some observers are not convinced that the increased use of computers in classrooms can produce dramatic increases in educational achievement.

 

Kirk Johnson, a policy analyst for the Washington-based Heritage Foundation, recently analyzed National Assessment of Educational Progress data to determine whether student computer use boosted reading scores. That study, Johnson said, suggests that while computers might be able to help students master basic skills, they may be far less effective in helping them with more advanced levels of learning.

           

"When it comes down to it, teachers and parents are still the most important factors in influencing children's education," he said.             "If computers are so great, where are the national gains in test scores?"

 

While George said it is still too early to suggest that computer use in schools could be a source of nationwide increases in test scores, she did point out that localized studies in West Virginia and Idaho have attributed testing gains in those states to an increased use of educational technology.

 

"One of the things about technology is it's an ongoing investment. You won't see changes immediately," she said. "If you're measuring the effectiveness of a tool, it's hard to pinpoint the effectiveness of one element."

 

 The analysis - which examined the results of recent research on how technology is being used in schools - concluded that the effectiveness of technology depends on several factors, including the goals of the instruction, the design of the software, the implementation of the technology by educators and the characteristics of the students involved.

 

The SIIA study concluded that students whose teachers have completed 10 hours or more of training perform better than those whose teachers received five hours of training or less. Federal and state policymakers have been pushing for more technology-targeted teacher training as a means of ensuring that investments in computer hardware and software are put to use.

 

While questioning the assumption that computer usage can lead to increases in test scores, Johnson said there is clearly a role for computers in education.

 

"Virtually everyone agrees that there should be computer literacy taught because of the true power of Internet technology for research purposes," he said. "I don't think we should pack up all of the computers and send them back to Dell and Gateway. It should be a tool, not a crutch."

 

William Rukeyser, coordinator of the California-based Learning in the Real World, a nonprofit organization that has often questioned the value of technology in schools, was more skeptical of the SIIA study, primarily because of its source.

           

"It is incumbent on educators and policy leaders to adopt a caveat emptor attitude," he said. "The bottom line on education technology is that the jury is still out and more objective, arm's length research is needed."

 

George said it is fair to question the software industry group's role in promoting the study, but stressed that the report is an analysis of independent research on the subject.

           

 "The research we have cited in this report is independent studies," she said. "We're just compiling them."

 

Linda Roberts, director of the Education Department's Office of Educational Technology, said the compilation of existing research should be helpful to educators.

           

  "While yes, this is funded through the Software Information and Industry Association, the studies they are examining are studies being conducted by those in the field," she said. "It's a real contribution."

 

The Education Department is working to track the effectiveness of computers in schools by requiring technology grant recipients to track their programs' success. It also is working with the National Science Foundation to fund basic research in learning and is hosting a fall conference of researchers and educators to study the issue.

 

"We're doing evaluations of what's happening in the field, we're focused on evaluating the programs we fund and we're focused on what we need to do in the long term," Roberts said.

 

While the SIIA study adds to the current body of research, Roberts said it is mostly made up of "after-the-fact" research that makes it difficult to tease out which educational improvements can be tied to technology.