If you or someone in your life is getting ready to take a standardized test that includes writing an essay, the way that test is scored may soon be antiquated. Instead of having a set of human eyes comb over thousands upon thousands of essays every year, it could soon be that the heart and soul you pour out into that perfectly printed essay paper will be spit in and spit out of a machine in a matter of seconds (or even shorter).
An organization called Open Education Solutions took 16,000 essays that were released from six states, and used automated software solutions to provide a quantifiable grade for the very qualitative testing material. It was described as the “first comprehensive, multivendor trial” to test the computerized solutions that are available. And according to the results of the study, it seems computers tend to pick up on the same positives and negatives of an essay as humans do.
Study co-author Mark Shermis said, “The results demonstrated that overall, automated essay scoring was capable of producing scores similar to human scores for extended-response writing items with equal performance for both source-based and traditional writing genre.” Do you like the idea of a unified, unbiased system to score essays in standardized tests, or would you rather have someone who can truly pick up on nuances and other qualitative components take on that charge? If you went with the latter, your opinion may very well be moot.
Open Education Solutions study finds computers grade essays like humans is written by Mark Raby & originally posted on SlashGear.
© 2005 - 2012, SlashGear. All right reserved.