(Sounds like the new SAT format won't be enough of a money-maker for the
Acorn, so they're trying yet another test to make kids cringe and
suffer.....rf)


Testing Your Tech Smarts
By Amit Asaravala

Story location: http://www.wired.com/news/culture/0,1284,67156,00.html

02:00 AM Apr. 08, 2005 PT

Normally, the prospect of taking a two-and-a-half-hour test produced by the
Educational Testing Service, makers of the dreaded Scholastic Aptitude Test,
would have kept me awake for nights.

But the Information and Communication Technology literacy assessment is no
SAT. For one, it's not an admissions requirement for U.S. universities -- at
least, not yet. More important, it's not trying to determine whether I
remember how to calculate the surface area of a sphere, or that I know what
"obsequious" means. (I don't.)

Rather, the ICT probes a skill I think I've nearly perfected over the past
decade -- the ability to make sense of the multiple streams of information
that our computers throw at us every day.

ETS designed the test for colleges and universities that want to find out
where their students stand when it comes to thinking critically and solving
problems in today's tech-saturated world, according to ETS spokesman Tom
Ewing. Schools administer the tests on campus and receive an aggregate
score.

"With today's technologies, you have to be able to extract information,
summarize it and communicate it," said Ewing. "The test is sort of measuring
your ability to blend these things."

Ewing said he doubts the ICT will ever play as large a role in college
admissions as the SAT, but he acknowledged that ETS plans to develop a
version of the test for businesses. Plus, the company will offer individual
test scores at some point in 2006. That leaves open the possibility that
companies and technical colleges might one day use the test to vet
employment candidates and college applicants.

Unlike many other standardized tests, the ICT assessment is not based on
multiple-choice questions or essays. Instead, the ICT asks test-takers to
complete tasks with the help of common internet technologies. Those who take
it may find themselves scanning e-mail messages for important attachments,
looking for documents using a search engine and picking the most
authoritative sources out of a set of search results, among other things.

If these tasks seem trivial, consider the number of people who still fall
for e-mail phishing scams. Or the people who buy products from a shady
website and then find that their identity has been stolen. Even good
reporters get duped once in a while, believing documents and websites to be
authoritative when they're actually elaborate hoaxes.

Those who are better at avoiding these mistakes -- those who efficiently
separate the wheat from the chaff -- will perform better in today's
information-based society, ETS claims; that's why it says a test like the
ICT is needed.

While the reasoning makes sense, I was skeptical that a test could actually
measure what seems to be a somewhat amorphous skill. So when I learned that
ETS was conducting a trial of the ICT between Jan. 24 and April 15 at select
colleges around the United States, I called to see if I could take the test.

After some pleading on my part, the company agreed to let me sit in on the
trial, provided that I not tell any of the other test-takers that I was a
reporter until after the test. This was to ensure that I didn't make the
volunteers nervous or otherwise affect their behavior, explained a spokesman
for the company. I also couldn't reprint entire questions from the test
here, lest I give future test-takers an unfair advantage.

I'll admit I felt nervous as I walked to the classroom at San Francisco
State University where the test was to be held. It had been years since I
was on a university campus, and memories of grueling exams -- some of which
I was unprepared for -- still haunted me.

Thankfully, these fears disappeared once I got to the classroom -- a
computer lab -- and remembered that I was going to be in my element. I had,
after all, been using computers since the early 1980s and the internet since
1994. Surely, a test full of web searches would be just like another day in
the office.

My assumptions were correct -- almost.

Because the test is web-based, it's served as a series of interactive forms
and applications inside an Internet Explorer window. This means that
test-takers are stuck in the simulated computer environment created by ETS.
Forget about using your own e-mail client or the Mozilla web browser. And
don't even think about reaching for any of the dozens of keyboard shortcuts
that you've grown accustomed to on your own computer.

While this didn't pose too much of a problem when I had to use ETS'
simulated search engine to find the most relevant documents on a given
topic, it without a doubt affected the speed at which I performed
e-mail-related tasks. For instance, when one question asked me to send a
message to a specific person, I wasted time searching for an address book in
ETS' simulated e-mail program.

Interfaces aside, I also had trouble with some of the tasks the test wanted
me to perform. For instance, one question asked me to select images and text
for a document aimed at adolescent schoolchildren. Given that I've forgotten
exactly what it was like to be that age -- mainly on purpose -- I had a hard
time figuring out what would fit best.

I wondered: Would a grade-school teacher have had an unfair advantage in
completing that task? And how much better would I have done if I could use
familiar software to perform familiar tasks, like researching facts for an
article?

I wasn't the only one who had these concerns. "The e-mail program was a
little complicated," said Ann Pattison, an SFSU junior who took the test
when I did. "A lot of students don't use e-mail programs that look like that
-- they use Hotmail or Gmail."

Pattison added that the wording of the questions was often convoluted or
geared toward business situations she wasn't familiar with. "I looked at
some of them for a long time, thinking, 'What are they talking about?'" said
Pattison. "I think they have to work on it."

ETS research scientist David Williamson confirmed that the company plans to
use feedback from test-takers like Pattison to refine the test before the
official release. That, after all, was the point of running the trial.

But he noted that the software interfaces were not likely to change to look
more like Outlook or any other program. In fact, they were purposely
designed to be vendor-neutral. This, he said, places the emphasis on the
task and not the technology.

"We went to great efforts to make it not like any commercial product," said
Williamson. "There are already plenty of commercial product certifications
out there that can measure how adept you are at using software. But what
we're trying to target is providing only the minimal software functionality
that's required to get the task done."

The tasks-not-technology approach is also represented in the scoring of the
tests, which is completely automated, according to Williamson. The scoring
engine doesn't care where test-takers put the mouse cursor or how many times
they click on a button, he said. But it does build up points incrementally
based on the types of documents you view and the number of times you refine
your search and so on.

This means two test-takers could select the same document from a list of
search results, but be graded differently because of the way they got there.

"If two students both select the same resource to share to an audience,
they're equal in their ability to know when a resource is an appropriate
resource," said Williamson. "But if they take different paths, what that
implies is the more efficient person is better able to distinguish from the
summary paragraphs that the source is more likely to contain relevant info."

Given this, it's not likely I lost points for wasting time in my search for
that address book. But I probably did get docked for forgetting to send an
attachment with the e-mail as required by the test question.

Unfortunately, I couldn't see my test score because ETS doesn't yet offer
individual results for the ICT. I guess that means I can just tell my editor
that I got an A.

Whatever my score, the question remains: Does the ICT literacy assessment
really measure a person's ability to make critical judgments and solve
problems in today's tech-oriented world?

It's possible that no one will know for sure until researchers have had time
to follow test-takers to see whether high scorers really end up doing better
in school and in the work force. Of course, by then information and
communication technologies might look completely different than they do
today, and ETS may find itself designing a whole new test.

At the moment, that doesn't seem to bother the company.

"We're responding to a need that colleges have now," said Ewing. "We'll
adapt as the situation presents itself."

End of story



You are a subscribed member of the infowarrior list. Visit 
www.infowarrior.org for list information or to unsubscribe. This message 
may be redistributed freely in its entirety. Any and all copyrights 
appearing in list messages are maintained by their respective owners.

Reply via email to