Only in America

Hussar

Legend
I strongly disagree DA. All testing reveals is how well someone can perform in a completely unnatural situation that barely relates to anything you will encounter in the real world.

Heck one only has to look at TOEIC standards where second language learners can score very highly on the test yet utterly fail basic fluency.
 

log in or register to remove this ad

Ahnehnois

First Post
Standardized tests mostly just measure how good you are at taking tests.

Their validity arises mostly from the fact that to go down some professional paths, you're going to need to take a lot of them. They're self-justifying. People who ace the SAT may one day get the opportunity to take the MCAT/LSAT/etc., and then the boards/bar/etc. None of those tests correlate strongly with real world performance in the relevant field, but if you're not a good enough test taker you may not get the chance to do anything.

To me the rationale for all this testing is tenuous at best.
(Not sour grapes; I'm incredibly good at standardized tests myself).
 

Dannyalcatraz

Schmoderator
Staff member
Supporter
The quality of testing depends on how the test is constructed, true.

But again, how do you ascertain math proficiency without testing? How do you discern whether someone has read The Odyssey, understand Oil & Gas Law, can do basic economic or accounting analyses without some kind of demonstration? How do you cull those fundamentally unfit for being engineers, MDs, etc. from those who are without comparing their abilities to a known baseline of competency in certain areas?

Again, standardized tests may not be perfect, but what do you propose as an alternative?
 

Hussar

Legend
Empowering teachers to make that determination. Trusting and training educators to be able to determine competency in a field.

How do I know you can do something? By watching you do it and evaluating your ability in a controlled situation.

The problem is what I propose would be far too expensive to ever be used.
 

Dannyalcatraz

Schmoderator
Staff member
Supporter
Empowering teachers to make that determination. Trusting and training educators to be able to determine competency in a field.

How do I know you can do something? By watching you do it and evaluating your ability in a controlled situation.

The problem is what I propose would be far too expensive to ever be used.

Most professions have practicals at some point that weed people out, but only after the mass winnowing that occurs at the level of the LSAT, MCAT, etc. level. There simply aren't enough teachers to either do that mass sort or to adequately assess the skills of the masses via practicals absent tests like the MCATS and their ilk. And the economic reality of the huge numbers of would-be students tell us there never will be.

Even after those tests you decry, most law schools get over 100+ applicants per slot. Most, obviously, dont get in, and my entering class at UT Austin was nearly 600 students. Of the admittees who make it through 3+ years of law school, 40% don't pass the par the first time.

Odds are even longer for would be doctors.
 

Hussar

Legend
Most professions have practicals at some point that weed people out, but only after the mass winnowing that occurs at the level of the LSAT, MCAT, etc. level. There simply aren't enough teachers to either do that mass sort or to adequately assess the skills of the masses via practicals absent tests like the MCATS and their ilk. And the economic reality of the huge numbers of would-be students tell us there never will be.

Even after those tests you decry, most law schools get over 100+ applicants per slot. Most, obviously, dont get in, and my entering class at UT Austin was nearly 600 students. Of the admittees who make it through 3+ years of law school, 40% don't pass the par the first time.

Odds are even longer for would be doctors.

See, but, that's the point. What is the point of the MCAT? To simply winnow out? You really believe that a standardised test does that?

But, even before that level, testing is still not really proving anything. Take a 4th year uni student and give him a test in his field. He does pretty well. We'll say 95%. Fantastic mark. Six months later, after graduation, sit him down and give him the exact same test and his marks, if he passes at all, will be skin of the teeth. It's been proven that you lose about 75-80% of what you learned in uni 6 months after graduation.

So, what did that test actually prove?

Or maybe another example. We've all taken high school level math. At least, most of us likely have. Solve the following:

A^2+b=17

How much you want to bet that while some of you can do this (I certainly can't), most can't? Despite the fact that we have had testing, and likely multiple tests that show that we could do that while we were in high school. All the testing shows is that you are able to retain in short term memory, a selection of skills that will fade within a few months after testing.
 

Dannyalcatraz

Schmoderator
Staff member
Supporter
See, but, that's the point. What is the point of the MCAT? To simply winnow out? You really believe that a standardised test does that?

Yes, it does, clearly and effectively.

@70,000 people take the MCAT annually. 50% of them score below the minimum standards of any med school in the USA.

Is everyone who doesn't get a certain score on the MCAT unqualified for med school? No. But most are. And they get to try something else rather than take up space that could be allocated to someone with better odds of success in med school. Is everyone who does get a certain score on the MCAT qualified for med school? No. But most are, and the med schools winnow out a certain portion as well...and then the medical board exams (standardized at the state level) and residencies and internships further winnow.

Unqualified people still slip though, to be sure, but not as many as would happen if, instead of 35,000 potential students, there were 70,000 because there was no MCAT.

And again, what method do you propose to replace it?


But, even before that level, testing is still not really proving anything. Take a 4th year uni student and give him a test in his field. He does pretty well. We'll say 95%. Fantastic mark. Six months later, after graduation, sit him down and give him the exact same test and his marks, if he passes at all, will be skin of the teeth. It's been proven that you lose about 75-80% of what you learned in uni 6 months after graduation.

So, what did that test actually prove?
Assuming that is true- I know of no such study- it is probably because most of what we learn in college is not used after college and/or wasn't learned well in the first place- both hurdles to retention as long-term memory.

And according to a 2001 Dartmouth paper on the nature of memory, even that forgetfulness is not as thorough as you might think: "..."forgotten" material can be relearned in less time than is required for the original learning, even after ​many years' disuse."

So, while the test may only prove that you knew something on test day, echoes of that knowledge linger in the mind, more easily recalled or retrained than learning it anew.

Or maybe another example. We've all taken high school level math. At least, most of us likely have. Solve the following:

A^2+b=17

How much you want to bet that while some of you can do this (I certainly can't), most can't? Despite the fact that we have had testing, and likely multiple tests that show that we could do that while we were in high school. All the testing shows is that you are able to retain in short term memory, a selection of skills that will fade within a few months after testing.

Again, math beyond the most rudimentary stuff is not used by most people- myself included- so there is no reason to retain it. Heck, most of us know that even as we're learning it. So it fades with disuse, and our attitude towards math while learning it contributes t the speed with which we lose it. But, as the Dartmouth study points out, I could relearn it in less time now, having leaned it once, as opposed to having never learned it at all.
 

Ahnehnois

First Post
@70,000 people take the MCAT annually. 50% of them score below the minimum standards of any med school in the USA.

Is everyone who doesn't get a certain score on the MCAT unqualified for med school? No. But most are. And they get to try something else rather than take up space that could be allocated to someone with better odds of success in med school. Is everyone who does get a certain score on the MCAT qualified for med school? No. But most are, and the med schools winnow out a certain portion as well...and then the medical board exams (standardized at the state level) and residencies and internships further winnow.
I don't take most of that as given at all. Is there any evidence that people who score low (or at least lower than the 50th percentile) are less likely to succeed in med school?

Moreover, does the test information tell us anything that a reasonably intelligent person could not already conclude from looking at their transcripts? To me, looking at grades, courses taken, and the rigor of where they were taken is a far better way of assessing the same thing. I doubt that the MCAT weeds out many people who have a good academic record but then bomb this particular test for some reason.

And most doctors will tell you that they've forgotten most of what they learned in medical school. The point of standardized professional education is essentially to combat fraud (given what constituted a "doctor" before the Flexner report), which is a legitimate problem. However, I remain unconvinced that the standardized tests are really part of the solution.

And again, what method do you propose to replace it?
How about nothing? I don't think colleges and professional schools would be unable to make admissions decisions without the tests, and I'm not convinced that their decisions would be any worse.

The other major point of these tests, professional licensure and certification, is largely a peer evaluation anyway; I don't see that cutting out the exams would really change the professions that much.
 


pedr

Explorer
I think the last few posts show the difference between the centrally set/marked qualifications I am used to and the standardised tests which were noted as the US equivalent.

English qualifications are content-driven, in that the curriculum is designed by educators to teach key elements of each subject, and then to test understanding and application of those elements. This seems to be different from a US approach which has parallel tracks - a curriculum designed by individual teachers or schools, and assessed locally followed by a separate test which is disconnected from the design of the curriculum. In one sense, English education "teaches to the test" at least from age 14, but the test is aligned directly to the curriculum and designed by and in conjunction with teachers and educators. I'm sure there are educational theories which differentiate between curriculum assessment and aptitude assessment, with the SAT intended to do the latter, but it appears as if this has the effect of influencing teaching as much as a central curriculum design without the same degree of teacher buy-in.

Perhaps these are differences without substance, but I still find it notable in comparing the systems.
 

Remove ads

AD6_gamerati_skyscraper

Remove ads

Upcoming Releases

Top