Menu
News
All News
Dungeons & Dragons
Level Up: Advanced 5th Edition
Pathfinder
Starfinder
Warhammer
2d20 System
Year Zero Engine
Industry News
Reviews
Dragon Reflections
White Dwarf Reflections
Columns
Weekly Digests
Weekly News Digest
Freebies, Sales & Bundles
RPG Print News
RPG Crowdfunding News
Game Content
ENterplanetary DimENsions
Mythological Figures
Opinion
Worlds of Design
Peregrine's Nest
RPG Evolution
Other Columns
From the Freelancing Frontline
Monster ENcyclopedia
WotC/TSR Alumni Look Back
4 Hours w/RSD (Ryan Dancey)
The Road to 3E (Jonathan Tweet)
Greenwood's Realms (Ed Greenwood)
Drawmij's TSR (Jim Ward)
Community
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Resources
Wiki
Pages
Latest activity
Media
New media
New comments
Search media
Downloads
Latest reviews
Search resources
EN Publishing
Store
EN5ider
Adventures in ZEITGEIST
Awfully Cheerful Engine
What's OLD is NEW
Judge Dredd & The Worlds Of 2000AD
War of the Burning Sky
Level Up: Advanced 5E
Events & Releases
Upcoming Events
Private Events
Featured Events
Socials!
EN Publishing
Twitter
BlueSky
Facebook
Instagram
EN World
BlueSky
YouTube
Facebook
Twitter
Twitch
Podcast
Features
Top 5 RPGs Compiled Charts 2004-Present
Adventure Game Industry Market Research Summary (RPGs) V1.0
Ryan Dancey: Acquiring TSR
Q&A With Gary Gygax
D&D Rules FAQs
TSR, WotC, & Paizo: A Comparative History
D&D Pronunciation Guide
Million Dollar TTRPG Kickstarters
Tabletop RPG Podcast Hall of Fame
Eric Noah's Unofficial D&D 3rd Edition News
D&D in the Mainstream
D&D & RPG History
About Morrus
Log in
Register
What's new
Search
Search
Search titles only
By:
Forums & Topics
Forum List
Latest Posts
Forum list
*Dungeons & Dragons
Level Up: Advanced 5th Edition
D&D Older Editions
*TTRPGs General
*Pathfinder & Starfinder
EN Publishing
*Geek Talk & Media
Search forums
Chat/Discord
Menu
Log in
Register
Install the app
Install
Community
General Tabletop Discussion
*Geek Talk & Media
Only in America
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Dannyalcatraz" data-source="post: 6274230" data-attributes="member: 19675"><p>This study concluded that, while the MCAT was not a good predictor by itself and should be used in conjunction with analysis of undergraduate GPA, the MCAT is indeed a good overall predictor of success in med school, and was better than examining undergraduate GPA alone. IOW, while using both is best, if you're only going to consider one, use the MCAT; if assigning weight to both when considering both, weight the MCAT more heavily.</p><p></p><p><a href="http://medical-mastermind-community.com/uploads/47-MCAT-predicts-med-school-academics-better-than-uGPAs.pdf" target="_blank">http://medical-mastermind-community.com/uploads/47-MCAT-predicts-med-school-academics-better-than-uGPAs.pdf</a></p><p></p><p>This study, 4 years later, supports the conclusions of that one, and made additional findings that are being incorporated to improve the MCAT. (Certain things will be de-emphasized as their relevance decreases; certain subsections noted as being poorly correlative will be radically revised; the gender-bias- which actually predicts success in med school of women <em>better</em> than for men- will be addressed, etc.)</p><p></p><p><a href="http://journals.lww.com/academicmedicine/Fulltext/2010/06000/The_Predictive_Validity_of_Three_Versions_of_the.20.aspx" target="_blank">http://journals.lww.com/academicmedicine/Fulltext/2010/06000/The_Predictive_Validity_of_Three_Versions_of_the.20.aspx</a></p><p></p><p></p><p></p><p>I'm an attorney who is the sn of an MD. We both have that same position. So did my law-school teachers, who said that they couldn't pass the bar without studying for it.</p><p></p><p>And the primary reason why goes back to the Dartmouth study: most of what we learned, we don't use in daily practice. (The secondary reason is that what we learned at that level has often changed in validity' completeness and relevance- no need to know info that is no longer good.)</p><p></p><p>But again, as the Dartmouth study stated, because we have learned it once, we will find it easier to recall that information than someone seeking to learn it from scratch.</p><p></p><p>To put it in practical terms, if you're in a rural area, where there is only one MD for a huge geographic area, you'd want someone who actually took a broad base of classes in med school than someone who just focused on a specialty. While neither may have ever performed a particular operation or treated a particular disease, the generalist will get up to speed much faster than the guy who never studied it at all.</p><p></p><p></p><p></p><p>When you double (or more) the number of students they must consider for admissions, you're going to increase the error rate by simple statistics. My dad served on the admissions board for Tulane med school for a while, pre-MCAT. It was a nightmare of looking at huge stacks of applications that were often essentially indistinguishable. MCATs- and similar tests- give you another evaluative tool- one proven to work (see above).</p><p></p><p></p><p></p><p>The Bar exams got their name because their intent is to "bar" the unqualified from practice. They're like any other standardized test out there, just harder. In my state, they recently added a practical; not the norm, yet, AFAIK.</p><p></p><p>But what is the quality that peer evaluation, really? It's assessing your skills against a known standard, just like the standardized tests claim to do...sometimes without the safeguards of an anonymous standardized test, and at greater expense.</p><p></p><p>The Texas practical for dentists, for instance, assesses your skills on live patients. That means a certain number of people have to volunteer to have their mouths poked around in by unlicensed, unproven dentists. Without weeder tests, you'd have to increase the number of volunteers...and examiners. There simply aren't enough qualified dentists willing to take the time to evaluate the prospective licensees- they have their own patients to treat. Then here's the costs of the infrastructure needed in those additional spaces in which to test the applicants, which need manual tools, dental chairs, water, sanitization, x-rays, etc.</p><p></p><p>Furthermore, unlike the anonymous standrdized tests, anyone can be failed at any time during the practical. You will not be told why you got the tap on the shoulder. It could be because you were incompetent. It could be because they know you're from out of state. It could be because the examiner is a racist or he knows you dated his cousin. You'll never know. Which means, as you sit out a year for the next test date, you won't know what areas you need to study more of in order to pass...or if you shouldn't even bother.</p><p></p><p>The tests aren't perfect, no. But they do help.</p></blockquote><p></p>
[QUOTE="Dannyalcatraz, post: 6274230, member: 19675"] This study concluded that, while the MCAT was not a good predictor by itself and should be used in conjunction with analysis of undergraduate GPA, the MCAT is indeed a good overall predictor of success in med school, and was better than examining undergraduate GPA alone. IOW, while using both is best, if you're only going to consider one, use the MCAT; if assigning weight to both when considering both, weight the MCAT more heavily. [url]http://medical-mastermind-community.com/uploads/47-MCAT-predicts-med-school-academics-better-than-uGPAs.pdf[/url] This study, 4 years later, supports the conclusions of that one, and made additional findings that are being incorporated to improve the MCAT. (Certain things will be de-emphasized as their relevance decreases; certain subsections noted as being poorly correlative will be radically revised; the gender-bias- which actually predicts success in med school of women [I]better[/I] than for men- will be addressed, etc.) [url]http://journals.lww.com/academicmedicine/Fulltext/2010/06000/The_Predictive_Validity_of_Three_Versions_of_the.20.aspx[/url] I'm an attorney who is the sn of an MD. We both have that same position. So did my law-school teachers, who said that they couldn't pass the bar without studying for it. And the primary reason why goes back to the Dartmouth study: most of what we learned, we don't use in daily practice. (The secondary reason is that what we learned at that level has often changed in validity' completeness and relevance- no need to know info that is no longer good.) But again, as the Dartmouth study stated, because we have learned it once, we will find it easier to recall that information than someone seeking to learn it from scratch. To put it in practical terms, if you're in a rural area, where there is only one MD for a huge geographic area, you'd want someone who actually took a broad base of classes in med school than someone who just focused on a specialty. While neither may have ever performed a particular operation or treated a particular disease, the generalist will get up to speed much faster than the guy who never studied it at all. When you double (or more) the number of students they must consider for admissions, you're going to increase the error rate by simple statistics. My dad served on the admissions board for Tulane med school for a while, pre-MCAT. It was a nightmare of looking at huge stacks of applications that were often essentially indistinguishable. MCATs- and similar tests- give you another evaluative tool- one proven to work (see above). The Bar exams got their name because their intent is to "bar" the unqualified from practice. They're like any other standardized test out there, just harder. In my state, they recently added a practical; not the norm, yet, AFAIK. But what is the quality that peer evaluation, really? It's assessing your skills against a known standard, just like the standardized tests claim to do...sometimes without the safeguards of an anonymous standardized test, and at greater expense. The Texas practical for dentists, for instance, assesses your skills on live patients. That means a certain number of people have to volunteer to have their mouths poked around in by unlicensed, unproven dentists. Without weeder tests, you'd have to increase the number of volunteers...and examiners. There simply aren't enough qualified dentists willing to take the time to evaluate the prospective licensees- they have their own patients to treat. Then here's the costs of the infrastructure needed in those additional spaces in which to test the applicants, which need manual tools, dental chairs, water, sanitization, x-rays, etc. Furthermore, unlike the anonymous standrdized tests, anyone can be failed at any time during the practical. You will not be told why you got the tap on the shoulder. It could be because you were incompetent. It could be because they know you're from out of state. It could be because the examiner is a racist or he knows you dated his cousin. You'll never know. Which means, as you sit out a year for the next test date, you won't know what areas you need to study more of in order to pass...or if you shouldn't even bother. The tests aren't perfect, no. But they do help. [/QUOTE]
Insert quotes…
Verification
Post reply
Community
General Tabletop Discussion
*Geek Talk & Media
Only in America
Top