Jump to content

Menu

What does "doesn't test well" mean?


unsinkable
 Share

Recommended Posts

Well, my kids don't see a lot of tests or quizzes in high school...its a lot of portfolio/project/paper/discussion to make the grade. My older felt very unprepared for college tests, simply because of the lack of experience from high school. High school grading is a joke...for ex., it's quite possible to have a quarter's grade be based on things other than tests. Quite a few kids here get their butts kicked on Regent's exams...and I mean, walking in with a 100 class average, and feeling lucky to score a 65 on the RE...they simply do not have the test experience.

I'm glad you posted, because I was trying to figure out how a kid who doesn't test well on standardized tests could still be successful in high school, because when I was in school, it seemed like we were constantly taking tests and quizzes.

 

Additionally, I was thinking that most tests and pop quizzes in high school are "timed," just on the basis of the classes being a specified length, so I don't think I was understanding some of the responses. This thread has been very interesting for me to read.

 

Thanks! :)

Link to comment
Share on other sites

  • Replies 133
  • Created
  • Last Reply

Top Posters In This Topic

Are people only referring to standardized tests?

How can a student who doesn't test well, end up with good grades (ie 4.0) and above?

For Students I know who "don't test well", it is across the board, standardized tests and school tests and quizzes. So "doesn't test well" means the student fails, barely passes, squeaks by, etc on tests and quizzes, even with preparation.

And school tests and quizzes are a huge part of their grades.

In college, these kids struggle even worse than in high school because the grades are based on mostly tests, and compared to high school, very little homework.

 

 

You asked how people define not testing well. You stated that your experience is across the board not testing well. I responded that my kids perform well across the board but not as well on standardized tests.

 

 

That's the attitude here that I'm reading from people.

Maybe people should think of how offensive that comes across.

Simply articulating factual information about our own experiences does not make any sort judgment about anyone else's experiences. Stating that by sharing those experiences means we are saying, " too bad," to those students who struggle across the board is offensive bc I never said anything of the kind. Nor did I see anyone else say that either.

Link to comment
Share on other sites

I'm glad you posted, because I was trying to figure out how a kid who doesn't test well on standardized tests could still be successful in high school, because when I was in school, it seemed like we were constantly taking tests and quizzes.

 

Additionally, I was thinking that most tests and pop quizzes in high school are "timed," just on the basis of the classes being a specified length, so I don't think I was understanding some of the responses. This thread has been very interesting for me to read.

 

Thanks! :)

Fwiw, that is not our experience. The SAT/ACT are in a class of their own. I do have a dd like Lolly's who struggles with MC and TF tests, too. But, they were few in number during her college experience and she did well n her licensing exam.
Link to comment
Share on other sites

In what testing situation Other than the ACT/SAT are students given 3 hrs of items that are meant to be read as quickly as possible and answered in a few seconds to under minute? (It would be a poorly designed college exam!)

 

Exams for some professional credentials and licenses also require this.  There are some jobs that require an ability to be able to do this--some as some financial market trading jobs

Link to comment
Share on other sites

I think that there is thing called "Test Sense" or a "Test IQ".   Mine is pretty high.  Honestly, I can pass any class based on multiple choice without studying.  Many times I've taken a test on a book I never opened and scored > 90%.  Fortunately, I am studious by nature so I generally studied anyway.  Even on non multiple-choice tests, I get a sense of them after the first one.  Several times I thought "Something is weird with this problem".  Then later learned from the professor that it had originally been something else but was changed.  Like the Linear Algebra test where a super-easy problem was where a hard-ish problem should have been.  I later learned from the professor that he put that problem in to figure out which students were letting their calculators do all the work.    

 

I could see how someone could be the opposite.  For example, my eyes can't change focus on balls coming toward me.  So a volleyball is about 3 feet wide.  I honestly thought that other people were just better at guessing where the ball was.  I remember once in gym 'learning' how to bat.  The teacher said to "keep my eye on the ball". I said, 'What ball?"  I'd never seen it.  As far as I could tell, the pitcher and catcher were just playing with me.  It wasn't until I read that really good baseball hitters can see the threads on the ball that I realized maybe I had a problem.  

 

Back when I was teaching, I was buddies with a teacher across the hall who told me about a class he took on tests.  It was how to write them, not how to take them.  The professor said at the beginning that anyone could bring any multiple choice test to him from any other professor anywhere, and he would make at least a 70 on it. The class made it their mission to find one that he'd fail.  They never did.  

 

Link to comment
Share on other sites

In what testing situation Other than the ACT/SAT are students given 3 hrs of items that are meant to be read as quickly as possible and answered in a few seconds to under minute? (It would be a poorly designed college exam!)

 

 

My licensing exams in medical school (NBME/USMLE 1-3) were similar in timeframe.  Now they have added in a PE component to step 2 so that is different but it is in addition to the computer click single best answer timed exam for step 2 and both must be passed independently. Step 3 gets a little creative with combining the timed computer click single best answer followed by second phase where there are cases/simulations. 

 

I would say that the written component of my specialty (post residency) boards also resembled this.  In our specialty passing the written exam gives us the opportunity to take our oral boards.  We must pass both to be board certified.  

Link to comment
Share on other sites

Maybe a little off topic, or perhaps not, but I also think that some kids do better with coming across questions they don't know without it throwing them.  They may take a quick educated guess and mark it for later if they have time to go back and see if they can reason it through but move on to answer what they do know without trauma.  This is something that my stepson and eldest daughter have always been very good at doing.  If time allows also both kind of get into the challenge, see it as a puzzle and want to work through it (and often will be able to) but have no problem prioritizing skipping it to get to the questions they do know and can do well with.  

 

For example when our eldest daughter was in 8th grade she was taking algebra which our state requires the student to pass a comprehensive exam to get credit.  For various reasons she was basically teaching herself with back from DH or myself as needed but to get credit for the course she was following the high school course schedule, using their chosen textbook, doing (and turning in)the assigned problem sets and taking all of the class exams on campus.  She was doing very well in the course (she may have even had a 100% average) when midterms rolled around.  The course instructor didn't feel like making a midterm exam so he opted to give the students the state exam even though they hadn't finished the course.  The deal was he would give everyone enough points for whoever had the highest score to earn 100 on the exam.  Our daughter ended up scoring an 85 on the exam.  Since she was working independently to an extent she had gotten ahead of the class so she had been exposed to some material other kids hadn't but she certainly hadn't been exposed to the entire course.  What really allowed her to excel was that she just took a deep breath and moved along when she saw something she hadn't learned. She had time at the end of the exam so she went back to the stuff that was unfamiliar and figured out some of the questions just extrapolating from what she did know.  She came out of that exam all excited about stuff she had "learned" while taking the test and went into her textbook to check and see that she was correct.  Her older brother had a much better math teacher who didn't put him in that situation but I know he would have reacted similarly. 

 

At the same time, both our foster son (who struggled some academically but ultimately made it through his bachelors and is a very competent detective with our state police) and DD14 are much more phased when they come up against stuff that they are not completely certain of.  DD14 is very bright and had an amazingly high IQ when tested as a pre-schooler but DH and I both took time with her in the car before dropping her off for her AP exams last spring to remind her that she wasn't expected to know all the questions and that it would be ok, just breathe, etc.  She rocked her AP exams and I think for her some of this really is a remnant of the self esteem issues we're still helping her work on overall.

 

 

Link to comment
Share on other sites

Regular tests don't typically have "distractors", possible answers designed to trck the test-taker. Math tests in high school and college generally give the student a problem and the student either solves it correctly or not.

 

The friend I mentioned in Joanne's thread got A's in honors math courses at our high school and did fine in the math courses she took at the community college she attended before transferring to a 4 year selective college. But she bombed the math portion of the SAT's because she had anxiety and the "distractor" answers tended to confuse her.

Link to comment
Share on other sites

I'm glad you posted, because I was trying to figure out how a kid who doesn't test well on standardized tests could still be successful in high school, because when I was in school, it seemed like we were constantly taking tests and quizzes.

 

Additionally, I was thinking that most tests and pop quizzes in high school are "timed," just on the basis of the classes being a specified length, so I don't think I was understanding some of the responses. This thread has been very interesting for me to read.

 

Thanks! :)

 

Some people are very good guessers on multiple-choice tests/exams. I barely know anything about pro football but I recently got 15/15 on some BuzzFeed NFL quiz because I was able to make good guesses. If someone had asked me the exact same questions on an open-ended quiz where I had to supply the answers I probably would've scored more in the 2 or 3 out of 15 range. The SAT and GRE very likely overestimated my actual academic potential simply because I'm a good guesser on MC exams.

 

My high school friend was the opposite- a very poor MC test-taker. I graduated ahead of her in our class but our GPA's were much, much closer than our SAT scores. If you looked at our test scores, you would think that we had wildly different academic potentials, and that simply wasn't the case. I was a bit stronger of a student but only by a bit. Think the difference between an A- and a B+.

Link to comment
Share on other sites

I wrote this first post, which you quoted and replied below:

 

And all those things...anxiety, processing speed, etc. etc. etc. impact the kids I know in their day-to-day school work. For the the kids I know, it isn't ONLY on SAT and ACT day.

In what testing situation Other than the ACT/SAT are students given 3 hrs of items that are meant to be read as quickly as possible and answered in a few seconds to under minute? (It would be a poorly designed college exam!)

 

Fwiw, my older 2 kids both graduated with their degrees with honors. Our current college student makes high As on exams where the class avg might be a 60 or lower.

Is that your criteria for when kids are allowed to have test anxiety and be impacted by slow processing speed and the myriad of other issues that can lead to *not testing well*? Only tests with 3 hours of items that are meant to be read as quickly as possible and answered in a few seconds to under a minute? Did you not see when you want examples that are ONLY LIKE THE ACT and SAT that you are implying that these are the only tests that should be causing kids problems?

 

It is funny that I am the one who can understand that kids can have problems in all testing situations, for various reasons, not just the SAT and ACT...but I'm offensive.

Link to comment
Share on other sites

When I read your reply to my post: 

And all those things...anxiety, processing speed, etc. etc. etc. impact the kids I know in their day-to-day school work. For the the kids I know, it isn't ONLY on SAT and ACT day.

I read it as an implication that it canNOT possibly be true that kids with anxiety, slow processing speeds, etc. are ONLY affected by SAT/ACT type tests.   My response to you was an explanation of WHY it IS true, not a judgment against kids who struggle with additional academic areas.   You are misinterpreting what people are attempting to explain.   Plenty of kids struggle with both.  But, yes, there are some kids who flip both other directions:  good test takers with poor academic performance and kids with high academic performance who are poor test takers.   Cant imagine why all 3 scenarios cannot simultaneously co-exist.

 

 

I wrote this first post, which you quoted and replied below:



Is that your criteria for when kids are allowed to have test anxiety and be impacted by slow processing speed and the myriad of other issues that can lead to *not testing well*? Only tests with 3 hours of items that are meant to be read as quickly as possible and answered in a few seconds to under a minute? Did you not see when you want examples that are ONLY LIKE THE ACT and SAT that you are implying that these are the only tests that should be causing kids problems?

It is funny that I am the one who can understand that kids can have problems in all testing situations, for various reasons, not just the SAT and ACT...but I'm offensive.

 

Link to comment
Share on other sites

While I have no problems getting good test scores, I was allowed to eat M&Ms and drink coffee during 3hrs exams (not in the states) in the big exam hall. There is too many reasons for not testing well. My younger so far has scored less well consistently for timed tests while my older is less affected.

 

I wonder if anti-anxiety drugs or supplements would help any of the poor test takers.

It takes time to find the correct drug and the correct dosage for a person. A dosage that works for daily anxiety may not be high enough for a high stakes test.

Link to comment
Share on other sites

I was thinking about this thread last night.

 

And I don't think I'm misinterpreting the attitude of "too bad" and "oh well" toward those who test badly across the board.

 

If I asked how do people get a 4.0 if they don't test well and the answer is people who test poor leave school, I think that is a dismissive comment ala too bad, oh, well.

Link to comment
Share on other sites

I was thinking about this thread last night.

 

And I don't think I'm misinterpreting the attitude of "too bad" and "oh well" toward those who test badly across the board.

 

If I asked how do people get a 4.0 if they don't test well and the answer is people who test poor leave school, I think that is a dismissive comment ala too bad, oh, well.

 

The internet boards are difficult. Tone is taken from posts that is not intended. The fact is that people who do not do well in school usually do end up leaving. They either fail out, or they just get tired of having to work around the clock to maintain fair grades. I am not being dismissive; it is just the way it is. (And I have a dd who I fully expect to follow that path. I don't see how she can make it. Currently, she is battling forward, but... Then, again, she is a fighter, so who knows!) The problem is that there just aren't that many options to taking tests. Extra time can be granted with documented need. Even that isn't always enough. My dd has to take a very low class load. She knows she cannot handle more than 12 hours because those 12 hours are like a "normal" student taking 20. It is going to take her a long time to earn a degree. That kind of pressure is really hard to deal with. Add in the additional costs, and sometimes I wonder if it is even worth it.

Link to comment
Share on other sites

Regular tests don't typically have "distractors", possible answers designed to trck the test-taker. Math tests in high school and college generally give the student a problem and the student either solves it correctly or not.

 

The friend I mentioned in Joanne's thread got A's in honors math courses at our high school and did fine in the math courses she took at the community college she attended before transferring to a 4 year selective college. But she bombed the math portion of the SAT's because she had anxiety and the "distractor" answers tended to confuse her.

 

 

Yes. I'm reminded of my College Algebra class. The whole grade was 50/50 midterm and final. Both tests were 25 questions each, MC, answered on a scantron - and 3 of 4 options were very good distractors. Talk about unforgiving.

 

The other thread got me thinking about this class. Do colleges which don't look at standardized test scores discourage their own profs from grading based solely on tests? Because it seems like attracting students who "don't test well" and then requiring that they pass a class like my College Algebra, would just be a recipe for failure.

Link to comment
Share on other sites

The internet boards are difficult. Tone is taken from posts that is not intended. The fact is that people who do not do well in school usually do end up leaving. They either fail out, or they just get tired of having to work around the clock to maintain fair grades. I am not being dismissive; it is just the way it is. (And I have a dd who I fully expect to follow that path. I don't see how she can make it. Currently, she is battling forward, but... Then, again, she is a fighter, so who knows!) The problem is that there just aren't that many options to taking tests. Extra time can be granted with documented need. Even that isn't always enough. My dd has to take a very low class load. She knows she cannot handle more than 12 hours because those 12 hours are like a "normal" student taking 20. It is going to take her a long time to earn a degree. That kind of pressure is really hard to deal with. Add in the additional costs, and sometimes I wonder if it is even worth it.

FWIW, I realize that it is a possiblity.

 

And I do appreciate your posts and explanations, along with many other posts and explanations in the thread.

 

Just a specific question (that you can answer in a general way if you don't want to talk specifically about your DD) about credits:

 

Does your DD ever start with 15 credits so she has some wiggle room to drop if one class isn't working out in order to stay full time?

 

Good luck to her!

Link to comment
Share on other sites

Yes. I'm reminded of my College Algebra class. The whole grade was 50/50 midterm and final. Both tests were 25 questions each, MC, answered on a scantron - and 3 of 4 options were very good distractors. Talk about unforgiving.

 

The other thread got me thinking about this class. Do colleges which don't look at standardized test scores discourage their own profs from grading based solely on tests? Because it seems like attracting students who "don't test well" and then requiring that they pass a class like my College Algebra, would just be a recipe for failure.

 

I teach a highly mathematical finance class with up to 300 students per section.  I give multiple choice exams.  I don't use distractor answers to trick or confuse students, but I do include in the choice of answers wrong answers that I know students typically get when they work the problem.  

 

I was talking to someone who teaches the same class at another university who said that they do not even have a multiple choice grading machine at their school.  All problems must be worked out.  So there is a big difference from school to school  I would think, on average, that smaller, liberal arts schools rely less heavily on multiple choice exams, but I have never seen the correlation on the admissions criteria and the average grading in classes.  

Link to comment
Share on other sites

I was thinking about this thread last night.

 

And I don't think I'm misinterpreting the attitude of "too bad" and "oh well" toward those who test badly across the board.

 

If I asked how do people get a 4.0 if they don't test well and the answer is people who test poor leave school, I think that is a dismissive comment ala too bad, oh, well.

 

Yes, I think there is a sense that if your kid has always struggled with something as important as tests, you should have already figured out that she's not good college material, and you don't get to grieve that as your kid approaches college age.  But if your kid is doing well in school, you have a right to get your hopes up for a fancy college and then grieve that when your child bombs the big test.

 

As the parent of a child who is bright and has many positive qualities, but has trouble with various school evaluation tools, I still have hopes for her.  I hope that she has the opportunity to meet her considerable potential and enjoy a positive college experience.  I am not going to give up, grieve, and get over it before she is a full-fledged adult.

Link to comment
Share on other sites

Are people only referring to standardized tests?

 

How can a student who doesn't test well, end up with good grades (ie 4.0) and above?

 

For Students I know who "don't test well", it is across the board, standardized tests and school tests and quizzes. So "doesn't test well" means the student fails, barely passes, squeaks by, etc on tests and quizzes, even with preparation.

 

And school tests and quizzes are a huge part of their grades.

 

In college, these kids struggle even worse than in high school because the grades are based on mostly tests, and compared to high school, very little homework.

 

In my son's case, it means that fill-in-the-bubble standardized testing does not accurately reflect his true capabilities. He has a combination of text anxiety and a tendency to over-think questions that depress his scores. (The over-thinking thing is very common with extremely bright students, by the way.)

 

He maintained very good grades during his high school years (graduated with an unweighted GPA of 3.84), even in online courses, because his grades on regular, daily assignments were consistently excellent. His scores on multiple-choice quizzes and tests were never as good as those on other kinds of assignments, but they were balanced out by the very high scores on everything else.

 

His ACT scores were fine, above average, but not stellar. 

 

He pulled A's and B's in his year of dual enrollment at the community college without breaking a sweat. He's in his first semester of his official freshmen year now and doing really well. His lowest mid-term grade was an 87. He's found, so far, in his program, that standardized tests of the kind that cause him trouble really aren't a big part of the grading. Instead, he writes papers and gives presentations, both of which are tasks at which he shines. And as a performing arts major, he has a lot of tests that are hands (or feet) on, where he has to show what he has learned and what he can do, which is, again, an area of strength for him.

Link to comment
Share on other sites

It sounds like people are discounting the idea that the students I know who "don't test well" are actually NOT TESTING WELL in classes. And it is IMPACTING their grades.

 

:confused:

 

I don't see that at all in the replies I've read so far. I know that not testing well has had an impact on my own son's grades. However, because fill-in-the-bubble multiple choice testing is only one part of a grade for most classes, he has been able to do well enough on other assignments to compensate.

Link to comment
Share on other sites

(what kind of score would this be: "SAT I score of 1550 out of 2400 or ACT Composite score of 22" High, or a good avg?)

 

 

Slightly above average, but not high. According to this chart from the ACT site, a composite of 22 would put the student in the 62nd percentile in the U.S.:

 

http://www.actstudent.org/scores/norms1.html

Link to comment
Share on other sites

I teach a highly mathematical finance class with up to 300 students per section.  I give multiple choice exams.  I don't use distractor answers to trick or confuse students, but I do include in the choice of answers wrong answers that I know students typically get when they work the problem.  

 

Yes.

 

I don't think that tricking students is the issue, but giving answers that can clearly be ruled out imo tests test-taking skills more than anything else.

 

For a specific example: Let us say that I give the problem 1/2 + 1/3. 

 

Now, if I give the possible answers as 5/6, 1/4, 4/7, and 7/5, it is pretty obvious that the last 3 cannot be right. The student doesn't really need to know how to add fractions.

 

But if I give the possible answers as 5/6, 2/5, 1/6, and 3/2, I've included every way that I've seen a student get this problem wrong. I guess these might be distractors, but if a student works out the problem correctly, they will see the correct answer on the list and mark it. I'm not trying to trick people, but I am trying to see whether they can actually add fractions.

 

I'm fortunate enough to be at a small enough school that I don't have to use mc tests, but I don't see distractors as inherently a bad thing. 

Link to comment
Share on other sites

I think that there is thing called "Test Sense" or a "Test IQ".   Mine is pretty high.  Honestly, I can pass any class based on multiple choice without studying.  Many times I've taken a test on a book I never opened and scored > 90%.  

 

Me, too. I'm a natural test taker. I can look at a test question and know what the test maker wants the answer to be, even when I don't agree or would have worded it differently or realize that the question is poorly designed. The correct answer just kind of leaps out at me a lot of the time.

 

Neither of my kids has that gift. Each of them had to learn how to do reasonably well on multiple choice tests, and neither has ever done as well on those kinds of tests as one would expect given how bright and otherwise academically capable they are. 

 

Like so many other things, some people's brains are wired to handle this stuff better, and others are not. It doesn't mean it can't be gotten around or compensated for; it just means that person's brain isn't naturally set up to shine on those kinds of tests. 

Link to comment
Share on other sites

Slightly above average, but not high. According to this chart from the ACT site, a composite of 22 would put the student in the 62nd percentile in the U.S.:

 

http://www.actstudent.org/scores/norms1.h

Half of all students taking the ACT will score below the average.  In my local school districts over 1/2 of students graduate with an A average (90 and above).  Many of these excellent students will have ACT scores which are closer to an average ACT score, while only a handful will be in the top 10% of ACT scores.

Link to comment
Share on other sites

But if I give the possible answers as 5/6, 2/5, 1/6, and 3/2, I've included every way that I've seen a student get this problem wrong. I guess these might be distractors, but if a student works out the problem correctly, they will see the correct answer on the list and mark it. I'm not trying to trick people, but I am trying to see whether they can actually add fractions.

 

I'm fortunate enough to be at a small enough school that I don't have to use mc tests, but I don't see distractors as inherently a bad thing. 

 

Don't you see how somebody with test anxiety might be able to find the correct answer when asked it as an open-ended question, but get confused by the distractors in a multiple-choice exam? My high school friend was like that. If you just asked her to solve the math problem, she'd generally get it right. But when there were incorrect answers presented, she'd get flustered and start second-guessing herself.

 

With computerized exams now available, I hope that admissions testing starts moving towards open-ended problems and away from MC ones.

 

Link to comment
Share on other sites

It is interesting how schools differ in regard to ACT/SAT scores and GPAs.

 

At my son's public high school, for last year's graduating class of over 1,000 students, the average composite score for the ACT was 27.5. (All juniors must take the ACT plus Writing as part of our state's testing requirements.) 22 students received perfect ACT scores.

 

Perfect GPAs out of 1,000+ students? Two. In my son's class of over 1,000 students, there are also only two who have perfect GPAs so far. Average ACT scores won't be available for awhile.

Link to comment
Share on other sites

Don't you see how somebody with test anxiety might be able to find the correct answer when asked it as an open-ended question, but get confused by the distractors in a multiple-choice exam? My high school friend was like that. If you just asked her to solve the math problem, she'd generally get it right. But when there were incorrect answers presented, she'd get flustered and start second-guessing herself.

 

With computerized exams now available, I hope that admissions testing starts moving towards open-ended problems and away from MC ones.

 

At times, I have given exams that are part multiple choice and part open-ended problems.  The results from the two different parts of the exam tend to be highly correlated.  In almost thirty years, I can't remember a single case that I have seen in which a student could consistently work open-ended questions but miss questions on identical material when it was tested using a MC question.  

 

I think that being confident in one's answer is part of knowing the material that I am testing.  If a student works a problem and gets an answer of "120" but then looks and sees that in addition to "120" there is an answer of "140" and gets confused and starts second-guessing himself, then the student is not confident in his answer.  In the business world, people will have to be able to make choices and decisions when they are presented with a lot of "wrong answers" they need to avoid.

Link to comment
Share on other sites

At times, I have given exams that are part multiple choice and part open-ended problems.  The results from the two different parts of the exam tend to be highly correlated.  In almost thirty years, I can't remember a single case that I have seen in which a student could consistently work open-ended questions but miss questions on identical material when it was tested using a MC question.  

 

I think that being confident in one's answer is part of knowing the material that I am testing.  If a student works a problem and gets an answer of "120" but then looks and sees that in addition to "120" there is an answer of "140" and gets confused and starts second-guessing himself, then the student is not confident in his answer.  In the business world, people will have to be able to make choices and decisions when they are presented with a lot of "wrong answers" they need to avoid.

 

I am sure you don't mean it this way, but the bold is very insulting to those who suffer with the brain disease of anxiety. Anxiety is not reasonable or logical. And often has **nothing** to do with the quality of preparation.

 

"Being confident in one's answer" is not going to heal the anxious brain.

Link to comment
Share on other sites

Yes.

 

I don't think that tricking students is the issue, but giving answers that can clearly be ruled out imo tests test-taking skills more than anything else.

 

For a specific example: Let us say that I give the problem 1/2 + 1/3. 

 

Now, if I give the possible answers as 5/6, 1/4, 4/7, and 7/5, it is pretty obvious that the last 3 cannot be right. The student doesn't really need to know how to add fractions.

 

But if I give the possible answers as 5/6, 2/5, 1/6, and 3/2, I've included every way that I've seen a student get this problem wrong. I guess these might be distractors,

 

 

So, what's a distractor, and why is it bad to have them?  If you have to have multiple choice tests, seems like Kiana's list of frequent wrong answer is exactly the thing you want on the test to make sure the student can get the right answer.

Link to comment
Share on other sites

With computerized exams now available, I hope that admissions testing starts moving towards open-ended problems and away from MC ones.

 

I'm worried that computerized exams bring with them a whole new set of problems.  Even doing khan academy online quizes, I've had to teach: type your answer in.  STOP.  read it back to yourself, so that you make sure that have correctly typed in what you want, and there's no double keystrokes or missing ones.  Computerized grading of essays seems horrible to me.  Adaptive online tests, where you can't skim the whole test, then go back and do the easy ones, requires new testing strategies.

Link to comment
Share on other sites

I think that being confident in one's answer is part of knowing the material that I am testing.  If a student works a problem and gets an answer of "120" but then looks and sees that in addition to "120" there is an answer of "140" and gets confused and starts second-guessing himself, then the student is not confident in his answer.  In the business world, people will have to be able to make choices and decisions when they are presented with a lot of "wrong answers" they need to avoid.

 

One of the skills often taught in test prep classes is predicting answers before a student even looks at the options. In math, "predicting" would mean working the problem and coming up with an answer. Once that is done, the student should only need to look for the choice that matches his or her answer and mark it. The approach saves time, because students don't need to consider and discard incorrect choices, and also increases the chance of getting the correct answer, since the student doesn't need to second guess.

 

The same principle works for vocabulary questions and the writing/grammar questions and reading comprehension. The approach usually taught is that the student reads the question but not the answer choices first, then looks at the associated passage or sentence and predicts and answer, then locates the choice that most closely matches the prediction, marks it and moves on.

 

It's such a simple concept, but one that doesn't come naturally to many students. 

Link to comment
Share on other sites

I was talking to someone who teaches the same class at another university who said that they do not even have a multiple choice grading machine at their school.  All problems must be worked out.  So there is a big difference from school to school  I would think, on average, that smaller, liberal arts schools rely less heavily on multiple choice exams, but I have never seen the correlation on the admissions criteria and the average grading in classes.  

 

When Calvin was looking at universities, he checked carefully how classes were graded and how his final degree 'class' (grade) would be arrived at.  He already knew that he worked well with tight timetables and a lot of structure, and with essay-based exams.  In his high school subjects, he had been able to pull marks up much higher using exams than he could have using long-term class-work grades.  It's important to know one's strengths and choose a course/college accordingly.

 

L

Link to comment
Share on other sites

FWIW, I realize that it is a possiblity.

 

And I do appreciate your posts and explanations, along with many other posts and explanations in the thread.

 

Just a specific question (that you can answer in a general way if you don't want to talk specifically about your DD) about credits:

 

Does your DD ever start with 15 credits so she has some wiggle room to drop if one class isn't working out in order to stay full time?

 

Good luck to her!

 

She did try that. She ended up dropping the extra class as soon as it got iffy. Now, she only takes 12 at a time so that she has to push through. She feels like she will always drop if things get hard in a class, and that will ALWAYS happen. Personally, I don't like her logic, but it isn't my choice to make. Time will tell if her strategy works or not... 

Link to comment
Share on other sites

I guess my point is that if the MC answers aren't distractors, at least to some degree, it is a very bad MC test.

 

As jdahlquist says, it is vanishingly rare that a student does well on open-ended questions and poorly on MC. I do not discount the possibility, but it is extremely rare. In graduate school, I taught the same class both with and without MC tests. In both cases, the quizzes (which were hand-graded and never MC) were highly predictive of test scores -- the correlation score was virtually identical. The test means and grade distributions were also virtually identical. 

 

The sort of student who second guesses because of the presence of distractors also, unfortunately, tends to second-guess in the absence of distractors, erasing correct answers and writing incorrect ones. 

Link to comment
Share on other sites

In what testing situation Other than the ACT/SAT are students given 3 hrs of items that are meant to be read as quickly as possible and answered in a few seconds to under minute? (It would be a poorly designed college exam!)

 

I'm not an ACT/SAT apologist, but this design helps to elicit a distribution of scores at the top end. Many more students could do well on these tests if given more time or fewer questions. That's not the point.

(Now, what these tests are actually revealing about students - that's another discussion. But for one reason *why,* this is certainly a reason.)

Link to comment
Share on other sites

I am sure you don't mean it this way, but the bold is very insulting to those who suffer with the brain disease of anxiety. Anxiety is not reasonable or logical. And often has **nothing** to do with the quality of preparation.

 

"Being confident in one's answer" is not going to heal the anxious brain.

No, I don't mean it to be insulting and I am not sure why it is insulting.  Examinations are a measure of someone's performance and not simply a measure of the quality of preparation.  If a business student is going to become an auditor, they will be presented "wrong" answers that they have to identify as wrong in the work world.  Being able to say "this is the correct answer; I am not going to be influenced by someone else's wrong answer" is an important skill to have.  Personally, I prefer open-ended questions in that I can see where an individual student went wrong and I don't have any students who made high grades "guessing" on a MC test.  My point regarding MC questions having distractors, however, is that this is not simply a testing phenomenon; the real world in which decisions will have to be made also contains many distractors.

Link to comment
Share on other sites

At times, I have given exams that are part multiple choice and part open-ended problems.  The results from the two different parts of the exam tend to be highly correlated.  In almost thirty years, I can't remember a single case that I have seen in which a student could consistently work open-ended questions but miss questions on identical material when it was tested using a MC question. 

 

The stakes are much lower for a regular course test situation in high school or college than for the SAT/ACT/GRE/PSAT taken for Natl. Merit  consideration/etc. Students with test anxiety may be better able to manage it for a normal course exam. I know that when I was in high school it felt like my entire future was at stake when taking the SAT.

 

The one and only panic attack I've ever experienced in my life came during the SAT my senior year. I thought I was having a heart attack at first. Fortunately it came during the break between sections and I was able to calm down enough to finish the test. I never felt any kind of anxiety like that during a normal course exam because even with a final exam, it was only one component of my grade (an important one to be sure, but not the only thing that mattered).

 

Link to comment
Share on other sites

So, what's a distractor, and why is it bad to have them?  If you have to have multiple choice tests, seems like Kiana's list of frequent wrong answer is exactly the thing you want on the test to make sure the student can get the right answer.

 

It is not necessarily "bad" but it moves the test from content driven to a higher percentage of test *taking* skills driven. Many brains do just fine with it, but for those who have anxiety or some other barrier to success for these types of test, the test does NOT measure that student's actual aptitude or knowledge content.

Link to comment
Share on other sites

No, I don't mean it to be insulting and I am not sure why it is insulting.  Examinations are a measure of someone's performance and not simply a measure of the quality of preparation.  If a business student is going to become an auditor, they will be presented "wrong" answers that they have to identify as wrong in the work world.  Being able to say "this is the correct answer; I am not going to be influenced by someone else's wrong answer" is an important skill to have.  Personally, I prefer open-ended questions in that I can see where an individual student went wrong and I don't have any students who made high grades "guessing" on a MC test.  My point regarding MC questions having distractors, however, is that this is not simply a testing phenomenon; the real world in which decisions will have to be made also contains many distractors.

 

 

You keep equating "real world" scenarios with the intentional distractors in standardized tests. It's not comparable.

 

If you don't see how your statements might be insulting to those who suffer with anxiety, I don't know how to make that more clear to you.

 

The examinations CAN BE a measure of performance, but depending on the student and test, that is not always the case.

 

 

 

Link to comment
Share on other sites

I'm not an ACT/SAT apologist, but this design helps to elicit a distribution of scores at the top end. Many more students could do well on these tests if given more time or fewer questions. That's not the point.

(Now, what these tests are actually revealing about students - that's another discussion. But for one reason *why,* this is certainly a reason.)

I have had the opportunity to serve on graduate admissions committees for a number of years.  Generally, we were dealing with GRE and GMAT scores and college GPAs rather than ACT/SAT and high school grades.  At this level, you often have the opportunity to see which students do perform well in the academic studies.

 

Students with high test scores and high GPAs tended to do well in the programs; they had the background knowledge and skills to be successful; a few were not successful because of life events or because of lack of ambition, drive, etc.  Students at the low end of the distribution for grades and test scores, tended to do poorly.  Those who were more in the middle was where things became a bit more tricky.  We found that a good predictor of success of this group was asking them a question during the interview process, "What do you read?"  Candidates who revealed that they did not read much tended not to be able to keep up with the work load and pace of the program.

Link to comment
Share on other sites

I have had the opportunity to serve on graduate admissions committees for a number of years. Generally, we were dealing with GRE and GMAT scores and college GPAs rather than ACT/SAT and high school grades. At this level, you often have the opportunity to see which students do perform well in the academic studies.

 

Students with high test scores and high GPAs tended to do well in the programs; they had the background knowledge and skills to be successful; a few were not successful because of life events or because of lack of ambition, drive, etc. Students at the low end of the distribution for grades and test scores, tended to do poorly. Those who were more in the middle was where things became a bit more tricky. We found that a good predictor of success of this group was asking them a question during the interview process, "What do you read?" Candidates who revealed that they did not read much tended not to be able to keep up with the work load and pace of the program.

Is this graduate school in the business/accounting/finance field?

Link to comment
Share on other sites

She did try that. She ended up dropping the extra class as soon as it got iffy. Now, she only takes 12 at a time so that she has to push through. She feels like she will always drop if things get hard in a class, and that will ALWAYS happen. Personally, I don't like her logic, but it isn't my choice to make. Time will tell if her strategy works or not...

Thanks for the reply. I really hope things work out for her.

Link to comment
Share on other sites

You keep equating "real world" scenarios with the intentional distractors in standardized tests. It's not comparable.

 

If you don't see how your statements might be insulting to those who suffer with anxiety, I don't know how to make that more clear to you.

 

The examinations CAN BE a measure of performance, but depending on the student and test, that is not always the case.

No examination or classroom experience is exactly comparable to real world scenarios.  What I am saying is that people have to deal with distractors in the real world.  These distractors are often much more intentional in the real world in that their purpose is to mislead or confuse.  Often on exams these distractors are not "intentional" in that their intent is to mislead or distract--they simply represent wrong answers that many students who incorrectly work the problem will get.  

 

The results of an exam are a measure of performance--they are a measure of the performance on that exam under the circumstances the exam was given.  How that performance should be interpreted and whether it measures how an individual would perform in a different situation are questionable.  

Link to comment
Share on other sites

It is not necessarily "bad" but it moves the test from content driven to a higher percentage of test *taking* skills driven. Many brains do just fine with it, but for those who have anxiety or some other barrier to success for these types of test, the test does NOT measure that student's actual aptitude or knowledge content.

 

I still don't understand.  Consider the question:  1/2 + 1/3.

 

One set of possible answers, with "distractors" is (Kiana's example):

 

5/6, 2/5, 1/6, and 3/2

 

another set of possible answers is

 

5/6, 0, 100, pi

 

I would say that the first one (with "distractors") is just a better test.  The second one, perhaps a student with better test taking skills could get the right answer without even doing the problem.  Are you saying that you daughter would do substantially better on the second test because of anxiety?  I would say the first test is more content driven and the second test can be gamed more easily.

Link to comment
Share on other sites

Yes, I think there is a sense that if your kid has always struggled with something as important as tests, you should have already figured out that she's not good college material, and you don't get to grieve that as your kid approaches college age. But if your kid is doing well in school, you have a right to get your hopes up for a fancy college and then grieve that when your child bombs the big test.

 

As the parent of a child who is bright and has many positive qualities, but has trouble with various school evaluation tools, I still have hopes for her. I hope that she has the opportunity to meet her considerable potential and enjoy a positive college experience. I am not going to give up, grieve, and get over it before she is a full-fledged adult.

Yeah, it is pretty discouraging to hear people who don't test well shouldn't be on the College Board in the first place.

 

Couple that with the recent (last couple decades) push that today's BS/BA is the equivalent of a high school diploma now (required for many, many basic and entry level positions now when it wasn't in years past) and it is even more discouraging...

Link to comment
Share on other sites

Yeah, it is pretty discouraging to hear people who don't test well shouldn't be on the College Board in the first place .

My brother don't test well all his academic life. He went to a hands on community college equivalent and then onto university to get a BEng. He would not have survived in an academic structured (less hands on tests) engineering course. For some people it is really all about fit like Laura said.

 

My younger who is slower in speed than his brother is very good at guessing and eliminating for multiple choice. MCQ tests are really a bad indicator of what he knows (as in he scores a lot higher than what he know).

Link to comment
Share on other sites

You keep equating "real world" scenarios with the intentional distractors in standardized tests. It's not comparable.

 

If you don't see how your statements might be insulting to those who suffer with anxiety, I don't know how to make that more clear to you.

 

The examinations CAN BE a measure of performance, but depending on the student and test, that is not always the case.

 

I don't understand why it is necessary to walk on eggshells when discussing the design of an objective test.

 

I thought the whole point of objective tests is that it takes out the whole "personal" aspect and levels the playing field in many ways.

 

A child getting a low or high MC test score, or finding an objective test difficult, should not be viewed as a measure that child's value, personality, or anything else worth getting insulted about.  It's just a score.  It just is.  If it isn't an accurate reflection of ability, find a way to demonstrate that, but don't get insulted by the test or the test designers etc.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share


Ă—
Ă—
  • Create New...