Hi all... sorry if this isn't posted in the right forum, but I wasn't sure where to post this.
I am a law student at a very prestigious university, and we just had a grading scandal in one of your classes. I am not really grasping the formula the professor used (though, I don't think he has any idea what he was doing, either).
There were 3 sections on the exam: A, B, C, and each section had a raw score. The professor used an excel sheet to figure out the standard deviation of each section. The formula he used to calculate the final grade was this:
(ARawScore / Standard Deviation of A) + 3((BRawScore + CRawScore))/Standard Deviation of BRawScore + CRawScore)
Here is what I don't get. Is there any mathematical reason to ever divide a raw score by the standard deviation for the set of all raw scores for that section? It really makes no sense to me. What the professor purported to do was weight section A 25%, Section B 25%, and Section C 50%, but, I don't see how that formula gets that result. I think the correct formula should be:
(ARawScore x 0.25)+(BRawScore x 0.25)+(CRawScore x 0.50), which is quite easy... so, either I'm totally lost, or the professor has no idea what he is doing.