I mentioned a story to some people yesterday about the argument over grade inflation in Irish third-level institutions.
The story is on the Irish Times website and is called: "Are standards in our third-level colleges slipping?"
Read the 'Yes' and 'No' sides to the argument here.
4 comments:
A similar process is underway in the leaving certificate where courses have been pared down to the nub over the last couple of decades and not surprisingly grades are on the rise. This is motivated by reasons such as in the case of science subjects to make these subjects "more attractive" to students which seems to be a synonym for "less hard". However, our second-level system is in no way in the same poor condition as the English system which last year could not in any way distinguish between approximately the top 10% of students, whereas our system had 20 potential levels to discriminate amongst high achievers for courses such as medicine, dentistry etc.
What would be really interesting would be to see the predictive validity of third-level grades in terms of job performance. I reckon the level of prediction is extremely poor considering the variation between colleges (and I'm sure also within colleges) mentioned in the acticle, and considering a highly standarised system (the SAT's) predicts about 15% (a generous estimate) of third level performance (and this is more like 10% for A-level grades and somewhat better for the leaving cert (about 20%) it is unlikely an unstandardised system would yield better predictions, bringing one to question the value of having anything more than a pass-fail degree system for supplying the job market!
I'm not sure I understand the comparison with the UK. If the "20 potential levels" refers to points: is there really a difference between the cognitive ability & long run productivity of people on 540 and 560 points ? I doubt it. The difference would be swamped by school & SES effects.
Its like rankings of universities or departments:one can only quantify so far (in the latter case to a degree that is not useful).
My experience as a lecturer suggests that there is a big and genuine difference between people who get 1sts and those who get 2.2s,say.Of course it is a noisy signal and employers know this.
That said, I don't know much on the returns to a "good degree". This maybe data driven but its also tied up with the perennial signaling vs. human capital issues.
One could, data permitting,do a Regression-Discontinuity Design study comparing the earnings of those who those who just missed a 1st with those who just got it etc.
The 20 levels point referred to the increments on the leaving certificate scale which distinguish those scoring in the top 10% of students. As this is a test of achievement it can only be used as a rough estimate of the main predictors of job performance (cognitive ability (25%ofvariance),conscientiousness (16% of variance). The increments are mainly useful as a cheap way of choosing amongst candidates for high points courses, rather than engaging in an expensive (and probably flawed) interview process or paying the likes of ACER hundreds of euro to provide aptitude tests (the predictive validity of which has not been determined in an Irish context). Both interviews and aptitude tests are influenced by school and SES factors also.
It would definitely be interesting to see the predictive value of college and departmental rankings. In the cases of secondary schools we correlated the school rankings (based on the number of third level entrants) year on year over three years and found it to be .5 (average over three years) indicating very large student cohort effects (and perhaps to a small extent staff turnover effects).
It makes you wonder in the face of such large student cohort effects would it really be beneficial to cap quotas of first, 2.2 degrees etc as has been done in Princeton.
Post a Comment