Tuesday, October 31, 2006

Our Survey Says...Exactly What We Wanted (6)

Today's Survey...Universities UK produce survey which doesn't say anything..

Student work rates 'vary widely'

http://news.bbc.co.uk/1/hi/education/6099768.stm

The work put in by students in England varies greatly between both subjects and institutions - bringing the value of a degree into doubt, a report says.

Er, no it doesn't. The BBC even contradicts this claim later in the article "The report does not prove that the degree classification system is flawed, but it certainly raises questions that need to be addressed"

Of course as anyone who's been reading this blog over the past week will realise the devil is in the detail. In the pre-amble to the report the UK Universities qualify the results thus

"5. The survey provides the most detailed account yet of what students receive when they study at an English university. Inevitably, though there are limits to the conclusions which can be drawn on the basis of the survey. The paragraphs below set out the most important of these considerations.

a. The survey reports the responses students gave to the questions asked about the number of hours of teaching they received, their own academic effort and their own satisfaction with their experiences. It may not, therefore, provide a definitive quantification of the amount of teaching provided in English universities the accounts students give may be unreliable.

So the data they are basing their conclusions on could be unreliable.

b. The survey has produced a set of quantitative indicators which describe what is provided in English universities but there is no suggestion that these are indicators of the quality of education. That is quite a different matter, and the formal teaching students receive -- and the amount of private study they undertake - are just some of the inputs that go towards determining the quality of the experience.

The data been quantified in a way that does not reflect the quality of the actual education the students get

c. The measures of satisfaction reported here are not intended to replicate or substitute for those provided by the National Student Survey (the latter provide a guide to overall levels of student satisfaction). They have been included to enable us to establish whether there is a link between the quantity of the different types of provision students report receiving and their satisfaction with it.

eh?

d. Whilst the sample is large, it is not large enough to provide reliable information on every subject offered in every institution. Because we required a minimum level of response before the results were treated as reliable there are many institutions where results are not shown. However, sufficient are shown to enable lessons to be drawn about provision across the sector as a whole. Annex B provides information about the sample.

It doesn't cover what it claims to cover i.e all the Universities in England

e. Where students are asked to reply in terms of activity in a week, it should be born in mind that universities have different numbers of weeks in an academic year (and in particular Oxford and Cambridge have fewer than others). These results (in these and other respects) cannot therefore be taken as saying all there is to say about the amount of provision that students receive.

And it's been skewed by other factors localised to certain Universities.

So let me get this straight. The data is potentially unreliable, doesn't measure quality, linked to something isn't quantified, doesn't include all that it proports to, and is skewed by local circumstances.

How the hell can it's results be treated with anything other than skeptism. To paraphrase - "The report is flawed, and it certainly raises questions that need to be addressed" such as why hell don't journalists bother to check things before publishing stories based on surveys, when even the most cursory glance would demonstrate that its flawed to the point of being utterly useless?

No comments:

Post a Comment