Students take to Twitter to express anger about GCSE Biology paper

Janet Downs's picture
 47

‘It had nothing to do with biology!’ a 15 year-old told me after she’d taken her AQA GCSE biology Higher Unit 1 paper today.

She’d spent time revising; she did mind maps; she downloaded past AQA biology papers to practice.  She thought she'd done enough.

But she’s not happy.  She, like thousands of her peers, feel she hasn’t been given a chance to shine.

One of the questions contained a table showing the percentage of girls and boys who drank particular alcoholic drinks: beer, lager, cider.   She was asked to compare the amount of alcohol consumed and then use the data to disprove a statement like, ‘All boys drink alcohol’.

This sounds more like statistics than biology. 

Another question was about ‘an independent company’.   Business studies, surely?

The paper created a Twitter storm.   One student was furious that she’d ‘wasted the last two years learning the carbon cycle, IVF and hormones for no reason.’   Another wanted a refund for the money spent on a revision guide which included nothing that appeared in the exam.  A third wrote:

‘NOTHING on hormones/menstrual cycle. NOTHING on cloning.  NOTHING on vaccination.  Who designed this exam, the drunk hamsters?’

 An AQA spokesman told the Telegraph: "Exams aren't meant to be easy and students are obviously going to tweet about that, but there was nothing wrong with this paper.’

Candidates disagree.

The Telegraph quotes ex-teacher, Pete Langley, founder of Get Revising, who advised students ‘not to panic’.    "They’re not alone in feeling this way, we are seeing a record number of posts about the AQA GCSE Biology exam on The Student Room which can provide some reassurance post-exam.”

Langley said it sounded like the exam was ‘really tough’ which could leave pupils ‘feeling disappointed after all the hard work they’ve put into preparing for it.’

It doesn’t appear to be just disappointment over a ‘tough’ exam.  The students on Twitter felt the exam hadn’t given them the chance to demonstrate the work they’d put in.

Students have two more AQA Science papers to take: Chemistry and Physics.   Today’s experience will have left many feeling angry and wondering if the next two papers will be as poorly written as the Biology paper appeared to be.

This isn’t the first time AQA has faced criticism over a science paper.  Last year their GCSE Chemistry exam contained a passage which answered one of the questions.  On that occasion, pupils used Twitter to express their glee.  Today, however, pupils are tweeting their anger.

If exams are to be of any use to pupils, future employers, university admission tutors or companies offering apprenticeships, they must be reliable.  If papers seem unfair, overly difficult or too easy, if grade boundaries fluctuate from year to year, then confidence in the exam system is destroyed.  And the greatest losers are the young people, like the one I spoke to today, who take exams which don't appear to be fit for purpose.

AQA needs to explain and justify today's Higher Unit 1 Biology paper.

UPDATE 30 May 08.45   Schools Week published this comment by a parent of a pupil taking AQA Biology in its Readers' Reply column 27 May 2016 (comment not available on line):

'My son checked the cover of his GCSE biology paper twice during the exam as he thought he must have been given the wrong paper by mistake.  So did his friend and so, according to social media, did other students.  It wasn't that the exam was hard, it was that it was odd and lacked opportunities to show off the biological knowledge they had been learning.  He felt short-changed.  He's really good at maths so being asked to do lots of data analysis wasn't hard for him.  If anything it made the paper easier.  But he wanted to be tested on the biology he had worked hard to learn - or at least some of it.'   (My highlights).

 

Share on Twitter
Category: 

Comments

rogertitcombe's picture
Wed, 18/05/2016 - 17:04

This issue draws attention to the lack of clarity of the purpose of education and exams. We are drowning in the consequences of the marketisation paradigm that incentivises behaviourist approaches to teaching, learning and testing. While not defending the exam board, which appears to be just as confused, the response of the pupils is even more depressing.

"“It’s my first GCSE exam that I’ve done and it’s like ‘are all GCSE exams going to be stuff that I haven’t been taught’?”

If only! That would indicate a healthy education system that encouraged deep, developmental learning for understanding. Is this why the English system performs so poorly on the PISA tests? These are designed to test understanding, not rote learning. The questions
deliberately involve contexts unlikely to have been specifically covered by a syllabus, but which a candidate with good understanding can answer.

The marketisation paradigm is corroding standards and this is getting worse. For example, making little kids remember formal grammatical terms in KS1 is NOT a route to deep developmental learning. See

https://www.washingtonpost.com/blogs/answer-sheet/post/how-germ-is-infec...

We are in a truly dire state if this level of confusion exists between exam boards, schools, teachers and pupils. See LSN posts

http://www.localschoolsnetwork.org.uk/2016/02/educational-lysenkoism-is-...

and

http://www.localschoolsnetwork.org.uk/2016/03/telling-is-not-teaching-an...

The comments of pupils actually reveals how badly they have been taught. The questions they are complaining about are really easy! I fear that the expectation of teachers and pupils is that 'teaching IS telling, and listening, remembering and regurgitating IS learning'.

Of course it is not the student's fault that the entire school system is now so poorly focussed on deeplearning. Things will only get worse.


Janet Downs's picture
Thu, 19/05/2016 - 08:19

Roger - the GCSE paper under discussion didn't just expect pupils to regurgitate facts (although I know that's increasingly likely in one-off summative exams).  It contained data analysis (as have past AQA papers).  However, candidates expect exam papers to concentrate on what they have studied so they can apply that knowledge.  But if the exam asks questions about topics which weren't on the syllabus, then this isn't a fair test of what pupils know, understand and can do.  It was if pupils went into an English exam expecting to be tested on Macbeth but faced questions about Henry V.

The AQA question re teenage preferences for particular alcholic drinks apparently asked pupils to use the given statistics to disprove a statement like, 'All boys drink alcohol'.  it appears this question could have been answered by anyone who could analyse data - it needed no prior knowledge.  If my assessment is correct (and I haven't seen the paper, I'm relying on what I've heard), then this isn't a fair question to ask in a biology exam.  It doesn't give candidates a chance to apply their biological knowledge.

As far as PISA is concerned, Science is the one subject in which UK 15 year-olds score above the OECD average.   Scoring above an average isn't performing 'poorly'.  And I would argue that scoring average in reading and maths isn't performing 'poorly' either - it's not brilliant but it's not dire.

The pupil I spoke to didn't expect a Higher GCSE exam to be easy - but she did expect it to test her knowledge of biology and how well she applied this.   It appears the AQA exam did not do so.  She wasn't complaining that the questions were too hard but that they didn't relate to the expected subject matter and, in the case of the alcohol question, could have been answered by someone with no biological knowledge.

The AQA exam currently being assessed doesn't just rely on a one-off, end-of-course exam.  It has a controlled assessment element (see here).


rogertitcombe's picture
Thu, 19/05/2016 - 11:51

I fear you are quite wrong about this. Science exams have always asked questions that test the application of knowledge in a context not specifically covered by the teacher or mentioned on the syllabus. Physics GCSE exams can and should ask questions about the interpretation of data in the form of graphs that relate to experiments they may never have heard of. In a biology GCSE exam pupils might be given a 'food web' containing animals they have never heard of, let alone studied, and then asked to interpret the effects on future populations if one species or another was to be culled. Quantitative chemistry calculations can be set on reactions involving elements and compounds never studied. That is actually at the core of all science, because science is completely meaningless as a mere collection of facts of memorised 'carbon cycles' or 'water cycles' etc.

In my view these were good questions because not only did they include data analysis in a biological context but they were about contemporary issues like nutition and substance abuse.

This example highlights the failures of our marketised education system just as clearly as the Perry Beeches and other Academy fiascos. In fact it is much more important than stories about financial malpractice, because it relates to our children being defrauded out of a proper developmental education through the deliberate corruption and degradation of the teaching methods in our schools by our GERM infected, ideologically obsessed government.

I would go further. As you know I argue that our behaviourist-corrupted education system is making our kids dimmer. It is no criticism of this exam if some students that have been properly taught anything in a decent school with a proper developmental pedagogy can answer those data analysis questions, but says everything about pupils that have been taught any maths and science that can't. They have been made dimmer by their behaviourist school experience as I predict in 'Learning Matters' will happen as a result of the failures of our marketised school system.

Look how far we have regressed even from Bloom in the early 1950s. Bloom's taxonomy has 'Knowledge Recall' as the basement level. Just three layers above is 'Application' still in the bottom half of the pyramid, yet these students cannot manage the very easy questions that test Application in this GCSE Biology exam. Where are the questions that test the higher Bloom Levels of 'Analysis', 'Synthesis' and 'Evaluation?'

See Section 1.7, p31 in your copy of 'Learning Matters'.

Are we saying that the higher levels of Bloom are no longer appropriate at GCSE Level? This was not the case in the late 1980s with the Leicestershire Modular Framework. See Section 5.7, p125 of 'Learning Matters'. In this MODE 3 GCSE . 'Application' (which these complaining students couldn't do at the simplest level) would only get you a GCSE E/ D grade. For grades A to C some evidence of performance at the top three Bloom levels was needed.

Section 1.10, p36  of 'Learning Matters' is a historical analysis of GCSE, CSE and GCE grades setting out the levels of knowledge and understanding that used to be needed. This too makes it clear that the top three Bloom levels were always needed for grades A-C. This was debased mainly following the post 1997 Labour governments degradation of standards that followed from the failure of so many schools to get anywhere near the government's newly introduced 'zero tolerance of failure' benchmarks. Now the C grade has dropped through 'the average attainment' down to the 'expected' attainment of every child without severe learning difficulties' that doesn't take a family holiday for two weeks of term time in Benidorm.

Then we come to your usual attacks on conclusions drawn from the English system's PISA performance. Yes, the first round had some statistical issues, but given that the UK was the first to bring about universal state education in the 19th century, and the brilliant work being done at that time by Richard Dawes in KIngs Somborne as described in Section 5.8, p132 of 'Learning Matters', the PISA performance of English schools in the 21st century really is poor. Richard Dawes' country kids would have done better, as a close study of how and what was taught in his school reveals.

Gove was right about two things only. That the English state education system is underperforming - even worse now following his disastrous marketisation reforms. And that the 'Vocational Equivalents' introduced by Blair to justify his 'zero tolerance oif failure' policies, were a scam and an educational disaster.


Janet Downs's picture
Thu, 19/05/2016 - 13:34

Roger - I haven't attacked PISA, just pointed out what the results in 2006, 2009 and 2012 said: 15 year-olds in the UK score at the OECD average for reading and maths, and above average for science.  If quoting PISA results accurately is attacking PISA, then I'm at a loss.

 


Janet Downs's picture
Thu, 19/05/2016 - 13:38

Roger - Re the AQA question.  The candidates expected data analysis which related to science - they'd been taught it and practised it.  But the alcohol question contained a table about alcoholic drinks preferred by teenagers. It was not scientific data.  It was not experimental data (unless it was part of blind tasting for marketing purposes but that is neither scientific nor is it ethical if teenagers were involved in tasting the booze).    It's like including a table showing which brands of chocolate are favoured by men and women and then asking candidates to use the data to disprove the statement, 'Women dislike chocolate'.  Where exactly is the science in that?

The pupils were not saying they couldn't do the analysis - they were complaining about the lack of scientific knowledge needed to do the analysis.  And they certainly evaluated the question - they decided it didn't test what is was supposed to test: knowledge, understanding and analysis of a science problem.  

 


rogertitcombe's picture
Thu, 19/05/2016 - 14:37

From the comments in the Guardian on 18 May, all the complaints from the students that I read were saying how upset they were because after all their revision they could not do these questions because they were not on the syllabus and their teacher had not taught them the answers (presumably to remember in their revision). I feep very sorry for these students. In fact I am upset and angry on their behalf. It is certainly the case that science teachers, the Association for Science Education, and OfSTED have been complaining for years about the drift away from practical science in favour of being sat in rows being told stuff.  


rogertitcombe's picture
Thu, 19/05/2016 - 15:20

Janet it is a matter of interpretation. The following is from the official OECD 2012 data summary booklet. I have embolded all the countries with positive mentions about anything.

What the data tell us • Shanghai-China has the highest scores in mathematics, with a mean score of 613 points – 119 points, or the equivalent of nearly three years of schooling, above the OECD average. Singapore, Hong Kong-China, Chinese Taipei, Korea, Macao-China, Japan, Liechtenstein, Switzerland and the Netherlands, in descending order of their scores, round out the top ten performers in mathematics. • Of the 64 countries and economies with trend data between 2003 and 2012, 25 improved in mathematics performance. • On average across OECD countries, 13% of students are top performers in mathematics (Level 5 or 6). They can develop and work with models for complex situations, and work strategically using broad, well-developed thinking and reasoning skills. The partner economy Shanghai-China has the largest proportion of students performing at Level 5 or 6 (55%), followed by Singapore (40%), Chinese Taipei (37%) and Hong Kong-China (34%). At the same time, 23% of students in OECD countries, and 32% of students in all participating countries and economies, did not reach the baseline Level 2 in the PISA mathematics assessment. At that level, students can extract relevant information from a single source and can use basic algorithms, formulae, procedures or conventions to solve problems involving whole numbers. • Between 2003 and 2012, Italy, Poland and Portugal increased their shares of top performers and simultaneously reduced their shares of low performers in mathematics. • Boys perform better than girls in mathematics in only 37 out of the 65 countries and economies that participated in PISA 2012, and girls outperform boys in five countries. • Shanghai-China, Hong Kong-China, Singapore, Japan and Korea are the five highest-performing countries and economies in reading in PISA 2012. • Of the 64 countries and economies with comparable data throughout their participation in PISA, 32 improved their reading performance. • On average across OECD countries, 8% of students are top performers in reading (Level 5 or 6). These students can handle texts that are unfamiliar in either form or content and can conduct fine-grained analyses of texts. Shanghai-China has the largest proportion of top performers – 25% – among all participating countries and economies. More than 15% of students in Hong Kong-China, Japan and Singapore are top performers in reading as are more than 10% of students in Australia, Belgium, Canada, Finland, France, Ireland, Korea, Liechtenstein, New Zealand, Norway, Poland and Chinese Taipei. • Between the 2000 and 2012 PISA assessments, Albania, Israel and Poland increased their shares of top performers and simultaneously reduced their shares of low performers in reading. • Between 2000 and 2012 the gender gap in reading performance – favouring girls – widened in 11 countries. • Shanghai-China, Hong Kong-China, Singapore, Japan and Finland are the top five performers in science in PISA 2012. • Between 2006 and 2012, Italy, Poland and Qatar, and between 2009 and 2012, Estonia, Israel and Singapore increased their shares of top performers and simultaneously reduced their shares of low performers in science. • Across OECD countries, 8% of students are top performers in science (Level 5 or 6). These students can identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations.

OK the list is dominated by China and other Asian counties, but plenty of Europeans get a positive positive mention for improvement.

We get no mention at all.

In the tables it is interesting to note that our performance in maths and science has declined.

And all this over almost two decades of frantic benchmarking, academisation, zero tolerance of failure,  forced school closures, and £billions of public money spent on megasalaries for 'hero school leaders' and on academies and free schools. Don't forget that in all this time in the vast majority of all these other countries, there have been no major educational initiatives at all, just schools getting on with their work and never making the news or any controversy at all. The exception is Sweden, which went into steep decline after implementing education policies very like those being forced onto our school system.

Would any analysis of these data suggest that our schools have much to show for all the pain of the last 20 years?

Never averse to sticking my neck out, I predict that our performance in the next PISA round will be worse still.

In relation to the AQA biology GCSE exam, it seems to me that my analysis that the student's complaints reflect bad teaching rather than the exam board setting questions 'not on the syllabus' is more likely. I repeat the last line of the OECD analysis of the PISA science results.

Across OECD countries, 8% of students are top performers in science (Level 5 or 6). These students can identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations.

AQA are clearly trying and are getting slated for their efforts. I remain surprised at such criticism from LCN.

 


Janet Downs's picture
Thu, 19/05/2016 - 16:27

First key finding from OECD publication re PISA 2012 results for UK:

'The United Kingdom performs around the average in mathematics and reading and above average in science, compared with the 34 OECD countries that participated in the 2012 PISA assessment of 15-year-olds.'

You might be right, however, about results in the next round of PISA declining.  Adreas Schleicher, OECD PISA statistics guru, says this is likely in Maths because of superficial teaching 

 


Janet Downs's picture
Thu, 19/05/2016 - 16:45

To repeat: the AQA exam has a practical element - it's a controlled assignment.   Re 'not on the syllabus' - it's to be expected, surely, that an exam sticks to the syllabus by asking pupils what they know, understand and can do with the information learnt via the syllabus.  If, for example, the syllabus did not include, say, genetic modification of food, then it would be unacceptable if the exam contained a question on GM crops.  And it would be unfair to conclude from the pupils' inability to answer such a question that they had all been taught 'badly'.

That said, AQA told the Guardian that all questions were connected with the syllabus.   Even the 'alcohol' question could be said to be connected with the syllabus because it covered drugs etc.   But the crux of the matter is that the alcohol question did not seem to have anything to do with science.    As the latest Guardian comment says (and from someone who's not sympathetic towards the students):

'The teenage alcoholic drinks question didn't require knowledge of any thing, it was mostly graph reading skills... I'm not saying this is right, it does have nothing to do with biology.'

'...nothing to do with biology'.   What justification can there be for including a question on a biology paper which had nothing to do with biology? 


rogertitcombe's picture
Thu, 19/05/2016 - 17:08

Just because genetic modification of food is not on the syllabus would not rule out a data analysis question about a trial related to pesticide resistance or yield.

I have to repeat that science exams have always contained data analysis questions in contexts not mentioned on the  syllabus. I know because I used to write them. In the distant past I was a Chief Examiner for GCSE and CSE.


Janet Downs's picture
Thu, 19/05/2016 - 17:25

Roger - that's true, and that would be fair enough because the question was (a) gave scientific data, and (b) required scientific analysis.  But I understand the alcohol question gave no scientific data but was about teenage preferences.  And the question expected no knowledge of science to be able to answer.  It's about as daft as including a table on what sweets children like best and asking candidates to use the data to prove 'children like sweets'.

Candidates expect to analyse scientific data in contexts unknown but not to analyse data which has nothing to do with science and then answer a question which requires no knowledge of science.  How is that allowing candidates to show their scientific understanding and analysis?


rogertitcombe's picture
Thu, 19/05/2016 - 18:22

My understanding of the question was that it was testing the ability to make logical deductions rather than illogical inferences from data.

'Fish are animals that live in the sea, but just because something is an animal and it lives in the sea does not mean it is a fish'. I haven't seen the question, but I suspect it is that kind of thing.

This sort of thing comes up all the time in biology and is at the core of all science.


Janet Downs's picture
Fri, 20/05/2016 - 09:29

Except that the question appeared to be about which alcoholic drinks were most preferred by teenagers.  If the same were applied to your fish example, the table wouldn't include biological information about fish but would list which fish were the most popular in chip shops.  It's hardly the core of science to base analysis on consumer preferences.


Tatiana's picture
Fri, 20/05/2016 - 09:48

Roger
To decide whether the paper was adequate or not we at least should have a look at it. Judging whether the questions were easy or not relying solely on students' complaints might be erroneous. It can be the case that the paper as a whole didn't reflect the specification, but the frustrated students just cannot convey their impression, complaining instead about some insignificant details.
On a different note, I came across into inconsistencies and absurd expected answers in the mark schemes for A level physics. For example, when essentially the same question comes up in two different papers of the same specification, MS for one year might require to use a specific term, but MS for the other can tell that this term is inappropriate and would score zero marks. Even In mathematics where one would expect the marking scheme be the most straightforward, I remember a wrong answer in the mark scheme for an Edexcel A level paper.
So I wouldn't be too optimistic about the quality of exam papers.


rogertitcombe's picture
Fri, 20/05/2016 - 12:36

Janet - I really need to see the paper and the syllabus to be sure about my response to your points. However from what I have picked up, this question has got nothing to do with consumer preferences but with what can be logically inferred from a set of data provided about them and what cannot. Much more than physics and chemistry, biology involves the observation of things that vary in all sorts of ways. It has its origins in 'Natural History' and did not become a 'hard predictive science' until Darwin and genetics. Much mainstream experimentation in biology involves observational pattern data just like that presented in the question, so it is perfectly reasonable to set a data analysis question that tests the ability to come to valid conclusions from that sort of data.

My understanding is that the syllabus has a foreword that lists assumed, 'skills and understanding common to the interpretation of any scientific data'. This is normal, indeed essential, for all science exams.

It may indeed reflect my prejudice in relation to how marketisation has corrupted teaching methods, but this controversy does seem to me to indicate biology lessons bereft of practical lab work and the routine interpretation of the data that produces. From what the students say in their Guardian comments, there is a clear expectation on their part that if they memorise all the stuff told to them by the teacher and learn all the stuff handed out then they should be able to answer the questions in the exam. But telling is not teaching and listening is not learning.

 

 


rogertitcombe's picture
Fri, 20/05/2016 - 13:12

I am not complacent about the quality of exam papers. I spent much of my career complaining about their shortcomings. I am also sure that privatising the exam boards has lowered standards.

I used to be infuriated by 'closed' mark schemes in science exams. I remember this (rather trivial) question.

"Why can't penguins fly?"

The marking scheme provided by the Chief Examiner allowed only for the following.

'because their wings are too small' - 1 mark or

'because their wings are too small for their weight' - 2 marks

I came across this response

'because penguins eat fish they catch in the sea and they don't have to be able to fly to do this, they use their wings to swim instead' - ZERO MARKS!!

This for a student showing a greater depth of understanding than the person that set the question.


Janet Downs's picture
Fri, 20/05/2016 - 13:24

'Skills and understanding common to the interpretation of any scientific data'  - the key word is 'scientific'.  If the data presented doesn't appear to contain any scientific data, then it's not testing the related skills and understanding...  

This particular exam DID have a practical element - a controlled assessment was done a few weeks ago.  In this case, lessons were not 'bereft on practical lab work' which, as you rightly say, is essential for scientific understanding.     

You also assume that the pupils were told stuff rather than having studied stuff.  There is, as you rightly point out, a difference.  But just because candidates complain the test didn't cover what they studied does not necessarily mean they just expected to regurgitate stuff told to them.   It could do... but then again, it could not.   


rogertitcombe's picture
Fri, 20/05/2016 - 14:48

This what a student (Jessica Clark) wrote in the Guardian.

The teenage alcoholic drinks question didn't require knowledge of any thing, it was mostly graph reading skills- all the data was given, it's part of a "how science works" section that is in every exam

All science exams need to test understanding about 'how science works'. I understand that the syllabus states this overtly.

"3.2 
How Science Works 
This section is the content underpinning the science that candidates need to know and understand. Candidates will be tested on How Science Works in both written papers and the Controlled Assessment."

Another student, (Libby T) wrote this in the Guardian.

There wasn't a question about which drink boys preferred. The question featured a graph with the percentages of 100 15 year old boys and 100 15 year old girls who liked different types of drinks. Then there was a statement asking why it wouldn't be true to say that 'all boys drink alcohol'. It was an interpretation question; evaluation and basic science skills are right at the start of most biology textbooks.

I would go so far as to say that this is an excellent science question. Any graphical interpretation data would have done but the examiner took the trouble to find some data related to teenage drinking habits. Why wouldn't any teacher seek to develop understandings of data interpretation, an essential core skill, using a context familiar to the age group? The examiner is just doing the same - well done. The key word is 'develop'. Facts can be 'taught', concepts have to 'develop' in the mind of the learner - unless you believe in behaviourism, which asserts that if you tell a student something enough times so that they remember what they have been told, then that that is the same as understanding a concept. It is not.

This is how Vygotsky put it.

As we know from investigations of concept formation, a concept is more than the sum of certain associative bonds formed by memory, more than a mere mental habit; it is a genuine and complex act of thought that cannot be taught by drilling, but can only be accomplished when the child’s mental development has itself reached the requisite level.

Clearly no amount of revision would help secure the conceptual understanding needed to answer this question. That is why the behaviourist pedagogy of marketisation fails. This is what my book, 'Learning Matters' is all about.

I really do not understand your distinction between 'scientific' and 'unscientific' data. Data are just data and all data are scientific data, because if it is data about the real world out there it is science. If it is data about the behaviour of living things out there in the world it is biological science. This really is important. For example, it is quite wrong to think that chemistry is about 'chemicals'. Chemicals are all the 'stuff' out there that the universe is made of, not just the stuff in labelled bottles and jars in chemistry labs.


Janet Downs's picture
Sat, 21/05/2016 - 09:15

'The teenage alcoholic drinks question didn't require knowledge of any thing, it was mostly graph reading skills- all the data was given, it's part of a "how science works" section that is in every exam.'

Exactly.  It 'didn't require knowledge of anything'.  It was data analysis which could be done by anyone who could analyse data.  I expect I could have made a decent response.  But how would that have tested my knowledge of biology, since that is woefully lacking?

Libby T says there wasn't a question about which drinks boys preferred.  She follows this by saying there was a graph which showed percentages of boys/girls who 'liked different types of drinks'.    That, surely, is preference.  There is no scientific basis for the data - it merely shows, er, preferences.   The statement to be disproved, 'all boys drink alcohol', could again be tackled by someone with no knowledge of biology.

If the question had a table, say, of units of alcohol in drinks which teenagers drink and then linked, say, to a question about relative damage of different drinks to young, developing bodies, this would be valid.  But, as Libby T has confirmed, the question could have been answered by someone with no knowledge of biology.  As the purpose of the exam was to test knowledge, understanding and application of biology, it would appear this question didn't do so.

Definition of 'unscientific': 'not done in a way that agrees with the methods of science'.  Gove's dodgy surveys come to mind.


rogertitcombe's picture
Sat, 21/05/2016 - 10:36

No it did not require knowledge of anything. We agree on that. But inability to answer a question like that would serious limit the capability of any student to understand much about any science, including biology, regardless of the amount of factual knowledge they had amassed and how many hours of revision they had carried out. That is why the syllabus states that it is essential understanding that students can be expected to be tested on in any of the written papers. Biology without understanding is no more than 'Natural History of Selborne' type nature study. (Nothing wrong with that of course - it makes delightful reading, but it is not science). Before Darwin all biology was like that.  

The question about the preferences for drinks involves exactly the same principle. Of course animals' (including humans') food and drink preferences are about biology. David Attenborough based his brilliant career and TV programmes on exactly that.

But it would not matter if the questions had absolutely no link to to biology at all

Your point about Gove's dodgy survey's is spot on. What sort of 'broad and balanced' education do all pupils need to enable everybody, not just you and other LSN posters, to recognise the fallacies in what our lying politians peddle to us? That is why all students should have to study proper, at least double award, science at KS4, as TVEI and the National Curriculum once required, whether a C grade at GCSE can be gained or not. That is one of the main reasons why 'vocational' non academic streaming, or selection into particular schools, should be banned in the state education system.

If students are not to develop these abilities in maths and science, then where in the school curriculum might they be found? 


rogertitcombe's picture
Sat, 21/05/2016 - 10:38

Sorry typo - Gove's surveys


Janet Downs's picture
Sat, 21/05/2016 - 11:24

Roger - the pupil I spoke to wasn't unable to do the question but was insulted by its triviality.  As I keep repeating, it could be done by someone who could analyse data.  It did not require students to have studied biology.   It did not require pupils to understand biology. As such, I would have been awarded marks.

 It cannot be justified to include in an Biology exam, particularly a Higher Tier one, a question which didn't need any background in biology.  How can students show what they know, understand and can do adequately if they're faced with a question which someone like me, without this background, could answer?

Of course it matters if the questions have some link to biology.    That's what's being examined - biology.  I get your point about analysis, it's an essential skill, but it's to be expected that the data to be analysed in a biology exam has some passing resemblance to biology and not, as I keep saying, lead to a questions which could be tackled by someone with no knowledge of biology.    And knowledge isn't totally a dirty word - pupils need both: knowledge and the ability to do something with that knowledge.  Analysis doesn't happen in a vacuum.  And analysis in an exam should not be so dumbed-down that it could be answered by someone without the relevant background knowledge.

 


Janet Downs's picture
Sat, 21/05/2016 - 11:27

Roger - on the wider curriculum, of course all pupils should study science to the end of KS4 (and all the Humanities and Arts subjects) irrespective of whether they're examined.    And of course pupils should be taught data analysis.  They should also be taught how language can be used to mislead, persuade and manipulate.   That can be covered in subjects such as English, History and the much-maligned Media Studies (a cynic might say hositlity to this subject by politicians and papers like the Mail is fuelled by the concern that pupils might be less likely to be taken in by biased rhetoric if they'd followed a course in Media Studies).


rogertitcombe's picture
Sat, 21/05/2016 - 12:14

Janet - Your persistence on this and your references to Gove's misuse of data prompt me to make further comment.

 I was a career-long science teacher. I taught GCSE science groups at KS4 right through my headship years. For me, science education has always been 'general education for everybody', not primarily preparation for A levels or careers in science or technology, although it can be that as well.

 As you will have picked up from my book, 'Learning Matters', like Carol Dweck and the 'growth mindset' movement, I reject the pedagogy of 'fixed ability and potential'. I do not reject the concept of 'general intelligence', which is why a lot of lefties are uncomfortable with my book. Like Shayer and Adey I believe in 'plastic intelligence' and am convinced by both the academic soundness and the educational efficacy of their approach, which was founded in science education, and which I have seen work in my headship school to great effect.

They both started out as teachers of chemistry, who struggled to come to terms with the issue of 'difficulty': why some students couldn't understand hard stuff like, 'the Mole concept' (nothing to do with the critters that dig up our lawn), and some could. The answer lies in 'general intelligence', which the political right loves, because it means there is no point wasting an academic education on 'dim kids'. Nor is there, unless there is an approach to schooling that can make dim kids bright. Shayer and Adey devoted their careers to  showing how this can be done and proving that their methods work. Dweck and all the subsequent followers build on their work.

 In my headship school, all students took double award GCSE science. Separate sciences were available in addition in the 'extended curriculum'. Double award science could get you into academic science and engineering degrees in Russell Group universities. Our son got AA and A (no A* then) in science and maths and did just that. I also taught physics and chemistry GCSEs in my office to small groups of students in our 'study club' extended curriculum.

 In double award science we often taught things that were not on the curriculum at all. This applied not just to the knowledge content, but also to some aspects of data analysis that went well beyond that required in the AQA biology exam. An example was our, 'What can a detective learn from footprints?' unit.

 The 'excuse' (not that we needed one) for this was 'Forensic Science'. The first bit of practical was making plaster casts of real footprints - great fun. Then the science: what can the detectives learn from the footprint and how certain can they be?

 Every student then measured their height, and the length of their own feet and filled in their personal data onto a large chart on the lab blackboard. When it was completed students made copies in their note books, Then the data were plotted onto 'scatter-graphs' using Excel. Some class discussion elicited the suggestion that boys' and girls' data needed to be treated separately, 'because boys had bigger feet than girls'. We then discussed scatter-graphs in general and the concept of 'correlation'. Excel was then used to produce the correlation coefficients. The class scatter-graphs were printed out with a copy given to each student to stick in their note book. Very rich class discussion then followed.

 Was it possible to work out the most probable height of the owner of a footprint? If so how? Could we be sure of the sex of the person? What about the age? Could the footprint of a boy be confused with that of a woman?

 How certain could we be about our predictions?

 How could this be improved? Would it be better to combine the data for the whole year group? Why would this be important?

 Absolutely none of this was on the GCSE syllabus. Was it science? - Certainly. Was it worth doing? - You judge.

 Is this sort of teaching now happening in Academy schools? - You tell me.

 Were the girls that were complaining in the Guardian likely to have been taught like this? - The answer is obvious.

 So are the complaints about this AQA exam justified? - Certainly not, but they say a great deal about the debased and degraded state of our education system as a result of GERM and the marketisation paradigm.

Still more important, could Gove and his like get away their lies if all our school students had a proper broad and balanced education?

As for the student you spoke to, she was not representative of the complainants. All the ones I read were complaining that they couldn't do it because it was not on the syllabus and so they hadn't revised for it. It was therefore 'not fair'. That was certainly the line of the Guardian article. I haven't even gone into the question about the 'trails'(sic) question that they also complained about for the same reason.


mike d's picture
Fri, 20/05/2016 - 01:23

USUAL AQA ARROGANCE


rogertitcombe's picture
Fri, 20/05/2016 - 13:13

I don't know who that is aimed at on here. I have got nothing to do with AQA.


Janet Downs's picture
Fri, 20/05/2016 - 09:29

BBC reports that a Northern Ireland school intends to complain to AQA about the Biology paper.


agov's picture
Fri, 20/05/2016 - 11:49

So, hypothetically, is it absolutely fine if an exam paper has nothing at all to do with the prescribed syllabus because the questions might be testing something or other about something or other that students might happen to know from some other random source, or might somehow or other be able to work out from the information given in the question?


rogertitcombe's picture
Fri, 20/05/2016 - 13:00

No agov. Biology is a branch of science. Science is about observation, evidence, prediction and the interpretation of data. So any question in any science paper can and should expect an appropriate level of understanding and skill in all the foundation tenets of science. It is not about 'knowing anything' from any source, random or otherwise. It is essentially about how it is possible to decide whether a generalisation or pattern about the universe, the earth, life on earth including, in the case of biology,  how humans and other animals behave, is true or not. Richard Dawkins is especially good on what science is and what it isn't. This does include being able to 'work things out' from information provided in the question. If it was content specified in the syllabus it would not be necessary to provide the information in the question.

Even if English science exams were not like this then PISA science questions certainly are. They have to be 'content free' so that all students from different countries and cultures can be validly compared and assessed. See this LSN article for an explanation.

Surely the primary purpose of all 16+ GCSE science courses is to ensure an appropriate  degree of 'scientific literacy' in school leavers. The specific content in biology, chemistry, physics and integrated science courses is just the knowledge base used to result in this outcome. 


Janet Downs's picture
Sun, 22/05/2016 - 08:19

Agov - Perhaps we can look forward to the time when candidates entering an exam room won't need prior knowledge of a subject because they'll be given some data to analyse which will show data analysis only.   No need, then, to learn anything so irrelevant as biological terms for biology, the Periodic Table in chemistry, ohms in Physics (my 'ohmwork', sadly not completed to my shame).  Similarly in history:  no need to learn about what happened in the past (causes, effects etc),  the exam paper will provide some passages for analysis.    English wouldn't require pupils to have studied books but would be reduced to comprehension passages.

When GCSE was first introduced (when it was a human attempt to provide a qualification showing each pupil's achievement in subjects studied over two years), we were told GCSEs would show what pupils 'know, understand and can do'.    That is: knowledge, understanding and doing (ie interpreting, analysying, evaluating etc).

It's only to be expected, then, that candidates taking an exam expect questions which test these three.

 


Janet Downs's picture
Fri, 20/05/2016 - 13:28

And we've already established that UK 15 year-olds score above the OECD average in PISA Science.  


agov's picture
Sat, 21/05/2016 - 11:23

I suppose ancient Greek theorists might say something similar but I thought you were a hardened fan of the Galilean/Newtonian revolution and actually knowing stuff rather than just talking theoretically irrespective of any knowledge of observed measurable phenomena. If GCSE science courses are actually a bit like a methodology course perhaps I should consider a new career as a science teacher - not that I know anything about gaseous domains. I don't know why the 'knowledge base' should be dismissed as assumed to exist when the exam testing it has (or so it is claimed in this case) no way of exploring whether candidates know anything about it.


rogertitcombe's picture
Sun, 22/05/2016 - 09:51

Now you are being silly Janet.  All subjects including all those that you mention must include all the aspects of the subjects that you also mention.

All subjects require factual knowledge. Who has ever denied that? However the factual knowledge is never enough for any subject.

Take your examples.

A biology student's studies cannot be restricted to knowing the name of everything.

It is not necessary to 'remember' the Periodic Table except on 'University Challenge'. A copy is always provided in chemistry exams. It is is essential to understand the patterns revealed in the rows and columns.

In physics, units are named after scientists. The Ohm is the unit of electrical resistance. the crucial understanding is that Resistance = Volts/Amps. Far more important that you understand that, than which scientist the unit of resistance is named after. You would not lose any marks for not knowing the name as long as you used the right symbol for the unit - the Greek capital Omega.

In history it is not possible to 'know' anything about what happened in the past without evidence. The further you go back in time the more important the evidence becomes. Even many World War II 'facts' are now disputed and require evidence; massacres in Poland for example. Arguments are going on now in the International War Crimes Tribunal about events in the Balkans only a few years ago. We can anticipate much more of this about Syria when those wars are finally over.

In English there is no point studying any book if you can't comprehend (make meaning from) reading it.

'Knowledge, understanding and doing' are indeed all involved in all studies of everything. They are all essential.

However Bloom has six levels in his pyramid. Knowledge is the basement layer. Harder questions test aspects of study further up the pyramid.

I think Bloom is right. There is a extended discussion of this in Section 1.7 of 'Learning Matters', p31. You can even read it free by using the 'turn the pages' fuction on Amazon.

Have we exhausted this now Janet?


rogertitcombe's picture
Sat, 21/05/2016 - 12:19

Then let us hope that the students that take the next round of PISA science tests have been taught a lot better that these that are complaining about the AQA exam. Given the scale of Academisation, I am not hopeful.


rogertitcombe's picture
Sat, 21/05/2016 - 12:41

agov - I am sure you would make an excellent science teacher. The Galilean/Newtonian/Popper revolution stresses the vital role of hypothesising, experimenting and learning from mistakes. There is no shortage of facts in science or science exams and never will be. Obviously students have to learn about something, but it is the meaning that results that is most important. That is why 'birdwatching' is not 'biological science' and 'trainspotting' is not 'engineering science. However I confess to having been an avid trainspotter in my youth. I was in good company with the likes of Pete Waterman (not personally). See his brilliant TV programmes.

The point about experiments is that they produce data. The data are no good unless proper conclusions are made. Galileo's famous Tower of Pisa dropping weights experiments is an excellent example. If as a teacher you try to repeat it the students are always quick to tell you that it doesn't work because the objects do not hit the ground together for all sorts of reasons that are likely to confuse. Reality is always complicated.

So there is nothing more important than developing the skill of interpreting the results of experiments. It is always really difficult and fraught with traps. That is why Janet is so wrong in agreeing with the student that the AQA data analysis questions were 'trivial' and not worthy of all her efforts in mugging up the facts of her biology course. 

They could both not be more wrong.


Janet Downs's picture
Sun, 22/05/2016 - 08:02

Roger - And let's hope that the next round of PISA (already taken) didn't contain data in the science paper which showed, say, the percentage of cats which liked fish or chicken in jelly and then ask students to use to the data to disprove the statement that 'all male cats like jelly'.   It might show skill at data analysis but doesn't say much about the biology of cats, fish or jelly.


Janet Downs's picture
Sun, 22/05/2016 - 08:07

Roger - Of course interpreting experimental data is important in science.   But data which has more to do with marketing (preferences for drinks) is not a scientific experiment in the same way that Gove's surveys weren't scientific (although they were presented as having validity).


rogertitcombe's picture
Sun, 22/05/2016 - 08:58

Janet - A mainstream biology GCSE experiment for decades has involved 'Choice Chambers'. These experiments are entirely about the preferences of animals. They involve providing an enclosed space divided into chambers that have different environmental conditions and every minute (say) counting the number of animals found in each chamber. For school purposes small creepy animals such as maggots or woodlice are used with environments like dark/bright, wet/dry etc  but the teaching purpose is general not specific and can apply in principle to all animals.

The practical biological examples are vast. In ecology they may be related to habitat sudies in order to understand the change in populations of animals related to their foodstuffs. Many wild animals of the cat species could legitimately be studied. The similarities between wild and domestic cats are a legitimate scientific study in relation to evolutionary studies.

In human housing pest eradication the application is obvious. Some pests like fleas, mosquitos and rats carry serious diseases. Major epidemics that have in the past wiped out millions of people are now prevented by means of such biological experiments. Food hygiene is a further application.

In farming, birds (chickens), fish(salmon), mammals (rabbits) etc these are vital issues related to food productivity. In parts of Asia where I have visited farmed insects are also eaten.

In plants greenhouse and field trials are mainstream biological agricultural science.

In medicine, bacterial choice chamber experiments are essential for finding cures to bacterial diseases. That is how anti-biotics are developed.

There is an important pedagogic principle of introducing pupils to scientific concepts first through examples set a familiar context. Such PISA questions (or GCSE) would therefore be absolutely fine.

Science (like maths) always gets very hard very quickly. Counting beans soon gets into prime numbers and Nobel prizes. Science and maths teachers have a hard job and for me there is clear evidence that the behaviourist teaching favoured by Academies (see my articles on the EEF research) is becoming less effective. This really matters if the government, through Academy management and ownership is forcing our teachers to teach badly. That is why I am so persistent about the issue on LSN, my website and in my book.

 


rogertitcombe's picture
Sun, 22/05/2016 - 09:12

Why do you assume that the drinks preferences in the AQA question was about 'market preferences'. It could have been about 'health education'.

I am no 'free market' neoliberal, but I don't see why market preference data shouldn't be included in science exam questions. The division of science into biology, chemistry and physics is purely arbitrary. There are no clear boundaries. Physicist tend to believe that all sciences are branches of physics anyway. Good science teaching should always reject boundaries that try to dictate what science can be allowed to be taught.

New branches of science are emerging all the time - eg nanoscience.

Also it has never been the right of non-scientists (eg priests, politicians or journalists) to tell scientists what can and cannot teach about their subject.

I recall the following advice to school students that fall asleep in science lessons and need to work out when they wake up wether they are in a biology, chemistry or physics class.

If it moves its biology.

If it stinks its chemistry.

If the teacher can't get it to work its physics. 


agov's picture
Sun, 22/05/2016 - 12:10

Of course observations are not science without a theory to explain them but these exam candidates do not seem to have been asked to produce an explanatory hypothesis for a data set nor, apparently, to discuss anything to do with any established biological theory. The data analysis question could just as well, or better, have been for a maths or statistics exam. Why have subject specific exams at all if the only thing students are asked to demonstrate is an ability to sensibly discuss general non-subject specific concepts and interpret a provided set of data? Testing understanding is no doubt a good thing but surely in a subject specific exam it should, largely at least, relate to specific subject knowledge?

You seem to be defending science, good science teaching and good education against bad teaching, narrow training, and marketisation and privatisation: what's that got to do with this particular examination into nothing in particular?

It may be true that some students could not answer these questions because they had not been specifically instructed in how to do it. If so, then that is likely a poor reflection on the general state of education. It still demonstrates nothing about what they know (or memorised, though I'm not quite sure why that is necessarily a bad thing) about a specific subject and still means that people can pass subject specific exams without knowing anything about the subject.

I'm reminded of a philosophy professor who related that one of his students had said that to do philosophy you don't actually have to know anything at all, you just have to be very very clever (- not that the professor agreed). Is that what you propose for all subjects? I'm not sure Newton etc would approve.


rogertitcombe's picture
Sun, 22/05/2016 - 14:16

agov - If the questions had been for a maths or statistics exam they would have been much harder. Maths and science are linked. You certainly need to understand some maths in order to to apply biology. You do not need to understand any biology in order to apply maths. It is therefore necessary that the maths that it is necessary to understand in order to apply biology knowledge appears in the biology GCSE exam.

In the light of the response to this AQA paper it is probably wise for the examiner to frame the maths questions in the context of biology. I am taking a more extreme 'Roger Titcombe' position in arguing that this is not necessary. However in this AQA exam all the maths questions were framed in the context of biology. Biology is the study of living things. All the data analysis questions were framed in the context of data about living things. Niether, as Janet argues, were these contexts trivial, as I have tried to explain.

Your penultimate paragraph distorts the position of science GCSE exams. I haven't seen the whole paper, but I would be very surprised if most of it wasn't stuffed with biology content knowledge. I don't know what you mean by 'a pass'. I assume you mean a C grade. The level of understanding needed for C grades has been grossly reduced. I strongly suspect that students can now get a C grade in GCSE biology, including in this AQA exam, without even attempting these very easy data analysis questions. They would not be able to achieve this without knowing a load of the biology content.

I like your last paragraph about the study of philosophy, but the emphasis is the wrong way round. It is not so much that you need to be clever in order to understand philosophy  as that students become clever as a result of studying philosophy. The same is true for science if it is taught (and examined) properly. That's a good thing isn't it?

That's why I am strongly in favour of 'philoposophy' replacing statutory RE in our school system

The arguments are also made very well in this LSN post that I quote from as follows.

This is from the Independent of 6 February. 

"In 1909, when four old Etonians decided that it was time to put something back into the community to offer fresh hope for teenagers from disadvantaged homes, they probably never thought their actions would still be having a profound impact on lives in the 21st century." 

"The four friends – Arthur Villiers, Gerald Wellesley, Alfred Wagg and Sir Edward Cadogan – thought that the best they could do for the youngsters was to give them a chance to succeed at sport by setting up the Eton Manor Boys' Club in the East End of London." 

"At the turn of the century, the name of the charity was changed to the Villiers Park Education Trust, in memory of the key role that Arthur Villiers played in its development. Today, the trust offers a unique programme to persuade the brightest pupils living in some of Britain's most "forgotten" disadvantaged areas to seek places at the country's most selective universities. Essentially, it helps those who could be considered to be the "hidden" poor to lift their academic expectations." 

"According to Richard Gould, the chief executive of the trust: 'The starting point is: we want to help them develop a passion for their subject. And the way of doing that is by not getting them to do anything that's in their exam specification.'" 

"He cites research showing that pupils who take part in such enrichment classes away from the rigid exam syllabus end up getting better grades at A-level than their counterparts who sat in rows listening to teachers trying to 'teach to the test'.

"The best way to secure a top grade pass, Gould argues, is not necessarily by putting nose to the grindstone or sticking it in a book that just deals with the exam syllabus. A survey of those who finished the Scholars' Programme last year revealed that 78 per cent gained a place at university and 71 of the grades they achieved at A-level were either A*, A or B grades. Eighty-six per cent said that the programme had bolstered their confidence and 89 per cent believed that it would lead to better job prospects." 

This is in direct contradiction to the pedagogy of the Core Knowledge Foundation of E D Hirsch, so beloved of Michael Gove and David Green of Civitas. Gove wants to force state schools to emulate the likes of Eton College, but I wonder if the teaching in such institutions really is anything like that which they praise and endorse

The impression I get is that there is a great deal of pupil-pupil and pupil-teacher talk in the context of Vygotsy's Zone of Proximal Development (ZPD). 

All very Shayer and Adey and Blob-like. 

This is certainly the argument put by the Old Etonian's Villiers Park Trust. The Trust is currently involved in a project in Hastings. 

"On the residential course, they tackle issues such as how to break codes and a day in the life of a media journalist, during which they look at the different slants that can be put on an individual story. They do presentations to the rest of their year group to show how they have tackled problems, which is a key element in boosting their confidence and the communication skills they will need at job and university admissions interviews." 

"I feel sorry for the teachers at the schools because the syllabus is so dense they don't have the time to depart from it," says Gould. "They're risk-averse and they haven't got time to do anything that's outside the syllabus." 

Just how Blobby can you get? 

I didn't expect to get from Eton College such powerful support for my arguments in favour of developmental teaching and condemnation of the behaviourism of the knowledge-focussed rote learning advocated by Toby Young, David Green and Michael Gove.


agov's picture
Wed, 25/05/2016 - 14:23

(Sorry, duty called.)

The maths needed for that question is of course required to do science and it makes no difference in what context the question is set and it might just as well have been about widgets. If it were only a matter of one question on maths being answerable independently of actually knowing any biology you would be right. The concern is that the exam required little or no subject-specific knowledge.

It is not that I am distorting the position of science GCSE exams but that students claim this paper distorts the role of subject knowledge in GCSE science exams by not requiring any. You say you "would be very surprised if most of it wasn't stuffed with biology content knowledge". The allegation by some seems to be that it wasn't.


Janet Downs's picture
Mon, 30/05/2016 - 08:49

See UPDATE in main article for comment published in Schools Week 27 May from parent of candidate.


agov's picture
Mon, 30/05/2016 - 09:51

rogertitcombe's picture
Mon, 30/05/2016 - 11:54

agov - In the words of Shania Twain, 'that doan impress me much' - the complaint on 'Schools Week' I mean, not the article by Laura McInerney, with which I am in full agreement. This is what I wrote on 'Schools Week'

"I am very much with Laura McInerney on this. The English education system is blighted by teaching to the test. In my view all GCSE exams should contain questions that their teacher could not possibly have prepared them for. This is what PISA attempts to do and quite right too. It is the only way to test deep understanding. If this came about then all the escalating damage to our pupils and the continuing degradadation of our education system caused by academisation and marketisation would rapidly be halted. The Global Education Reform Movement (GERM) would have to be be rapidly abandoned. The quality of exam questions cannot be judged from the reactions of students on Twitter on finding that their cramming and revision has been in vain. The purpose of schooling is not to enable the maximum number of students to pass the maximum number of exams, but to produce cleverer, wiser and more creative school leavers who can apply their knowledge in the unpredictable and challenging future that faces us all.

That is what my book, 'Learning Matters' is all about.


agov's picture
Thu, 02/06/2016 - 11:36

Well, now that this LSN article is available again after failing to load for a few days, I can again say that you make an excellent case and one that has much merit apart from the continuing fact that it seems to have nothing to do with the complaint being made. In this case the mother explicitly says "It wasn’t that the exam was hard, it was that it was odd and lacked opportunities to show off the biological knowledge they had been learning" and "it made the paper easier for him". It may be that there are also other (unfounded) criticisms about not having been specifically drilled for such questions but that doesn't eliminate the possibility of this paper having no subject-specific content.


rogertitcombe's picture
Thu, 02/06/2016 - 15:56

agov - I strongly suspect that the paper contained loads of subject content, but I do not know for sure any more than you because we have not seen the paper. Some of the students were complaining that they couldn't do the very easy data analysis questions because they were not specifically on the syllabus. Eg. they didn't know what a 'trail' (sic) was, (trial) and they didn't know wat an 'independent company' was in relation to who carried out the trial. This brought about the initial complaint that it was a 'Business Studies' exam not Biology, because the term, 'Indepenendent Company' in the context of 'trail' (sic) did not appear in the syllabus. The Guardian reported this straight. Presumably the reporter didn't know what an 'independent trail' was either.


Add new comment

Already a member? Click here to log in before you comment. Or register with us.