Learning from assessment: insights about student learning from programme level evidence

  • Published on
    05-Jan-2016

  • View
    34

  • Download
    1

DESCRIPTION

Learning from assessment: insights about student learning from programme level evidence. Dr Tansy Jessop, TESTA Project Leader Launch of the Teaching Centre School of Politics and International Relations University of Nottingham 15 May 2014. TESTA premises. - PowerPoint PPT Presentation

Transcript

The TESTA process: key ingredients in its spread to over 70 programmes in more than 20 universitiesLearning from assessment: insights about student learning from programme level evidenceDr Tansy Jessop, TESTA Project LeaderLaunch of the Teaching CentreSchool of Politics and International RelationsUniversity of Nottingham15 May 2014Assessment drives what students pay attention to, and defines the actual curriculum (Ramsden 1992).Feedback is significant (Hattie, 2009; Black and Wiliam, 1998)Programme is central to influencing change.TESTA premisesStudents spend most time and effort on assessment. Assessment is the cue for student learning and attention. It is also the area where students show least satisfaction on the NSS. Scores on other factors return about 85% of good rankings, whereas only 75% of students find assessment and feedback good. We often think the curriculum is the knowledge, content and skills we set out in the planned curriculum, but from a students perspective, the assessment demands frame the curriculum. Looking at assessment from a modular perspective leads to myopia about the whole degree, the disciplinary discourse, and often prevents students from connecting and integrating knowledge and meeting progression targets. It is very difficult for individual teachers on modules to change the way a programme works through exemplary assessment practice on modules. It takes a programme team and a programme to bring about changes in the student experience. Assessment innovations at the individual module level often fail to address assessment problems at the programme-level, some of which, such as too much summative assessment and not enough formative assessment, are a direct consequence of module-focused course design and innovation. 2Thinking about modules modulus (Latin): small measureinterchangeable unitsstandardised unitssections for easy constructionsa self-contained unitRaise the question: are there problems with the packaging? Works for furniture does it work for student learning? Assumptions of modularity: self-contained; disconnected; interchangeable. The next slide indicates some of the tensions of packaging learning in modules, and tensions inherent in the ,metaphor./ 3How well does IKEA 101 packaging work for Sociology 101?FurnitureBite-sizedSelf-containedInterchangeableQuick and instantaneousStandardisedComes with written instructionsConsumptionStudent LearningLong and complicatedInterconnectedDistinctiveSlow, needs deliberationVaried, differentiatedTacit, unfathomable, abstractProductionOriginally used for furniture and prefab and modular homes how well does it suit educational purposes? Im not taking issue with modules per se, but want to highlight that there have been some unintended consequences some good, some bad of using modular systems. Many programmes have navigated through them, some havent. Anyone who has built IKEA furniture knows that the instructions are far from self-evident and we have translated a lot of our instructions, criteria, programme and module documents for students in ways that may be as baffling for them. Have we squeezed learning into a mould that works better for furniture?4HEA funded research project (2009-12)Seven programmes in four partner universitiesMaps programme-wide assessmentEngages with Quality Assurance processesDiagnosis intervention cureWhat is TESTA?Transforming the Experience of Students through AssessmentTESTA Cathedrals Group UniversitiesEdinburghEdinburgh NapierGreenwichCanterbury ChristchurchGlasgowLady Irwin College University of DelhiUniversity of West ScotlandSheffield Hallam Huge appetite for programme-level data in the sector. Worked with more than 100 programmes in 40 universities internationally. The timing of TESTA many universities revisiting the design of degrees, thinking about coherence, progression and the impact of modules on student learning. The confluence of modules with semesterisation, lacl of slow learning, silo effects and pointlessness of feedback after the end of a module7TESTA is a way of thinking about assessment and feedbackGraham GibbsWhat started as a research methodology has become a way of thinking. David Nicol changing the discourse, the way we think about assessment and feedback; not only technical, research, mapping, also shaping our thinking. Evidence, assessment principles8Time-on-taskChallenging and high expectationsStudents need to understand goals and standards Prompt feedbackDetailed, high quality, developmental feedbackDialogic cycles of feedbackDeep learning beyond factual recallBased on assessment principlesTESTA Research Methods(Drawing on Gibbs and Dunbar-Goddet, 2008,2009) ASSESSMENT EXPERIENCEQUESTIONNAIRE FOCUS GROUPS PROGRAMME AUDIT Programme Team MeetingCase Study Based on robust research methods about whole programmes - 40 audits; 2000 AEQ returns; 50 focus groups. The two triangulating methodologies of the AEQ and focus groups are student experience data student voice etc. Three legged stool. These three elements of data are compiled into a case profile which captures the interaction of an academics programme view, the official line or discourse of assessment and how students perceive it. This is a very dynamic rendering because student voice is explanatory, but also probes some of our assumptions as academics about how students work and how assessment works for them etc. Finally the case profile is subject to discussion and contextualisation by insiders the people who teach on the programme, who prioritise interventions. 10Case Study X: whats going on?Mainly full-time lecturersPlenty of varieties of assessment, no examsReasonable amount of formative assessment (14 x)33 summative assessmentsMasses of written feedback on assignments (15,000 words)Learning outcomes and criteria clearly specified.looks like a model assessment environmentBut students:Dont put in a lot of effort and distribute their effort across few topicsDont think there is a lot of feedback or that it very useful, and dont make use of itDont think it is at all clear what the goals and standards areare unhappyLarge programme; modular approaches; marker variation, late feedback; dependency on tutors11Case Study Y: whats going on?35 summative assessmentsNo formative assessment specified in documentsLearning outcomes and criteria wordy and woollyMarking by global, tacit, professional judgementsTeaching staff mainly part-time and hourly paid.looks like a problematic assessment environmentBut students:Put in a lot of effort and distribute their effort across topicsHave a very clear idea of goals and standards Are self-regulating and have a good idea of how to close the gap12Two paradigmsTransmission ModelSocial Constructivist model In pairs/groups, read through quotes from student focus group data on a particular theme.What problems does the data imply? What solutions might a programme develop to address some of these challenges?A3 sheets provided to tease out challenges and solutions.Focus Group dataChallengesSolutionsStudent voice dataIf there werent loads of other assessments, Id do it.If there are no actual consequences of not doing it, most students are going to sit in the bar.I would probably work for tasks, but for a lot of people, if its not going to count towards your degree, why bother?The lecturers do formative assessment but we dont get any feedback on it.Theme 1: Formative is a great idea but We could do with more assessments over the course of the year to make sure that people are actually doing stuff.We get too much of this end or half way through the term essay type things. Continual assessments would be so much better.So you could have a great time doing nothing until like a month before Christmas and youd suddenly panic. I prefer steady deadlines, theres a gradual move forward, rather than bam!Theme 2: Assessment isnt driving and distributing student effortStudent workloads often concentrated around two summative points per module. Sequencing, timing, bunching issues, and ticking off modules so students dont pay attention to feedback at the end point.19The feedback is generally focused on the module.Its difficult because your assignments are so detached from the next one you do for that subject. They dont relate to each other.Because its at the end of the module, it doesnt feed into our future work.Youll get really detailed, really commenting feedback from one tutor and the next tutor will just say Well done.Theme 3: Feedback is disjointed and modularThe criteria are in a formal document so the language is quite complex and Ive had to read it a good few times to kind of understand what they are saying.Assessment criteria can make you take a really narrow approach.I dont have any idea of why it got that mark. They read the essay and then they get a general impression, then they pluck a mark from the air. Its a shot in the dark.Weve got two tutors one marks completely differently to the other and its pot luck which one you get.Theme 4: Students are not clear about goals and standardsLimitations of explicit criteria, marker variation is huge, particularly in humanities, arts and professional courses (non science ones) Students havent internalised standards which are often tacit. Marking workshops, exemplars, peer review.21Too much summative; too little formativeToo wide a variety of assessmentLack of time on taskInconsistent marking standardsTicking modules offPoor feedback: too little and too slowLack of oral feedback; lack of dialogue about standardsInstrumental reproduction of materials for marksMain findingsStudents and staff cant do more of both.Reductions in summative how many is enough?Increase in formative and make sure it is valued and required.Debunking the myth of two summative per module.Articulating rationale with students, lecturers, senior managers and QA managers.Summative-formative issuesThe case of the under-performing engineers (Graham, Strathclyde)The case of the cunning (but not litigious) lawyers (Graham, somewhere)The case of the silent seminar (Winchester)The case of the lost accountants (Winchester)The case of the disengaged Media students (Winchester)1. Examples of ramping up formativeThe case of low effort on Media StudiesThe case of bunching on the BA Primary2. Examples of improving time on taskSeminars youtube presentations; teaching student map my programme; under-confident but keen journal club Principlesmake it authentic, Multi stage; Public work social pressure; Spread and co-ordinate hand in dates; Formative requirementspeer marking and accountabilitysamplingSetting first year expectationsBrief, frequent, innovative, developmental25The case of the closed door (Psychology)The case of the one-off in History (Bath Spa)The case of the Sports Psychologist (Winchester)The conversation gambit3. Engaging students in reflection through improving feedbackThe case of the maverick History lecturer (a dove)The case of the highly individualistic creative writing markers4. Internalising goals and standardsChangesImprovements in NSS scores on A&F from bottom quartile in 2009 to top quartile in 2013Three programmes with 100% satisfaction ratings post TESTAAll TESTA programmes have some movement upwards on A&F scoresProgramme teams are talking about A&F and pedagogyPeriodic review processes are changing for the better.Impactswww.testa.ac.ukTESTA Higher Education Academy NTFS project, funded for 3 years in 2009. 4 partner universities, 7 programmes cathedrals group. Gather data on whole programme assessment, and feed this back to teams in order to bring about changes. In the original seven programmes collected before and after data. 30Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31.Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489.Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112.Jessop, T. and Maleckar, B. (in press). The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education.Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale study of students learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88.Jessop, T, McNab, N & Gubby, L. (2012) Mind the gap: An analysis of how quality assurance processes influence programme assessment patterns. Active Learning in Higher Education. 13(3). 143-154.Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 517Sadler, D.R. (1989) Formative assessment and the design of instructional systems, Instructional Science, 18, 119-144.References

Recommended

View more >