PEC Student Evaluation Results

This post details my use of a student survey to evaluate my performance on the PEC course at MMU. It shows some work I have carried out under dimension A5 of the UK PSF, as I am evaluating my professional practices with a view to improve pedagogy. I believe it also also demonstrates dimension K5 (methods for evaluating the effectiveness of teaching), and K6 (quality assurance and quality enhancement for academic and professional practice).

Evaluation design

As this process constitutes a simple data collection to feed into a reflection on my teaching rather than a formal evaluation much of the available literature is something of an ‘overkill’. However, Gravestock & Gregor-Greenleaf provide an excellent summary of the benefits and pitfalls of student evaluation, and I have endeavoured to follow some of the principles they outline, including educating students on the use of the data, and avoiding a “poor presentation and contextualisation of evaluation data” (2008:6).

Sharp (1990) advises an ‘illuminative’ style of evaluation, with varied data collection and an explicit involvement from students. In this case I shall be collecting simple numerical ‘Lickert-type’ data (Likert, 1932), but also encouraging students to provide a more detailed commentary in text box for each rating. Such quantitative data is important as the low number of student on the course gives a ‘small N’, making entirely quantitative judgements invalid (Boysen et al., 2014).

Numerical data

Students scored each course activity on a scale from “1: of no use and/or uninteresting” to “5: extremely useful and/or very interesting”. Each numerical option was given a comments box, and students were encouraged to add information to explain their choice here. They were also advised that the evaluation was entirely anonymous, and will only be used to improve the course for forthcoming years.

Scriven (1995) highlights a range of common errors related to the use of course ratings data, including the use of scores without regard to distribution. The mean score for activities given below, and I have included the range of scores for each to give some idea of the distribution.

  • Physics sessions: 4.8 (4 – 5)
  • Biology sessions: 4.3 (2 – 5)
  • Chemistry sessions: 2.5 (1 – 3)
  • Maths sessions: 2.8 (1 – 4)
  • Moodle resources: 4.1 (3 – 5)
  • Mid-Course Online Assessments: 4.3 (4 – 5)
  • Personal project: 4.3 (3 – 5)
  • Portfolio completion: 4.6 (4 – 5)
  • Your project presentation: 4.6 (4 – 5)
  • Observation of others’ presentations: 4.7 (4 – 5)
  • Educational visits: 3.8 (3 – 5)
  • External sessions with school pupils: 3.5 (3 – 5)

Overall I was very pleased with the high marks for university physics sessions; they were mainly 5s, with entirely positive comments including “well-structured”, “high-quality”, “absolutely essential”, and “so much information and really well-delivered”. It was also interesting to see students picking up on the variety of pedagogical approaches; “many useful tips that I can take into my teaching career”, “useful to observe [tutors’] different teaching styles”, and “…a great help in understanding the physics principles and the practical skills required for teaching.” No suggestions for improvement were received from students at this point. Whilst not precluding changes being made, it does suggest that physics instruction may be a strength of the course.

In order to further confirm students positive reporting of physics sessions I observed a colleague teaching on the course. The content delivered was at an appropriate level, and used a range of suitable strategies (presentations, animations, demonstrations, independent research, and ‘hands-on’ practical work). It was interesting to note that he adopted a very similar manner with students to my own approach, being quite informal and conversational rather than didactic. Whilst this is perfect for such a small group as it allows them to be more comfortable asking questions and seeking help, it would not be appropriate for larger groups or lectures. The tutor I observed had a different approach to practical work, in that rather than demonstrate an activity for students to replicate he frequently had a sample method and additional activities printed on a single side of A4. This is something I have since used in my own practice, and is a more efficient use of time that promotes an independent approach to practical work. He was also very proactive in making connections between the theory being studied and the wider world through the use of recent news articles and interesting examples. This added a lovely colourful dimension to the session, and is something that I have tried to emulate.

There is a variability in students’ responses to other subject sessions, a small number of which were included to either prepare students for qualification as a general science teacher, or to help embed mathematical skills associated with the subject.

Biology was generally well-received, with comments such as “enjoyable content and appropriate pace”, “excellent… clear and concise”, and “nice to have additional information”. This tutor had explicitly structured sessions to include PEC students in an existing course, which was not the case in chemistry. This may be reflected in the low feedback scores and somewhat negative comments; “not great joining people already on the course”, “part of an ongoing course so wasn’t targeted at out needs”, “not enough chemistry as was part of [existing course] not aimed at us”. This suggests a need to work on the integration of the chemistry portion, and an opportunity for the biology tutor to share good practice.

The student experience was again different in maths sessions, with a larger range of scores reflecting written feedback; “maths was atrocious”, “not specific enough to us”, “my maths is strong but if it wasn’t these sessions wouldn’t have helped improve it”, “good for one session, but became very ’samey’ “. I know that one aim of the maths sessions was to develop individual thinking and problem-solving skills, and highlight how these can be promoted in schools. Perhaps insufficient communication of these aims with physics students devalued the experience somewhat.

I was pleased with the Moodle resources and online assessment scores, especially given the efforts expended in improving this aspect; some concerns were raised around the speed at which materials were shared with students, but otherwise feedback was very positive. The variety of assessments was appreciated, and seen as useful in portfolio building.

The portfolio was deemed to be highly beneficial, and many students produced excellent materials. Two wrote their own version of a physics textbook, and several others filled three lever arch files full of quality material that will be transferable to their own teaching. This reflects the positive outcomes suggested in an earlier post (see section ‘Assessments and Misconceptions’).

Whilst trying not to read too much into small differences, feedback regarding the research projects was generally good. Independence was celebrated, as was the chance to apply learning; “good to have lots of freedom”, “great to have an objective to help keep focused”, and “gave me a chance to explore physics and open my mind”. The chance to practice presentation skills was also appreciated; “helped to prepare [me] for teaching, and offered a chance to find and show a passion for what you’ve learnt!”, “nice to see everybody’s progress from when we first began, also a good practice for teaching in front of a group of people”.

The visits were self-organised, so I imagine the students’ experienced depended upon the venue selected. There were some suggestions for improvement around paperwork and connections to wider aspects of physics which will be addressed.

Experiences interacting with school pupils were seen quite negatively, and I think this is due to poor organisation on our part, especially with the deployment of PEC students in such situations. Feedback echos this, as these experiences were “Interesting, but feel it would be good for PEC students to have more of an input” and “didn’t feel like we were needed”. Whilst further involvement with school pupils may be difficult given health and safety aspects of interactions, I take on board requirement for more involvement.

Action plan

  • Maintain integrity of physics instruction – this is the main body of the course so remains a key priority
  • Increase the range of resources and assessments available on Moodle, and ensure resources are shared sufficiently quickly
  • Ensure all additional subject input is integrated with the physics student experience, and relevance to the course is regularly emphasised
  • Provide more structure and a flexible word count for writing-up educational visits
  • Consider more active deployment of students at events involving school pupils

1348 words

References

Boysen, G. A., Kelly, T. J., Raesly, H. N., and Casner, R. W. (2014) ‘The (mis)interpretation of teaching evaluations by college faculty and administrators’. Assessment & Evaluation in Higher Education 39(6), pp. 641-656.

Gravestock, P., and Gregor-Greenleaf, E. (2008) Student Course Evaluations: Research,
Models and Trends
. Toronto: Higher Education Quality Council of Ontario.

Likert, R. (1932) ‘A technique for the measurement of attitudes’. Archives of Psychology 22(140), pp. 1-55.

Scriven, M. (1995) ‘Student Ratings Offer Useful Input to Teacher Evaluations’. Practical Assessment, Research & Evaluation 4(7)

Sharp, A. (1990) ‘Staff/student participation in course evaluation: a procedure for improving course design’. ELT Journal 44(2), pp. 132-137.