Untitled Document

Abstract

Understanding students’ learning techniques and learning difficulties is essential in evaluating success in achieving learning outcomes and of the overall teaching-learning process in engineering programmes. Whilst the effectiveness of module delivery is closely related to teaching, less emphasis has been placed on the evaluation of modules as compared to the evaluation of teaching by students. The study aims to demonstrate how a novel student evaluation of module (SEM) questionnaire can be used to obtain students’ input and feedback on modules with a view to improving teaching and learning. Learning difficulties, potential solutions to overcome these perceived difficulties and ways to increase students’ interest in the modules have been explored through this alternative SEM. The proposed SEM allowed efficiency in data collection and analysis, with the added advantage of a global analysis on all modules offered in an engineering programme. Conclusions drawn from the SEM results can be used as guidelines for teaching-learning improvement, as well as for curriculum development.

Introduction

Today, Higher Education Institutions (HEIs) are expected to deliver high quality teaching to facilitate learning amongst students as a matter of course. As a result, performance indicators have been developed to assess teaching and learning in higher education, of which the most common method is student ratings. Student ratings of teaching effectiveness have been shown to be multidimensional, reasonably valid and useful to students, faculty and administrators (Marsh, 1987). Feedback from students is generally used firstly for administrative decisions and secondly for instructional improvement (Fresko and Nasser, 2001).

A study by Brennan et al. (2003) on student feedback in HEIs in the UK indicates that feedback from students is collected at a number of different levels. The levels include an individual teacher, a module, a semester, a programme, a subject, a department and a faculty. Of these, the module level form of feedback is the most common, especially for programmes with modular structures. Module feedback is perceived to be most effective in obtaining relevant information for relatively immediate implementation of improvements to the teaching and learning process.

Questionnaires are the main mechanism used for collecting student feedback. The two systems available for module evaluation via questionnaires are the standardised and diversified systems (Szwelnik, 2005). In the former, a standard questionnaire applied to all modules is designed by the management team responsible for quality assurance. As for the latter, various methods of evaluation may be used for different modules, depending on the discretion of the lecturer. Nevertheless, “questionnaire fatigue” is an issue of concern (as raised by staff at UK HEIs), especially with module level feedback (Brennan et al., 2003). In a modularised system based on semesters a student may be required to complete up to 12 questionnaires. Completion and use of data from these questionnaires can become ritualistic for both staff and students. From the students’ point of view, the purpose and use of “satisfaction” questionnaires is not always fully explained and students are not always convinced that their feedback is important. Another typical complaint by students is insufficient time given for completion of questionnaires. A report to the Higher Education Funding Council for England stated that, to improve learning within a module, the use of direct, qualitative feedback is preferable to questionnaires (Harvey, 2001). Qualitative discussion between staff and students about modules provides prompt and in-depth understanding of both positive and negative aspects. Since module level feedback is important for the on-going evolution of modules, teaching staff must be responsive to both formal and informal feedback.

At the University of Nottingham Malaysia Campus, an alternative SEM questionnaire was developed in order to identify learning difficulties faced by engineering students and the potential ways in which they could be overcome. This paper describes the reasoning behind the development of this SEM and its pilot testing within the engineering schools. The results of the alternative questionnaire are discussed and the implications of introducing a new SEM practice are examined.

Practice, format and analysis method of existing SEM

At the University of Nottingham, SEM has been a mandatory mechanism since the academic session 1996/97. At the same time, the student evaluation of teaching (SET) was introduced as a university-wide practice. However, prior to SEM being made compulsory for all schools1, many had been collecting data on module satisfaction informally. The main purpose of the implementation of SEM was to obtain feedback from students for curriculum development. The Malaysia Campus has used the same system of SEM and SET since the campus opened in the academic session 2000/01.

The current SEM practice for the Faculty of Engineering is that each module is evaluated individually. SEM questionnaires are distributed to the students by the lecturers concerned during the last four weeks of each semester. Each lecturer is expected to explain to the students the purpose of SEM, assuring them that their views are confidential and that the quantitative data is processed centrally. A student volunteer is selected to collect the questionnaires at the end of the evaluation and the students are left to complete the questionnaires for approximately 15-20 minutes without the lecturer present. The Faculty of Engineering employs a standardised system for SEM; the same set of questionnaires is used for the evaluation of all modules in each level of study for every programme per semester. This means that, for all of the engineering programmes, between four and seven SEMs are completed by each student in addition to the SET form. The current SEM questionnaire consists of 19 close-ended questions covering module delivery, learning outcomes and facilities. The questions are listed below:

  • The module has given me a good understanding of the subject
  • The module was well organised
  • The module has developed my interest in the subject
  • I have learnt a lot from this module
  • The library resources fully supported this module
  • The module helped me think critically
  • The objectives of the module were made clear
  • The handouts and other support materials were helpful
  • The pace of the module was just right
  • The work-load was just right
  • I learned to apply principles, concepts, generalisations and theories
  • I learned important techniques
  • The module has improved my communication skills
  • The size of the class facilitated effective learning
  • This module complements others I have studied
  • I have had enough opportunity to demonstrate what I have learned in this subject
  • The method(s) of assessment are appropriate to the content of this module

A Likert rating scale response is used whereby students are required to select one of the following: 1 for ‘strongly disagree’, 2 for ‘disagree’, 3 for ‘neutral’, 4 for ‘agree’ and 5 for ‘strongly agree’2. The rating for each question is obtained by averaging all the ratings provided by students’ responses to that particular question. An overall rating for the module is also computed whereby the module is considered to be satisfactory if its rating is higher than 3. Also included in the SEM are open-ended questions on what the students liked about the module, how the module could be improved and further comments.

Motivation behind the alternative SEM

By analysing the existing SEM results for each individual module in all engineering schools at the Malaysia Campus, several questions were raised, as follows:

  • Does the existing SEM reflect problems encountered by students in their learning process?
  • Does the existing SEM identify modules in which the majority of students are challenged and unable to cope?
  • Does the existing SEM provide adequate module-related information to lecturers in order to further understand the needs of students?

The existing SEM results provided no direct answers to the first and second points since it was not designed with these questions in mind. The SEM ratings were generally above the satisfactory level for all schools, hence it was hard to determine modules which were perceived as difficult in each programme. Within the School of Chemical and Environmental Engineering, for example, the study revealed that the module with the highest SEM rating (overall score of 4.5 out of 5) was viewed as difficult by students. Interestingly, the lecturer concerned was already aware, through informal qualitative feedback in class, that students found the module demanding. Discussions with teaching staff from the School of Mechanical Engineering and the School of Electrical and Electronic Engineering indicated that the existing SEM provided insufficient information for module review. It also omitted questions pertaining to the improvement of teaching-learning effectiveness. For the purposes of curriculum development, this is an important issue which must be addressed.

From a practical point of view, it was observed that the large number of surveys distributed to students (either by individual lecturers or through university-wide practice) towards the end of each semester has resulted in “questionnaire fatigue”. The idea of introducing one SEM per semester came as a solution to this problem. This alternative SEM, which comprises all the modules taught in the semester, will facilitate the entire process of response collection, data processing and results analysis. Feedback on all modules from the same group of students concurrently can ensure consistency in terms of data analysis for every school and hence provide a channel for comparison among modules in each programme. Additionally, one SEM instead of four to six per semester for every school will reduce greatly the time and effort required of the Faculty administrative staff.

Design of the alternative SEM

With the above concerns in mind, several requirements were considered when developing the alternative SEM. Firstly, the proposed SEM must be able to identify “difficult” modules and the associated reasons for this perception. In part, the questionnaires should recognise why students are struggling with these modules, as well as whether they have been appropriately equipped with sufficient background knowledge for the current level of studies. Also of interest were ways to aid students in overcoming these difficulties and to increase their interest in learning. As the emphasis of modern engineering education has shifted to what is being learned instead of what is being taught, the new SEM has to focus on learning outcomes. Lastly, this SEM should provide students with a platform for feedback on supporting facilities such as libraries, computing provision and experimental laboratories.

An example of the alternative SEM for the School of Mechanical Engineering can be found in Appendix 1. As shown in the questionnaire, students are required to rank the modules taught in their corresponding programmes according to their level of difficulty. Subsequently, they are asked to answer questions set to explore the types of difficulties, the potential ways to overcome these and the perceived approach to increasing their interest in the modules. Lastly, questions pertaining to background knowledge, learning outcomes and facilities are included in the questionnaire.

To facilitate the entire process of answering these questions, a series of answers have been carefully formulated for each question. Interviews with lecturers, particularly the more experienced, within the Faculty of Engineering provided the framework for the choice of answers. Some choices were obtained via direct and indirect student feedback during lectures, tutorial sessions and office consultations with students. Additionally, open-ended comments were allowed for each question. To further ensure that all key issues were addressed, the new questionnaire was first appraised by selected third year engineering students for immediate feedback through direct informal communication. Preliminary results analysis was subsequently performed. The SEM questionnaire was finally revised and handed out to second year engineering students.

Format and analysis of the alternative SEM

Unlike the existing SEM, whereby the rating is given according to how much one agrees with a statement, the analysis for this alternative SEM is dependent on the percentage of students agreeing with a statement. Students only select answers relevant to them through the list of choices for every question. Both single and multiple answers are allowed and the option for students to express any views for each question is also available. The main reason for adopting the “percentage of students” method is to minimise the subjectivity of opinion present when a rating scale is used. This method also limits the frustration associated with the five point rating scale system, whereby the respondents have to deal with determining how much they agree or disagree with each question. From past experience, the rating scale system does not provide sufficient information to allow changes and improvements to be made.

This alternative SEM questionnaire, consisting of twelve questions in total, is an optimal and effective approach to collecting meaningful feedback from students, as compared to the existing SEM with 19 questions. By formulating the questions under different categories, different levels of analysis are made possible if required. For instance, analysis can be carried out based on the category of “identified difficult modules” in each programme. In this case, any inadequacy which may have caused the modules to be perceived as demanding by the majority of students will be made clear. Alternatively, analysis can also be based on the category of “student academic results”. Here a better understanding of the major difficulties faced by students with different academic capabilities can be achieved.

Analysis of results from pilot study

The target students for this study were second year undergraduates of the academic session 2006/07 in the Faculty of Engineering at the University of Nottingham Malaysia Campus. They included students from the School of Chemical and Environmental Engineering, the School of Electrical and Electronic Engineering and the School of Mechanical Engineering. This particular group of students were chosen for this pilot study because direct feedback and observation by lecturers indicates that Year 2 is often a problem year, with students having more difficulties in coping with the modules. Appendix 2 lists the results obtained from the SEM carried out in the autumn semester. A total of 142 valid responses were obtained, approximately 52% of the total Year 2 students from all the Schools combined. Of these, 39% were from the School of Chemical and Environmental Engineering, 37% from the School of Mechanical Engineering and 24% were from the School of Electrical and Electronic Engineering. The ratio of international students to local Malaysian students in these Schools were 1:5, 1:4 and 1:3 respectively. Interviews with students revealed that the lower number of responses from the School of Electrical and Electronic Engineering may have been due to the fact that students in this School were relatively socially inactive when compared to students in the other two Schools and hence were more passive towards participation in this volunteer-based survey. The higher ratio of international to local students in the School of Electrical and Electronic Engineering could possibly have been a contributing factor to the lower number of responses, since international students tend to be reluctant to participate in surveys (Szwelnik, 2005).

Identification of difficult modules

In each academic year covering two semesters (autumn and spring), students complete 120 credits whereby one credit corresponds to ten hours of work on the part of the student. Students normally take 60 credits per semester (as in the case of the School of Chemical and Environmental Engineering and the School of Mechanical Engineering). For the School of Electrical and Electronic Engineering, some modules are offered over a complete academic year. In general, the majority of the 10 credit modules are delivered by 24 hours of lectures and 12 hours of example classes, though variations exist across the modules. Example classes are normally conducted with the aid of demonstrators to support student learning. Students are also expected to attend compulsory engineering laboratory sessions for some modules.

Although the assessment for each module varies according to module content, a typical 10 credit module will involve coursework and a two-hour formal examination. The type of coursework is dependent on the module and includes laboratory reports and problemsolving assignments. These are normally assessed during the semester to give rapid feedback to students. Formal examinations are conducted after the teaching semester is over and following a study break. Feedbackon the general performance of students in the examination is made available after the examination papers are marked.

Through the pilot study, two to three modules in each School were identified as “difficult” by students. Here, a module was deemed difficult if over 25% of the total responses identified the module as demanding, or if the module has a significantly higher percentage of responses compared to other modules (a difference of more than 5%). For the School of Electrical and Electronic Engineering, Signal Processing and Control Engineering and Mathematical Techniques for Electrical and Electronics Engineers were the perceived difficult modules. For the School of Chemical and Environmental Engineering Materials and Separation Processes 2 were identified. Mechanical Engineering students found Structural Vibration 1 and Solid Mechanics 2 difficult.

By examining the modules identified as difficult, a common characteristic could be picked up: that the modules require in-depth understanding of concepts, followed by heavy mathematical or computational manipulations in the applications of these concepts. This preliminary examination has shown that a more comprehensive study of these modules is necessary in order to understand the reasons behind these findings.

Learning difficulties perceived by students

More than two-thirds of the respondents from all schools combined found understanding theories and concepts to be the most difficult part of their learning process, followed by the application of these theories and concepts and, finally, solving problem sets or assignments (which was found difficult by approximately half of the participants). The number of students who chose “understanding theories and concepts” as their main learning obstacle was approximately the same irrespective of the engineering disciplines. Mechanical Engineering students exhibited a slightly higher percentage in difficulties in applying theories and concepts than others, while Electrical and Electronics Engineering students topped the list for difficulties in working on problem sets or assignments.

These difficulties were noted by lecturers, especially for modules that involved extensive understanding of theories and concepts. The high percentage of students experiencing difficulty with understanding demonstrated that the efforts on the part of academic staff to impart understanding of engineering principles were still insufficient. Additionally, more than a third of all students identified that incomplete handouts and the fast pace of module delivery caused difficulties in their learning process – in particular, half of the Mechanical Engineering students cited incomplete handouts as the key cause of learning difficulties. A quarter of engineering students reflected that they encountered difficulties in producing ideas, with Chemical Engineering students topping the list. Overall, students were less likely to have problems during the laboratory and computing sessions. The number of respondents who felt that the workload and curriculum were heavy and that the levels of their modules were too high was relatively small.

Perceived solutions to learning difficulties

More than three-quarters of all respondents believed that attempting more worked examples would be the solution to their learning difficulties. They also identified that further explanations in theories and concepts, coupled with complete handouts, could help them overcome these difficulties. Alternatives such as supplementary references and reading material, as well as more contact hours received passive responses – this was especially noticeable for Chemical Engineering students.

Capturing students’ interest in learning

As a means of capturing students’ interest in learning, across all the engineering schools, multimedia illustration of difficult concepts gained the highest vote, followed by links to industrial practice or applications and visits to industrial sites. Other suggestions such as updates on research and technology and historical notes were unpopular options with the students.

All of the engineering students were notably more concerned about the applications of their studies and were more career-oriented than research-oriented in their outlook. Compared to other schools, Chemical Engineering had a higher ratio of female to male students, which may explain the difference in responses. Multimedia illustration as a way to capture interest was understandable, as students nowadays are hugely IT oriented. Therefore, with these indicators for how to capture students’ interest in learning, lecturers can create a more successful classroom teachinglearning experience.

Learning outcomes

In terms of learning outcomes, the majority of engineering students perceived that they had acquired skills relating to problem solving techniques, application of theories and concepts and problem recognition. The students also believed that they had acquired analytical and critical thinking skills through the modules.

It should be noted here that, although these results might not necessarily reflect the overall view of learning outcomes as expected, they do highlight the fact that engineering students accepted better training in problem solving techniques than other areas. Greater emphasis should also be placed on training the students to be able to think analytically and critically. Ascertained through this SEM, the key weaknesses among most engineering students at this stage were their communication skills and knowledge of professionalism and ethical codes. Ways to achieve these learning outcomes should be taken into consideration when revising module curriculums.

Laboratories facilities

Feedback on learning support facilities showed that nearly half of all students found laboratory facilities to be sufficient, while a third indicated inadequacy in software and engineering equipment.

Discussions and conclusions

An alternative SEM questionnaire has been developed to identify learning difficulties faced by engineering students and the potential ways to overcome them. Its pilot testing at the University of Nottingham Malaysia Campus has been carried out and the results of this alternative SEM have been analysed. Compared to the existing SEM practice, 78% of students responded positively that the alternative SEM questionnaire provided more comprehensive feedback on the modules and available facilities. As discussed, this SEM not only allows a global view on various aspects of the engineering programmes but also facilitates comparisons between modules in each engineering programme. Comparisons between modules are possible because the same group of students answer the SEM concurrently.

Nonetheless, one drawback of the alternative SEM (from which the existing SEM also suffers) is the capacity for second year undergraduates to make judgements on module organisation and learning outcomes. For instance, in the existing SEM students may believe that they understand a subject but then show that they clearly do not when this understanding is assessed. Likewise, using the alternative SEM they may perceive that they have gained analytical or critical thinking skills but the reality is that they have not. However, students are more likely to provide meaningful responses to questions posed in the alternative SEM on difficult modules, overcoming learning difficulties and increasing their interest in modules. By providing solutions to overcoming difficulties and ways to increase interest in learning, students would be encouraged to select what applies to them. This contrasts with the existing SEM where they are asked to determine how much they agree or disagree with statements concerning the module delivery. Admittedly, these perceptions may still not necessarily reflect the true situation, but more concrete feedback which can be used to support learning difficult modules is provided through the alternative SEM.

With the implementation of one SEM questionnaire per semester for each programme, “questionnaire fatigue” is avoided, which increases the chance of students taking the exercise more seriously and providing meaningful responses. The administrative load is also reduced as less data collection and processing is required with the alternative SEM. Further potential developments from this pilot study include conducting the same study at the UK campus to assess the impacts of different teaching and learning styles on the effectiveness of the alternative SEM. In addition, the success of using the results of the alternative SEM to implement changes to module syllabus and delivery would need to be analysed for more insights into improving SEM practice.

References

Brennan, J., Brighton, R., Moon, N., Richardson, J., Rindl, J. and Williams, R. (2003) Collecting and using student feedback on quality and standards of learning and teaching in higher education: a report to the Higher Education Funding Council for England.

Centra, J. (1993) Using student evaluations: guidelines and benefits. In: Reflective faculty evaluation. San Francisco: Jossey Bass.

Fresko, B. and Nasser, F. (2001) Interpreting student ratings: consultation, instructional modification and attitudes towards course evaluation. Studies in Educational Evaluation 27, 291-305.

Harvey, L. (2001) Student feedback. A report to the Higher Education Funding Council for England.

Lim, P.H., Gan, S., Hartley, M. and Cloke, M. (2008) Student evaluation of module for improving teaching-learning effectiveness. Inaugural Universitas 21 Conference on Teaching and Learning, 21-22 February 2008, Glasgow, UK.

Marsh, H.W. (1987) Students’ evaluations of university teaching: research findings, methodological issues and directions for future research. International Journal of Educational Research, 11 (3), 255-388.

Morrison, T. 1995. Analyzing qualitative responses on student evaluations: an efficient and effective method. Rockingham: Higher Education Research and Development Society of Australasia.

Szwelnik, A. (2005) Module evaluation and feedback. Research report for Quality in Business Education, UK.

Notes

  1. All schools under the Faculty of Engineering have been restructured as departmental units since the academic session 2008/09. Since this study was conducted prior to this, the term ‘schools’ remains as it is in the discussions presented in this paper.
  2. Since the academic session 2008/09 the SEM rating scale has been inverted to 1 for ‘strongly agree’, 2 for ‘agree’, 3 for ‘neutral’, 4 for ‘disagree’ and 5 for ‘strongly disagree’ in order to be consistent with the rating scale of the UK campus.
  3. Unless otherwise stated, all results are presented as percentage of students over overall participants in each programme.
  4. Direct entry students refer to students admitted directly to Year 2 programmes from other colleges.
  5. Average results classification: 70-100% = first class; 60-69% = second class upper; 50-59% = second class lower; 40-49 = third class.

Contact details

Poay Hoon Lim School of Computer Science, The University of Nottingham, UK.
Email: phl@cs.nott.ac.uk

Suyin Gan (Corresponding author), Faculty of Engineering, The University of Nottingham Malaysia
Campus, Malaysia. Email: suyin.gan@nottingham.edu.my

Hoon Kiat Ng Faculty of Engineering, The University of Nottingham Malaysia Campus, Malaysia.
Email: hoonkiat.ng@nottingham.edu.my

Appendices

Appendix 1. Example of the alternative SEM questionnaire for the School of Mechanical Engineering

Appendix 2. Results obtained from all schools of engineering (Electrical & Electronic, Chemical & Environmental
and Mechanical) in the University of Nottingham Malaysia Campus, second year programmes,
autumn semester, session 2006/073

The appendices are available as part of the PDF download of this paper.



ISSN 1750-0052

Creative Commons Licence

This work is licensed under a Creative Commons Attribution 3.0 License.