Frequently asked questions about the CEQ

 

  1. Why does the ITL (re)analyze the CEQ data?
  2. When are the CEQs filled in?
  3. What CEQ data does the ITL report?
  4. What is the difference between the CEQ and the SCEQ?
  5. What are the CEQ Factor Scales?
  6. How valid is the CEQ?
  7. Who can I talk to about the Facutly's CEQ results?
  8. How many graduate respond to the survey?
  9. How are the CEQ Field of Study Categories assigned to Faculties?

1. Why does ITL (re)analyse the CEQ data?

The Course Experience Questionnaire gathers data on the teaching and learning experiences of graduates of all Australian Universities. The national data set is analyzed and reported by a national body, The GCA by institution and by field of study.

In addition to this national reporting system, The Institute for Teaching and Learning analyses and reports the CEQ data from University of Sydney graduates provided by the GCA. The data reported on the GCA website is based on field of study and discipline, rather than faculty. This is because students indicate their field of study on the CEQ rather than course or faculty. Different 'fields of study' fall within different faculties at the various Australian universities. As such the 'Field of Study' data reported on the GCA website does not always translate directly to the University of Sydney faculties. In order to allow faculties to make sense of the University's CEQ data the ITL analyses and reports the University's CEQ data using the Field of Study to Faculty mappings provided by the University's Planning and Information Office.

The ITL provides this CEQ information for the university community to use as a basis for planning improvements to teaching at a faculty level. The data reported on the ITL website is intended to support faculties in researching and developing teaching and learning improvements, and is not used by the ITL for any other purpose.


2. When are the CEQs filled in?

On the 31st October each year, the AGS (of which the CEQ is one of two components) is mailed to students who completed the requirements for their course in Semester 1 of that year. Students who complete their requirements at the end of Semester 2 after sent the AGS on 30th April the following year. The October and April survey rounds together form the graduate population for any one year. Survey data collection is finalised in September and the GCA publishes the results of the AGS in February the following year. The ITL reports utilise the GCA data and are therefore usually published shortly after the GCA reports. This means that the reports of graduating students' experiences are not usually available until at least a year after they finish their courses.

For example, the 2013 AGS investigates the experiences of graduates from 2012 who are sent the survey in October 2012 (Semester 1, 2012 graduates) and April 2013 (Semester 2, 2012 graduates). Data collection continues until September 2013, after which data is collected and analysed by GCA, with results published in February 2014. 


3. What CEQ data does the ITL report?

In completing the survey, graduates are asked to indicate the extent to which they agree or disagree with each of 13 propositions (25 until 2001) using a five-point scale where '1' represents strong disagreement and '5' indicates strong agreement. The intervening points on the scale (2,3,4) do not have value anchors. The items relate to three (five until 2001) aspects of students' experience of their courses.
Respondents are also asked to answer two questions in their own words: What were the best aspects of your course? and What aspects of your course are most in need of improvement?

The ITL reports the University of Sydney results for the items and the CEQ factor scales: for the University as a whole, and for each Faculty. Various report formats are available. The "Factor Scale Report" summarizes the data for each factor scale. The data is reported as the proportion of students who agreed or disagreed, that their experience of their course was educationally positive. There is now a considerable body of research which links students' positive perceptions on these key factors, with more effective student approaches to learning, and better quality student learning outcomes. The Item Report and the Detailed Report summarize the data for each item, both as proportion of students who assigned each rating and the mean and standard deviation of the responses.

The students' comments for the two open response items are typed by the ITL and provided to individual faculties as a searchable access database file.


4. What is the difference between the CEQ and the SCEQ?

The CEQ is a national survey administered by independent body Graduate Careers Australia (GCA). It is a survey of graduates of all Australian universities and higher education institutes. CEQ results are reported at least a year after the graduates actually finish their courses.

The SCEQ - The Student Course Experience Questionnaire - is a survey of current University of Sydney students. It uses the CEQ factor scale items plus some additional items drawn from CEQ survey research and development work as well as items that are particularly relevant to The University of Sydney context. The SCEQ was developed by the Institute for Teaching and Learning for use at The University of Sydney. The ITL administers the SCEQ bi-annually, during odd-numbered years. (The SCEQ was administered annually between 1999 and 2003).

 
 

5. What are the CEQ Factor Scales?

(Source: Ainley & Johnson (2000), The Course Experience Questionnaire 2000 Interim Report, ACER).

Up to 2001, the CEQ gathered data on students' perceptions of their course using 25 items, which aggregate to five factors: 1) Good Teaching, 2) Clear Goals and Standards, 3) Appropriate Workload, 4) Appropriate Assessment and 5) Generic Skills:

From 2002, the University of Sydney has gathered students' perceptions on Good Teaching, Generic Skills and overall satisfaction. The SCEQ is used to survey students' perceptions on the five original CEQ scales, plus the newly developed Learning Community Scale, developed in 2000 by the University of Melbourne Centre for the Study of Higher Education.

Below are the items that make up the five original CEQ scales. Item numbers relate to those used in the CEQ up to and including 2001. Numbers in brackets relate to the item numbers on the CEQ from 2002 onward.

1. The Good Teaching Scale (GTS) - six items

  3. (10.)   The teaching staff of this course motivated me to do my best work.
7. (1.) The staff put a lot of time into commenting on my work.
15. (27.) The staff made a real effort to understand difficulties I might be having with my work
17. (3.) The teaching staff normally gave me helpful feedback on how I was going.
18. (15.) My lecturers were extremely good at explaining things.
20. (16.) The teaching staff worked hard to make their subjects interesting.

The Good Teaching Scale is characterised by practices such as providing students with feedback on their progress, explaining things, making the course interesting, motivating students, and understanding students' problems. There is a body of research linking these practices to learning outcomes. High scores on the Good Teaching Scale are associated with the perception that these practices are present. Lower scores reflect a perception that these practices occur less frequently.

2. The Clear Goals and Standards Scale (CGS) - four items

  1.   It was always easy to know the standard of work expected.
6. I usually had a clear idea of where I was going and what was expected of me in this course.
13.r It was often hard to discover what was expected of me in this course.
24. The staff made it clear right from the start what they expected from students.

(r= item scoring reversed to allow for negative phrasing)

Even though the establishment of clear goals and standards in a course could be considered part of good teaching in a broader sense, it would be possible to utilise the practices encompassed by the Good Teaching Scale but fail to establish clear goals for the course and clear expectations of the standard of work required from students.

3. The Appropriate Assessment Scale (AAS) - three items

  8.r   To do well in this course all you really needed was a good memory.
12.r The staff seemed more interested in testing what I had memorised than what I had understood.
19.r Too many staff asked me questions just about facts.

(r= item scoring reversed to allow for negative phrasing)

This scale concentrates on one particular aspect of assessment and is not exhaustive in its measurement of assessment approaches. It focuses on the extent to which assessment emphasized recall of factual information rather than higher order thinking. Embedded in the Appropriate Assessment Scale is the assumption that assessment which does not focus on factual recall concentrates instead on higher order processes.

4. The Appropriate Workload Scale (AWS) - four items

  4.r   The workload was too heavy.
14. I was generally given enough time to understand the things I had to learn.
21.r There was a lot of pressure on me to do well in this course.
23.r The sheer volume of work to be got through in this course meant it couldn't all be thoroughly comprehended.

(r= item scoring reversed to allow for negative phrasing)

High scores on the Appropriate Workload Scale indicate reasonable workloads. These are graduates who disagree with the proposition that The workload was too heavy and who agree that I was generally given enough time to understand the things I had to learn. The evidence from research on student learning is that heavy workloads require students to adopt an approach to learning which emphasizes skimming across the surface of topics, without being able to spend the time to truly engage and understand the material they are meant to be learning.

5. The Generic Skills Scale (GSS) - six items

  2. (23.)   The course developed my problem-solving skills.
5. (14.) The course sharpened my analytic skills.
9. (6.) The course helped me develop my ability to work as a team member.
10. (42.) As a result of my course, I feel confident about tackling unfamiliar problems.
11. (32.) The course improved my skills in written communication.
22. (43.) My course helped me to develop the ability to plan my own work.

The Generic Skills Scale is an attempt to take into account the extent to which university courses add to the generic skills that their graduates might be expected to possess. Discipline-specific skills and knowledge are often crucial to prospects for employment and further study. Nevertheless, the emphasis on generic skills stems from the belief that knowledge quickly becomes obsolete, and generic skills that may have been acquired in the learning process should endure and be applicable in a broader context. Skills typically identified in this context include communication skills, the capacity to learn new skills and procedures, the capacity to make decisions and solve problems, the ability to apply knowledge to the workplace, and the capacity to work with minimum supervision.

There is one additional item, not used in analysis. Item 16: 'The assessment methods employed in this course required an in-depth understanding of the course content', is a new item being piloted to replace an item which did not load unambiguously on any single scale.

The Overall Satisfaction Item (OSI)

  25.  (49.)        Overall, I was satisfied with the quality of this course.

This single item asks graduates about their overall level of satisfaction with their degree course.


6. How valid is the CEQ?

The CEQ is based on over 20 years of international survey development and research. It is possibly the best researched student survey tool in use in Australia. It continues to be refined through the development of additional items and scales. Information on the survey's psychometric properties including reliability and validity is available on the GCA website.

The CEQ has attracted critics over the years, however it has stood the test of time. Numerous research studies have concluded that the factor scales and the survey items they derive from are valid across repeated administrations to different cohorts.


7. Who can I talk to about the Faculty's CEQ results?

Faculty Associate Deans Learning and Teaching coordinate how their faculty responds to both CEQ and SCEQ results. Your Faculty A/Dean L&T is the person to whom the CEQ Open Response Comments are returned and with whom the ITL liaises to support the faculty in using data such as the CEQ results to support and inform teaching improvement initiatives.

If you would like to get involved in your faculty's discussion of the results please contact your Faculty A/Dean L&T.

From 2000-2008 the ITL convened EQA working group to support faculties in making the best use of data from surveys like the SCEQ. The Resources developed by the EQA working group along with notes from the meetings of the working group are available on the EQA Working Group website and can provide ideas on how to respond to the SCEQ.


8. How many graduates respond to the survey?

Each year the AGS is mailed to over 11,000 University of Sydney graduates. The University of Sydney consistently achieves a response rate of greater than 50%.


9. How are the CEQ Field of Study Categories assigned to Faculties?

The data returned to the University by the GCA is based on field of study, rather than faculty. This is because students indicate their field of study on the CEQ form and provide feedback based on this field of study not their course or faculty. Different 'fields of study' fall within different faculties at the various Australian universities. As such the 'Field of Study' data reported on the GCA website does not always translate directly to the University of Sydney faculties. In order to allow faculties to make sense of the University's CEQ data, the ITL analyses and reports the University's CEQ data using the Field of Study to Faculty Mapping provided by the University's Planning and Information Office.

The mapping of a graduate's field of study to a University of Sydney faculty may be different to that suggested by their enrolment. For example, if someone was enrolled in Arts, but indicated their field of study on the CEQ form as Psychology, then they are included in the Science faculty, based on the Planning Support Office's mapping. As another example, all Arts graduates who indicated their first major as Personnel Management or Banking and Finance, are assigned to Business.

Table 1 shows how fields of study are allocated to Faculties by the Planning and Information Office.

Some multidisciplinary courses and fields of study clearly represent the involvement of more than one University of Sydney faculty. Table 1 also shows the way that the University of Sydney Planning and Information Office field of study mapping is used to assign the CEQ responses, where more than one faculty is involved. The data may be allocated to up to three faculties; for example, Medical Science as a field of study is assigned to Science, Medicine, and Health Sciences.

Some other examples:

  • Responses from graduates who indicated Music as their field of study are counted both in Sydney College of the Arts and Arts
  • Responses listing Horticulture as field of study are counted in both Agriculture and Rural Management (for responses from years prior to 2005 in which these were two separate faculties).

If you would like to discuss the mapping conventions used by The University please contact the Planning and Information Office  Strategic Planning Office.