Institutions involved in the provision of tertiary education across Europe are feeling the pinch. European universities, and other higher education (HE) institutions, must operate in a climate where the pressure of government spending cuts (Garben, 2012) is in stark juxtaposition to the EU’s strategy to drive forward and maintain a growth of student numbers in the sector (eurostat, 2015).
In order to remain competitive, universities and HE institutions are making ever-greater use of electronic assessment (E-Assessment) systems (Chatzigavriil et all, 2015; Ferrell, 2012). These systems are attractive primarily because they offer a cost-effect and scalable approach for assessment. In addition to scalability, they also offer reliability, consistency and impartiality; furthermore, from the perspective of a student they are most popular because they can offer instant feedback (Walet, 2012).
There are disadvantages, though.
First, feedback is often returned to a student immediately on competition of their assessment. While it is possible to disable the instant feedback option (this is often the case during an end of semester exam period when assessment scores must be can be ratified before release), however, this option tends to be a global ‘all on’ or ‘all off’ configuration option which is controlled centrally rather than configurable on a per-assessment basis.
If a formative in-term assessment is to be taken by multiple groups of students, each at different times, this restriction means that answers to each question will be disclosed to the first group of students undertaking the assessment. As soon as the answers are released “into the wild” the academic integrity of the assessment is lost for subsequent student groups.
Second, the style of feedback provided to a student for each question is often limited to a simple ‘correct’ or ‘incorrect’ indicator. While this type of feedback has its place, it often does not provide a student with enough insight to improve their understanding of a topic that they did not answer correctly.
Most E-Assessment systems boast a wide range of question types including Multiple Choice, Multiple Response, Free Text Entry/Text Matching and Numerical questions. The design of these types of questions is often quite restrictive and formulaic, which has a knock-on effect on the quality of feedback that can be provided in each case.
Multiple Choice Questions (MCQs) are most prevalent as they are the most prescriptive and therefore straightforward to mark consistently. They are also the most amenable question types, which allow easy provision of meaningful, relevant feedback to each possible outcome chosen.
Text matching questions tend to be more problematic due to their free text entry nature. Common misspellings or case-sensitivity errors can often be accounted for by the software but they are by no means fool proof, as it is very difficult to predict in advance the range of possible variations on an answer that would be considered worthy of marks by a manual marker of a paper based equivalent of the same question.
Numerical questions are similarly restricted. An answer can be checked for accuracy or whether it is within a certain range of the correct answer, but unless it is a special purpose-built mathematical E-Assessment system the system is unlikely to have computational capability and so cannot, for example, account for “method marks” which are commonly awarded in paper-based marking.
From a pedagogical perspective, the importance of providing useful formative feedback to students when they can most benefit from the feedback and put it to use must not be understated (Grieve et all, 2015; Ferrell, 2012).
In this work, we propose a number of software-based solutions, which will overcome the limitations and inflexibilities of existing E-Assessment systems.
Background: High fidelity patient simulation (HFPS) includes the use of computerised manikins to simulate real life clinical scenarios, providing experiences and educational outcomes comparable to that of a clinical placement (Hayden et al, 2014). Educationalists have increasingly been using experiential teaching approaches such as HFPS to deliver end of life education—however, there is a dearth of robust evidence to support the effectiveness of such an approach. Pilot data suggests it is a viable approach to teaching end of life care, however further research is warranted on a larger scale to confirm its effectiveness and capture qualitative data on the student experience (Lewis et al, 2016). Here we report on the use of Studiocode to assist in the qualitative analysis of the HFPS video recordings which will be captured as part of this larger study.
Methods: Undergraduate nursing and medical students from Queen’s University Belfast will be recruited to take part in interprofessional HFPS sessions which focus on end of life care. Each session will be video recorded with the students’ prior consent. Video recording analysis software, Studiocode, will be used to label and sort the data which will be analysed using thematic analysis, informed by symbolic interactionism (Blumer, 1969).
Expected outcomes: By recording the simulation sessions, a greater understanding of the use of HFPS for end of life education and its impact on interprofessional practice will be obtained. The analyses of these videos will allow for a qualitative exploration of interprofessional teamwork, communication, and social organisation, in caring for the simulated dying patient.
Quantitative skills such as data visualisation, statistical analysis and results reporting are becoming highly sought-after by employers and form a core component of assessment for our degree pathways. Statistical analysis is advancing at such a rapid pace due to both theoretical and computational advances that the ability to perform statistical analysis has become irreversibly linked with the ability to operate the (sometimes complex) software capable of undertaking that analysis. In order to make our students quantitatively literate, we need them to both understand the fundamental concepts of statistical inference and be able to operate the statistical software to implement that statistical analysis. However, students clearly have difficulty grasping the reality that the skills needed to operate a statistical software package are just as essential as being able to run a PCR machine. Thus far, teaching statistics and statistical software through lectures and tutorials has had mixed results and we need to ask, how can we best leverage the time we have to teach students about statistics while making sure they are skilled in statistical software? Qubstats.com is a website that has been set up to offer a vibrant online community for first-year QUB students taking Biodiversity (BIO1305) and to act as their “one-stop shop” for statistical programming and data analysis. The most important innovation the site brings to the learning experience is a dynamic question and answer portal that is designed to foster a community atmosphere of peer mentoring and support. In this talk, I will detail how the site is used and how successful it has been in improving quantitative literacy within the School of Biological Sciences.
The classroom of the future is one that’s connected, collaborative, and built around digital devices, and the textbook of the future is an immersive and interactive experience that can offer personalized content tailored to individual wants, needs and methods of learning. The recent changes to the academic structure at QUB originated the development of new Level 1 40 CAT modules with large numbers of students from different degree pathways and across different schools. In order to capitalise on digital technology in these new learning environments one of the changes introduced in the development of one of these modules was the introduction of the SmartBook. The SmartBook is an interactive adaptive etextbook and is a unique way to engage students in their learning. Students are assigned reading from the ebook but the lessons change and are individualised in response to the student’s answers to pop up quizzes. When students get questions wrong the textbook can direct them to a text, graphical, or video explanation that attempts to better explain the concept in question. This personalised learning experience is designed to give students who are struggling time to understand subjects, while faster learners can surge ahead without getting bored. The interactive feature encourages students to read and explore more and provides flexible access so students can access reading anywhere. There is easier integration of textbook material into the course and text can be annotated to guide the students’ learning. The interactive etextbook provides a continually adaptive reading experience, integrated learning resources and a visual analytics dashboard that delivers at-a-glance information regarding student performance, study behaviour and effort and should expedite feedback to the students and facilitate distance learning. Here we report on how this technology was introduced into a Level 1 Biological Sciences module and student opinions on the Smartbook’s efficacy in enhancing learning.
We demonstrate the use of the open source digital tool Numbas (www.numbas.org.uk) that has been used for computer based assessment of more than 150 level 1 students in the School of Mathematics and Physics at Queen's in the academic year 2016/2017. Traditional approaches based on multiple choices suffer the common flaws of being predictable and the correct solution being easy to spot. Other approaches require the solution to be numerical but do not allow for algebraic formulae. The novel approach, based on Numbas, allows the student to typeset their solution as a formula. Numbas interprets the student's solution and compares it with the Lecturer's one. This allows mathematics instructors to test students' knowledge in topics like calculus and linear algebra as in traditional pen-and-paper examinations. Another advantage is the capacity to randomise questions in a test, selecting them from a large set, and the capability to use random numerical values in a variety of questions. In this way the students cannot rely on memorising the answers to all possible questions and can be examined on different questions at the same level with little risk of collaboration among the students during the test. The automated nature of this type of assessment and low work-load associated with running it, enable us to set a higher pass threshold for the test (80%, given the basic nature of the questions we ask, as in a driving theory test), mitigated by the opportunity for the students to take the test multiple times without incurring a penalty.
We present our results from level 1 examinations in Pure and Applied Mathematics courses showing that this digital platform can be also employed in other disciplines: biology, chemistry, engineering, computer science and economics.
Interprofessional Learning (IPL) is crucial for students in healthcare, as effective interaction within a multidisciplinary healthcare team is essential for patient safety.
As part of the 2017 Development Weeks, all first-year students from Pharmacy, Medicine and Nursing were invited to participate in a joint workshop on the topic of healthcare ethics; due to the number of students potentially participating it was decided it would be practically infeasible to carry this out in a physical space, and that an online workshop would be best suited for the exercise.
A number of components made up the workshop, with an online discussion group forming the crux, where all of the interprofessional interaction took place. A website was developed to host the workshop with its main design goals being to ensure easy navigation and usability by students, who were all participating remotely.
Once logged on to the workshop website each student completed a registration form, choosing a username, which would identify them in their interprofessional discussion group - the students having been pre-allocated to the groups. Case study reading material (on the topic of ethics in healthcare) was then presented to the students, which they were given approximately 45 minutes to read.
At a specified time, the students opened the groups’ online discussion rooms where they discussed the case studies for approximately one hour, directed by questions included in the reading material.
Staff members logged on to facilitate the groups’ discussion in each session. Each facilitator covered two to four tutorial groups’ online discussions simultaneously. If a student required individual assistance they could open a private messaging board to chat with one of the facilitators.
Following discussion, the students participated in a “sli.do” poll with questions relating to the discussion, listened to pre-recorded facilitator feedback on both cases and had the opportunity to download a certificate on completion of the workshop.
Much previous research has indicated that where a student sits in a university lecture theatre has a correlation with their final grade. Frequently those students that sit regularly in the front rows have been reported to achieve the highest grades. However most of the research restricted student seat movement, which is both unnatural and may have adversely influenced the research results. A previously reported unique unrestricted seat tracking investigation by the authors of this research used a web and mobile software tracking application (PinPoint) to investigate student seating related performances in a 12 week Java programming university module. The PinPoint investigation concluded that the best assessment results were achieved by the students in the front rows and that assessment scores degraded the further students sat from the front. Additionally while the most engaged students were found to regularly sit at the front the same was not true for the most academically able or those with the greatest prior programming experience. This paper presents a further analysis of the PinPoint data, focusing on assessment performances within similar groups (academic ability, engagement and prior programming experiences) and additionally presents results of a temporal movement study and a qualitative analysis of the group and individual student seating decisions. It concludes that a comparison of student assessment performances within each of the peer groups, in every instance, found that the front row students outperformed their peers sitting further back. This strongly suggests that there is a benefit to sitting at the front regardless of academic ability, engagement or prior subject knowledge. It also points to other untested factors that may be positively influencing the front row performances.
In 2007 the School of History & Anthropology introduced a stand-alone dissertation. Prior to this Single-Honours History Students only were required to do a dissertation and it had to be based on a module taken in the first semester. There were a limited number of dissertation-related modules available so student choice was restricted and those not taking Single Honours were unable to avail of the opportunity to do original research. Since opening the module to all students in History and widening the choice to a topic of the students’ interests, that aligns with the ability of a staff member to supervise, the module has proven to be very popular and successful and is now taken by almost all Joint Honours, as well as Single Honours for whom it remains compulsory. Approximately 130 level 3 students do the dissertation each year.
Widening the choice of topics available to students was facilitated by increasing access to digital primary resources. The dissertation must be primary source based, usually using manuscript sources. Until recently students would have had to travel to archival repositories in Ireland and Britain to access much of this material and in the case of dissertations on American or medieval European topics it would simply not have been possible to undertake such topics.
My presentation will show how the recent available of digitised primary source material has facilitated original and high quality research among the undergraduate history dissertation students. Examples of collections that have been made available online in recent years that our students now have free online access to will include:
- The Thatcher Foundation Archives; the Irish Bureau of Military History and Military Service Pensions Collection; the Irish Census of Population (1901 and 1911); Attestation Papers for Canadian and Australian soldiers who served in the First World War; the Book of Kells; Digital collections of the Library of Congress; Proceedings of the Old Bailey online.
The overall aim of the presentation is to demonstrate how the recent availability of digitised primary sources has transformed the nature and breadth of topics that undergraduate students can pursue for their History dissertations.
This paper describes the use of technology to deliver an undergraduate history module taught and assessed substantially through visual materials. The module aims to develop historiographical and transferable skills in digital and visual literacy, as well as to make Chinese history research accessible to undergraduate students.
The convenors taught similar material to postgraduate students and to undergraduate interns through direct access to thousands of rare historical photographs of China held at Queens. We observed that using the photographs gave students confidence to explore Chinese history, and enabled them to develop visual methodological skills which were otherwise not well covered at either UG or PGT.
To scale up this approach into a Level 2 module technology was needed, to provide access to photographs held all over the world and to give the students convenient access to the photographs. We also saw this as an opportunity to develop students’ transferrable digital skills, building on the interests and experience of a generation exposed to technology since childhood.
The module is research-led for both staff and students, since it arises from the convenors’ ongoing research project and is taught substantially through group workshops and independent student research. Students’ learning for each unit begins with a selection of visual materials curated and annotated by the lecturers in Microsoft Sway. Staff guide students through each unit’s materials in a workshop in the Flexible Teaching Space, where students use tablets to work collaboratively exploring the assigned photographs and beyond to photograph databases around the world, including the lecturers’ own database on ContentDM. The module includes training in both using and creating digital visual materials, and the assessment includes the creation of a digital poster. The module therefore develops both digital and visual literacy, accompanied by traditional historical knowledge, but delivered through a more engaging and immediate approach.
Queen's University Belfast is committed to Equality, Diversity and Inclusion.
For more information please read our Equality and Diversity Policy.
Queen's University Belfast is registered with the Charity Commission for Northern Ireland NIC101788
VAT registration number: GB 254 7995 11