Gamification in Science Education: Gamifying Learning of ... ? Gamification is the use of game attributes

Download Gamification in Science Education: Gamifying Learning of ... ? Gamification is the use of game attributes

Post on 02-Dec-2018

212 views

Category:

Documents

0 download

TRANSCRIPT

CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 138 Gamification in Science Education: Gamifying Learning of Microscopic Processes in the Laboratory Katja Fleischmann James Cook University, Australia Ellen Ariel James Cook University, Australia Abstract Understanding and trouble-shooting microscopic processes involved in laboratory tests are often challenging for students in science education because of the inability to visualize the different steps and the various errors that may influence test outcome. The effectiveness of gamification or the use of game design elements and game-mechanics were explored in a year-long research project which saw the development of a web based learning tool that visualized Enzyme Linked Immunosorbent Assays (ELISAs) in a digital laboratory. A cohort of 30 students from the Bachelor of Medical Laboratory Science trialed the first prototype and provided feedback in a survey on their learning experience and the extent to which the learning tool enhanced their learning. This article provides additional insights into likely future trends of substituting traditional learning modes such as lecture and practical laboratory classes with gamified content. Keywords: Gamification; Game-based learning; Higher education; Science education Introduction Understanding and trouble-shooting microscopic processes involved in laboratory tests are often challenging for students in Biomedical Science, Medicine, Medical Laboratory Science and Veterinary Science. This is because of the inability to visualize the different steps and the various errors that may influence test outcome. A typical test these students have to learn and master in the laboratory is the Linked Immunosorbent Assay (ELISA). Enzyme ELISAs are designed to detect specific antigens from, for example, a viral agent such as Dengue fever. The detection consists of several steps, where reagents in each step link onto the previous if they are compatible. After an incubation period the excess reagent is washed away and the next step is applied. This linking and washing is continued until an indicator system can be applied and create a color change equivalent to the number of positive reactions in previous steps (Figure 1). CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 139 The current learning environment does not provide a visualization of this microscopic process and the students are faced with a success or failure at the end of the test, with little chance for them or the educator of finding out where they may have gone wrong. Such an experience leaves students disillusioned and questions the value of the practical application of these assays in a learning environment. Figure 1. ELISA Plates Hosting the Microscopic Test Procedure1 This hurdle in teaching and learning spurred a cooperation between students and educators of the School of Veterinary and Biomedical Sciences, School of Creative Arts 1 Figure 1. Bottom image public domain from http://en.wikipedia.org/wiki/ELISA#/media/File:ELISA.jpg CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 140 and School of Business at James Cook University, north Queensland, Australia, in order to develop a web-based learning tool for students of the Bachelor of Medical Laboratory Science and Bachelor of Veterinary Science which would be designed to visualize the microscopic ELISA process during different steps providing immediate feedback on application of the various reagents. It was envisioned that the learning tool would be available online to allow students to explore the ELISA in their own time and at their own pace. This cooperation was a year-long project and integrated into the existing curriculum with design students, information technology students and medical laboratory science students collaborating on the development and design of the learning tool under the supervision of three educators from the respective disciplines. In order to develop a learning tool that is engaging and motivates students to learn, student and educator teams looked into the concepts of gamification of learning which is the use of game design elements and game-mechanics in a learning context (Kapp, 2012). Some authors see great potential in games in particular for science education, for example Honey and Hilton (2011) suggest that they enable learners to see and interact with representations of natural phenomena that would otherwise be impossible to observe (p.1). However, despite an increased interest in gamification of learning in higher education and an increased application of game-based learning in the classroom, there is still not enough evidence showing the effectiveness of games in science instruction or learning (Ke, 2009 in Morris et al., 2013, p. 11; ONeil et al., 2005). The lack of evidence for the efficacy of gamification is not specific to science education but to higher education in general. Recent studies mapping existing literature (e.g., Borges et al., 2014; Dicheva et al., 2015; Hamari et al., 2014) generally came to the conclusions that there is a growing interest in gamifying educational content; indeed more papers report on gamification in higher education compared to marketing or customer engagement. Furthermore, there is often a lack of empirical evidence to support claims about positive impacts on learning outcomes and research on gamification in education is still in its infancy. It is particularly important to explore the contrasting opinions on the effectiveness of gamification in education that have emerged (e.g. Hanus & Fox, 2015). Dicheva et al. (2015) note that while the concept of gamification may look simple, the analyzed [sic] work demonstrates that gamifying learning effectively is not [simple] (p. 84). Therefore, the authors of this paper set out to gain a better understanding of gamification in science education and in particular to evaluate the extent to which the gamification of the ELISA can enhance student learning. The developed web-based learning tool was trailed with a cohort of students in the Bachelor of Medical Laboratory Science in terms of learning enhancement and user experience. Beyond Sugaring the Pill Gamification in Higher Education Utilizing games for learning in higher education has developed significantly since its beginning and from what Jong, Lee and Shang (2013) called sugaring the pill for the purpose of making learning more interesting. While Ebner and Holzinger (2007) found CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 141 little evidence of the use of games in higher education in 2007, game-based learning or gamification of learning has increasingly received attention in the higher education context. Gamification was, until recently, broadly defined as the use of game design elements and game-mechanics in a non-game context with the goal of engaging users in solving problems (Deterding, Dixon, Khaled & Nacke, 2011; Huotari & Hamari, 2012). Typical game elements applied in a non-game context are, for example, point scoring, competition with others and rules of play (Oxford Dictionary, 2012). Wu (2012) offers a more comprehensive definition reflecting recent and rapid developments: Gamification is the use of game attributes to drive game-like player behavior [sic] in a non-game context. This definition has three components: 1. The use of game attributes, which includes game mechanics/dynamics, game design principles, gaming psychology, player journey, game play scripts and storytelling, and/or any other aspects of games 2. To drive game-like player behavior [sic], such as engagement, interaction, addiction, competition, collaboration, awareness, learning, and/or any other observed player behavior [sic] during game play 3. In a non-game context, which can be anything other than a game (e.g. education, work, health and fitness, community participation, civic engagement, volunteerism, etc.) Gamification is a rather young phenomenon. According to Fitz-Walter (2013) the term was first coined in 2003 by Nick Pelling while Deterding, Dixon, Khaled and Nacke (2011) argue its first use has been recorded in 2008. However, gamification gathered momentum in recent years, in particular as an online marketing strategy to encourage engagement with a product or service. Well-known examples are Frequent Flyer programs (gamification of enterprises) and Nike+ (gamification of health) (Dicheva et al., 2015; Kim, 2015). Gamification of content occurs in a variety of contexts today and is widespread with over 50 percent of organizations [sic] using gamification to engage employees (Swann, 2012) and increasingly in a learning and training context. In higher education gamification is most often discussed in relation to the engagement of students in the learning process. According to Renaud and Wagoner (2011) and Villagrasa and Duran (2013) providing the opportunity to acquire knowledge through the use of game mechanics is a major driver to engage and motivate students to learn, a view which is supported by the findings of Poole et al. (2014) and the review by Hamari et al. (2014). Another benefit of gamification highlighted in some studies is improved learning which Borges et al. (2013) classifies as the gamified solutions to enhance the way students learn, maximizing the results of the learning process (p. 221). Hamari et al. (2014) also conclude CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 142 in their review that the gamified interventions increased the knowledge of students on the topic. Dicheva et al. (2015) found in their systematic mapping study that although many papers lack research rigor gamification has the potential to improve learning if it is well designed and used correctly (p. 83). Instant and frequent feedback takes on an important role in gamified learning and can create positive outcomes (e.g., Hanus & Fox, 2015; Nah et al., 2014). Nah et al. (2014) state in their review of the literature that the more frequent and immediate the feedback is, the greater the learning effectiveness and learner engagement (p. 406). Interestingly some features of gamification appear to elicit opposing outcomes in individual studies. These include features or elements that are widely used for gamifying learning content such as leaderboards, badges and rewards (Hanus & Fox, 2015; Nah et al., 2014; Poole et al., 2014). These features can not only engage students more actively but also increase competition. While Buckley and Doyle (2014), consider the competitive aspect as motivating and exciting for individual students, Hanus and Fox (2015) suggest that encouraging competition harms motivation, and Hamari et al. (2014) note that some studies also report on increased competition between students as a negative outcome. Some authors raise the concern that the results of gamification may not be long-term but produced by the novelty effect (Hamari et al., 2014, p. 3028). Similarly, De-Marcos et al. (2016) question, as a result of their study, the long-term sustained engagement of students with gamified learning content and suggest that more research is needed to reject/support claims that gamification leads to shallow learning. Gamification in Biomedical Science Education Learning and teaching in biomedical sciences is largely structured around on-campus face-to-face learning experiences where information transmission occurs either in lectures, tutorials and/or laboratory sessions. Digital game-based learning or gamification of learning is seen by some authors as the next step to offer contemporary science education, in particular with todays students who are being described as sophisticated consumers of digital content (Epper, Derryberry & Jackson, 2012, p. 8). The Federation of American Scientists (2006) recognized early on that; People acquire new knowledge and complex skills from game play, suggesting gaming could help address one of the nations most pressing needs strengthening our system of education and preparing workers for 21st century jobs (p. 3) As an outcome of a summit on educational games the Federation of American Scientists (2006) acknowledged several attributes of games that would be useful for application in learning environments. These include, among others: CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 143 contextual bridging (i.e., closing the gap between what is learned in theory and its use); high time-on-task [high engagement time]; motivation and goal orientation, even after failure; providing learners with cues, hints, and partial solutions to keep them progressing through learning; personalization of learning; and infinite patience. (Federation of American Scientists, 2006; p. 5) Requiring students to problem solve by presenting a problem within its wider, real life context and integrating game-like elements in learning is not new in higher education. For example, digital simulations provide a learning environment in which the student can practice difficult, exacting, life-threatening, or mission-critical skills (Epper, Derryberry & Jackson, 2012, p. 2). Digital simulations provide real life experiences in which students [can] immerse themselves in a realistic simulated setting without the fear of real life consequences (Ebner & Holzinger 2007, p. 3). Nevertheless, the difference between simulations and gamified learning content is that simulations are usually lacking the competitive character of games or the by chance element (Epper, Derryberry & Jackson 2012, p. 2). Other features or game mechanics that can assist and motivate students to learn are narratives, progress mechanics, immediate feedback and player control (Honey & Hilton, 2011; Morris et al., 2013). A game-like environment also allows the construction of a problem context and allows the learner to approach a problem via multiple pathways (Ebner & Holzinger, 2007). Although digital games for learning can have powerful features, they must be driven by learning outcomes (De-Marcos et al., 2016; Federation of American Scientists, 2006; Hanus & Fox, 2015; Jong, Lee & Shang, 2013). Epper, Derryberry and Jackson (2012) argue that distinctions between the real world and the online world are blurry for students (p. 9). Therefore digital game-based learning or gamification of learning is seen by some authors as a step to offer contemporary science education. Morris et al. (2013) in particular argue that scaffolding the development of scientific thinking can be achieved via educational and cultural tools in which gamification will play a major role as a contributor to science education. Honey and Hilton (2011) argue that game-based learning could be part of a new approach to science education in that teachers spark students interest by engaging them in investigations, helping them to develop understanding of both science concepts and science processes while maintaining motivation for science learning. (p. 1). Morris et al. (2013) support this argument and see games in science education as tools to create positive user experiences, their motivational and learning potential can be repurposed to enhance science education (p. 2). More specifically, Morris et al. (2013) see potential benefits of game-based learning in science education as students are provided with an authentic context in which players can demonstrate what they have learned, as opposed to standardized [sic] tests (p. 2); CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 144 that learning can be scaffolded to provide knowledge according to the learners development; and students are allowed to fail without penalty (opposed to an exam) and that therefore the students curiosity is encouraged. Overall, gamification of learning is considered to support enquiry-based approaches found in science education with some authors arguing that science education can be enhanced through gamification of learning. A hands-on example is described by Geelan et al. (2015) who studied the use of gamification to teach the fundamentals of bioscience to first year undergraduate students. In the study, gamification was used to engage students more actively in the unit content, improve content retention and increase the productivity of study time for students (p. 3). The study reported that students found the system to be fun and interesting, and it was both reported and observed that learning outcomes were significantly improved (p. 16). Geelan et al. (2015) see these positive outcomes of classroom gamification by including game based elements that reduce the likelihood that students would engage in playing the game rather than in learning through the game (p. 16). The ELISA Learning Tool: Project Development The project called A Collaborative Approach to On-line Learning of Diagnostic Test Function was funded by a JCU Teaching and Learning Development grant and involved a collaboration between academics and students from the School of Veterinary and Biomedical Sciences, the School of Creative Arts and the School of Business. The aim was to develop a web-based learning tool that would make microscopic processes visible to foster learning and understanding of the ELISA process, use principles of gamification to enhance the learning experience and motivate students to learn, allow students to learn and practice the ELISA independent of the laboratory environment and the need for the educator to be present, and therefore, allow students to learn at any time and in any place. The project was a year-long project with students from three disciplines (media design, information technology and medical laboratory science) collaborating on the design and development of the ELISA Learning Tool. Throughout the development process the designs were tested by medical laboratory (MedLab) science students and their feedback considered to improve the product. At a final presentation the best product from seven student teams was chosen, tested again and finalized as a prototype to be trailed with a student cohort of MedLab Science students (Figure 2). CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 145 Figure 2. Students from Media Design, IT and MedLab Science Collaborating and User Testing the ELISA Learning Tool Prototype The ELISA Learning Tool: Prototype The ELISA Learning Tool is a web-based interactive multimedia application which allows students to learn about ELISA and check their knowledge in the digital environment. Because of the complexity of such a project, which requires the collaboration of designers, IT and content specialists within particular budget and time specifications, the educator team decided to develop a prototype with limited functionalities first before engaging in the full development of the application. This decision was also guided by the lack of existing research concerning hands-on experience with gamification in science education. To work towards a long-term solution it was decided to explore this new territory through an iterative process, which required testing the prototype throughout its developmental stages and testing the final prototype with a cohort of MedLab Science students. The first prototype introduced students to a home screen where they could select to play ELISA or to access information on ELISA. The latter provided students with information on the theory and procedure of ELISA (Figure 3). When students decided to play ELISA they were able to start testing their knowledge in two sections, Antigen Detection and Antibody Detection with four possible ELISAs (or challenges) provided in each section. In the case of the Antibody Detection students could design an ELISA to detect an antibody against one of the following viruses: Chicken Pox, Measles, Hepatitis A and Rubella (Figure 4). Once a test or challenge was selected students entered the virtual laboratory in which larger than life visualizations of ELISA elements were presented (e.g. Capture, Antigen, Detecting and Conjugate) (Figure 5). The laboratory environment also displayed control buttons such as incubate, wash and reset, symbolizing actions to be undertaken during an ELISA in the real world laboratory environment. The look and feel of the application was held in a game-like style by using a distinct illustrative style. This style was playful but at the same time effective information design was used to create icons and symbols that were informative at first sight (Figure 6). Rollover actions revealed further information on elements available for the test and context-sensitive help CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 146 was provided throughout the game through mouse movement (Figure 7). Students could access the information icon throughout the game, which also led them to explanations on How to Play ELISA (Figure 8). Animations and sound were integrated in the ELISA learning tool to enhance the learning experience and to further engage students in this style of learning. After each test was concluded students would receive instant feedback on their result and would be offered to try again or move on to a different challenge. Figure 3. Home Screen and ELISA Background Information CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 147 Figure 4. After Selecting Antibody Detection Students Can Select One of the Four Challenges Presented Figure 5. Antibody Detection for Chicken Pox: Visualizing Microscopic Processes through a Distinct Illustration Style and User-Friendly Interface CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 148 Figure 6. Control Buttons and Icon Design Figure 7. Context-sensitive Help Text Triggered by Mouse Movements Such as Clicking On Elements or Rollover Actions Is Provided Throughout the Game CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 149 Figure 8. Explanations on How to Play ELISA Are Provided Research Methods A student cohort of 3rd year MedLab Science students were invited to participate in different learning activities around ELISA and to provide feedback on their learning experience. The students participated in a lecture, which introduced them to the theory of ELISA, a practical class in the laboratory in which students could practice what they had learned. Subsequently, students used the ELISA Learning Tool to consolidate and test their knowledge further. In order to evaluate the value and effectiveness of the ELISA Learning Tool students (n=30) were invited to provide feedback in an anonymous questionnaire after all learning activities were completed. The questionnaire explored the extent to which learning the ELISA can be enhanced through gamification of learning utilizing the developed web-based learning tool. In particular the following questions were investigated: 1. How successful was the ELISA Learning Tool as a learning experience for students? 2. Would students use the ELISA Learning Tool for their learning and hence could it substitute the practical laboratory classes? CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 150 Questions in the questionnaire were designed in the form of multiple-choice question (e.g. Did you enjoy learning ELISA with help the learning tool? Yes/No). The questions were often followed by an open-ended question which asked students to explain answer choices in more detail (e.g. Why did you or did you not enjoy using the ELISA Learning Tool? Please explain). Quantitative data are presented as response counts, percentages and a tally of response totals. Qualitative data collected were analyzed by looking for common topics or coding themes, which are overarching themes that can be identified within the data (Punch, 2009). The coding categories (topic coding) were not pre-established but left to emerge. Typical student comments are presented to illustrate each theme. Findings: Student Feedback on the Learning Value and Experience of the ELISA Learning Tool Learning Value of Activities A cohort of 30 students participated in the learning activities (lecture, practical class, ELISA Learning Tool) and provided feedback in the questionnaire. Of these students, 29 students (97%) attended lectures and all students (100%) attended the practical class and trailed the ELISA Learning Tool. The feedback gathered on the effectiveness for learning of lecture, practical class and the ELISA Learning Tool provides some interesting insights. Regarding how effective students experienced the lecture, 13 students (45%) found the lecture very useful, 11 students (38%) found it useful and five students (17%) were undecided. One student did not answer this particular question. Student feedback on the effectiveness of the practical class in the laboratory yielded similar results but a slightly higher percentage was undecided, 13 students (43%) found the practical class very useful, nine students (30%) experienced it as useful, and eight students (27%) were undecided. The feedback on how students experienced the effectiveness of the ELISA Learning Tool deviates from feedback on lecture and practical class in the very useful category. Of the 30 students who trailed the ELISA Learning Tool seven students (23%) found the ELISA Learning Tool very useful, 13 students (43%) found it useful and 5 students (17%) were undecided. There were also five students (17%) who stated that they experienced the ELISA Learning Tool as not very usefula category not selected for lectures and practical class. None of the students selected not useful at all in the survey. Table 1 overviews the feedback on the three learning activities. CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 151 Table 1. Student Feedback on the Learning Value of Different Learning Activities How do you assess this activity in terms of learning value to you? Lecture Practical class ELISA Learning Tool very useful 13 (45%) 13 (43%) 7 (23%) useful 11 (38%) 9 (30%) 13 (43%) neither useful nor not useful 5 (17%) 8 (27%) 5 (17%) not very useful - - 5 (17%) not useful at all - - - n= 29 30 30 Authenticity of learning with the ELISA Learning Tool It was important to investigate whether students experienced the ELISAs presented in the Learning Tool as an accurate simulation of the real testing process in the laboratory. Table 2 shows that the majority of students (90%) felt simulations were realistic. Table 2. Student Feedback on the Effectiveness of the ELISA Simulation in the ELISA Learning Tool Do you feel the ELISA Learning Tool simulates the real testing process accurately? Yes 27 (90%) No 3 (10%) n= 30 Students were also asked to provide feedback on their learning experience. They were asked if they enjoyed learning with the ELISA Learning Tool. A very high number of students, 27 students (90%), stated that they enjoyed learning the ELISA with the help of the ELISA Learning Tool. Only three students (10%) did not enjoy the experience (Table 3). CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 152 Table 3. Student Feedback on the Learning Experience Did you enjoy learning the ELISA with help of the Learning Tool? Yes 27 (90%) No 3 (10%) n= 30 When analyzing the qualitative feedback regarding students positive learning experience (90% of students enjoyed using the ELISA Learning Tool), four main themes emerged. These themes are overviewed in Table 4 with each theme illustrated by typical comments. Table 4. Student Feedback on Reasons for a Positive Learning Experience while Using the ELISA Learning Tool Why did you enjoy using the ELISA Learning Tool? Common theme Typical comment(s) Gain a better understanding of the process of the ELISA I think I actually gained a much better understanding of ELISA. Clarified the sequence of events. Interactivity and visualizations help to understand and memorize the process Interactive and it was easy to visualize what actually happens which you can't see in a practical. Seeing pictures of the animals was useful in remembering; also the info in the beginning was helpful Timesaving Cool antibody/antigen animals, and there was also no need to incubate and come back the next day. Learning this way is fun This game was fun and made understanding ELISA easy. Allowed me to see what we are learning in a simplistic way, which makes it easier to understand. It is also a game, which makes it fun. The feedback from the three students (10%) who did not enjoy the learning experience gave insight into why this was the case. Two students found the test exercise too simple and childish for a university student and also too repetitive to facilitate learning. The other student experienced the tests provided as just too complicated. CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 153 A deeper insight on what worked best for students when using the ELISA Learning Tool was gained by asking students to provide three reasons for what they liked most about the learning tool. The analysis of the feedback from the 27 students (90%) who enjoyed learning with the help of the Learning Tool revealed the following reasons with the number of references made shown in brackets. Information about the ELISAs and instructions provided (13) Interface design/graphical representation of the antigen/antibodies (13) Easy to use and understand (11) Quick (7) Interactivity/animation/dynamic (5) Rewarded and no penalties (3) The ELISA Learning Tool as a Substitute for Students Learning in the Laboratory The development, design and production of the ELISA Learning Tool required substantial effort and resources from all involved. It was important to evaluate whether the traditional way of teaching the ELISA through providing lectures and practical class in the laboratory should continue and whether it is effective to further develop the Learning Tool as a potential substitute for practical laboratory classes and/or lectures. Table 5 overviews student feedback on the likelihood of using the ELISA Learning Tool to learn or practice the ELISA. Table 5. Student Feedback on the Potential Use of the ELISA Learning Tool How likely is it that you would use the ELISA Learning Tool for study purposes? I would definitely use it to practice and learn. 7 (23%) I would most likely use it. 12 (40%) I am not sure if I would use it. 7 (24%) I would not use it. 4 (13%) n= 30 It is encouraging that 63% of students would definitely use or most likely use the ELISA Learning Tool for study purposes. The analysis of the 24 responses regarding why students would use or not use the learning tool revealed various perspectives. The main reason to use the learning tool was seen in its ability to make the ELISA process visible (mentioned by eight students), which helps understand the process. The following comments illustrate this: CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 154 The learning tool allows me to understand what is happening in the ELISA that you cannot see in the practicals. The graphics stimulate learning and made it easier, especially to understand ELISA. Clear visualization of ELISA step and process help in understanding the ELISA. Four students highlighted the ability to test their acquired knowledge with the ELISA Learning Tool: You can test your knowledge. It confirms knowledge that you are unsure of. It tells me whether I did the ELISA correctly. I would use it to help me understand the practical classes. While one student explained that it is better than reading about it, others (3) stated their preference for learning about the ELISA with books or from lecture notes. As one student commented: Books and lectures and practical classes tell you all this and way more. Its not necessary. Two students did not find the ELISA demanding enough, as expressed in this comment: My understanding of ELISA is beyond this game and I don't need it. Discussion The student feedback gave valuable insight into their learning experience when using the ELISA Learning Tool and hence the value of gamification to enhance the process of learning for students. The feedback showed that students still valued a lecture or practical laboratory class slightly higher (an average of 12% in the categories very useful/useful) than learning and practicing the ELISA by using the ELISA Learning Tool. The lecture was rated highest for learning value from students. The practical laboratory class was rated less useful than the lecture but more useful than the ELISA Learning Tool. The lower rating on the practical application of the ELISA, in either the real world (laboratory) or virtual world (ELISA Learning Tool), might point to a problem that students have in putting theory into practice. Experiencing the value of lecture, practical laboratory class and ELISA Learning Tool differently can also relate to different learning styles of students (e.g. Honey & Mumford, 1986; Kolb, 1984) resulting in preferring different learning modes (Buch & Bartley, 2002). Nevertheless, of the 30 students who trailed the ELISA Learning Tool, 66% found the learning activity useful or very useful. This is still a positive outcome considering that this first prototype was not fully functional. Nevertheless, a third of the students provided less positive feedback with 17% stating that they found the learning activity not very useful and another 17% of students were undecided regarding its value for their learning. CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 155 Interestingly at the same time a very high number of students from the same cohort (90%) stated that they enjoyed learning the ELISA with the help of the Learning Tool. This is a striking discrepancy while only 66% of students found the learning activity very useful or useful, 90% of students stated that they enjoyed learning with the learning tool. Some students even described learning with the Learning Tool as fun. Why is that? Some explanations can be offered though analyzing the qualitative feedback from students and looking at features of the ELISA Learning Tool that have been trailed. First of all, the ELISA Learning Tool was tested as a prototype and hence not all features that make gamification of learning beneficial were fully developed and implemented. For example, the prototype offered students the opportunity to test their knowledge in eight different test scenarios or challenges. In reality (and envisioned as part of a future prototype) there are 40 tests with varying degrees of difficulty available to students and which students would need to learn and perform. Feedback from some students highlighted this particular matter in that they felt the game was repetitive and for some students it was not challenging enough (e.g. My understanding of ELISA is beyond this game). It is a known issue in any higher education learning environment that students can have diverse levels of pre-existing knowledge and hence engage differently in learning activities. Furthermore, students can have diverse learning styles and speeds of learning, which can also lead to students reflecting differently on the usefulness of learning activities. However, gamification of learning is meant to exactly address such matters in that implementation of game-like features, such as using difficulty levels in a learning application, caters for the different developmental stages of student learning. Students could start exploring topics, such as learning the ELISA, according to their knowledge level and hence students would neither be overwhelmed nor unchallenged. The trailed prototype did not include this particular feature. Although seen as essential by the educator team and envisioned to be included, the iterative approach to the development of the Learning Tool did not see the implementation of this feature in the first prototype. This weakness of the trailed prototype is also reflected in feedback from students on the future use of the ELISA Learning Tool. It can be assumed that students who were not challenged by its use were among the students who stated that they would not use the ELISA Learning Tool for study purposes (13%). Similarly it is suggested that students who found the learning activity useful or very useful (66%) were also inclined to use the Learning Tool in the future, with 63% of students stating they would either definitely or most likely use it. Secondly, the high satisfaction of students in learning the ELISA using the Learning Tool (90%) can clearly be linked to the gamification of the ELISA. Students stated that they gained a better understanding of the process of the ELISA and that the interactivity and visualization helped them to understand and memorize the process. As suggested in the literature (Geelan et al., 2015; Honey & Hilton, 2011; Morris et al., 2013) the benefits of CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 156 using game mechanics were clearly highlighted through students feedback. They were in favor of receiving immediate feedback; the graphics used helped students to visualize the process and make an otherwise invisible process visible in the laboratory. The feedback from students regarding better understanding of the ELISA and understanding what was going on in the test was particularly powerful as it emerged in answers to various different questions. Furthermore, it was encouraging that a high percentage of students (90%) found that the ELISA Learning Tool simulated the test in the laboratory authentically. This is particularly positive as some students saw the use of the virtual test as time saving (quick) and appreciated the instant feedback. The incubation time in a real world laboratory, for example, could require students to wait for long periods of time or even to come back the next day to see the result. An educator would need to be present to provide feedback (if possible) on the success or failure of a test outcome. Avoiding frustrations about not being able to find out what went wrong in an ELISA is a particularly positive result from applying a game-based learning approach. Providing instant feedback in combination with additional contextual information enables students to learn from mistakes and can enhance their understanding of the process. Although providing contextual information was again a feature not fully developed at all levels in the first prototype, which means that not every mistake that could be made triggered feedback in the application. However, students nevertheless commented positively on the instructions and information about the ELISA provided throughout the game. Other typical benefits highlighted in the existing literature on gamification of learning also emerged in this research (e.g. Hanus & Fox, 2015; Lee & Hammer; 2011). For example, students appreciated that they could test their knowledge without fear of failure or penalties. It was also positively mentioned that learning was made interactive and dynamic and enhanced through animations and sound. Overall, a large number of students commented on the effectiveness of the graphics which helped to stimulate learning and understand the ELISA. Conclusion The development of the web-based learning tool aimed to make microscopic processes of ELISA visible through the gamification of learning content to help students understand the microscopic processes and consequently make learning the ELISA easier for students. The design of the application used principles of gamification, such as a game like interface design, choice of challenges or test scenarios, instant feedback, repeat by failure or progress by success and enhancing features such as sound and animation. Student feedback from the first prototype trial can overall be seen as positive, in particular in relation to the effectiveness of the application in visualizing the otherwise invisible processes of ELISAs. The feedback from MedLab Science students was encouraging with regard to enhancing their understanding of the Enzyme Linked Immunosorbent Assays CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 157 (ELISAs). In the case of the ELISA Learning Tool, gamification of learning content is seen as a successful approach to enhance student learning in helping students understand the microscopic laboratory process of ELISAs. Furthermore, the gamification of content led to the large majority of students enjoying learning with the learning tool. Student feedback also highlights the potential for further developments, which would include the introduction of various difficulty levels to cater for diverse developmental stages of students. Other features of gamification, such as collecting points or high scores, could be implemented to spur on competitiveness among students and hence further their engagement with the learning tool. More research needs to be done. For example it is not clear to what extent learning was inhibited through usability issues of the application itself, rather than through challenges in content comprehension. Future trials with a fully functional prototype would need to test the actual learning that had occurred through either an external exam or one completed as part of the learning tool. Although the ELISA Learning Tool is currently seen as an additional learning activity to complement lectures and practical laboratory classes, the initial positive feedback and hence the success of gamification of learning in this particular case suggests that with further careful development the web-based ELISA Learning Tool could substitute the traditional modes of learning in the long term. References Borges, S. D. S., Durelli, V. H. S., Reis, H. M., & Isotani, S. (2014). A systematic mapping on gamification applied to education. SAC '14 Proceedings of the 29th Annual ACM Symposium on Applied Computing March 24-28, Gyeongju, Korea. Buch, K. & Bartley, S. (2002). Learning style and training delivery mode preference. Journal of Workplace Learning, 14(1), 5-10. doi: 10.1108/13665620210412795 Buckley, P. & Doyle, E. (2014). Gamification and student motivation. Interactive Learning Environments, 22(6), 1-15. doi: 10.1080/10494820.2014.964263. De-Marcos, L., Eva Garcia-Lopez, & Garcia-Cabot, A. (2016). On the effectiveness of game-like and social approaches in learning: Comparing educational gaming, gamification & social networking. Computers & Education, 95, 99-113. doi: 10.1016/j.compedu.2015.12.008. Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011). From game design elements to gamefulness: Defining gamification. Paper presented at the Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments. Dicheva, D., Dichev, C., Agre, G., & Angelova, G. (2015). Gamification in education: A systematic mapping study. Educational Technology & Society, 18(3), 75-88. doi: 10.1109/EDUCON.2014.6826129. CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 158 Ebner, M., & Holzinger, A. (2007). Successful implementation of user-centered game based learning in higher education: An example from civil engineering. Computers & Education, 49(3), 873-890. Epper, R. M., Derryberry, A., & Jackson, S. (2012). Game-based learning: Developing an institutional strategy. Research Bulletin) (Louisville, CO: EDUCAUSE Center for Applied Research. Federation of American Scientists. (2006). Harnessing the power of video games for learning. Summit on Educational Games. Washington, DC: FAS. Fitz-Walter, Z. (2013, 23 January 2013). A brief history of gamification. Retrieved on 21 August 2015 from http://zefcan.com/2013/01/a-brief-history-of-gamification/ Geelan, B., Salas, K. d., Lewis, I., King, C., Edwards, D., & O'Mara, A. (2015). Improving learning experiences through gamification: A case study. Australian Educational Computing, 30(1). Retrieved on 11 April 2016 from http://journal.acce.edu.au/ index.php/ AEC/article/view/57/pdf Hamari, J., Koivisto, J., & Sarsa, H. (2014). Does gamification work? - A literature review of empirical studies on gamification. 47th Hawaii International Conference on System Science (HICSS), 6-9 January, Waikoloa, HI, USA. Hanus, M. D., &, J. (2015). Assessing the effects of gamification in the classroom: A longitudinal study on intrinsic motivation, social comparison, satisfaction, effort, and academic performance. Computers & Education, 80, 152-161. doi: 10.1016/j. compedu.2014.08.019. Hilton, M. & Honey, M. A. (2011). Learning science through computer games and simulations. Washington, DC: National Academies Press. Honey, P. & Mumford, A. (1986). The manual of learning styles. Berkshire, UK: Peter Honey. Huotari, K. & Hamari, J. (2012). Defining gamification: A service marketing perspective. Paper presented at the Proceeding of the 16th International Academic MindTrek Conference. Jong, M. S., Lee, J. H., & Shang, J. (2013). Educational use of computer games: Where we are, and whats next. Reshaping Learning (pp. 299-320). Berlin: Springer. Kapp, K. M. (2012). The gamification of learning and instruction: Game-based methods and strategies for training and education. San Francisco: Pfeiffer. Kim, B. (2015). Gamification in education and libraries. Library Technology Reports (Vol. 51, 2). Chicago. Kolb, D. A. (1984). Experiential learning. Englewood Cliffs, NJ: Prentice Hall. Lee, J. & Hammer, J. (2011). Gamification in education: What, how, why bother? Academic Exchange Quarterly, 15(2), 2. http://journal.acce.edu.au/CONTEMPORARY EDUCATIONAL TECHNOLOGY, 2016, 7(2), 138-159 159 Morris, B. J., Croker, S., Zimmerman, C., Gill, D., & Romig, C. (2013). Gaming science: The Gamification of scientific thinking. Frontiers in psychology, 4, Article 607. doi: 10.3389/fpsyg.2013.00607 Nah, F. F.-H., Zeng, Q., Telaprolu, V. R., Ayyappa, A. P., & Eschenbrenner, B. (2014). Gamification of Education: A review of literature. HCIB 2014, Held as Part of HCI International 2014, June 2227, Crete, Greece. Oxford Dictionary. (Ed.) (2012) Oxford Dictionary. Poole, S. M., Kemp, E., Patterson, L., & Williams, K. (2014). Get your head in the game: Using gamification in business education to connect with Generation Y. Journal for Excellence in Business Education, 3(2). Punch, K. (2009). Introduction to research methods in education. London: Sage. Rossman, G. B. & Wilson, B. L. (1985). Numbers and words: Combining quantitative and qualitative methods in a single large-scale evaluation study. Evaluation Review, 9(5), 627-643. Renaud, C., & Wagoner, B. (2011). The gamification of learning. Principal Leadership, 12(1), 56-59. Swann, A. (2012, Jul 16, 2012 @ 9:30 PM ). Gamification comes of age, Forbes. Retrieved on 16 February 2016 from http://www.forbes.com/sites/gyro/2012/07/16/ gamification-comes-of-age/ Villagrasa, S., & Duran, J. (2013). Gamification for learning 3D computer graphics arts. Proceedings of the First International Conference on Technological Ecosystem for Enhancing Multiculturality. Wu, M. (2011, 08-29-2011). What is gamification, really? Retrieved on 16 February 2016 from http://community.lithium.com/t5/Science-of-Social-blog/What-is-Gamification-Really/ba-p/30447 Correspondence: Katja Fleischmann, Associate Professor in Media Design, College of Arts, Society and Education, James Cook University, Townsville City, Queensland, Australia http://dx.doi.org/10.3389%2Ffpsyg.2013.00607http://www.forbes.com/sites/gyro/2012/07/16/%20gamification-comes-of-age/http://www.forbes.com/sites/gyro/2012/07/16/%20gamification-comes-of-age/http://community.lithium.com/t5/Science-of-Social-blog/What-is-Gamification-Really/ba-p/30447http://community.lithium.com/t5/Science-of-Social-blog/What-is-Gamification-Really/ba-p/30447

Recommended

View more >