Gaming The System: MeasuringGamification’s Impact On Student Learning

Abstract

The rapid advancement of digital technology has had undeniable transformative effects on numerous industries, including education. Using gamification, a more recent technological innovation shaping the landscape of academics, educators integrate game design principles into non-gaming contexts to improve learning outcomes and experiences. In this study, students enrolled in a private university finance class participated, some being taught with gamified lessons or tools, while others learned through more traditional means with no gamification. Assessment and analysis of their learning outcomes show that gamification is effective and has a positive impact on education, as the students in the gamified methods group demonstrated significantly higher improvement in their understanding of finance and other business concepts. The growing interest in gamification of education, evidenced by an increasing body of literature, and the immense potential for the use of gamification as a powerful education tool, are highlighted in this study. 

Introduction

There is an ongoing trend at institutions of higher education: these institutions tend to show more regard to faculty members with longer careers and more experience than to those who have been hired recently and are not far past their own graduation. The reasoning behind this is clear, and many of the established instructors are excellent educators who deserve higher esteem and appreciation. However, the more experience a teacher gains, the greater the distance grows between them and the student experience. A problem arises when the main factor driving career advancement and faculty retention is seniority, particularly when faculty members cling to traditional, tried-and-true instructional methods rather than adapting to change. 

Unfortunately, the prioritization of seniority can discourage bold changes or innovation by junior faculty members, and it can incentivize veteran instructors to continue teaching students in the way they were taught, perpetuating the cycle. While this has obvious implications for faculty members throughout their careers, it also has a profound impact on the learning experience of the students, who have voiced their frustration. For decades, the students have submitted complaints regarding the way courses are taught – the teaching methods fail to meet their expectations.

But it is not up to the students to chart the course. It is up to each institution to decide whether to embrace new teaching methods, which might allow connection with the students on a greater level, which could in turn result in lower attrition and increased student satisfaction. It is up to each institution to turn things around. 

Literature Review

Student Expectations and Metrics

Higher levels of student satisfaction would not only change the amount, or at the very least the nature, of student complaints, it would also lead to greater persistence through and completion of programs of study. Al Hassani and Wilkins (2022) found a “strong and positive relationship between student satisfaction and retention” (p.1048). In other words, students who experience higher levels of satisfaction are more likely to persevere in their chosen programs of study and follow through to graduation (Wong & Chapman, 2023). Moreover, in addition to greater persistence through to graduation, students who experience higher satisfaction also exhibit higher levels of engagement and motivation while enrolled (Annamdevula & Bellamkonda, 2016). 

Additionally, higher student satisfaction has been shown to influence other factors that continue even outside the classroom and beyond graduation, such as student loyalty, alumni engagement, peer support, and the overall financial well-being of the university (Ensmann & Whiteside, 2022). All of this being the case, universities should find the demonstration of a statistically significant positive correlation between student satisfaction and gamified teaching methods sufficiently compelling to explore this educational approach to improve student experiences and learning outcomes. 

Adoption of Gamification as a Pedagogical Modality

Over time, gamification has become increasingly adopted by K-12 level educators to enhance their course content, with varying levels of success (Dichev & Dicheva, 2017). The aim of this new modality is to improve critical thinking skills, which would in turn increase knowledge retention. Though popular among students, adoption of gamification remains relatively low in secondary and higher education. Various factors are believed to contribute to the low integration, including resistance to overhauling a course curriculum, and trepidation about taking on the technological demands of gamification. Additionally, some educators hold the belief that gamification may not be suitable for university-level students, and more specifically, that it might oversimplify the curriculum and diminish its rigor. Regardless of the reasoning, gamification of teaching methods has yet to gain real traction in university education systems (Khaldi et al., 2023). 

However, these barriers can and should be overcome. Gamification benefits students of higher education. The rewards, incentives, and flexibility embedded in the dimensions of gamification improve student satisfaction and performance (Allan & Fryer, 2011). Applications based on gamified instruction concepts, such as Duolingo, have allowed users of all ages to learn such material as new languages; these concepts have also been used in professional seminars to provide training (Shortt et al., 2023). The evidence for the potential of gamification to benefit learners is compelling. 

Background of Gamification

Gamification involves the application of aspects or techniques used in game design to other items or areas. Gamification of teaching methodologies, therefore, involves incorporating techniques such as rewards, simulation, and roleplaying, to name a few, to enhance the learning process. It is by no means a new concept; however, it is still rather novel in the university setting. Gamification in the classroom has been gaining popularity due to its potential for accommodating different learning styles, which the rigidity of traditional lecturing and testing methods do not allow for (Nieto-Escamez & Roldán-Tapia, 2021). 

Teaching methods have adapted to changes in student preferences, and as technology has allowed. Older learners, such as Baby Boomers and members of Generation X, typically thrive when presented with clear tasks, deadlines, and an authoritative style of instruction. Younger learners, such as Millennials (the last generation to experience an environment that was not “phygital”) and members of Gen. Z, show better learning outcomes with a blend of technology and strong peer connections. These younger learners also tend to value solving simulated problems (Blevins, 2021). But regardless of the age or background of the learner, gamification of teaching methods is based on time-tested learning principles, thus it can benefit all learners. 

Dimensions of Gamification 

The gamification model used in this study hinges upon five dimensions chosen to reflect different aspects of learning: ecological, social, personal, performance, and fictional. Although these were suitable for examining the impact of a gamified curriculum on learning outcomes in this study, it should be noted that there are other classifications which could be used (Toda et al., 2019a; Toda et al., 2019b). The five-dimension model aligns with the objectives in the business curriculum and focuses on improving comprehension, engagement, and mastery of key concepts. It is a powerful tool, capable of ensuring a comprehensive approach and enhancing education experiences and learning outcomes for business students. 

Performance dimension. Gamified courses provide immediate rewards tied to performance, which can motivate students and keep them engaged, ultimately improving their learning outcomes and education experiences (Johnson, 2011; Rasheed et al., 2022). Student engagement is an important indicator of performance in higher education, and many metrics are directly affected by it (Bowden et al., 2021). Performance rewards provided to the students in the experimental group included experience points and badges. 

Ecological dimension. Ecology is an important component of learning, particularly gamified learning, because “chance, imposed choice, economy, rarity, and time pressure,” which can be effective components of game design, can also motivate students to thoughtfully complete tasks (Toda et al., 2019, p. 6). Students in the experimental group in this study earned rewards, including snacks and experience points, for completing assignments rooted in the ecological dimension. Students also randomly received negative utility rewards, incorporating an element of chance. Negative rewards included a gallon of tomato juice, a can of sardines, and other similar items (Santos et al., 2021). Student grades were not affected by negative utility rewards; these were intended to provide an element of fun. 

Social dimension. The social dimension, which includes cooperation, competition, social pressure, and reputation, was another important component elucidated through gamification (Toda et al., 2019). The students in the experimental section were broken into teams for a group assignment in which the teams prepared a product launch pitch and presented it in front of the class. The assignment had a twist: each team had to propose a strategy that was financially viable but ethically problematic. The team with the most compelling pitch earned experience points. 

Fictional dimension. Introducing a fictional element in the classroom can help students develop their critical thinking and evaluation skills (Jarvis, 2020). Gamification of education is particularly effective when students’ learning objectives are reframed using hypothetical scenarios. In this study, groups in the experimental section were tasked with formulating a proposal for a new program, athletic sport, or intramural offering, outlining projected revenues and expenses over the next decade. Each team then presented their proposal and analysis to a business faculty member who played the role of the university president. Experience points were awarded to the team whose proposal displayed the most promising fiscal outlook and alignment with the university’s core values. 

Personal dimension. The personal dimension of gamification provides a balance to the social component, making the course content more engaging by tying student progress to individual efforts rather than allowing them to rely on team success (Opdecam & Everaert, 2018). A major concern with group work in the classroom is that each member’s effort and contribution can vary (Bishnoi, 2017), which often leads to conflict within the group (Burke, 2011). The personal dimension, on the other hand, provides incentives for personal accomplishments. For this dimension, the experimental group students completed interactive video quizzes, multiple-choice tests, and surveys, and they participated in graded collaborative video forums. 

Increased Content Mastery and Comprehension Through Gamification

Underpinning this study is the proposition that student perception of the learning process has a significant and positive correlation with their satisfaction levels and academic performance. This relationship explains the influence the learning process has on students’ overall academic success and highlights the importance of their perception regarding their education experience (Banahene et al., 2018, p.6). Additionally, research suggests that student satisfaction leads to increased engagement, in turn leading to a stronger understanding of concepts (Hijazi & Naqvi, 2006), even if the student does not perceive the highest level of satisfaction (Yüner et al., 2023). 

Gamification of teaching approaches represents a logical means of increasing student satisfaction. In a study conducted in Glasgow, students reported higher satisfaction scores when they were provided with prompt, comprehensive, and helpful feedback from their instructors. Specific analysis from the Glasgow study showed that student satisfaction scores increased significantly when instructor responses were prompt (Gartland et al., 2016). Therefore, in the experimental course in this study, students were given feedback almost instantly, either through the gradebook or the leaderboard. 

Methodology

The research design, data collection procedures, and methods of analysis used in this study to examine the effects of gamified classroom content on student learning outcomes and experiences are provided in this section. 

Research Questions and Hypotheses

In this study, the impact of gamification in education on student mastery of content was examined by testing two hypotheses. The Research Questions and related Hypotheses appear in Table 1. Results taken from an externally administered assessment tool were used to test the Hypotheses. 

Table 1 – Research Questions and Hypotheses

RQ1 – Does the implementation of gamified teaching methods significantly affect student mastery of course content shared by the business finance curriculum and the division’s professional core curriculum? H1 – Third-party assessment results will show that the percentage of change for the business students in the experimental section will be significantly higher than for those in the control group in the “total” subject areas of emphasis in the business core curriculum.
RQ2 – Does the implementation of gamified teaching methods significantly affect student mastery of finance-specific course content?H2 – The third-party assessment results will show that the percentage of change for the business students in the experimental section will be significantly higher than for those in the control group in the finance-specific subject area. 

Measuring Student Knowledge Acquisition

During the first week of class, students from an experimental and a control group were given a standardized assessment to gauge their comprehension of key business concepts. This inbound assessment established a baseline measuring their mastery of business finance, business communications, and economic subject matter concepts, each of which are areas of emphasis in both the business division’s common professional core and the business finance curriculum. Students transitioning from the scholastic environment to the workforce must have a firm understanding of these concepts (Lusardi, 2019). During the last week of class, the assessment is repeated, allowing a comparison of the inbound and outbound assessment scores. This provides important insights about the knowledge obtained in the business subject matter areas. 

The Peregrine Finance Assessment

The assessment is administered externally by a third-party company, Peregrine Global Services (PGS). Due to their ability to assess student knowledge in a range of subject matter areas, many accredited universities rely on PGS’s evaluation tools and other resources (Bright et al., 2019). To ensure the security and confidentiality of student results, the assessments used in this study were administered via a PGS microsite, which was not integrated with the private university’s learning management system (LMS). Students did not pay any fees associated with this assessment; these were paid by the university. The PGS assessment has been shown to be statistically valid and reliable (Oedekoven et al., 2019). It includes multiple-choice questions, and it covers four areas of emphasis. There are 10 multiple-choice questions covering topics within each area, and these 10 questions are randomly selected from a test bank containing over 200 questions related to those topics.

Finance testing facet. In this section of the assessment, students are given questions on topics related to the area of business finance, including financial statements, financial ratios, valuation of assets, inventory, calculation of interest, cash flows, taxes, and valuation of bonds (Peregrine Global Services, 2020). Sample assessment questions related to the finance facet are shown in Figure 1. 

Figure 1 – Sample Finance Questions

Business communications facet.

Questions from this section are meant to assess student knowledge in subjects including cross-cultural, verbal, non-verbal, and written business communications. Students are also tested on their knowledge of the process of communication and organizational flow of information (Peregrine Global Services, 2020). Figure 2 shows sample questions from the business communications section. 

Figure 2 – Sample Business Communications Questions

Macroeconomics facet.

Knowledge of concepts including unemployment, inflation, interest rate changes, protectionism, and gross domestic product (Goldman, 1991) are tested in this section (Peregrine Global Services, 2020). For example, a question about how scarcity impacts demand on a macro level would appear in this section of the assessment. Sample macroeconomics questions are shown in Figure 3. 

Figure 3 – Sample Macroeconomics Questions

Microeconomics facet.

This section comprises questions focusing primarily on firm-level economic phenomena. Concepts assessed in this section include pricing, revenue, supply, demand, and trends with business-level implications (Peregrine Global Services, 2020). Sample microeconomics questions are given in Figure 4. 

Figure 4 – Sample Microeconomics Questions

Overall results.

In addition to providing results for each of the four sections individually, the assessment includes an overall score, providing for a combined assessment (Peregrine Global Services, 2020). However, this study focused on students enrolled in a business finance course, therefore only the business finance section of the assessment was analyzed individually to test Hypothesis 2. Business communications and economics are not points of focus in the business finance course, but they are common areas of emphasis shared with the division’s professional core, therefore the assessment results from these areas were not analyzed individually, but they were included as part of the overall assessment total and used to test Hypothesis 1. 

Research Design

This study used a quasi-experimental design and analyzed results from the PGS assessment tool to examine the impact of the gamification of teaching methods on content mastery for business students at a private university. To evaluate the impact of implementing gamification, comparison was conducted of the mean course evaluation scores for two distinct groups: students enrolled in sections of a course employing gamified teaching methods (the experimental group), and students enrolled in sections of the course where gamified teaching methods were not introduced (the control group). 

The Principles of Finance class was selected for this study, as all undergraduate business students at this university are required to take this course as part of the division’s common professional core. To minimize selection bias, course listings in the student registration materials for both the experimental and the control sections contained no indication whether gamified methods would be used. 

Two interactive gradebooks were used for this study; each was synchronized to a central feedback tool for students. Integration of the university’s LMS with a third-party software program called Moodle allowed tracking and display of experience points. Additionally, students were given weekly individual grade reports summarizing their grades to-date and their previous performance. 

Participants

Participants were drawn from undergraduate students completing a business-related academic program who were enrolled in different sections of the Principles of Finance course at a private university located in Kansas. Students were selected from different sections of the same finance course to ensure accurate representation of the diverse student population. 

There was a total of 33 students in the experimental group, and the control group comprised 31 students. This sample size was sufficient to allow evaluation of the effects of implementing gamified teaching techniques in the classroom as compared to traditional modes of instruction. 

Procedure

For this study, careful consideration was given to course design for the different sections. In the experimental section, gamified elements were integrated into the course. These included, among other things, the use of leaderboards, badges, and interactive quizzes. Traditional teaching methods were used in the control section; gamified methods were not introduced in this section. There were, of course, elements that were common to both the experimental and the control sections. Furthermore, both sections maintained the same curriculum and identical student learning objectives. 

Experimental Group Elements

The course design of the experimental section of the class differed from that of the control section in two important ways: the pacing, and the reward structure. In the gamified section, greater quantities of assignments were given to students, but the assignments were smaller. With more assignments, stakes were lower because lower scores on the assignments had less impact on the final grade. 

The gamified course section also had an enhanced reward structure. In addition to the points given on graded assignments, as is done per the traditional modality, students in the gamified section were awarded experience points for optional tasks. Engagement with ungraded content was encouraged as a means of increasing the students’ perceived value of the LMS. 

Experience points. Although grades are generally an effective motivational tool for students, rewards that are not directly tied to grades can also motivate students to learn course material (Zebing, 2019). Furthermore, grade inflation can occur if rewards are directly tied to grades. In developing the course, creating incentives that would not affect grades but would still motivate the students was critical. Experience points were utilized to incentivize students in the experimental group to complete ungraded course tasks. It was hypothesized that students would be more likely to ignore assignments that would not affect their final grade, therefore, students in this section could only earn experience points as a reward for ungraded tasks. 

In this study, students in the experimental section were awarded experience points for completing reading assignments, watching videos, and performing other tasks that are typically available but optional in courses employing a traditional modality. There were also levels associated with the amount of points earned; the more points a student earned, the higher level they would achieve. All students in this section began at the first level, “Bookkeeper Clerk,” but each could potentially reach “Warren Buffett,” level 10. The levels and their associated titles and points required are shown in Figure 5. 

Figure 5 – Experimental Group Levels

Achieving higher levels not only came with greater academic status, it also exempted students from certain assignments, if they reached the levels by specific prescribed times. For example, students who reached Level 6 could opt out of a quiz that was to be given during the risk mitigation section of the course. Assignments eligible for student exemption were selected during course development. 

The university integrated Level Up XP, a Moodle plugin, into its LMS to manage the tracking and distribution of experience points. Level Up XP also allows customization of the levels, reporting, and maintaining a leaderboard for the class. The Department of Instructional Design at the university managed the technical aspects of the Level Up XP plugin. 

Weekly debates. Course concepts were integrated into the student experience through the assignment of group activities in the experimental section. In the weekly debates, groups had a limited amount of time to review and propose recommendations for issues aligning with an assigned topic at the end of each week. Care was taken during course development to select issues that would be complimentary to the course content, timely and topical, and interesting to the students. 

As an example, groups in this section had to consider and propose a different currency to replace the U.S. dollar as the world’s reserve currency. Each group had to determine which currency had the best chance of maintaining a stable global economy in the long term. Recommendations from the groups included the euro, the Chinese yuan, and Bitcoin, and the group with the most persuasive presentation was awarded experience points.

HTML5 package (H5P) videos.

Many online courses incorporate videos as an integral part of the curriculum, including courses in higher education. “Video-based learning is defined as the learning through the use of videos, which is comprised of visual and audio cues to present information for learners. It is recognized as a powerful resource for online teaching and learning” (Yu et al., 2023, p. 2). Research has shown that students demonstrate favorable responses to videos rendered with H5P formatting (Kosmaca et al., 2023). Videos were used in both the experimental and the control group sections in this study, and aside from the formatting, the videos were identical for both sections, however, only the experimental group used videos with interactive H5P formatting. 

“HTML5 package (H5P) videos allow the creators to add messages, questions, links and other interactions” (Supper et al., 2021, p.1315). H5P formatting was used to embed questions throughout the videos for the experimental section. The H5P video questions were multiple-choice, and student answer results were sent directly to the LMS gradebook. 

Video quizzes.

Students could have points added to their final grade calculation by watching a brief video that aligned with the weekly lesson plan subject matter and taking a quiz based on the video. Every week, 5 to 10 videos were made available. These video quizzes were optional. They were also arranged in a particular order and interdependent, meaning that the students could not proceed to the next video without completing the quiz for the previous video. Furthermore, if a student did not earn a perfect score on a quiz, they would not proceed to the next video. 

Impossible questions.

Upon completing the optional quizzes for every video in the series for the week, a student earned the chance to answer a question based on the learning objectives for the week’s topic. However, these were questions of extreme difficulty. Students earned extra course points for successfully answering these impossible questions. The greatest reward reported by students, however, was a sense of personal accomplishment. 

Control Group Elements

Compared to the experimental group, the control group utilized more traditional teaching modalities. The pedagogical elements described below were employed exclusively in the control group section. 

Discussion forums.

To facilitate student-led learning, collaborative forums relating to the weekly course topics were used in the control section of the course. Each student was required to respond to a discussion topic posted by the instructor, and to at least two responses posted by classmates. A rubric, which included facets such as spelling, grammar, content, outside sources, timeliness, and critical thinking, was used to grade the responses and was made available for student review. 

Unit tests.

Whereas the experimental group was given video quizzes, students in the control section were given unit tests, which were larger but less frequent. These time-restricted tests covered content from approximately three weeks’ worth of course materials, and they constituted a significant percentage of each student’s final grade. They were administered electronically using a pool of questions online. 

Homework assignments.

Students in the control group were assigned homework. Homework assignments included content knowledge assessment tools such as written assignments, quizzes, and long-format assignments. Homework assignments demanded that each student put in a significant amount of individual work. 

Common Section Elements

Many elements were shared by both the experimental and the control sections. However, the practical implementation of these may have differed depending on factors such as class size and specific student needs. The weight of these assessments in the calculation of grades remained the same between both sections. 

Midterm and final tests.

Both groups were given midterm and final exams, which were standardized to ensure consistent format and content for all students. Questions in these knowledge assessment instruments were primarily multiple-choice, a format deliberately chosen to mitigate any scoring errors associated with grading essays or other exam answer types (Uysal & Dogan, 2021). 

Reading assignments.

Students in both sections were required to read materials including textbook chapters and other content related to the weekly course topics. The reading assignments were identical for the experimental and the control groups. 

End of course evaluation (IDEA).

At the conclusion of the course, the students in both sections were asked to submit course evaluations. The university uses the IDEA Student Ratings of Instruction (SRI) system, in which students evaluate “their own work habits, motivation levels, and the extent of their background preparation. The aggregated course-specific averages derived from these self-ratings hold significant importance, serving as pivotal variables employed in the calibration of scores related to student evaluations of progress concerning pertinent learning objectives (PRO), instructor excellence, and overall course excellence” (Benton & Li, 2017, p.1). 

In summation, although gamified teaching methods and more traditional teaching modalities each have distinct elements unique to their approach, they can still share identical course functions and elements. Figure 6 visually demonstrates the commonalities and the unique elements used for the two groups in this study. 

Figure 6 – Modality Elements

Data Collection

During the first week of class, students enrolled in both the experimental and the control sections of the Principles of Finance course were notified of the pending assessment and provided a hyperlink through the assignments module of the university’s LMS, through which they completed an authentication process, linking themselves to the university and the class, and they completed the inbound assessment online. Figure 7 shows the PGS assessment selection page. During the last week of class, this process was repeated, and the students completed the outbound assessment. 

Figure 7 – Peregrine Assessment Registration

Materials and Measures

Students were instructed to answer each question thoughtfully. They had 3 minutes to answer each question, all of which were multiple-choice. The questions for each of the business subject matter areas were drawn from a pool of topic-related questions. Students were given two 15-minute breaks, and they were allowed to attempt the assessment a total of three times. The assessments were not graded; students were only required to submit confirmation of completion to have the task counted as being complete. 

Data Collection Timeline

The inbound assessment was posted during the first week of class, and students had until Sunday at 11:59 PM of that week to complete it. The outbound assessment was posted during the  final week of class, and the students again had until Sunday at 11:59 PM of that week to complete it. Upon the course’s completion, results were retrieved from the secured PGS client administration site using the reports module. 

Data Gathering

Results were obtained from the report in a pairwise format. In this type of study, using pairwise comparison of the results allows for the highest validity and reliability of statistically significant trends (Garofalo et al., 2022). Results were converted to a comma-separated values (CVS) file, from which they were manually transferred to a spreadsheet, maintaining the CVS format. Random validation was used for the spreadsheet data to ensure that student confidentiality was not compromised. 

Statistical Analysis

The spreadsheet data was processed using JASP, a free, open-source statistical processing program selected for its ability to perform a range of basic statistical tests (Love et al., 2019). JASP is an intuitive tool that peer-reviewed researchers have touted as being effective, even for advanced statistical analysis (Kelter, 2020).

To test the Hypotheses, the mean scores for the finance-specific assessment results and for the total overall assessment results were compared using independent t-tests. This provided a measurement for the amount of student content mastery improvement for both the experimental and the control groups. Effect sizes (Cohen’s d) were then calculated to determine the significance of any statistical differences. A significance level of α = 0.05 was employed for all statistical tests performed in this study. This level was chosen for its demonstrated ability to mitigate the risk of false positive (Type I error) and false negative (Type II error) results (Miller & Ulrich, 2019). 

Results

This study demonstrates that gamification of teaching methods in a business finance class impacts student learning outcomes. Students who completed the gamified section of the course showed greater improvement in their understanding of finance-specific concepts. Furthermore, the analysis shows that these students also displayed greater improvement in the total assessment score, which evidences a higher improvement in their understanding of concepts in other areas of emphasis, including business finance, business communications, and economics. 

Finance-specific Results

In an independent t-test, the mean score of the experimental group (n = 33) was 20.606 (SD = 22.212, SE = 3.867, CV = 1.078), while the mean score of the control group (n = 31) was 0.000 (SD = 22.949, SE = 4.122, CV = ∞). Results of the t-test (t(62) = 3.650, p <0.01, d = 0.913) support the hypothesis that the experimental group would have a higher mean score than the control group. The finance-specific assessment results are captured in Table 2, and associated charts are presented in Figure 8. 

Table 2 – Finance Category Results


95% CI
for Cohen’s d

tdfpCohen’s dSE Cohen’s dLowerUpper
Change – Finance3.65062< .0010.9130.276-∞0.477


Note.  For all tests, the alternative hypothesis specifies that calculations for the control group would be less than for the experimental group.

Figure 8 – Finance Category Charts

Total Peregrine Assessment Results

In an independent t-test, the mean score of the experimental group (n = 33) was 11.920 (SD = 18.200, SE = 3.168, CV = 1.527), while the mean score of the control group (n = 31) was 3.655 (SD = 19.059, SE = 3.423, CV = 5.214). Results of the t-test (t(62) = 1.744, p = 0.040, d = 0.444) support the hypothesis that the experimental group would have a higher mean score than the control group. The total assessment results are shown in Table 3, and associated charts appear in Figure 9.

Table 3 – Total Peregrine Results







95% CI for Cohen’s d

tdfpCohen’s dSE Cohen’s dLowerUpper
Total1.774620.0400.4440.256-∞0.025

Note.  For all tests, the alternative hypothesis states that calculations for the control group would be less than those for the experimental group.

Figure 9 – Peregrine Total Charts

Overall Findings

The analyzed data clearly shows a larger increase in content mastery for those whose education included dimensions of gamification. Cohen’s d was calculated at 0.913 and 0.444 (see Table 4) for the tested Hypotheses, indicating that the size of the effect for students regarding their mastery of finance-specific content was large, and the size of the effect for students regarding their mastery of all four areas of business emphasis was moderate. This shows that the differences in the means for the experimental and control groups cannot be categorized as small. In fact, a large effect size was found regarding several of the tested questions, highlighting the nuance underlying the reported results (Brydges, 2019). Table 5 summarizes the results for both tested Hypotheses.

Table 4 – Cohen’s d Calculations


Cohen’s d
Finance0.913
Total0.444

Table 5 – Cumulative Descriptives

GroupNNMeanSDSECoefficient of Variation
FinanceControl310.00022.9494.122
 Experimental3320.60622.2123.8671.078
TotalControl313.65519.0593.4235.214
 Experimental3311.92018.2003.1681.527

Discussion

Student performance was assessed for the content areas of business finance alone, and business finance combined with business communication, and micro- and macroeconomics. As was clearly shown, students in the business program who were enrolled in the Principles of Finance course demonstrated higher levels of improvement in their mastery of course content when the material was delivered with the five dimensions of gamification incorporated. Furthermore, achieving higher levels by accruing experience points was observed to be a significant motivational tool for students in the experimental class. Observations in class demonstrated this; although the interactive videos and the group product launch pitch outcome had a minimal impact on final grades, the vast majority of the experimental section students completed the video assignments, and there was fierce competition to present the best pitch.

Reluctance to implementing these methods may result from many real or perceived hindrances. Gamifying a traditional university course does not happen by flipping a switch. It requires the appropriate technology and technological support, thoughtful curriculum design, and modified hiring and employment practices, each of which can affect the faculty culture and carry a heavy financial burden. Indeed, implementation may not be feasible for some institutions due to financial or other constraints. 

Investment 1: Instructional Technology

Successful gamification requires sufficient instructional technology, including an LMS capable of integrating outside software such as Level Up XP. The LMS should be able to integrate performance assessment and tracking for gamified assignments. Many LMSs are able to assess results of traditional assignments, such as exams, discussion forums, and papers; however, because gamification might incorporate non-standard learning tools such as interactive videos, assessing results and tracking data is more challenging. If used, H5P videos should send results directly to the gradebook, giving instructors control over navigation and answer attempts. Of course, gradebooks must have the capability to accept the results, and they must allow flexibility in grading, such as awarding extra credit.

Additional necessary technical investments for gamification include equipment and software for video production, sound, lighting, and editing. These should be made easily accessible at faculty workstations. 

Investment 2: Curriculum Design

Even seasoned faculty members have resisted simple additional requirements such as uploading course content to an LMS. Full integration of a gamified teaching modality can be very challenging, but it also has benefits, therefore, universities should consider developing a more robust model of curriculum design. Faculty should receive support to validate essential elements of the course. After validation, and after sufficient training, the instructor can shape the gamification of the course based on their stated needs. 

It might be the case that some courses are just not conducive to gamification. However, this should be tested, not merely accepted. The optimal level of gamification might not be the same for every course. Based on the analysis in this study, it is reasonable to conclude that most courses would benefit from some amount of gamification.

Investment 3: Hiring and Employment Practices

The main considerations in the employment selection of instructors are their educational and professional experiences. For institutions of higher education, the hiring decision should also be based on whether the candidate can demonstrate sufficient proficiency in instructional technology. 

Universities considering implementing gamified teaching modalities should ensure that faculty members recognize and appreciate its value. Educators should not be compelled to use teaching modalities they do not find suitable for the programs they lead. Forcing the implementation of gamified teaching methods may not result in increased student satisfaction if the instructor is resistant to the implementation; in fact, the opposite results would be expected. It could also lead to higher employee turnover, and ultimately the loss of talented, valuable, and passionate instructors (Zalat et al., 2021). Allowing faculty members to teach in accord with their preferred style can make them more effective as educators, which can in turn increase student learning outcomes and student satisfaction. 

Limitations of the Study

Some limitations are recognized regarding the design and procedures in this study. For instance, the learning outcomes achieved in the finance class may have been influenced by the class’s unique context or the instructor’s teaching style. The validity of the study may have been impacted by the subjective nature of the course evaluations. Moreover, the quasi-experimental design may not have eliminated all possible confounding variables, despite efforts to select similar courses and student populations. 

It should be noted that the study was conducted at a faith-based private university located in Kansas. Although the student population is diverse, and the students share many common traits with those at other universities, there may be consequential demographic differences for students in public, for-profit, or secular institutions, or for those in different geographic areas. Notably, the university in this study typically enrolls a large number of student-athletes, who may have a greater appreciation for a competitive learning experience, as is offered with gamification.

Suggestions for Further Research

Additional research would provide a fuller understanding of the benefits, challenges, and nuances of adopting gamified teaching approaches in institutions of higher education. 

LMS Effectiveness

Various LMS platforms are available, but Moodle was the only LMS used in this study. To examine student metrics, such as satisfaction, more in-depth, researchers might assess how these metrics vary when other LMS platforms are used to deliver course content. To ensure the highest degree of validity, researchers would be advised to use similar and equivalent courses across all LMS platforms; after development, the courses should be converted to a common course cartridge format, and then imported to each platform.

Student Analytics

Various types of student data are used to evaluate the effectiveness of teaching and learning support services in universities. The frequency at which students access their course materials, to utilize learning resources, complete assignments, check their grades, or for any other reason, is an important metric in this regard; it has been shown in several studies to have a positive correlation with academic success (Van Wart et al., 2020). Investigation into the interaction between gamification and student analytics would provide valuable insights for universities aspiring to improve their learning and teaching support services. 

Conclusion

Gamification of teaching methods has garnered a lot of attention due to its potential to enhance the student experience in higher education (Khaldi et al., 2023). Several factors are important for the successful incorporation of gamification into coursework. First, instructional technology with sufficient capabilities and support must be in place; this includes providing educators with the appropriate hardware and software resources. Second, careful attention should be given to curriculum design; gamification is not a one-size-fits-all solution to improving course content mastery. Finally, experience with gamified theory and techniques should be prioritized in the faculty hiring process; it is important to create a culture where proficiency in instructional technology is valued and faculty members are encouraged to implement gamification to improve learning outcomes. 

Of course, gamified teaching methods are also used to increase student engagement and motivation, and the degree of diversity in the student population is an important factor in this regard. Cultural nuances that influence learning and teaching styles should be thoughtfully considered when incorporating gamified methods. A collaborative, inclusive approach encourages participation by students from different backgrounds, and a gamified course design that includes addressing cultural issues or discussing cross-cultural issues is likely to resonate with a diverse student population. 

Successful implementation of gamification in the classroom benefits students in many ways. Higher engagement in class leads to better comprehension and knowledge retention. It also leads to higher motivation and student satisfaction. Higher motivation and satisfaction have a reciprocal effect on engagement. All of these factors create an environment more conducive to persistence, graduation—in short, better learning outcomes and an enhanced student experience all around—which may ultimately, eventually put an end to those student complaints about their courses.

References

  • Al Hassani, A. A., & Wilkins, S. (2022). Student retention in higher education: The influences of organizational identification and institution reputation on student satisfaction and behaviors. International Journal of Educational Management, 36(6), 1046–1064. doi:10.1108/IJEM-03-2022-0123
  • Allan, B. M., & Fryer, Jr., R. G. (2011). The power and pitfalls of educational incentives. The Hamilton Project, 7, 1-36.
  • Annamdevula, S., & Bellamkonda, R. S. (2016). Effect of student perceived service quality on student satisfaction, loyalty and motivation in Indian universities: Development of HiEduQual. Journal of Modelling in Management, 11(2), 488-517. doi:10.1108/JM2-01-2014-0010
  • Banahene, S., Kraa, J. J., & Kasu, P. A. (2018). Impact of HEdPERF on students’ satisfaction and academic performance in Ghanaian universities; mediating role of attitude towards learning. Open Journal of Social Sciences, 6(5), 96-119. doi:10.4236/jss.2018.65009
  • Benton, S., & Li, D. (2017). Validity of the IDEA student ratings of instruction student characteristic items. The IDEA Center, 1-3.
  • Bishnoi, N. (2017). Collaborative learning: A learning tool advantages and disadvantages. Indian Journal of Health & Wellbeing, 8(8), 789–791.
  • Blevins, S. (2021). Learning Styles: The Impact on Education. MEDSURG Nursing, 30(4), 285–286.
  • Bowden, J. L., Tickle, L., & Naumann, K. (2021). The four pillars of tertiary student engagement and success: A holistic measurement approach. Studies in Higher Education, 46(6), 1207–1224. doi:2188/10.1080/03075079.2019.1672647
  • Bright, C. F., Bateh, J., & Babb, D. (2019). The relationship between simulation strategies and exit exam scores: A correlational assessment of Glo-Bus And Peregrine. American Journal of Business Education, 12(4), 53-60.
  • Brydges, C. (2019). Effect size guidelines, sample size calculations, and statistical power in gerontology. Innovation in Aging, 3(4), 1-8. doi:10.1093/geroni/igz036
  • Burke, A. (2011). Group work: How to use groups effectively. Journal of Effective Teaching, 11(2), 87-95.
  • Dichev, C., & Dicheva, D. (2017). Gamifying education: What is known, what is believed and what remains uncertain: A critical review. International Journal of Educational Technology in Higher Education, 14(9), 1-36. doi:https://doi.org/10.1186/s41239-017-0042-5
  • Ensmann, S., & Whiteside, A. L. (2022). “It helped to know I wasn’t alone”: Exploring student satisfaction in an online community with a gamified, social media-like Instrucitional approach. Online Learning Journal, 26(3), 22-45. doi:10.24059/olj.v26i3.3340
  • Garofalo, S., Giovagnoli, S., Orsoni, M., Starita, F., & Benassi, M. (2022). Interaction effect: Are you doing the right thing? PLOS ONE, 17(7), 1-19. doi:10.1371/journal.pone.0271668
  • Gartland, K. M., Shapiro, A., McAleavy, L., McDermott, J., Nimmo, A., & Armstrong, M. (2016). Feedback for future learning: Delivering enhancements and evidencing impacts on the student learning experience. New Directions in the Teaching of Physical Sciences, 11(1), 1-6.
  • Goldman, D. P. (1991). Growth economics vs. macroeconomics. Public Interest, 105, 78–92.
  • Hijazi, S. T., & Naqvi, S. (2006). Factors affecting students’ performance: A case of private colleges. Bangladesh e-Journal of Sociology, 3, 1-10.
  • Jarvis, C. (2020). Fiction as feminist pedagogy: An examination of curriculum and teaching strategies embodied in the novel. Studies in Continuing Education, 42(1), 118–132. doi:2188/10.1080/0158037X.2019.1572601
  • Johnson, R. G. (2011). What’s new in pedagogy research. American Music Teacher, 60(4), 56–57.
  • Kelter, R. (2020). Bayesian alternatives to null hypothesis significance testing in biomedical research: A non-technical introduction to Bayesian inference with JASP. BMC Medical Research Methodology, 20(142), 2-12. doi:10.1186/s12874-020-00980-6
  • Khaldi, A., Bouzidi, R., & Nader, F. (2023). Gamification of e-learning in higher education: A systematic literature review. Smart Learning Environments, 10(10), 1-31. doi:10.1186/s40561-023-00227-z
  • Kosmaca, J., Cinite, I., & Barinovs, G. (2023). Exploring interactive H5P video as an alternative to traditional lecturing at the physics practicum. International Baltic Symposium on Science and Technology Education (pp. 1-11). Šiauliai, Lithuania: BalticSTE2023.
  • Love, J., Selker, R., Marsman, M., Jamil, T., Dropman, D., Verhagen, J., . . . Wagenmakers, E. J. (2019). JASP: Graphical statistical software for common statistical designs. Journal of Statistical Software, 88(2), 1-17. doi:10.18637/jss.v088.i02
  • Lusardi, A. (2019). Financial literacy and the need for financial education: Evidence and implications. Swiss J Economics Statistics, 155(1), 1-8. doi:10.1186/s41937-019-0027-
  • Miller, J., & Ulrich, R. (2019). The quest for an optimal alpha. PLoS ONE, 14(1), 1-13. doi:10.1371/journal.pone.0208631
  • Nieto-Escamez, F. A., & Roldán-Tapia, M. D. (2021). Gamification as online teaching strategy during COVID-19: A mini-review. Psychol, 12(648552), 1-9. doi:10.3389/fpsyg.2021.648552
  • Oedekoven, O. O., Napolitano, M., Lemmon, J., & Zaiontz, C. (2019). Determining test bank reliability. Transnational Journal of Business, 4(Summer), 63-74.
  • Opdecam, E., & Everaert, P. (2018). Disagreements about cooperative learning. Accounting Education, 27(3), 223–233. doi:2188/10.1080/09639284.2018.1477056
  • Peregrine Global Services. (2020). Exam summary: Business administration undergraduate level. Gillette, WY: Peregrine Global Services.
  • Rasheed, H. M., He, Y., Khalid, J., Khizar, H. M., & Sharif, S. (2022). The relationship between e‐learning and academic performance of students. Journal of Public Affairs, 22(3), 1-7. doi:2188/10.1002/pa.2492
  • Santos, A. C., Oliveira, W., Hamari, J., Rodrigues, L., Todo, A., Palomino, P., & Isotani, S. (2021). The relationship between user types and gamifcation designs. User Modeling and User-Adapted Interaction, 31, 907-940. doi:10.1007/s11257-021-09300-z
  • Shortt, M., Tilak, S., Kuznetcova, I., Martens , B., & Akinkuolie , B. (2023). Gamification in mobile-assisted language learning: A systematic review of Duolingo literature from public release of 2012 to early 2020. Computer Assisted Language Learning, 36(3), 517-554. doi:10.1080/09588221.2021.1933540
  • Supper, P., Praschinger, A., & Radtke, C. (2021). Three‐staged key‐feature cases promote interaction in remote education. Wiley-Blackwell, 55(11), 1315-1317. doi:10.1111/medu.14609
  • Toda, A. M., Klock, A. C., Oliveria, W., Palomino, P., Rodrigues, L., Shi, L., . . . Cristea, A. (2019a). Analysing gamification elements in educational environments using an existing gamification taxonomy. Smart Learning Environments, 6(16), 1-14. doi:10.1186/s40561-019-0106-1
  • Toda, A., Palomino, P., Oliveira, W., Rodrigues, L., Klock, A., Gasparini, I., . . . Isotani, S. (2019b). How to gamify learning systems? An experience report using the design sprint method and a taxonomy for gamification elements in education. Educational Technology & Society, 22(3), 46-60.
  • Uysal, I., & Dogan, N. (2021). Automated essay scoring effect on test equating errors in mixed-format test. International Journal of Assessment Tools in Education, 8(2), 222–238. doi:10.21449/ijate.815961 
  • Van Wart, M., Ni, A., Medina, P., Canelon, J., Kordrostami, M., Zhang, J., & Liu, Y. (2020). Integrating students’ perspectives about online learning: A hierarchy of factors. International Journal of Educational Technology in Higher Education, 17(53), 1-22. doi:International Journal of Educational Technology in Higher Education
  • Wong, W. H., & Chapman, E. (2023). Student satisfaction and interaction in higher education. Higher Education , 85(5), 957–978. doi:2188/10.1007/s10734-022-00874-0
  • Yu, L., Shek, D. T., Wong, T., Li, X., & Yu, L. (2023). Use of instructional videos in leadership education in higher education under COVID-19: A qualitative study. PLoS ONE, 18(9), 1-28. doi:10.1371/journal.pone.0291861
  • Yuen, A. H., Cheng, M., & Chan, F. H. (2019). Student satisfaction with learning management systems: A growth model of belief and use. British Journal of Educational Technology, 50(5), 2520–2535. doi:10.1111/bjet.12830
  • Yüner, B., Eriçok, B., & Ertaş, B. (2023). Examination of variables affecting the perceptions of academic performance of higher education students during the distance education process. Journal of Learning and Teaching in Digital Age, 8(1), 161-168 . doi: 10.53850/joltida.1097130
  • Zalat , M. M., Hamed , M. S., & Bolbol , S. A. (2021). The experiences, challenges, and acceptance of e-learning as a tool for teaching during the COVID-19 pandemic among university medical staff. PLOS ONE, 16(3), 1-12. doi:10.1371/journal.pone.0248758
  • Zebing, W. (2019). Academic motivation, engagement, and achievement among college students. College Student Journal, 53(1), 99–112.