Bloom’s Taxonomy[1] is the foundation of many discussions of education research discussion and it is important to know the six levels:
Knowledge: This is the foundational information a learner possesses, e.g. multiplication tables.
Comprehension: This next level (building upon Knowledge) refers to a learner’s ability know the meaning of information. This is often demonstrated by students being able to compare and contrast.
Application: This is where learners can take their comprehension of a topic and apply it to new problems.
Analysis: This is described as “the breakdown of the material into its constituent parts and detection of the relationships of the parts and of the way they are organized”[1]. A learner is performing analysis when they take a passage of text and can break it down and pick out salient points.
Synthesis: At this level a learner can take pieces of information and create something new. As an example Adams, N.E., uses “The formulation of a management plan for a specific patient“[4].
Evaluation: Here is where a learner reaches the top of the taxonomy pyramid. With this skill a learner can make judgments of value. For example, a learner, after being presented with all the relevant material, would be able to reason their way to a sound medical decision.
Bloom’s taxonomy was a great start, but Lw, A., et al., published a revision in 2001[2]. This revision created a 2 dimension matrix[3], depicted in Figure 1. In the first dimension (The Knowledge Dimension) are four levels of knowledge: factual, conceptual, procedure, and metacognitive. In the second dimension (The Cognitive Process Dimension) are the six levels of cognitive processing: remember, understand, apply, analyze, evaluate, and create. in this new revision the fundamental meaning of the 6 cognitive levels did not change, but the last two levels swapped. By making the levels operate in a second dimension, Lw, A., et al., provided a breakdown of the four types of knowledge the learner operates with.
No matter the taxonomy chosen, the “use of the taxonomy encourages instructors to think of learning objectives in behavioral terms to consider what the learner can do as a result of the instruction.”[4]
Figure 1 Revised Blooms Taxonomy [2, 3]
Webb, N.L. took a different approach and built a framework specifically for “interpreting and assigning depth-of-knowledge levels to both objectives within standards and assessment items“[5]. He defined four levels:
Recall and Reproduction: At this depth of knowledge a learner simply knows the answer. They are not expected to solve for something.
Skills and Concepts: This depth of knowledge is more complex than Recall and Reproduction. A learner could be expected to solve a problem at this depth. Getting to a final answer may require more than one step. For example, a learner may be expected to collect and chart a data set.
Strategic Thinking: At this depth of knowledge, a student is expected to be able to provide a nuanced/thoughtful explanation of an answer.
Extended Thinking: At this depth of knowledge a student can be expected to “synthesize information from multiple sources.”[5] The tasks performed at this level are complex in nature and tend to require significantly more time to complete.
Using Webb’s categories, instructors can increase rigor in their classroom by working student through the various depth of knowledge[6].
Discipline-Based Education Research “aims to understand teaching and learning in a discipline while considering the priorities, viewpoints, knowledge, and practices of that discipline.”[7] This research can be critical to optimally applying techniques in discipline specific ways. Unfortunately, DBER is not universally respected and faculty research in DBER can be looked at negatively by colleagues in a field as being not real research.
When defining engagement as “the student’s psychological investment in and effort directed toward learning, understanding, or mastering the knowledge, skills, or crafts that academic work is intended to promote“[8] Martin, F. and D.U. Bolliger find that engagement in online learning environments is important to the success of instruction. They show that the engagement paths of learner-to-instructor, learner-to-learner, and learner-to-content are all important. They also found that the learner-to-instructor engagement is especially important.
Bringing the focus now to my specific area of research, using gamification in mixed in-class/online instruction of introductory engineering courses.
Gamification is “the use of game design elements in non-game contexts.”[9] While my research is specifically about implementing gamification in online/digital engineering education, this concept can get applied in the physical world[10]. It has been shown that gamification in educational environments has a positive impact[11], but it is possible that the positive effects gamification has on performance is temporary and wears off as the novelty subsides.[12]
As an alternative to gamification Begg, M., D. Dewhurst, and H. Macleod proposed Game Informed Learning[13]. This takes “traditional” teaching techniques gaming features/immersive nature to “inform” how lessons are conducted, e.g. conducting fully immersive simulations and not attempting to externally motivate learners with gaming gimmicks like points or badges.
Current published research does not address if a mixed in-class/online classroom environment can benefit student engagement and performance. By implementing our deliberate practice engine with game features like points and a leaderboard, we have produced a strong gamified quiz/practice engine for learning introductory SQL concepts. Furthermore, the engine provides immediate custom feedback to help learners find errors. Initial runs of the practice engine are executed in a flipped-classroom environment providing students with the strongly desire learner-to-instructor engagement. Our preliminary studies show positive results, and we are in the process of publishing these results.
Our next step is to take the deliberate practice engine (which we have shown to have a positive impact on performance and engagement) and research the effect of the engine on student task efficiency and its effect on self-efficacy, engagement, and academic emotion. Our hypothesis is that the use of the engine as part of classroom instruction will improve all four metrics.
Additional follow-on research could include topics such as engine behavior refinement and how portable such activities are to other academic institutions.
1. Bloom, B.S., Taxonomy of educational objectives: The classification of educational goals. 1956.
2. Lw, A., et al., A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. 2001.
3. A Model of Learning Objectives based on A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. 2024-03-04]; Available from: https://www.celt.iastate.edu/wp-content/uploads/2015/09/RevisedBloomsHandout-1.pdf.
4. Adams, N.E., Bloom's taxonomy of cognitive learning objectives. J Med Libr Assoc, 2015. 103(3): p. 152-3.
5. Webb, N.L., Depth-of-knowledge levels for four content areas. Language Arts, 2002. 28(March): p. 1-9.
6. Aungst, G. Using Webb’s depth of knowledge to increase rigor. 2014; Available from: https://www.edutopia.org/blog/webbs-depth-knowledge-increase-rigor-gerald-aungst.
7. Paul, R. and R. Brennan, DISCIPLINE-BASED EDUCATION RESEARCH (DBER)–WHAT IS IT, AND WHY SHOULD ENGINEERING EDUCATION RESEARCH SCHOLARS BE TALKING ABOUT IT MORE? Proceedings of the Canadian Engineering Education Association (CEEA), 2019.
8. Newmann, F.M., Student engagement and achievement in American secondary schools. 1992: ERIC.
9. Deterding, S., et al. From game design elements to gamefulness: defining" gamification". in Proceedings of the 15th international academic MindTrek conference: Envisioning future media environments. 2011.
10. Huotari, K. and J. Hamari. Gamification” from the perspective of service marketing. in Proc. CHI 2011 Workshop Gamification. 2011.