Blog Articles

Learning with Labster: The Lab Experience is Simulated but the Research-Informed Practice is Real

April Ondis and Dr. Rachel Schechter
Product Information

How did the professional pilots, surgeons, and race car drivers of today become experts?  Practice.

How did these experts-in-training practice safely, not hurting themselves or anyone else in the process?  Simulations. The solution of incorporating simulated learning experiences into education sounds simple but is it? No! Simulated learning platforms must be imagined and built, tested and improved, and, importantly to Labster, research-driven.

Call out box: Jet pilots

Labster co-founders Michael Bodekaer Jensen and Mads Bonde identified the potential for quality simulations for learning science and then leveraged research to find a solution in virtual labs. Inspired by aviation training’s use of flight simulations, Jensen and Bonde used research-informed practices to design a platform that allowed students to practice experiments from home. The co-founders wanted to support students in developing high-level laboratory skills without the typical constraints of in-person lab study such as scheduling, staffing, supplies - and even safety. 

Labster’s inspiration from flight simulators (1) was only the beginning of using innovation to inform development and product improvements. Recognizing the importance of research-informed learning practices, Labster was built on a rigorous, evidence-based foundation to drive student learning outcome attainment.

Call out box: Labster was built on a foundation of evidence-based research

Student Engagement and Knowledge Building Informed by Research 

Labster provides a flexible approach to learning that applies all of the best aspects of explicit instruction, inquiry-based learning, and problem-based learning. Virtual lab simulations help students connect theory with practice as well as visualize processes, practical laboratory procedures, and instrument techniques. (2)

Inquiry-Based Learning

The Labster team’s goal is to build simulations that confer the benefits of traditional explicit instruction to students in a setting that empowers them with choice and maintains attention by matching the instructional method with the learner's comfort level. Inquiry-based learning is a student-centered method in which students discover new things by exploring their near environment and developing strong arguments about the world surrounding them based on strong justifications. Research shows that inquiry-based learning is very effective within science learning (3) since the work is simultaneously creative and practical.

Call out box: Labster's inquiry-based learning approach

Problem-Based Learning

Over the years, Labster's approach to creating curriculum shifted from building specific simulations for school curriculums to designing their own learning objectives that employed Problem-Based-Learning. While inter-connected with Inquiry-Based-Learning, PBL is focused on giving students complex problems to solve, whereas IBL is more commonly associated with giving students choice research opportunities. PBL has also been shown to be specifically effective when teaching science. (4)

The Labster team integrated the strategy of students creating solutions within realistic constraints. PBL comes to life in Labster’s Behavioral Thermoregulation simulation where students create their own planet and try to maintain life on the planet. The key to this learning strategy is that there are multiple solutions, and not necessarily a right or wrong answer.  Students' engagement and knowledge building increase when students can explore real-world problems such as exploring the nitrogen cycle to maintain proper crop yields to feed the population but not overheat the planet by manipulating water and fertilizer cycles.

Learning Objectives from Labster Behavioral Thermoregulation Graphic

Immediate Feedback 

Labster’s virtual lab environment provides immediate feedback to improve the impact on student achievement (5) and motivation for learning (6). Students get feedback the second they complete something correctly or incorrectly, increasing their rate of learning. Incorporating user feedback, the design team enhanced the way success and failure within sim experiences are communicated, creating new design guidelines for all simulations that better fit the research-informed feedback student evaluation models. (7) These feedback loops enable active learning to be productive and advance learning outcomes. 

Multimodal Instruction

Labster operationalizes multimodal instruction to meet the diverse needs of learners through multiple points of exposure and forms of practice. Different activities within the sim experience around the same topic including readings, experimental procedures, testing yourself, and model making maximize learning outcomes. Research demonstrates that multimodal instruction allows for increased practice and multiple entry points to the content particularly useful for diverse learners. (8)

Learning Objectives from Labster's Gram Stain Simulation Graphic

Observation & Shadowing 

The Labster team created and utilizes a research-informed mechanic called Dr. One, a virtual lab assistant, to show student users how to complete a task prior to trying it themselves (similar to a worked examples) and remind students about lab safety. Research suggests that students learning with virtual helpers or “pedagogical agents” has a positive effect on learning outcomes (9) perhaps by reducing the cognitive load on the learner.  

This is particularly true when the pedagogical agent allows for students to experience worked examples in combination with their own problem-solving (10).

Description of Dr. One lab assistant Graphic

Individualization

Labster’s virtual labs and science simulation automatically individualize instruction to a student’s learning needs. Labster’s responsive model and self-paced approach tap into the improved learning outcomes reflected in the research (11) on individualized learning approaches. The integration of Universal Design for Learning throughout the simulations also increases equity in access within Labster’s programming, specifically making improvements to their hardware and software for the visually and hearing impaired.

The Effect Sizes of Labster’s Research-Based Principles

Meta-analyses conducted on the experiments cited in the above topics outperform most educational strategies (less than .40 is low).  Of the pedagogical strategies utilized by Labster, individualization has shown the highest impact within research, followed by problem-based and inquiry-based learning.  Simulation and platform learning with immediate feedback also outperform most educational strategies in terms of the magnitude of their effect on learning outcomes. 

Chart showing the effect sizes of Labster’s Research-Based Principles in 2022

Labster Research Highlights

Labster is built on a foundation of research but is not solely research-based. The program is also an evidence-based product that has been rigorously tested and improved over time. The program has had over 21 peer-reviewed studies, including nine randomized control trials,  three quasi-experimental studies, three case studies, and four survey studies. All studies showed a positive benefit, with the largest benefits shown in knowledge building, enjoyment, and lab applications. Labster has also been shown to be especially helpful for struggling learners. Take a look at this chart showing some of the impacts found in Labster’s research ranked by effect size.

Chart showing the impacts found in Labster’s research ranked by effect size


Evidence-Driven, Interactive Learning Platform


Over two thousand learning institutions currently use Labster’s innovative and evidence-driven platform. The Labster team has demonstrated a clear commitment to the science of learning, both by continually incorporating science into their product design and through continual scientific experimentation on their own product. Scientific research is woven into the culture at Labster as we design simulations that are engaging, build confidence, and transfer new skills to the real world.


Article References: 

(1) Meyer, G. F., Wong, L. T., Timson, E., Perfect, P., & White, M. D. (2012). Objective Fidelity Evaluation in Multisensory Virtual Environments: Auditory Cue Fidelity in Flight Simulation. PLoS ONE, 7(9). https://doi.org/10.1371/journal.pone.0044381

(2) De Vries, L. E., & May, M. (2019). Virtual laboratory simulation in the education of laboratory technicians–motivation and study intensity. Biochemistry and Molecular Biology Education, 47(3), 257-262.

(3) Aktamis, Hilal & hiğde, Emrah & Özden, B.. (2016). Effects of the inquiry-based learning method on students' achievement, science process skills and attitudes towards science: A meta-analysis science. 13. 248-261. 10.12973/tused.10183a. 

(4) Ayaz, M.t & Söylemez, M. (2015). The Effect of the Project-Based Learning Approach on the Academic Achievements of the Students in Science Classes in Turkey: A Meta-Analysis Study. Eğitim ve Bilim. 40. 255-283.

(5) Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Re-search, 77 (1): 81-112.

(6) Badyal, D. K., Bala, S., Singh, T., & Gulrez, G. (2019). Impact of immediate feedback on the learning of medical students in pharmacology. Journal of advances in medical education & professionalism, 7(1), 1–6. https://doi.org/10.30476/JAMP.2019.41036

(7) Schinske, J., & Tanner, K. (2014). Teaching More by Grading Less (or Differently). CBE life sciences education, 13(2), 159–166. https://doi.org/10.1187/cbe.cbe-14-03-0054

(8) Ryoo, J., & Winkelmann, K. (2021). Innovative learning environments in STEM higher education: Opportunities, challenges, and looking forward (p. 137). Springer Nature.

(9) Schroeder, N. L., & Craig, S. D. (2021). Learning with virtual humans: Introduction to the special issue. Journal of Research on Technology in Education, 53(1), 1-7.

(10) McLaren, B. M., & Isotani, S. (2011, June). When is it best to learn with all worked examples?. In International conference on artificial intelligence in education (pp. 222-229). Springer, Berlin, Heidelberg.

(11) Steenbergen-Hu, S., Makel, M. C., & Olszewski-Kubilius, P. (2016). What One Hundred Years of Research Says About the Effects of Ability Grouping and Acceleration on K–12 Students’ Academic Achievement: Findings of Two Second-Order Meta-Analyses. Review of Educational Research, 86(4), 849–899. https://doi.org/10.3102/0034654316675417

 

Additional Meta-Analyses References:

Baceviciute, S., Cordoba, A. L., Wismer, P., Jensen, T. V., Klausen, M., & Makransky, G. (2022). Investigating the value of immersive virtual reality tools for organizational training: An applied international study in the biotech industry. Journal of Computer Assisted Learning, 38(2), 470–487. https://doi.org/10.1111/jcal.12630

Baceviciute, S., Terkildsen, T., & Makransky, G. (2021). Remediating learning from non-immersive to immersive media: Using EEG to investigate the effects of environmental embeddedness on reading in Virtual Reality. Computers & Education, 164(Complete). https://doi.org/10.1016/j.compedu.2020.104122

D’Angelo, C., Rutstein, D., Harris, C., Bernard, R., Borokhovski, E., Haertel, G. (2014). Simulations for STEM Learning: Systematic Review and Meta-Analysis. Menlo Park, CA: SRI International.

Dyrberg, N. R., Treusch, A. H., & Wiegand, C. (2017). Virtual laboratories in science education: students’ motivation and experiences in two tertiary biology courses. Journal of Biological Education, 51(4), 358–374. https://doi.org/10.1080/00219266.2016.1257498

Hattie, J. (2022). Inquiry Based Learning. Metax. Retrieved from <https://www.visiblelearningmetax.com/influences/view/inquiry-based_teaching>. 

Kozhevnikov M, Li Y, Wong S, Obana T, Amihai I. Do enhanced states exist? Boosting cognitive capacities through an action video-game. Cognition. 2018; 173:93. doi: 10.1016/j.cognition.2018.01.006.

Makransky, G., Mayer, R., Nøremølle, A., Cordoba, A. L., Wandall, J., & Bonde, M. (2020). Investigating the feasibility of using assessment and explanatory feedback in desktop virtual reality simulations. Educational Technology Research & Development, 68(1), 293–317. https://doi-org.ezproxy.lakeheadu.ca/10.1007/s11423-019-09690-3

Makransky, G., Mayer, R., Nøremølle, A., Cordoba, A. L., Wandall, J., & Bonde, M. (2019). Investigating the feasibility of using assessment and explanatory feedback in desktop virtual reality simulations. Educational Technology Research and Development, 68(1), 293–317. https://doi.org/10.1007/s11423-019-09690-3

Makransky, G., Mayer, R. E., Veitch, N., Hood, M., Christensen, K. B., & Gadegaard, H. (2019). Equivalence of using a desktop virtual reality science simulation at home and in class. PLoS ONE, 14(4). https://doi.org/10.1371/journal.pone.0214944

Makransky, G., & Petersen, G. B. (2019). Investigating the process of learning with desktop virtual reality: A structural equation modeling approach. Computers & Education, 134(Complete), 15–30. https://doi.org/10.1016/j.compedu.2019.02.002

Makransky, G., Bonde, M. T., Wulff, J. S. G., Wandall, J., Hood, M., Creed, P. A., Bache, I., Silahtaroglu, A., & Nørremølle, A. (2016). Simulation-based virtual learning environment in medical genetics counseling: an example of bridging the gap between theory and practice in medical education. BMC Medical Education, 16. https://doi.org/10.1186/s12909-016-0620-6

Makransky, G., Terkildsen, T. S., & Mayer, R. E. (2019). 3 60(Complete), 225–236. https://doi.org/10.1016/j.learninstruc.2017.12.007

Makransky, G., Thisgaard, M. W., & Gadegaard, H. (2016). Virtual Simulations as Preparation for Lab Exercises: Assessing Learning of Key Laboratory Skills in Microbiology and Improvement of Essential Non-Cognitive Skills. PLoS ONE, 11(6). https://doi.org/10.1371/journal.pone.0155895

Thisgaard, M., & Makransky, G. (2017). Virtual Learning Simulations in High School: Effects on Cognitive and Non-cognitive Outcomes and Implications on the Development of STEM Academic and Career Choice. Frontiers in Psychology, 8. https://doi.org/10.3389/fpsyg.2017.00805

Van der Kleij, F. M., Feskens, R. C., & Eggen, T. J. (2015). Effects of feedback in a computer-based learning environment on students’ learning outcomes: A meta-analysis. Review of educational research, 85(4), 475-511.