Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Do self-created metacognitive prompts promote short- and long-term effects in computer-based learning environments?

Do self-created metacognitive prompts promote short- and long-term effects in computer-based... engelmann@tum.de Chair for Teaching and Learning Students must engage in self-regulated learning in computer-based learning with Digital Media, TUM School of environments; however, many students experience difficulties in doing so. Therefore, Education, Technical University of Munich, Arcisstraße 21, 80333 this study aims to investigate self-created metacognitive prompts as a means of München, Germany supporting students in their learning process and improving their learning Present address: Institute of performance. We conducted an experimental study with a between-subject design. Education, Faculty of Education and Social Sciences, University of The participants learned with self-created metacognitive prompts (n = 28) or without Hildesheim, Universitätsplatz 1, prompts (n = 29) in a hypermedia learning environment for 40 min while thinking 31141 Hildesheim, Germany aloud. In a second learning session (stability test), all participants learned about a different topic without prompts. The results showed no clear effect of the self- created metacognitive prompts on the learning process and performance. A deeper analysis revealed that students’ prompt utilization had a significant effect on performance in the second learning session. This study contributes to the research investigating how students can be supported in ways that enhance their learning process and performance. Keywords: Self-created prompts, Metacognitive prompts, Self-regulated learning, Knowledge acquisition, Long-term effects Introduction Self-regulated learning (SRL) is important for successful learning in computer-based learning environments (CBLEs; Sambe, Bouchet, & Labat, 2017), and it is considered to be a crucial skill in lifelong learning (Anthonysamy, Koo, & Hew, 2020). Due to an increased need for flexible learning settings that could be addressed using CBLEs, an increasing number of approaches have been developed to address the research in self- regulated learning, in particular, instructional support in CBLEs (e.g., Azevedo, & Aleven, V. (Eds.)., 2013; Hsu, Wang, & Zhang, 2017). SRL describes students’ strategic behavior in moving towards a learning goal (Schunk & Zimmerman, 1998). In the cyc- lic model of self-regulated learning (Zimmerman, 2008), three phases of self-regulated learning are described, each specifying metacognitive activities such as goal setting and strategic planning (forethought phase), metacognitive monitoring and control (per- formance phase), and self-evaluation and reflection (self-reflection phase). © The Author(s). 2021 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 2 of 21 In an ideal setting, students would be able to engage in self-regulated learning in flexible CBLEs during their journey to achieve learning goals throughout their lifetime. However, many students experience difficulties in adequately self-regulating their learning, particu- larly in complex CBLEs and across multiple settings (e.g., Azevedo, Moos, Johnson, & Chauncey, 2010; Azevedo, Taub, & Mudrick, 2018; Daumiller & Dresel, 2018; Jansen, van Leeuwen, Janssen, Conijn, & Kester, 2020; Pieger & Bannert, 2018). Furthermore, these difficulties are associated with suboptimal learning outcomes (e.g., Azevedo & Cromley, 2004; Bannert, Sonnenberg, Mengelkamp, & Pieger, 2015; Kizilcec, Pérez-Sanagustín, & Maldonado, 2017; Lai & Hwang, 2016). Metacognitive prompts were established as in- structional support to help students better self-regulate their learning and thereby achieve higher learning outcomes (e.g., Bannert, 2009; Hsu et al., 2017). Metacognitive prompts are hints, clues, or questions that target the learners’ metacognition. Evidence suggests that helping students engage in self-regulated learning successfully fosters their learning outcomes. For example, in a relevant meta-analysis, Zheng (2016) found a medium effect size of d = 0.44 of self-regulated learning scaffolds in CBLEs on academic performance compared with learning without support. Thus, the overall helpful effects of metacognitive prompts appear to be undisputed. Nevertheless, the extent of the usefulness of the metacognitive prompts within studies on metacognitive prompting seems to vary (Daumiller & Dresel, 2018;Zheng, 2016). A pos- sible explanation may be that the effects of the metacognitive prompts on learning depend on how the students utilize the offered prompts. To date, the utilization of prompts has scarcely been investigated in the common prompting research. To the best of our know- ledge, only Daumiller and Dresel (2018) thematized prompt use, whereby the authors conceptualized it in terms of the subjectively assessed frequency of generally following the “incitement” of prompts. Accordingly, the focus of the mentioned study was instead on the handling of prompts in terms of “whether” rather than “how.” We sought to close the identified research gap by investigating how students’ prompt utilization, which we define in this study as the “way to deal with each specific prompt”, affect their learning outcome measured directly after learning and, moreover, in a subsequent learning session 3 weeks later. Moreover, we advance the common prompting approach with so-called self-created metacognitive prompts. By helping students to create their own prompts for a learning session in a CBLE, we transfer one of the most interesting opportunities that CBLEs pro- vide to instructional support: the flexible use of learning time and place and a wider range of options in deciding how to learn. Supporting self-regulated learning and performance in computer-based learning environments with metacognitive prompts One approach to facilitating self-regulated learning in CBLEs is metacognitive support, for example, in the form of prompts that induce students’ metacognitive activities during learn- ing. Such activities include but are not limited to orientation, goal specification, planning, monitoring and control, and evaluation strategies (Bannert, 2007; Veenman, 1993). Accord- ing to available meta-analyses, prompts show a significant, moderate positive effect on aca- demic performance (Belland, Walker, Olsen, & Leary, 2015;Zheng, 2016). However, there are only a few empirical studies to date that also examine student be- havior, such as the number of page visits in an online learning environment, in addition Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 3 of 21 to more descriptive attributes (e.g., Bannert et al., 2015; Daumiller & Dresel, 2018; Lallé, Taub, Mudrick, Conati, & Azevedo, 2017; Wong, Khalil, Baars, de Koning, & Paas, 2019). Bannert et al. (2015), for example, investigated the navigation behavior of students in regulating their learning to gain insight into whether and how students se- lect pages with relevant content from learning websites. In their study, they found non- linear page selections by students as indicators of strategic navigation behavior (Astleitner, 1997). Moreover, understanding how students use prompts provides deeper insight into the learning effectiveness of prompts (e.g., Bannert et al., 2015; Bannert & Mengelkamp, 2013; Bannert & Reimann, 2011; Moser, Zumbach, & Deibl, 2017). Ban- nert et al. (2015) found, for example, that within an experimental group that received metacognitive prompts, just under half of the students complied with the prompts as intended, and the others were less or not at all compliant. Their analysis showed an im- proved learning outcome regarding transfer performance for compliant students. The relevant research mainly focuses on the effectiveness of prompts with regards to short-term effects in learning outcomes, which means that the learning outcomes are measured at the end of the learning session (e.g., Delen, Liew, & Willson, 2014; Kim & Pedersen, 2011; Moser et al., 2017; Müller & Seufert, 2018; Renner, Prilla, Cress, & Kimmerle, 2016; van Alten, Phielix, Janssen, & Kester, 2020). Only a few studies have analyzed the long-term effects of metacognitive prompts in a second learning session following some days or weeks later. A few studies have found sustainable effects for metacognitive prompts (Stark & Krause, 2009; Stark, Tyroller, Krause, & Mandl, 2008); however, in another study, the effect of metacognitive prompts could not be retained in a follow-up session (Hilbert et al., 2008). Thus far, the long-term effects of prompts have been researched less and the few studies have been conducted in quite varied ways. The classical approach is character- ized by presenting another delayed knowledge test on the same learning topic (s. Table 1). For example, Daumiller and Dresel (2018) supported the students of three experi- mental groups with prompts (only metacognitive prompts vs. only motivational prompts vs. metacognitive and motivational prompts) whereas students in the control group learned without prompts about a topic on psychological research methods. Im- mediately after learning, there was a knowledge test, and the results showed better learning outcomes for all groups of prompted students compared to the nonprompted students. In addition, there was one delayed knowledge test 1 week later and another delayed knowledge text (exam) 10 weeks later on the same learning topic. At least the first of these delayed tests showed better test scores for the prompted groups whereas the second showed better test scores only for the motivational prompts. The significant Table 1 Types of long-term effects investigated in the literature Learning session 1 Test 1 Learning session 2 Test 2 Classical approach Experimental Immediate and manipulation delayed knowledge of the prompts test(s) Repeated approach Experimental Immediate Experimental Immediate manipulation knowledge test manipulation knowledge test of the prompts of the prompts Stability approach Experimental Immediate Learning without Immediate manipulation knowledge test prompts knowledge test of the prompts Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 4 of 21 results of the delayed knowledge tests were interpreted as long-term effects (see also the study by Stark & Krause, 2009). In general, this classical approach verifies that the knowledge gained by prompting still exists after a few days and thus measures the long-term effect with regard to knowledge of the content. Another approach uses a repeated design.Müller and Seufert(2018)prompted students of the experimental group in a learning session about empirical research methods and conducted a knowledge test immediately after learning. One week later, they “repeated” the same design by prompting the students of the experimental group again. The learning topic was different; however, the session also addressed the basics in empirical research methods. They found a short-term effect on transfer performance but no long-term ef- fects in the second learning session. With this repeated design, it can be investigated whether the instructional prompts work better with regard to higher learning outcomes in a second learning session using another learning topic under the same learning condi- tions. In general, this repeated design does not investigate the preservation of the prompts after a period of time during which the participants were not exposed to prompts. Instead, the repeated design tests the degree to which prompts affect learning at a particular learn- ing session, in more than one consecutive learning situation. The design by Bannert et al. (2015) was similar with regard to the two learning ses- sions; however, they did not prompt the students in the second learning session to in- vestigate whether the strategies induced by self-directed metacognitive prompts (presented only in the first learning session) would be accessed in another follow-up learning session. They found better navigation behavior and transfer performance not only in the first learning session but also in the second (unsupported) learning session. This stability approach investigates not only the stability of knowledge gain in a similar learning topic but also the stability of the strategy use induced by prompts. In general, this stability approach investigates the long-term effects with regard to self-regulation strategies cued by the metacognitive prompts. For the purpose of our study, we focused on the stability approach to investigate the long-term effects of self-created metacognitive prompts. By doing so, we analyzed whether the positive effects of self-created metacognitive prompts measured directly after learning could be perpetuated in a follow-up learning session without any add- itional instructional support, thus affecting learning beyond the learning session in which the prompt appeared in future learning sessions. In summary, the evidence in terms of both understanding the effect of self-created metacognitive prompts on the learning process and on learning outcomes as well as the long-term effects of self- created metacognitive prompts will be investigated in this study. Modifying the design of metacognitive prompts The individual studies mentioned (e.g., Bannert et al., 2015; Müller & Seufert, 2018; Stark & Krause, 2009) as well as the review and meta-analysis (Belland et al., 2015; Zheng, 2016) paint an overall positive picture of the effect of metacognitive prompts; however, the results are not consistent. The inconsistency could be due to the function of prompts—such as differentiating between prompts supporting students’ reflection of learning behavior and prompts supporting students’ cognitive processes—as well as the timing of the support. Both aspects have gained attention in research as potentially Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 5 of 21 influencing the success of prompts (e.g., Azevedo, Cromley, Moos, Greene, & Winters, 2011; Berthold, Nückles, & Renkl, 2007; Kauffman, Ge, Xie, & Chen, 2008; Molenaar, Roda, van Boxtel, & Sleegers, 2012). In most of the studies reported thus far, one type of prompt was implemented in fixed time intervals. Some studies have developed adap- tive prompts that enable adjustments to the content and/or timing of the prompts (e.g., Bouchet, Harley, & Azevedo, 2016; Kramarski & Friedman, 2014; Schwonke, Hauser, Nückles, & Renkl, 2006; Thillmann, Künsting, Wirth, & Leutner, 2009). To summarize this research area, the extent to which the prompts should be adapted to gain the best learning results is still an open question; in particular, the content of the prompts re- quires more research since the time intervals of prompts have been investigated previ- ously. For example, in the experimental study by Bannert et al. (2015), shortly before the learning session, students could personalize the time intervals at which metacogni- tive prompts would appear during the learning phase. Similarly, the prompts in the study by Bouchet et al. (2016) were faded out in different ways, meaning that the time interval between the presentations of prompts was adjusted by the students. However, they found no significant effect of fading on learning gains, as expected. The idea of giving students the ability to create and configure their own prompts is not only driven by the transfer of the flexibility of modern CBLEs directly to their inte- grated support, as mentioned in the introduction, but also by the phenomenon of stu- dents’ poor compliance with instructional tools (e.g., Clarebout & Elen, 2006; Lallé et al., 2017; Schworm & Gruber, 2012), particularly with metacognitive prompts (e.g., Bannert & Mengelkamp, 2013). Students in past studies have complained that the prompts restricted their learning or indicated that they (partly) did not use the prompt in the intended manner (e.g., Bannert et al., 2015). Furthermore, the research shows that giving students the opportunity to influence their own learning is positively corre- lated with students’ development of self-regulated learning (Randi & Corno, 2000). With regard to instructional prompts, however, there is hardly any research to date that addresses adaptable prompts and their effects on learning processes and learning out- comes (Bannert et al., 2015; Kramarski & Friedman, 2014). Based on the scarce re- search available, adaptable prompts have been found to support learning (e.g., Bannert et al., 2015). In the study by Bannert et al. (2015), students were able to personalize their metacognitive prompts not only with regard to presentation time as mentioned above but also with regard to the content of the prompts. This means that the sequence in which the reasons for the learning activities had to be carried out for prompting could be chosen freely. As expected, learning with such self-directed metacognitive prompts significantly increased students’ navigation behavior and transfer performance when compared with another group of students learning without prompts. The goal of this study is to further increase the freedom in personalizing one’s own learning sup- port and to investigate the effects of such a greater learner control. Similar to self-directed (personalized) metacognitive prompts, self-created (personal- ized) metacognitive prompts are characterized students’ ability to determine the timing of the prompts’ occurrence themselves before beginning the learning process. Nonethe- less, self-created prompts differ from self-directed prompts in that they give students more freedom with regard to the personalization of their prompts: As implied by their name, self-created metacognitive prompts are written by the students themselves in ad- vance to the learning process, based on one example of a metacognitive prompt (e.g., Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 6 of 21 Müller & Seufert, 2018). In addition, the learners themselves determine the number of learning activities in which they will engage when self-creating their prompts (within the restrictions of the learning setting). By asking students to create their own prompts before the learning session and to re- ceive those prompts during the learning session, we expect them to feel supported without being restricted by prompts that have been imposed on them. In this case, prompts could achieve their aim to support the SRL process and performance (as shown in the research summarized by Zheng, 2016) without being hindered by stu- dents’ reluctance. According to the SRL research, this prompting approach appears to help students overcome production deficits (Azevedo & Cromley, 2004; Marschner, Thillmann, Wirth, & Leutner, 2012; Nückles, Schwonke, Berthold, & Renkl, 2004; Veenman, 2007; Veenman, Van Hout-Wolters, & Afflerbach, 2006). Prompts will in- duce the students’ metacognitive activities that they usually do not execute spontan- eously during learning situations. The prior research in which medium effect sizes were obtained for different types of metacognitive prompts on transfer performance was compared with a control group learning environment without prompts (0.42 < d <0.59, Bannert & Mengelkamp, 2013). The extant research leads us to develop self-created metacognitive prompts: Self- created metacognitive prompts are prompts that students create themselves before they begin to study learning materials. Such prompts allow students to gain even more con- trol over their learning process. Our hypothesis is that self-created prompts are more adapted to the (perceived) needs of individual students and will lead to higher compli- ance with metacognitive prompts during the learning phase. Research questions and hypotheses In general, prompting self-regulated learning in CBLEs is successful in supporting short-term learning outcomes (Belland et al., 2015; Winters, Greene, & Costich, 2008; Zheng, 2016). The adaption of prompts has been investigated as an approach for ad- dressing the suboptimal utilization of prompts during learning processes (e.g., Bannert et al., 2015). Thus far, the adaption of prompts has mainly been based on the timing (e.g., Bouchet, Harley, & Azevedo, 2018) or content of the prompts (e.g., Schwonke et al., 2006). The current research investigates prompts created by students themselves, whereby, in addition to the timing of the prompts, the students are able to determine the content of the prompts themselves. Thus, this paper adds to the research in meta- cognitive prompts by investigating prompts that give students more freedom in creat- ing their prompts than any other comparable study has ever done. This study investigates the effect of these self-created prompts on the learning process and learn- ing outcomes as well as on the stability of the potential effects. Moreover, we explore the actual utilization of prompts during the learning process to better understand self- created prompts. Thus, this research poses three research questions: (1) To what extent do self-created prompts affect the learning process, and can the potential effect be maintained long-term? (2) To what extent do self-created prompts affect learning performance, and can the potential effect be maintained long-term? Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 7 of 21 (3) To what extent does prompt utilization influence short- and long-term effects? The first and second research questions are based on prior research showing, on average, the positive effects of prompts on self-regulated learning processes and learn- ing outcomes (e.g., Bannert et al., 2015; Bannert & Reimann, 2011; Winters et al., 2008; Zheng, 2016). Thus, with regard to the first two research questions, we hypothesize that positive effects can also be found when learning with self-created prompts. Accordingly, the hypotheses for the first two research questions are as follows: (1) Self-created prompts facilitate the learning process: Students learning with self- created prompts will visit relevant webpages more frequently, spend more time on relevant webpages, and navigate the learning environment less linearly than students learning without prompts. Moreover, these effects will be maintained long-term. (2) Self-created prompts improve learning performance: Students learning with self- created prompts will increase their recall, comprehension, and transfer perform- ance compared to students learning without prompts. Furthermore, these effects will be maintained long-term. The third research question is also based on research findings suggesting that stu- dents utilize prompts differently, and not always to their advantage, which affects their learning outcome (e.g., Bannert et al., 2015; Bannert & Mengelkamp, 2013; Clarebout & Elen, 2006; Randi & Corno, 2000; Schworm & Gruber, 2012). While the meta-analysis by Zheng (2016) differentiated between different functions of prompts, there has been no prior analysis of prompt utilization with regard to how the function of the prompt was interpreted by the students. Hence, prior studies have analyzed either the function of prompts from a different perspective or different aspects of prompt utilization; there- fore, the analysis in this paper is more exploratory and thus nondirected. Thus, this paper is closely related to the current research investigating the short- and log-term effects of different types of prompts (e.g., Bannert et al., 2015; Müller & Seu- fert, 2018). It goes beyond the existing research by implementing prompts that are in- fluenced by the learners themselves to a higher degree than in prior studies, and it investigates how these prompts are utilized by students. Method Sample and design Sixty-six German-speaking undergraduate university students participated in the study. The final sample comprised n = 57 participants (M =19.9 years, SD =1.58; 72%fe- age age male) because we had to eliminate 9 participants due to (a) very poor compliance in par- ticipating in the study and (b) reducing the number of extreme values in prior knowledge by removing two participants with very low prior knowledge (below 4 on a scale from 1 to 25) and three participants with very high prior knowledge (above 15 on a scale from 1 to 25). The experiment was conducted in a between-subject design comprising three ses- sions. As with Bannert et al. (2015), the first session was used to measure learner charac- teristics. Additionally, based on Bannert et al. (2015), the other two sessions (learning session 1 and learning session 2) took place approximately 1 week and 4 weeks later. The Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 8 of 21 Fig. 1 Procedure of the experimental study over three sessions manipulation of the independent variable self-created prompts was undertaken only in the first learning session. The manipulation was implemented with two conditions: the experi- mental condition of learning with self-created prompts (n = 28) and the control condition of learning without prompts (n = 29). The students were randomly assigned to one condi- tion by following a randomized list of the two conditions. Procedure The experiment took place in a laboratory over three sessions, separating the pretests, the first, and the second learning session. The total time of the experiment was ap- proximately 5 h. It resulted from the duration of the tests and further from around 40 min of learning time per learning session, a time that was chosen to represent a typical teaching time in class. Figure 1 presents an overview of the procedure. In the first session, the learner characteristics were measured. Since we found no dif- ference in the learner characteristics between both groups, we did not consider the learning characteristics further. The first learning session was structured into three phases that, in total, lasted ap- proximately 2 h. In the first phase, the participants were introduced to the hypermedia learning environment and randomly assigned to the experimental or control condition as described above. Afterwards, the participants received training that lasted approxi- mately 15 min. The content of the training was dependent on the condition: the partic- ipants in the experimental condition received a short introduction to metacognitive prompts and a model of what a metacognitive prompt looks like, and they subsequently created metacognitive prompts for themselves; the participants in the control condition received alternative training (on ergonomics) to ensure an equal workload for the par- ticipants in both conditions. In a second phase, the participants engaged in self- regulated learning in the hypermedia learning environment for 40 min with or without self-created prompts depending on the condition. Afterwards, the participants were trained to think aloud. The participants learned about the basic concepts of operant conditioning and were free to navigate through the hypermedia learning environment Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 9 of 21 as they wished. All page visits as well as their time and duration were recorded in a log file. Additionally, the participants were asked to think aloud during the entire learning task. The think-aloud protocols and the computer screen were recorded in a video file. In a third phase, directly after the learning phase, the recall, comprehension, and trans- fer performance of the participants were measured. The second learning session (stability test) was similar in structure to the first learning session but without the instruction phase, and it lasted approximately 1.5 h. In this ses- sion, the participants learned about the basic concepts of motivation psychology. Unlike the first learning session, during the second learning session, there was no manipula- tion of the independent variable. Every other aspect of the learning phase was similar to the procedure of the learning phase in the first learning session for the control con- dition. Hence, all participants learned in the hypermedia learning environment without prompts for 40 min and were free to navigate as they wished. Again, all page visits as well as their time and duration were recorded in a log file, and the participants were asked to think aloud during the entire learning task. The think-aloud protocols and the computer screen were recorded in a video file. Immediately after learning, the recall, comprehension, and transfer performance of the participants were measured. Learning environment The study included two learning phases: one phase in the first learning session on the topic of operant conditioning and one phase in the second learning session (stability test) on the topic of the psychology of motivation. An analysis of text readability (Michalke, 2015) showed similar levels of difficulty of both learning topics (i.e., the Flesch-Kincaid grade-level score for “learning theories” was 19.01, and that for “psych- ology of motivation” was 19.14). Both topics were presented in similar computer-based hypermedia learning environments. Each of the two learning environments included a page specifying the learning goal, 10 pages of information relevant to the topic specified in the learning goals (including approximately 2300 words as well as 5 pictures and ta- bles); and approximately 40 pages including overviews, summaries, and pages not rele- vant to the learning goals. The hypermedia learning environment was designed to provide external validity to the learning process, which usually takes place in a hyper- media learning context with additional, irrelevant information. Moreover, self- regulation and metacognition are important as they help students negotiate learning situations that contain relevant and irrelevant information (among other hurdles). Thus, the students in this study were provided with an externally valid learning experi- ence that made it necessary for them to self-regulate their learning by detecting the relevant learning pages according to their current learning goals. The participants could navigate the learning environment by using a menu bar on the left side of the computer screen, one of 300 hyperlinks, a next-page button and a previous-page button on the top of each page, and the browser buttons (back and forward). Manipulation of the independent variable The manipulation of the independent variable was implemented in two steps during the first learning session. In the first step, the training (before the learning phase) intro- duced the participants of the experimental condition to the prompts and showed them Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 10 of 21 how to create their own prompts to be used during the upcoming learning phase. Add- itionally, the participants could familiarize themselves with the structure of the hyper- media learning environment (with different content) to anticipate which prompts they might need and at what point they would like to be supported by a prompt. At the end of the training phase, directly before the start of the learning phase, the participants created their own prompts. To do so, they were introduced to an example prompt list- ing major metacognitive learning activities. They were completely free in creating their prompts; i.e., they could choose to design prompts that were similar to the example prompt, loosely based on the example prompt, or completely different from the pre- sented example prompt. However, they were asked to base the prompts on the training and thus mainly created metacognitive prompts. Additionally, the participants set time- stamps to determine at which points within the 40-min learning phase they wished to receive the self-created prompt. The participants needed to set a minimum of 3 time- stamps. By designing the self-created prompts in this way, the students could influence how they interacted with the prompts during the learning phase, the prompt itself, and the time of appearance. In the second step, the participants received a think-aloud training and started with the learning in the environment. The prompts were shown to the participants of the experimental condition during the learning phase at the times determined by the par- ticipants. The prompts were presented in a pop-up window displaying the self-created list of metacognitive learning activities; see Fig. 2 for an example. Students were ex- pected to select one or more of the prompted learning activities that they wanted to en- act next, submit their list of selected activity/ies, and then continue learning in the hypermedia learning environment while performing the activity/ies (one after the other) during time following the prompt. The participants in the control condition received a different training that was irrele- vant to the content of the study to ensure a similar work time and workload under both conditions. Specifically, they learned about the major criteria of an ergonomic work- place and how to design their own ergonomic workplace. They could also familiarize themselves with the structure of the hypermedia learning environment before the learn- ing phases started by using the same content as that presented to the students of the experimental group. Then, they received a think-aloud training and learned in the same hypermedia learning environment as the participants in the experimental conditions without any support by prompts. Instruments and dependent variables Learning process: navigation behavior To investigate the learning process, we analyzed the recorded log files with regard to the students’ navigation behavior. The log files collected in this study recorded the pages and times of all webpages visited by the participants as well as the duration of the visits. Moreover, the webpages of the learning environment were categorized into relevant and irrelevant pages (see the description of the learning environment). Mean- ingful insights could be derived from our analysis of the systematic navigation behavior of the participants based on three parameters that could be drawn from the informa- tion given in the log files: (1) the relative frequency of relevant page visits (the number Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 11 of 21 Fig. 2 Screenshot of the learning environment including a pop-up window displaying the self-created prompt by a learner. Note: In this example (which is translated from German for this paper), the learner created the prompt by listing four different learning activities before starting the learning session in the learning environment. In the specific prompting situation, s/he chose “I reflect on previous content” as the learning activity to enact next of relevant pages visited divided by the total number of pages visited), (2) the relative time spent on relevant pages (time spent on relevant pages divided by the total learning time), and (3) the frequency of linear navigation steps divided by the total number of navigation steps. The frequency and time spent on relevant webpages provide insight into navigation behavior because they indicate the degree to which the students used self-regulation strategies to reach their learning goals, selected relevant webpages to study, and did not simply follow the progression suggested by the chapters in the hypermedia learning environment. The last parameter is based on the operationaliza- tion of strategic navigation behavior as the number of nonlinear node selections (Astleitner, 1997). Learning performance The learning outcomes were measured on three dimensions with three knowledge tests. The dimensions were based on Bloom’s taxonomy of cognitive learning objectives (Bloom, 1956) and concentrated on the three least complex components: recall, com- prehension, and transfer. Recall was measured by asking the participants to write down the terms and concepts of the topic they studied during the learning phases (operant conditions in learning session 1 and the psychology of motivation in learning session 2). The performance score for recall was determined by counting the correct number of terms and concepts mentioned by the participants. Comprehension was measured with a multiple-choice test consisting of 22 items for the topic of operant condition (learning session 1, α = .73) and 19 items for the topic of psychology of motivation Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 12 of 21 (learning session 2, α = .40), each with 1 correct and 3 false response options. Transfer was measured with open questions in which the participants were asked to solve 8 prototypical problems in educational settings that were unknown to the participants and to apply the knowledge of operant conditioning (learning session 1, α = .61) or the psychology of motivation (learning session 2, α = .65). Two research assistants rated each answer independently on a scale from 0 (no answer) to 5 (correct answer) for all data. The interrater reliability was good (learning session 1: Cohen’s Kappa = .84; learn- ing session 2: Cohen’s Kappa = .74). In case of disagreement between the two raters, an expert of the research team determined the final score. Prompt utilization Prompt utilization was measured by coding the think-aloud protocols of the partici- pants in the experimental condition in the first learning session. Because both condi- tions involved learning without prompts in the second learning session, prompt utilization could be coded only in the first learning session. Coding was completed for each prompt appearance for every participant and then aggregated to yield one score for each participant. The interpretation of each prompt was coded in one of two cat- egories, either (a) a cue to reflect on current metacognitive learning activities or (b) a cue to enact a (potentially new) metacognitive learning activity presented in the prompt. We also included a residuum category that was applied in case the prompt utilization could not be coded or the learner did not react to the prompt at all. This categorization was developed inductively after observing that the self-created prompts were used quite differently: (a) some participants seemed to use the prompts as a re- flection tool. In the think-aloud protocols, these participants mentioned which (meta-) cognitive strategies they were currently using and compared them with the metacogni- tive learning activities listed in their prompt. For example, one participant gave this statement while hovering with her cursor over an activity (“I check whether I am reach- ing my goal”) in the prompt window: “I am checking whether I am reaching my goal… Yeah, I actually did that already. So, yes [ticking that box].” (b) Other participants were reminded by their prompt to enact some of the (meta-) cognitive activities mentioned in the prompt. In the think-aloud protocols, these participants did not mention their current learning strategies, often abandoned what they did when the prompt appeared, and verbalized their plan to enact one or more of the activities suggested in the prompt. For example, one participant gave the following statement at the time the prompt window opened: “[reading out loud:] An individual sees… [prompt window opens] Okay, learning activities. I’ll reflect the content learned so far [ticking that box] and afterwards I will get an overview of the material [ticking that box]. So, reflecting… I can say that the learning material is about…” This difference in the interpretation of the prompts (including a third category for unclear cases that were excluded from the ana- lysis) were coded by two independent coders (Cohen’s Kappa = .78). Because each partici- pant worked with more than one prompt, a score was aggregated for each participant by calculating the mode of each participant’s codes. If the mode for a participant was the residuum category or if both categories were coded the same amount of times, then the participant was not included in the analysis regarding prompt utilization. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 13 of 21 Statistical analysis The alpha level was set to 5% for all analyses. As the hypotheses for research question 1 and research question 2 were directional hypotheses, we used one-tailed hypothesis testing for the analyses regarding these research questions. Research question 3 was open; therefore, we used two-tailed hypothesis testing for the analyses regarding the third research question. To avoid a multiple comparison problem in interpreting the in- dividual comparisons, we applied a Bonferroni correction. For the first and second re- search question, we tested six comparisons for the individual measures, three learning process parameters, and three performance tests. Thus, we corrected the significance level for the individual comparisons to 0.0083. For the third research question, we ap- plied three comparisons per dataset for the individual measures of performance. Thus, we set the significance level to 0.017 for the individual comparisons. Results Effects of self-created prompts on the learning process To analyze the potential effect of self-created prompts on the learning process, we compared the navigation behavior of the students in the experimental condition to the navigation behavior of the students in the control condition. We hypothesized that self- created prompts would increase the frequency and time the students devote to relevant pages and decrease the linearity in which the hypermedia learning environment is navi- gated compared with learning without prompts. Furthermore, we hypothesized that these effects would persist in a second learning session (stability test) in which the stu- dents under both conditions learned without prompts. Table 2 displays the descriptive values for all navigation parameters in the first and second learning sessions. A multi- variate analysis of variance showed that the prompts facilitated navigation behavior in the first learning session with a large effect (F(3, 53) = 6.63, p < .001; Wilk’s Λ = 0.73, partial η = .27). The analysis of the individual parameters showed that the self-created prompts affected only the relative time spent on relevant webpages (see Table 2). We found no overall effect of prompts on the navigation behavior in the second learning session (F(3, 53) = 2.60, p = .062; Wilk’s Λ = 0.87, partial η = .13). In summary, the results partly support our hypotheses regarding our first research question: The self-created metacognitive prompts supported the learning process in the first learning session; however, the effect could not be maintained in the second learning session. The posi- tive effect could be found only for one navigation parameter, i.e., the relative time spent on relevant webpages, an individual effect that was also shown in the second learning session. Effects of self-created prompts on learning performance To analyze the potential effect of self-created prompts on learning performance, we an- alyzed students’ recall, comprehension, and transfer performance in the experimental condition and compared their learning outcomes with the students in the control con- dition. We hypothesized that self-created prompts would foster learning performance compared with learning without prompts. Furthermore, we hypothesized that these ef- fects would persist in a second learning session in which the students in both condi- tions learned without prompts. Table 2 displays the descriptive values for all learning performance parameters in the first and second learning sessions. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 14 of 21 Table 2 Descriptive values and inferential statistics for navigation behavior and learning outcomes in learning session 1 and learning session 2 Learning with Learning t p (one-tailed) d self-created without prompts prompts (n = 28) (n = 29) MSD M SD Learning session 1 Relative frequency of relevant webpages visited .41 .12 .38 .13 –1.02 .157 0.24 Relative time spent on relevant webpages* .62 .13 .49 .16 –3.48 <.001 0.92 Linearity of navigation steps .39 .18 .47 .23 1.59 .117 0.39 Recall 14.61 4.95 12.24 5.70 –1.67 .051 0.44 Comprehension 16.93 2.58 15.38 3.68 –1.85 .036 0.49 Transfer 20.27 3.74 21.24 4.54 0.88 .191 0.23 Learning session 2 (stability test) Relative frequency of relevant webpages visited .32 .11 .32 .11 –0.23 .410 0.00 Relative time spent on relevant webpages .58 .17 .50 .16 –1.92 .030 0.51 Linearity of navigation steps .38 .18 .41 .17 0.76 .226 0.17 Recall 11.00 3.43 10.28 4.07 –0.73 .236 0.19 Comprehension 13.00 2.19 11.70 2.06 –1.84 .036 0.49 Transfer 20.66 5.054 20.88 5.39 0.16 .438 0.04 Note. Students of the experimental group were supported by self-created prompts only in learning session 1; in learning session 2, both groups learned without prompts. The Bonferroni-corrected significance level was 0.0083. The comparisons reached significance are marked with an asterisk (*) A multivariate analysis of variance showed a large, significant overall effect of the self- created metacognitive prompts on performance in the first learning session (F(3, 53) = 3.78, p = .016; Wilk’s Λ =0.82, partial η = .18). However, an analysis of the individual parameters showed no differences in recall, comprehension, or transfer (see Table 2). In the second learn- ing session, the self-created metacognitive prompts did not show an overall effect on perform- ance measures (F(3, 53) = 1.61, p = .198; Wilk’s Λ =0.92, partial η =.08). In summary, the results do not support our hypotheses regarding our second re- search question: The self-created metacognitive prompts positively affected the learning performance in the first learning session; however, the effect could not be maintained in the second learning session and the individual comparisons do not reveal a positive effect of the self-created metacognitive prompts on recall, comprehension, or transfer Effects of prompt utilization on short- and long-term learning performance For a deeper understanding of how the self-created prompt could affect or fail to affect the students’ learning performance, we analyzed the think-aloud protocols to determine how the prompts were interpreted by the participants during learning. Furthermore, we analyzed whether possible effects could persist in a second learning session in which the participants learned without prompts. Table 3 contains the results of this exploratory analysis. Half of the participants used their prompts to reflect on their current metacognitive learning activities (n = 11), i.e., as a reflection request. The other half of the students used their prompts mainly as a cue to enact one or more prompted metacognitive learning activities (n = 12), i.e., as a call to action without deeper reflection on current learning activities. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 15 of 21 Table 3 Descriptive values and inferential statistics for the analysis of prompt utilization regarding learning outcomes in learning session 1 and learning session 2 Reflection request (n = 11) Call to action (n = 12) t p (two-tailed) d MSD M SD Learning session 1 Recall 14.09 3.78 15.17 6.07 .50 .619 –0.21 Comprehension 17.00 1.79 16.00 3.10 –.94 .361 0.40 Transfer 21.95 1.97 18.92 4.68 –2.06 .059 0.84 Learning session 2 (stability test) Recall 10.09 3.59 11.00 3.74 .59 .559 –0.25 Comprehension 14.09 1.22 12.08 2.57 –2.42 .029 1.00 Transfer* 23.36 4.21 17.79 3.71 –3.37 .003 1.50 Note: This analysis includes only students of the experimental condition learning group with self-created prompts. Students of the experimental group were supported by self-created prompts only in learning session 1; in learning session 2, both groups learned without prompts. The Bonferroni-corrected significance level was 0.017. All comparisons that reached significance are marked with an asterisk (*) Table 3 displays the descriptive results and individual differences in the parameters. There were no clear descriptive differences between the groups in terms of recall and comprehension. The descriptive difference in the first learning session in transfer per- formance was in favor of the students’ interpretation of the prompts as a reflection tool. However, a multivariate analysis of variance showed no main effect of the interpretation of the self-created prompts on performance (F(3, 19) = 2.87, p =.064; Wilk’s Λ =0.69, partial η = .31). The analysis of the second learning session showed a long-term effect of the interpretation of the self-created prompts on learning performance (F(3, 19) = 6.45, p =.003; Wilk’s Λ = 0.50, partial η = .50). In the individual comparisons, a significant main effect with a large effect size was found in the transfer performance in favor of the stu- dents’ interpreting the self-created prompts as a reflection request. In summary, the results regarding the third research question lead us to the assump- tion that interpreting self-created prompts as a reflection request is more beneficial for learning performance than interpreting prompts as a cue for action in fostering long- term transfer performance. Discussion Do self-created metacognitive prompts promote short- and long-term effects in computer-based learning environments? The prior evidence supports the claims that metacognitive prompts generally can facilitate learning in CBLEs (e.g., Bannert et al., 2015; Belland et al., 2015; Bouchet et al., 2016; Delen et al., 2014; Kim & Pedersen, 2011; Winters et al., 2008; Zheng, 2016). We closely examined how a specific type of metacognitive prompt, self-created metacognitive prompts, may help students. More- over, we investigated the effects of self-created prompts on not only students’ short- term learning performances but also students’ learning processes, i.e., how students navigate CBLEs and utilize the self-created metacognitive prompts and how sustainable the possible effects are in a second learning session without prompts. The results of this experimental study allow several conclusions to be drawn regarding the support of stu- dents in CBLEs with self-created metacognitive prompts. As shown by the results, there are mixed beneficial effects for self-created metacogni- tive prompts on the learning process and learning performance. In addition to the Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 16 of 21 significant effects in this regard, we found medium effect sizes for the relative time spent on relevant webpages in learning session 1 as well as descriptive advantages for comprehension in learning session 1 and learning session 2 (see Table 2). Especially when the prompt use of the students was characterized by deep reflection (reflection request), we found even larger effect sizes for transfer in learning session 2 (see Table 3). In addition, the results are interesting in comparison to studies testing different types of (metacognitive) prompts. In a next step, it would be necessary to test the effect of self-created prompts in empirical investigations compared with other metacognitive prompts. The result pattern of this study differs from that of earlier studies with com- parable settings (e.g., Bannert et al., 2015; Bannert & Mengelkamp, 2013; Müller & Seu- fert, 2018), suggesting that the learning mechanism with self-created metacognitive prompts is perhaps just slightly different from prompts that students could not adapt (e.g., Müller & Seufert, 2018) or that they could adapt to a lesser degree by setting the timing of the prompt (e.g., Bannert et al., 2015). In the prior studies, setting the timing of the prompts caused students to navigate to relevant webpages more often and for longer periods and increased the students’ transfer performance. In this study, the self- created prompts similarly caused students to stay longer on relevant webpages. How- ever, in contrast to similar work (e.g., Bannert, 2007; Bannert et al., 2015; Bannert & Mengelkamp, 2013), in general, the self-created prompts did not show to facilitate transfer performance. Thus, the freedom to manipulate the content of the prompts themselves leads to a difference in the pattern of the results: the students in this study regulated their learning to a lesser degree based on the learning goals, and their per- formance suffered compared to that of the participants in studies that gave the students less freedom in designing their learning process. We based the design of this study on the general assumption that metacognitive prompts enhance the process of self-regulation and thus induce metacognitive ac- tivities and initiate deeper processing of information that is important for perform- ance and mainly relevant for transfer performance. The results of this study allow us to hypothesize that the self-created metacognitive prompts might not have been sufficiently targeted to facilitate this process. The self-created prompts were intended to improve poor-to-mixed compliance (Bannert et al., 2015;Bannert & Mengelkamp, 2013; Clarebout & Elen, 2006;Randi &Corno, 2000;Schworm & Gruber, 2012). Our results could suggest that the students might not have been sufficiently knowledgeable to create prompts that would best fit their learning process. While the students were introduced to the creation of prompts, they could not possibly have the same prior knowledge as the researcher who created the prompts in most other studies. Thus, the self-created prompts could not be used to achieve the full potential of the students. In future studies, a more in-depth training session should be given so that students better understand how they are going to study and which prompts would be helpful. Another approach for further investigation would be to allow the students to create or manipulate their metacog- nitive prompts while they engage with the learning materials. An alternative explanation for the inconclusive results could lie in the content of the prompts. While the metacognitive prompts created by experts and the self-directed prompts target metacognitive processes, the prompts created by the students might (also) target other learning processes such as cognitive activities. The advantage of Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 17 of 21 purely metacognitive prompts over alternative or mixed prompts is consistent with meta-analytic results showing that only metacognitive prompts significantly foster per- formance compared with conceptual, strategic, or multiple prompts (Zheng, 2016). While we aimed to investigate self-created prompts by giving more freedom to the students, this approach also allows less control over the content of the prompts in this experiment, which is a major limitation of the study and causes our conclusions to be limited to the intended effect of the self-created prompts and, to a lesser degree, to the actual behavior of the students (except for the coded interpretation of the prompts, i.e., students’ prompt utilization). The research indicates that some students do not utilize prompts as intended (e.g., Clarebout & Elen, 2006; Clarebout, Elen, Collazo, Lust, & Jiang, 2013; Moser et al., 2017; Schworm & Gruber, 2012), which could influence their learning outcomes (e.g., Bannert et al., 2015). Our rather explorative analysis investi- gated the utilization of prompts regarding the interpretation of the self-created prompts either as a cue to reflect about current cognitive and metacognitive activities or as a cue to enact a learning activity suggested by the self-created prompt. The results show a trend suggesting the benefit of interpreting the prompts as a cue to reflect one’s own learning compared with using the prompts mainly as a starting cue or as a cue for more new learning activities. Here, such reflection seems to lead to higher transfer perform- ance and a strong effect in the second learning session. This result could support our assumption that self-created prompts indeed address a production deficit (Marschner et al., 2012; Veenman, 2007; Veenman et al., 2006), thus helping students to reflect upon their learning activities and subsequently improve the self-regulated learning process; however, this finding applies only to some students. The students who inter- preted the prompts as a call to action might not understand their own learning process enough to reflect their current cognitive and metacognitive activities. If this was the cause, there was no production deficit (i.e., students possess the strategies but do not apply them spontaneously) to be addressed by the self-created prompts but rather a mediation deficit (i.e., students do not have the strategies) that cannot be addressed by this kind of prompt only (Bannert, 2007; Veenman, 1993). This analysis might have de- tected that the underlying assumption of using metacognitive prompts to facilitate the self-regulation process applies to only a subgroup of students: those who are affected by a production deficit (Marschner et al., 2012; Veenman, 2007; Veenman et al., 2006). The benefit of the interpretation of the self-created prompts as a reflection aid seems to contradict prior studies and the meta-analyses (Mäeots et al., 2016; Van den Boom, Paas, van Merriënboer, & van Gog, 2004;Zheng, 2016) suggesting that reflection prompts are inferior to metacognitive prompts. However, these studies investigated only the intended design of the researchers and not the interpretation of the prompts by the students. The self-created prompts investigated in this study would probably be categorized as metacog- nitive prompts and not as reflection prompts. However, as discussed before, the students did not always interpret the prompts as such. As assumed above, this interpretation is probably dependent on prior knowledge of self-regulation strategies. However, this inter- pretation is limited because of the exploratory nature of this work. The pattern found here should be tested directly in future confirmatory studies. One important limitation of this study is the design, comparing learning with self- created prompts to learning without prompts. This design was employed to take a first step in investigating whether self-created prompts would be at all beneficial. For a Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 18 of 21 closer examination of the mechanism behind different types of prompts, the design would need to include at least one more condition in which the prompts were not self- created by the students. Furthermore, the study is limited by two methodological con- straints. The training before the learning phase as well as the learning phase took only approximately 10 min and 40 min, respectively. Thus, the time was comparable to one lesson in school or university but did not include a longer time in which SRL processes could develop further. Thus, we must consider that the SRL process might change over time and this development could not be shown in this paper. In addition, the Cron- bach’s Alpha level for the comprehension test in the second learning session was rather low with 0.40 indicating a low internal consistency of this knowledge test. All other performance tests show higher values for internal consistencies of the specific scales, what we consider to be valid for testing the effect of the intervention. Conclusion This study expands our understanding of the support prompts can give to students in CBLEs by showing the partial effects of self-created prompts on the learning process. Furthermore, the study exceeded efforts in analyzing the impact of different prompts on learning (Zheng, 2016) by analyzing how differently the prompts are utilized and the effect of this different utilization on learning. Based on the results of this study, we can recommend paying closer attention to stu- dents’ utilization of self-created prompts: what are the students doing with the prompts and are the students utilizing the prompts as they were meant to by researchers? While the self-creation of prompts might help students to design prompts to their anticipated needs, this practice also introduces new difficulties that might hinder learning such as the limited knowledge of the students regarding the support they will need in a subse- quent learning session. Moreover, we must further investigate the interrelationship of involving students increasingly in creating their own prompts and the differences in utilization of prompts as well as the effect of this factor on the learning process and performance outcomes. The results warrant careful recommendation for the design and implementation of prompts in a learning environment. The study cannot directly support the prior recom- mendation (e.g., Azevedo, 2005; Bannert et al., 2015; Belland et al., 2015; Zheng, 2016) to design prompts to target metacognitive learning processes. However, it might be beneficial for students to be asked to reflect upon their current learning activities as our exploratory results suggest that an inclusion of reflection in the utilization of self- created prompts affects transfer performance, even more in long-term performance. In conclusion, we were able to show that self-created prompts partly facilitate the learning process CBLEs. However, due to the different outcome patterns compared to similar studies in which the students did not create the prompts themselves (e.g., Ban- nert et al., 2015; Bouchet et al., 2018; Daumiller & Dresel, 2018), we conclude that the involvement of students in creating prompts may influence the way in which the prompts support the learning process. Similarly, our results also show that the utilization of prompts affects the performance of students. While prompts that aim to support reflection did not yield a positive effect from a meta-analytic perspective (Zheng, 2016), self-created metacognitive prompts that are utilized to reflect upon current learning activities led to distinctly better transfer learning in the students of this Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 19 of 21 study. Thus, this study contributes to the body of literature investigating how students can be supported to enhance the learning process and performance outcomes when learning in CBLEs. Acknowledgements We would like to thank Dr. Christoph Sonnenberg for his invaluable contributions in collecting, analyzing, and discussing the data as well as the student assistants Anna Horrer, Stefanie Beck, and Veronika Danner for their help in collecting the data, coding the data, and preparing the manuscript. This research was funded by the German Research Foundation (BA 2044/7-2). Authors’ contributions Katharina Engelmann: Analysis of the literature, analysis of the data, and writing the manuscript. Maria Bannert: Analysis of the literature, design, preparation and collection of the data, analysis of the data, and writing the manuscript. Nadine Melzner: Analysis of the literature and writing the manuscript. The authors read and approved the final manuscript. Funding This research was funded by the German Research Foundation (BA 2044/7-2). After accepting to fund the studies (before any data collection), the German Research Foundation did not influence any aspect in the design of the study and collection, analysis, and interpretation of data and in writing the manuscript. Availability of data and materials Since we are still publishing original articles based on the data from this study, we are not making the data and materials available at the moment. Ethics approval and consent to participate All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. Informed consent was obtained from all individual participants included in the study. Consent for publication Not applicable Competing interests The authors declare that they have no conflict of interest. Received: 15 January 2020 Accepted: 28 January 2021 References Anthonysamy, L., Koo, A. C., & Hew, S. H. (2020). Self-regulated learning strategies in higher education: Fostering digital literacy for sustainable lifelong learning. Education and Information Technologies, 25(4), 2393–2414. https://doi.org/10.1 007/s10639-020-10201-8 . Astleitner, H. (1997). Lernen in Informationsnetzen: Theoretische Aspekte und empirische Analysen des Umgangs mit neuen Informationstechnologien auserziehungswissenschaftlicher Perspektive. Lang. Frankfurt/M. Azevedo, R. (2005). Using hypermedia as a metacognitive tool for enhancing student learning? The role of self-regulated learning. Educational Psychologist, 40(4), 199–209. https://doi.org/10.1207/s15326985ep4004_2 . Azevedo, R., & Aleven, V. (Eds.). (2013). Springer international handbooks of education.In International handbook of metacognition and learning technologies. New York: Springer. https://doi.org/10.1007/978-1-4419-5546-3 . Azevedo, R., & Cromley, J. G. (2004). Does training on self-regulated learning facilitate students’ learning with hypermedia? Journal of Educational Psychology, 96(3), 523–535. https://doi.org/10.1037/0022-0663.96.3.523 . Azevedo, R., Cromley, J. G., Moos, D. C., Greene, J. A., & Winters, F. I. (2011). Adaptive content and process scaffolding: A key to facilitating students’ self-regulated learning with hypermedia. Psychological Test and Assessment Modeling, 53(1), 106–140. Azevedo, R., Moos, D. C., Johnson, A. M. M. Y., & Chauncey, A. D. (2010). Measuring cognitive and metacognitive regulatory processes during hypermedia learning: Issues and challenges. Educational Psychologist, 45(4), 210–223. https://doi.org/10.1 080/00461520.2010.515934 . Azevedo, R., Taub, M., & Mudrick, N. (2018). Understanding and reasoning about real-time cognitive, affective, and metacognitive processes to foster self-regulation with advanced learning technologies. In D. H. Schunk, & J. A. Greene (Eds.), Handbook of self-regulation of learning and performance, (2nd ed., pp. 254–270). Routledge. Bannert, M. (2007). Metakognition beim Lernen mit Hypermedien: Erfassung, Beschreibung und Vermittlung wirksamer metakognitiver Strategien und RegulationsaktivitätenZugl.: Koblenz, Univ., Habil.-Schr., 2004 Pädagogische Psychologie und Entwicklungspsychologie: Vol. 61. Waxmann. http://deposit.d-nb.de/cgi-bin/dokserv?id=2993278&prov=M&dok_var= 1&dok_ext=htm. Bannert, M. (2009). Promoting self-regulated learning through prompts. Zeitschrift Für Pädagogische Psychologie, 23(2), 139–145. https://doi.org/10.1024/1010-0652.23.2.139 . Bannert, M., & Mengelkamp, C. (2013). Scaffolding hypermedia learning through metacognitive prompts, Springer international handbooks of education. In R. Azevedo, & V. Aleven (Eds.), International handbook of metacognition and learning technologies, (vol. 28, pp. 171–186). New York: Springer. https://doi.org/10.1007/978-1-4419-5546-3_12 . Bannert, M., & Reimann, P. (2011). Supporting self-regulated hypermedia learning through prompts. Instructional Science, 40(1), 193–211. https://doi.org/10.1007/s11251-011-9167-4 . Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 20 of 21 Bannert, M., Sonnenberg, C., Mengelkamp, C., & Pieger, E. (2015). Short- and long-term effects of students’ self-directed metacognitive prompts on navigation behavior and learning performance. Computers in Human Behavior, 52, 293–306. https://doi.org/10.1016/j.chb.2015.05.038 . Belland, B. R., Walker, A. E., Olsen, M. W., & Leary, H. (2015). A pilot meta-analysis of computer-based scaffolding in STEM education. Journal of Educational Technology & Society, 18(1), 183–197. Berthold, K., Nückles, M., & Renkl, A. (2007). Do learning protocols support learning strategies and outcomes? The role of cognitive and metacognitive prompts. Learning and Instruction, 17(5), 564–577. https://doi.org/10.1016/j.learninstruc.2007.09.007 . Bloom B. S. (1956) Taxonomy of educational objectives. In Bloom B. S., M. D. Engelhart, E. J. Furst, W. H. Hill & D. R. Krathwohl (Hg.), Handbook I: Cognitive domain (S. 20–24). Longmans. New York, NY, USA. Bouchet, F., Harley, J. M., & Azevedo, R. (2016). Can adaptive pedagogical agents’ prompting strategies improve students’ learning and self-regulation? In A. Micarelli, J. Stamper, & K. Panourgia (Eds.), Lecture notes in computer science: Vol. 9684. Intelligent tutoring systems: 13th international conference, ITS 2016, Zagreb, Croatia, June 7-10, 2016. Proceedings, (vol. 9684, pp. 368–374). Springer International Publishing. https://doi.org/10.1007/978-3-319-39583-8_43 . Bouchet, F., Harley, J. M., & Azevedo, R. (2018). Evaluating adaptive pedagogical agents’ prompting strategies effect on students’ emotions. In R. Nkambou, R. Azevedo, & J. Vassileva (Eds.), Lecture notes in computer science. Intelligent tutoring systems, (vol. 10858, pp. 33–43). Springer International Publishing. https://doi.org/10.1007/978-3-319-91464-0_4 . Clarebout, G., & Elen, J. (2006). Tool use in computer-based learning environments: Towards a research framework. Computers in Human Behavior, 22(3), 389–411. https://doi.org/10.1016/j.chb.2004.09.007 . Clarebout, G., Elen, J., Collazo, N. A. J., Lust, G., & Jiang, L. (2013). Metacognition and the use of tools. In R. Azevedo & V. Aleven (Eds.), International handbook of metacognition and learning technologies (Vol. 28, pp. 187–195). Springer. New Yor, NY, USA. Daumiller, M., & Dresel, M. (2018). Supporting self-regulated learning with digital media using motivational regulation and metacognitive prompts. The Journal of Experimental Education, 87(1), 161–176. https://doi.org/10.1080/00220973.2018.1448744. Delen, E., Liew, J., & Willson, V. (2014). Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video-based environments. Computers & Education, 78, 312–320. https://doi.org/10.1016/j.compedu.2014.06.018 . Hilbert, T. S., Nückles, M., Renkl, A., Minarik, C., Reich, A., & Ruhe, K. (2008). Concept mapping zum Lernen aus Texten. Zeitschrift Für Pädagogische Psychologie, 22(2), 119–125. https://doi.org/10.1024/1010-0652.22.2.119 . Hsu, Y.-S., Wang, C.-Y., & Zhang, W.-X. (2017). Supporting technology-enhanced inquiry through metacognitive and cognitive prompts: Sequential analysis of metacognitive actions in response to mixed prompts. Computers in Human Behavior, 72, 701–712. https://doi.org/10.1016/j.chb.2016.10.004 . Jansen, R. S., van Leeuwen, A., Janssen, J., Conijn, R., & Kester, L. (2020). Supporting learners’ self-regulated learning in massive open online courses. Computers & Education, 146, 103771. https://doi.org/10.1016/j.compedu.2019.103771 . Kauffman, D. F., Ge, X., Xie, K., & Chen, C.-H. (2008). Prompting in web-based environments: Supporting self-monitoring and problem solving skills in college students. Journal of Educational Computing Research, 38(2), 115–137. https://doi.org/10.2190/EC.38.2.a . Kim, H. J., & Pedersen, S. (2011). Advancing young adolescents’ hypothesis-development performance in a computer- supported and problem-based learning environment. Computers & Education, 2(57), 1780–1789. Kizilcec, R. F., Pérez-Sanagustín, M., & Maldonado, J. J. (2017). Self-regulated learning strategies predict learner behavior and goal attainment in massive open online courses. Computers & Education, 104,18–33. https://doi.org/10.1016/j.compedu.2016.10.001 . Kramarski, B., & Friedman, S. (2014). Solicited versus unsolicited metacognitive prompts for fostering mathematical problem solving using multimedia. Journal of Educational Computing Research, 50(3), 285–314. https://doi.org/10.2190/EC.50.3.a . Lai, C.-L., & Hwang, G.-J. (2016). A self-regulated flipped classroom approach to improving students’ learning performance in a mathematics course. Computers & Education, 100, 126–140. https://doi.org/10.1016/j.compedu.2016.05.006 . Lallé, S., Taub, M., Mudrick, N. V., Conati, C., & Azevedo, R. (2017). The impact of student individual differences and visual attention to pedagogical agents during learning with metatutor. In E. André, R. Baker, X. Hu, M. M. T. Rodrigo, & B. Du Boulay (Eds.), Lecture notes in computer science. Artificial intelligence in education, (vol. 10331, pp. 149–161). Springer International Publishing. https://doi.org/10.1007/978-3-319-61425-0_13 . Mäeots, M., Siiman, L., Kori, K., Eelmets, M., Pedaste, M., & Anjewierden, A. (2016). The role of a reflection tool in enhancing students’ reflection. In L. Gómez Chova, A. López Martínez, & I. Candel Torres (Eds.), INTED proceedings, INTED2016 proceedings, (pp. 1892–1900IATED). https://doi.org/10.21125/inted.2016.1394 . Marschner, J., Thillmann, H., Wirth, J., & Leutner, D. (2012). Wie lässt sich die Experimentierstrategie-Nutzung fördern? Zeitschrift für Erziehungswissenschaft, 15(1), 77–93. https://doi.org/10.1007/s11618-012-0260-5 . Michalke, M. (2015). koRpus (version 0.06-3) [computer software] http://reaktanz.de/?c=hacking&s=koRpus. Molenaar, I., Roda, C., van Boxtel, C., & Sleegers, P. (2012). Dynamic scaffolding of socially regulated learning in a computer- based learning environment. Computers & Education, 59(2), 515–523. https://doi.org/10.1016/j.compedu.2011.12.006 . Moser, S., Zumbach, J., & Deibl, I. (2017). The effect of metacognitive training and prompting on learning success in simulation-based physics learning. Science Education, 101(6), 944–967. https://doi.org/10.1002/sce.21295 . Müller, N. M., & Seufert, T. (2018). Effects of self-regulation prompts in hypermedia learning on learning performance and self- efficacy. Learning and Instruction, 58,1–11. https://doi.org/10.1016/j.learninstruc.2018.04.011 . Nückles, M., Schwonke, R., Berthold, K., & Renkl, A. (2004). The use of public learning diaries in blended learning. Journal of Educational Media, 29(1), 49–66. https://doi.org/10.1080/1358165042000186271 . Pieger, E., & Bannert, M. (2018). Differential effects of students’ self-directed metacognitive prompts. Computers in Human Behavior, 86, 165–173. https://doi.org/10.1016/j.chb.2018.04.022 . Randi, J., & Corno, L. (2000). Teacher innovations in self-regulated learning. In M. Boekaerts, M. Zeidner, & P. R. Pintrich (Eds.), Handbook of self-regulation, (pp. 651–685). Academic. https://doi.org/10.1016/B978-012109890-2/50049-4 . Renner, B., Prilla, M., Cress, U., & Kimmerle, J. (2016). Effects of prompting in reflective learning tools: Findings from experimental field, lab, and online studies. Frontiers in Psychology, 7, 820. https://doi.org/10.3389/fpsyg.2016.00820 . Sambe, G., Bouchet, F., & Labat, J.-M. (2017). Towards a conceptual framework to scaffold self-regulation in a MOOC. In C. M. F. Kebe, A. Gueye, & A. Ndiaye (Eds.), Lecture notes of the Institute for Computer Sciences, social informatics and telecommunications engineering. Innovation and interdisciplinary solutions for underserved areas, (vol. 204, pp. 245–256). Springer International Publishing. https://doi.org/10.1007/978-3-319-72965-7_23. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 21 of 21 Schunk, D. H., & Zimmerman, B. J. (1998). Self-regulated learning: From teaching to self-reflective practice. Guilford Press http://www.loc.gov/catdir/bios/guilford051/97046438.html. Schwonke, R., Hauser, S., Nückles, M., & Renkl, A. (2006). Enhancing computer-supported writing of learning protocols by adaptive prompts. Computers in Human Behavior, 22(1), 77–92. https://doi.org/10.1016/j.chb.2005.01.002 . Schworm, S., & Gruber, H. (2012). E-learning in universities: Supporting help-seeking processes by instructional prompts. British Journal of Educational Technology, 43(2), 272–281. https://doi.org/10.1111/j.1467-8535.2011.01176.x . Stark, R., & Krause, U.-M. (2009). Effects of reflection prompts on learning outcomes and learning behaviour in statistics education. Learning Environments Research, 12(3), 209–223. https://doi.org/10.1007/s10984-009-9063-x . Stark, R.,Tyroller, M.,Krause, U.-M., &Mandl,H.(2008).Effekte einermetakognitiven Promptingmaßnahme beim situierten, beispielbasierten Lernen im Bereich Korrelationsrechnung. Zeitschrift Für Pädagogische Psychologie, 22(1), 59–71. https://doi.org/10.1024/1010-0652.22.1.59 . Thillmann, H., Künsting, J., Wirth, J., & Leutner, D. (2009). Is it merely a question of “what” to prompt or also “when” to prompt? Zeitschrift Für Pädagogische Psychologie, 23(2), 105–115. https://doi.org/10.1024/1010-0652.23.2.105 . van Alten, D. C. D., Phielix, C., Janssen, J., & Kester, L. (2020). Effects of self-regulated learning prompts in a flipped history classroom. Computers in Human Behavior, 108, 106318. https://doi.org/10.1016/j.chb.2020.106318 . van den Boom, G., Paas, F., van Merriënboer, J. J. G., & van Gog, T. (2004). Reflection prompts and tutor feedback in a web- based learning environment: Effects on students’ self-regulated learning competence. Computers in Human Behavior, 20(4), 551–567. https://doi.org/10.1016/j.chb.2003.10.001 . Veenman, M. V. (1993). Metacognitive ability and metacognitive skill: Determinants of discovery learning in computerized learning environments. European Journal of Psychology of Education, 29(1), 117–137. Veenman, M. V. J. (2007). The assessment and instruction of self-regulation in computer-based environments: A discussion. Metacognition and Learning, 2(2-3), 177–183. https://doi.org/10.1007/s11409-007-9017-6 . Veenman, M. V. J., van Hout-Wolters, B. H. A. M., & Afflerbach, P. (2006). Metacognition and learning: Conceptual and methodological considerations. Metacognition and Learning, 1(1), 3–14. https://doi.org/10.1007/S11409-006-6893-0 . Winters, F. I., Greene, J. A., & Costich, C. M. (2008). Self-regulation of learning within computer-based learning environments: A critical analysis. Educational Psychology Review, 20(4), 429–444. https://doi.org/10.1007/s10648-008-9080-9 . Wong, J., Khalil, M., Baars, M., de Koning, B. B., & Paas, F. (2019). Exploring sequences of learner activities in relation to self-regulated learning in a massive open online course. Computers & Education, 140, 103595. https://doi.org/10.1016/j.compedu.2019.103595 . Zheng, L. (2016). The effectiveness of self-regulated learning scaffolds on academic performance in computer-based learning environments: A meta-analysis. Asia Pacific Education Review, 17(2), 187–202. https://doi.org/10.1007/s12564-016-9426-9 . Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal, 45(1), 166–183. https://doi.org/10.3102/0002831207312909 . Publisher’sNote Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Research and Practice in Technology Enhanced Learning Springer Journals

Do self-created metacognitive prompts promote short- and long-term effects in computer-based learning environments?

Loading next page...
 
/lp/springer-journals/do-self-created-metacognitive-prompts-promote-short-and-long-term-WtWmXpJSlJ

References (48)

Publisher
Springer Journals
Copyright
Copyright © The Author(s) 2021
eISSN
1793-7078
DOI
10.1186/s41039-021-00148-w
Publisher site
See Article on Publisher Site

Abstract

engelmann@tum.de Chair for Teaching and Learning Students must engage in self-regulated learning in computer-based learning with Digital Media, TUM School of environments; however, many students experience difficulties in doing so. Therefore, Education, Technical University of Munich, Arcisstraße 21, 80333 this study aims to investigate self-created metacognitive prompts as a means of München, Germany supporting students in their learning process and improving their learning Present address: Institute of performance. We conducted an experimental study with a between-subject design. Education, Faculty of Education and Social Sciences, University of The participants learned with self-created metacognitive prompts (n = 28) or without Hildesheim, Universitätsplatz 1, prompts (n = 29) in a hypermedia learning environment for 40 min while thinking 31141 Hildesheim, Germany aloud. In a second learning session (stability test), all participants learned about a different topic without prompts. The results showed no clear effect of the self- created metacognitive prompts on the learning process and performance. A deeper analysis revealed that students’ prompt utilization had a significant effect on performance in the second learning session. This study contributes to the research investigating how students can be supported in ways that enhance their learning process and performance. Keywords: Self-created prompts, Metacognitive prompts, Self-regulated learning, Knowledge acquisition, Long-term effects Introduction Self-regulated learning (SRL) is important for successful learning in computer-based learning environments (CBLEs; Sambe, Bouchet, & Labat, 2017), and it is considered to be a crucial skill in lifelong learning (Anthonysamy, Koo, & Hew, 2020). Due to an increased need for flexible learning settings that could be addressed using CBLEs, an increasing number of approaches have been developed to address the research in self- regulated learning, in particular, instructional support in CBLEs (e.g., Azevedo, & Aleven, V. (Eds.)., 2013; Hsu, Wang, & Zhang, 2017). SRL describes students’ strategic behavior in moving towards a learning goal (Schunk & Zimmerman, 1998). In the cyc- lic model of self-regulated learning (Zimmerman, 2008), three phases of self-regulated learning are described, each specifying metacognitive activities such as goal setting and strategic planning (forethought phase), metacognitive monitoring and control (per- formance phase), and self-evaluation and reflection (self-reflection phase). © The Author(s). 2021 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 2 of 21 In an ideal setting, students would be able to engage in self-regulated learning in flexible CBLEs during their journey to achieve learning goals throughout their lifetime. However, many students experience difficulties in adequately self-regulating their learning, particu- larly in complex CBLEs and across multiple settings (e.g., Azevedo, Moos, Johnson, & Chauncey, 2010; Azevedo, Taub, & Mudrick, 2018; Daumiller & Dresel, 2018; Jansen, van Leeuwen, Janssen, Conijn, & Kester, 2020; Pieger & Bannert, 2018). Furthermore, these difficulties are associated with suboptimal learning outcomes (e.g., Azevedo & Cromley, 2004; Bannert, Sonnenberg, Mengelkamp, & Pieger, 2015; Kizilcec, Pérez-Sanagustín, & Maldonado, 2017; Lai & Hwang, 2016). Metacognitive prompts were established as in- structional support to help students better self-regulate their learning and thereby achieve higher learning outcomes (e.g., Bannert, 2009; Hsu et al., 2017). Metacognitive prompts are hints, clues, or questions that target the learners’ metacognition. Evidence suggests that helping students engage in self-regulated learning successfully fosters their learning outcomes. For example, in a relevant meta-analysis, Zheng (2016) found a medium effect size of d = 0.44 of self-regulated learning scaffolds in CBLEs on academic performance compared with learning without support. Thus, the overall helpful effects of metacognitive prompts appear to be undisputed. Nevertheless, the extent of the usefulness of the metacognitive prompts within studies on metacognitive prompting seems to vary (Daumiller & Dresel, 2018;Zheng, 2016). A pos- sible explanation may be that the effects of the metacognitive prompts on learning depend on how the students utilize the offered prompts. To date, the utilization of prompts has scarcely been investigated in the common prompting research. To the best of our know- ledge, only Daumiller and Dresel (2018) thematized prompt use, whereby the authors conceptualized it in terms of the subjectively assessed frequency of generally following the “incitement” of prompts. Accordingly, the focus of the mentioned study was instead on the handling of prompts in terms of “whether” rather than “how.” We sought to close the identified research gap by investigating how students’ prompt utilization, which we define in this study as the “way to deal with each specific prompt”, affect their learning outcome measured directly after learning and, moreover, in a subsequent learning session 3 weeks later. Moreover, we advance the common prompting approach with so-called self-created metacognitive prompts. By helping students to create their own prompts for a learning session in a CBLE, we transfer one of the most interesting opportunities that CBLEs pro- vide to instructional support: the flexible use of learning time and place and a wider range of options in deciding how to learn. Supporting self-regulated learning and performance in computer-based learning environments with metacognitive prompts One approach to facilitating self-regulated learning in CBLEs is metacognitive support, for example, in the form of prompts that induce students’ metacognitive activities during learn- ing. Such activities include but are not limited to orientation, goal specification, planning, monitoring and control, and evaluation strategies (Bannert, 2007; Veenman, 1993). Accord- ing to available meta-analyses, prompts show a significant, moderate positive effect on aca- demic performance (Belland, Walker, Olsen, & Leary, 2015;Zheng, 2016). However, there are only a few empirical studies to date that also examine student be- havior, such as the number of page visits in an online learning environment, in addition Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 3 of 21 to more descriptive attributes (e.g., Bannert et al., 2015; Daumiller & Dresel, 2018; Lallé, Taub, Mudrick, Conati, & Azevedo, 2017; Wong, Khalil, Baars, de Koning, & Paas, 2019). Bannert et al. (2015), for example, investigated the navigation behavior of students in regulating their learning to gain insight into whether and how students se- lect pages with relevant content from learning websites. In their study, they found non- linear page selections by students as indicators of strategic navigation behavior (Astleitner, 1997). Moreover, understanding how students use prompts provides deeper insight into the learning effectiveness of prompts (e.g., Bannert et al., 2015; Bannert & Mengelkamp, 2013; Bannert & Reimann, 2011; Moser, Zumbach, & Deibl, 2017). Ban- nert et al. (2015) found, for example, that within an experimental group that received metacognitive prompts, just under half of the students complied with the prompts as intended, and the others were less or not at all compliant. Their analysis showed an im- proved learning outcome regarding transfer performance for compliant students. The relevant research mainly focuses on the effectiveness of prompts with regards to short-term effects in learning outcomes, which means that the learning outcomes are measured at the end of the learning session (e.g., Delen, Liew, & Willson, 2014; Kim & Pedersen, 2011; Moser et al., 2017; Müller & Seufert, 2018; Renner, Prilla, Cress, & Kimmerle, 2016; van Alten, Phielix, Janssen, & Kester, 2020). Only a few studies have analyzed the long-term effects of metacognitive prompts in a second learning session following some days or weeks later. A few studies have found sustainable effects for metacognitive prompts (Stark & Krause, 2009; Stark, Tyroller, Krause, & Mandl, 2008); however, in another study, the effect of metacognitive prompts could not be retained in a follow-up session (Hilbert et al., 2008). Thus far, the long-term effects of prompts have been researched less and the few studies have been conducted in quite varied ways. The classical approach is character- ized by presenting another delayed knowledge test on the same learning topic (s. Table 1). For example, Daumiller and Dresel (2018) supported the students of three experi- mental groups with prompts (only metacognitive prompts vs. only motivational prompts vs. metacognitive and motivational prompts) whereas students in the control group learned without prompts about a topic on psychological research methods. Im- mediately after learning, there was a knowledge test, and the results showed better learning outcomes for all groups of prompted students compared to the nonprompted students. In addition, there was one delayed knowledge test 1 week later and another delayed knowledge text (exam) 10 weeks later on the same learning topic. At least the first of these delayed tests showed better test scores for the prompted groups whereas the second showed better test scores only for the motivational prompts. The significant Table 1 Types of long-term effects investigated in the literature Learning session 1 Test 1 Learning session 2 Test 2 Classical approach Experimental Immediate and manipulation delayed knowledge of the prompts test(s) Repeated approach Experimental Immediate Experimental Immediate manipulation knowledge test manipulation knowledge test of the prompts of the prompts Stability approach Experimental Immediate Learning without Immediate manipulation knowledge test prompts knowledge test of the prompts Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 4 of 21 results of the delayed knowledge tests were interpreted as long-term effects (see also the study by Stark & Krause, 2009). In general, this classical approach verifies that the knowledge gained by prompting still exists after a few days and thus measures the long-term effect with regard to knowledge of the content. Another approach uses a repeated design.Müller and Seufert(2018)prompted students of the experimental group in a learning session about empirical research methods and conducted a knowledge test immediately after learning. One week later, they “repeated” the same design by prompting the students of the experimental group again. The learning topic was different; however, the session also addressed the basics in empirical research methods. They found a short-term effect on transfer performance but no long-term ef- fects in the second learning session. With this repeated design, it can be investigated whether the instructional prompts work better with regard to higher learning outcomes in a second learning session using another learning topic under the same learning condi- tions. In general, this repeated design does not investigate the preservation of the prompts after a period of time during which the participants were not exposed to prompts. Instead, the repeated design tests the degree to which prompts affect learning at a particular learn- ing session, in more than one consecutive learning situation. The design by Bannert et al. (2015) was similar with regard to the two learning ses- sions; however, they did not prompt the students in the second learning session to in- vestigate whether the strategies induced by self-directed metacognitive prompts (presented only in the first learning session) would be accessed in another follow-up learning session. They found better navigation behavior and transfer performance not only in the first learning session but also in the second (unsupported) learning session. This stability approach investigates not only the stability of knowledge gain in a similar learning topic but also the stability of the strategy use induced by prompts. In general, this stability approach investigates the long-term effects with regard to self-regulation strategies cued by the metacognitive prompts. For the purpose of our study, we focused on the stability approach to investigate the long-term effects of self-created metacognitive prompts. By doing so, we analyzed whether the positive effects of self-created metacognitive prompts measured directly after learning could be perpetuated in a follow-up learning session without any add- itional instructional support, thus affecting learning beyond the learning session in which the prompt appeared in future learning sessions. In summary, the evidence in terms of both understanding the effect of self-created metacognitive prompts on the learning process and on learning outcomes as well as the long-term effects of self- created metacognitive prompts will be investigated in this study. Modifying the design of metacognitive prompts The individual studies mentioned (e.g., Bannert et al., 2015; Müller & Seufert, 2018; Stark & Krause, 2009) as well as the review and meta-analysis (Belland et al., 2015; Zheng, 2016) paint an overall positive picture of the effect of metacognitive prompts; however, the results are not consistent. The inconsistency could be due to the function of prompts—such as differentiating between prompts supporting students’ reflection of learning behavior and prompts supporting students’ cognitive processes—as well as the timing of the support. Both aspects have gained attention in research as potentially Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 5 of 21 influencing the success of prompts (e.g., Azevedo, Cromley, Moos, Greene, & Winters, 2011; Berthold, Nückles, & Renkl, 2007; Kauffman, Ge, Xie, & Chen, 2008; Molenaar, Roda, van Boxtel, & Sleegers, 2012). In most of the studies reported thus far, one type of prompt was implemented in fixed time intervals. Some studies have developed adap- tive prompts that enable adjustments to the content and/or timing of the prompts (e.g., Bouchet, Harley, & Azevedo, 2016; Kramarski & Friedman, 2014; Schwonke, Hauser, Nückles, & Renkl, 2006; Thillmann, Künsting, Wirth, & Leutner, 2009). To summarize this research area, the extent to which the prompts should be adapted to gain the best learning results is still an open question; in particular, the content of the prompts re- quires more research since the time intervals of prompts have been investigated previ- ously. For example, in the experimental study by Bannert et al. (2015), shortly before the learning session, students could personalize the time intervals at which metacogni- tive prompts would appear during the learning phase. Similarly, the prompts in the study by Bouchet et al. (2016) were faded out in different ways, meaning that the time interval between the presentations of prompts was adjusted by the students. However, they found no significant effect of fading on learning gains, as expected. The idea of giving students the ability to create and configure their own prompts is not only driven by the transfer of the flexibility of modern CBLEs directly to their inte- grated support, as mentioned in the introduction, but also by the phenomenon of stu- dents’ poor compliance with instructional tools (e.g., Clarebout & Elen, 2006; Lallé et al., 2017; Schworm & Gruber, 2012), particularly with metacognitive prompts (e.g., Bannert & Mengelkamp, 2013). Students in past studies have complained that the prompts restricted their learning or indicated that they (partly) did not use the prompt in the intended manner (e.g., Bannert et al., 2015). Furthermore, the research shows that giving students the opportunity to influence their own learning is positively corre- lated with students’ development of self-regulated learning (Randi & Corno, 2000). With regard to instructional prompts, however, there is hardly any research to date that addresses adaptable prompts and their effects on learning processes and learning out- comes (Bannert et al., 2015; Kramarski & Friedman, 2014). Based on the scarce re- search available, adaptable prompts have been found to support learning (e.g., Bannert et al., 2015). In the study by Bannert et al. (2015), students were able to personalize their metacognitive prompts not only with regard to presentation time as mentioned above but also with regard to the content of the prompts. This means that the sequence in which the reasons for the learning activities had to be carried out for prompting could be chosen freely. As expected, learning with such self-directed metacognitive prompts significantly increased students’ navigation behavior and transfer performance when compared with another group of students learning without prompts. The goal of this study is to further increase the freedom in personalizing one’s own learning sup- port and to investigate the effects of such a greater learner control. Similar to self-directed (personalized) metacognitive prompts, self-created (personal- ized) metacognitive prompts are characterized students’ ability to determine the timing of the prompts’ occurrence themselves before beginning the learning process. Nonethe- less, self-created prompts differ from self-directed prompts in that they give students more freedom with regard to the personalization of their prompts: As implied by their name, self-created metacognitive prompts are written by the students themselves in ad- vance to the learning process, based on one example of a metacognitive prompt (e.g., Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 6 of 21 Müller & Seufert, 2018). In addition, the learners themselves determine the number of learning activities in which they will engage when self-creating their prompts (within the restrictions of the learning setting). By asking students to create their own prompts before the learning session and to re- ceive those prompts during the learning session, we expect them to feel supported without being restricted by prompts that have been imposed on them. In this case, prompts could achieve their aim to support the SRL process and performance (as shown in the research summarized by Zheng, 2016) without being hindered by stu- dents’ reluctance. According to the SRL research, this prompting approach appears to help students overcome production deficits (Azevedo & Cromley, 2004; Marschner, Thillmann, Wirth, & Leutner, 2012; Nückles, Schwonke, Berthold, & Renkl, 2004; Veenman, 2007; Veenman, Van Hout-Wolters, & Afflerbach, 2006). Prompts will in- duce the students’ metacognitive activities that they usually do not execute spontan- eously during learning situations. The prior research in which medium effect sizes were obtained for different types of metacognitive prompts on transfer performance was compared with a control group learning environment without prompts (0.42 < d <0.59, Bannert & Mengelkamp, 2013). The extant research leads us to develop self-created metacognitive prompts: Self- created metacognitive prompts are prompts that students create themselves before they begin to study learning materials. Such prompts allow students to gain even more con- trol over their learning process. Our hypothesis is that self-created prompts are more adapted to the (perceived) needs of individual students and will lead to higher compli- ance with metacognitive prompts during the learning phase. Research questions and hypotheses In general, prompting self-regulated learning in CBLEs is successful in supporting short-term learning outcomes (Belland et al., 2015; Winters, Greene, & Costich, 2008; Zheng, 2016). The adaption of prompts has been investigated as an approach for ad- dressing the suboptimal utilization of prompts during learning processes (e.g., Bannert et al., 2015). Thus far, the adaption of prompts has mainly been based on the timing (e.g., Bouchet, Harley, & Azevedo, 2018) or content of the prompts (e.g., Schwonke et al., 2006). The current research investigates prompts created by students themselves, whereby, in addition to the timing of the prompts, the students are able to determine the content of the prompts themselves. Thus, this paper adds to the research in meta- cognitive prompts by investigating prompts that give students more freedom in creat- ing their prompts than any other comparable study has ever done. This study investigates the effect of these self-created prompts on the learning process and learn- ing outcomes as well as on the stability of the potential effects. Moreover, we explore the actual utilization of prompts during the learning process to better understand self- created prompts. Thus, this research poses three research questions: (1) To what extent do self-created prompts affect the learning process, and can the potential effect be maintained long-term? (2) To what extent do self-created prompts affect learning performance, and can the potential effect be maintained long-term? Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 7 of 21 (3) To what extent does prompt utilization influence short- and long-term effects? The first and second research questions are based on prior research showing, on average, the positive effects of prompts on self-regulated learning processes and learn- ing outcomes (e.g., Bannert et al., 2015; Bannert & Reimann, 2011; Winters et al., 2008; Zheng, 2016). Thus, with regard to the first two research questions, we hypothesize that positive effects can also be found when learning with self-created prompts. Accordingly, the hypotheses for the first two research questions are as follows: (1) Self-created prompts facilitate the learning process: Students learning with self- created prompts will visit relevant webpages more frequently, spend more time on relevant webpages, and navigate the learning environment less linearly than students learning without prompts. Moreover, these effects will be maintained long-term. (2) Self-created prompts improve learning performance: Students learning with self- created prompts will increase their recall, comprehension, and transfer perform- ance compared to students learning without prompts. Furthermore, these effects will be maintained long-term. The third research question is also based on research findings suggesting that stu- dents utilize prompts differently, and not always to their advantage, which affects their learning outcome (e.g., Bannert et al., 2015; Bannert & Mengelkamp, 2013; Clarebout & Elen, 2006; Randi & Corno, 2000; Schworm & Gruber, 2012). While the meta-analysis by Zheng (2016) differentiated between different functions of prompts, there has been no prior analysis of prompt utilization with regard to how the function of the prompt was interpreted by the students. Hence, prior studies have analyzed either the function of prompts from a different perspective or different aspects of prompt utilization; there- fore, the analysis in this paper is more exploratory and thus nondirected. Thus, this paper is closely related to the current research investigating the short- and log-term effects of different types of prompts (e.g., Bannert et al., 2015; Müller & Seu- fert, 2018). It goes beyond the existing research by implementing prompts that are in- fluenced by the learners themselves to a higher degree than in prior studies, and it investigates how these prompts are utilized by students. Method Sample and design Sixty-six German-speaking undergraduate university students participated in the study. The final sample comprised n = 57 participants (M =19.9 years, SD =1.58; 72%fe- age age male) because we had to eliminate 9 participants due to (a) very poor compliance in par- ticipating in the study and (b) reducing the number of extreme values in prior knowledge by removing two participants with very low prior knowledge (below 4 on a scale from 1 to 25) and three participants with very high prior knowledge (above 15 on a scale from 1 to 25). The experiment was conducted in a between-subject design comprising three ses- sions. As with Bannert et al. (2015), the first session was used to measure learner charac- teristics. Additionally, based on Bannert et al. (2015), the other two sessions (learning session 1 and learning session 2) took place approximately 1 week and 4 weeks later. The Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 8 of 21 Fig. 1 Procedure of the experimental study over three sessions manipulation of the independent variable self-created prompts was undertaken only in the first learning session. The manipulation was implemented with two conditions: the experi- mental condition of learning with self-created prompts (n = 28) and the control condition of learning without prompts (n = 29). The students were randomly assigned to one condi- tion by following a randomized list of the two conditions. Procedure The experiment took place in a laboratory over three sessions, separating the pretests, the first, and the second learning session. The total time of the experiment was ap- proximately 5 h. It resulted from the duration of the tests and further from around 40 min of learning time per learning session, a time that was chosen to represent a typical teaching time in class. Figure 1 presents an overview of the procedure. In the first session, the learner characteristics were measured. Since we found no dif- ference in the learner characteristics between both groups, we did not consider the learning characteristics further. The first learning session was structured into three phases that, in total, lasted ap- proximately 2 h. In the first phase, the participants were introduced to the hypermedia learning environment and randomly assigned to the experimental or control condition as described above. Afterwards, the participants received training that lasted approxi- mately 15 min. The content of the training was dependent on the condition: the partic- ipants in the experimental condition received a short introduction to metacognitive prompts and a model of what a metacognitive prompt looks like, and they subsequently created metacognitive prompts for themselves; the participants in the control condition received alternative training (on ergonomics) to ensure an equal workload for the par- ticipants in both conditions. In a second phase, the participants engaged in self- regulated learning in the hypermedia learning environment for 40 min with or without self-created prompts depending on the condition. Afterwards, the participants were trained to think aloud. The participants learned about the basic concepts of operant conditioning and were free to navigate through the hypermedia learning environment Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 9 of 21 as they wished. All page visits as well as their time and duration were recorded in a log file. Additionally, the participants were asked to think aloud during the entire learning task. The think-aloud protocols and the computer screen were recorded in a video file. In a third phase, directly after the learning phase, the recall, comprehension, and trans- fer performance of the participants were measured. The second learning session (stability test) was similar in structure to the first learning session but without the instruction phase, and it lasted approximately 1.5 h. In this ses- sion, the participants learned about the basic concepts of motivation psychology. Unlike the first learning session, during the second learning session, there was no manipula- tion of the independent variable. Every other aspect of the learning phase was similar to the procedure of the learning phase in the first learning session for the control con- dition. Hence, all participants learned in the hypermedia learning environment without prompts for 40 min and were free to navigate as they wished. Again, all page visits as well as their time and duration were recorded in a log file, and the participants were asked to think aloud during the entire learning task. The think-aloud protocols and the computer screen were recorded in a video file. Immediately after learning, the recall, comprehension, and transfer performance of the participants were measured. Learning environment The study included two learning phases: one phase in the first learning session on the topic of operant conditioning and one phase in the second learning session (stability test) on the topic of the psychology of motivation. An analysis of text readability (Michalke, 2015) showed similar levels of difficulty of both learning topics (i.e., the Flesch-Kincaid grade-level score for “learning theories” was 19.01, and that for “psych- ology of motivation” was 19.14). Both topics were presented in similar computer-based hypermedia learning environments. Each of the two learning environments included a page specifying the learning goal, 10 pages of information relevant to the topic specified in the learning goals (including approximately 2300 words as well as 5 pictures and ta- bles); and approximately 40 pages including overviews, summaries, and pages not rele- vant to the learning goals. The hypermedia learning environment was designed to provide external validity to the learning process, which usually takes place in a hyper- media learning context with additional, irrelevant information. Moreover, self- regulation and metacognition are important as they help students negotiate learning situations that contain relevant and irrelevant information (among other hurdles). Thus, the students in this study were provided with an externally valid learning experi- ence that made it necessary for them to self-regulate their learning by detecting the relevant learning pages according to their current learning goals. The participants could navigate the learning environment by using a menu bar on the left side of the computer screen, one of 300 hyperlinks, a next-page button and a previous-page button on the top of each page, and the browser buttons (back and forward). Manipulation of the independent variable The manipulation of the independent variable was implemented in two steps during the first learning session. In the first step, the training (before the learning phase) intro- duced the participants of the experimental condition to the prompts and showed them Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 10 of 21 how to create their own prompts to be used during the upcoming learning phase. Add- itionally, the participants could familiarize themselves with the structure of the hyper- media learning environment (with different content) to anticipate which prompts they might need and at what point they would like to be supported by a prompt. At the end of the training phase, directly before the start of the learning phase, the participants created their own prompts. To do so, they were introduced to an example prompt list- ing major metacognitive learning activities. They were completely free in creating their prompts; i.e., they could choose to design prompts that were similar to the example prompt, loosely based on the example prompt, or completely different from the pre- sented example prompt. However, they were asked to base the prompts on the training and thus mainly created metacognitive prompts. Additionally, the participants set time- stamps to determine at which points within the 40-min learning phase they wished to receive the self-created prompt. The participants needed to set a minimum of 3 time- stamps. By designing the self-created prompts in this way, the students could influence how they interacted with the prompts during the learning phase, the prompt itself, and the time of appearance. In the second step, the participants received a think-aloud training and started with the learning in the environment. The prompts were shown to the participants of the experimental condition during the learning phase at the times determined by the par- ticipants. The prompts were presented in a pop-up window displaying the self-created list of metacognitive learning activities; see Fig. 2 for an example. Students were ex- pected to select one or more of the prompted learning activities that they wanted to en- act next, submit their list of selected activity/ies, and then continue learning in the hypermedia learning environment while performing the activity/ies (one after the other) during time following the prompt. The participants in the control condition received a different training that was irrele- vant to the content of the study to ensure a similar work time and workload under both conditions. Specifically, they learned about the major criteria of an ergonomic work- place and how to design their own ergonomic workplace. They could also familiarize themselves with the structure of the hypermedia learning environment before the learn- ing phases started by using the same content as that presented to the students of the experimental group. Then, they received a think-aloud training and learned in the same hypermedia learning environment as the participants in the experimental conditions without any support by prompts. Instruments and dependent variables Learning process: navigation behavior To investigate the learning process, we analyzed the recorded log files with regard to the students’ navigation behavior. The log files collected in this study recorded the pages and times of all webpages visited by the participants as well as the duration of the visits. Moreover, the webpages of the learning environment were categorized into relevant and irrelevant pages (see the description of the learning environment). Mean- ingful insights could be derived from our analysis of the systematic navigation behavior of the participants based on three parameters that could be drawn from the informa- tion given in the log files: (1) the relative frequency of relevant page visits (the number Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 11 of 21 Fig. 2 Screenshot of the learning environment including a pop-up window displaying the self-created prompt by a learner. Note: In this example (which is translated from German for this paper), the learner created the prompt by listing four different learning activities before starting the learning session in the learning environment. In the specific prompting situation, s/he chose “I reflect on previous content” as the learning activity to enact next of relevant pages visited divided by the total number of pages visited), (2) the relative time spent on relevant pages (time spent on relevant pages divided by the total learning time), and (3) the frequency of linear navigation steps divided by the total number of navigation steps. The frequency and time spent on relevant webpages provide insight into navigation behavior because they indicate the degree to which the students used self-regulation strategies to reach their learning goals, selected relevant webpages to study, and did not simply follow the progression suggested by the chapters in the hypermedia learning environment. The last parameter is based on the operationaliza- tion of strategic navigation behavior as the number of nonlinear node selections (Astleitner, 1997). Learning performance The learning outcomes were measured on three dimensions with three knowledge tests. The dimensions were based on Bloom’s taxonomy of cognitive learning objectives (Bloom, 1956) and concentrated on the three least complex components: recall, com- prehension, and transfer. Recall was measured by asking the participants to write down the terms and concepts of the topic they studied during the learning phases (operant conditions in learning session 1 and the psychology of motivation in learning session 2). The performance score for recall was determined by counting the correct number of terms and concepts mentioned by the participants. Comprehension was measured with a multiple-choice test consisting of 22 items for the topic of operant condition (learning session 1, α = .73) and 19 items for the topic of psychology of motivation Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 12 of 21 (learning session 2, α = .40), each with 1 correct and 3 false response options. Transfer was measured with open questions in which the participants were asked to solve 8 prototypical problems in educational settings that were unknown to the participants and to apply the knowledge of operant conditioning (learning session 1, α = .61) or the psychology of motivation (learning session 2, α = .65). Two research assistants rated each answer independently on a scale from 0 (no answer) to 5 (correct answer) for all data. The interrater reliability was good (learning session 1: Cohen’s Kappa = .84; learn- ing session 2: Cohen’s Kappa = .74). In case of disagreement between the two raters, an expert of the research team determined the final score. Prompt utilization Prompt utilization was measured by coding the think-aloud protocols of the partici- pants in the experimental condition in the first learning session. Because both condi- tions involved learning without prompts in the second learning session, prompt utilization could be coded only in the first learning session. Coding was completed for each prompt appearance for every participant and then aggregated to yield one score for each participant. The interpretation of each prompt was coded in one of two cat- egories, either (a) a cue to reflect on current metacognitive learning activities or (b) a cue to enact a (potentially new) metacognitive learning activity presented in the prompt. We also included a residuum category that was applied in case the prompt utilization could not be coded or the learner did not react to the prompt at all. This categorization was developed inductively after observing that the self-created prompts were used quite differently: (a) some participants seemed to use the prompts as a re- flection tool. In the think-aloud protocols, these participants mentioned which (meta-) cognitive strategies they were currently using and compared them with the metacogni- tive learning activities listed in their prompt. For example, one participant gave this statement while hovering with her cursor over an activity (“I check whether I am reach- ing my goal”) in the prompt window: “I am checking whether I am reaching my goal… Yeah, I actually did that already. So, yes [ticking that box].” (b) Other participants were reminded by their prompt to enact some of the (meta-) cognitive activities mentioned in the prompt. In the think-aloud protocols, these participants did not mention their current learning strategies, often abandoned what they did when the prompt appeared, and verbalized their plan to enact one or more of the activities suggested in the prompt. For example, one participant gave the following statement at the time the prompt window opened: “[reading out loud:] An individual sees… [prompt window opens] Okay, learning activities. I’ll reflect the content learned so far [ticking that box] and afterwards I will get an overview of the material [ticking that box]. So, reflecting… I can say that the learning material is about…” This difference in the interpretation of the prompts (including a third category for unclear cases that were excluded from the ana- lysis) were coded by two independent coders (Cohen’s Kappa = .78). Because each partici- pant worked with more than one prompt, a score was aggregated for each participant by calculating the mode of each participant’s codes. If the mode for a participant was the residuum category or if both categories were coded the same amount of times, then the participant was not included in the analysis regarding prompt utilization. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 13 of 21 Statistical analysis The alpha level was set to 5% for all analyses. As the hypotheses for research question 1 and research question 2 were directional hypotheses, we used one-tailed hypothesis testing for the analyses regarding these research questions. Research question 3 was open; therefore, we used two-tailed hypothesis testing for the analyses regarding the third research question. To avoid a multiple comparison problem in interpreting the in- dividual comparisons, we applied a Bonferroni correction. For the first and second re- search question, we tested six comparisons for the individual measures, three learning process parameters, and three performance tests. Thus, we corrected the significance level for the individual comparisons to 0.0083. For the third research question, we ap- plied three comparisons per dataset for the individual measures of performance. Thus, we set the significance level to 0.017 for the individual comparisons. Results Effects of self-created prompts on the learning process To analyze the potential effect of self-created prompts on the learning process, we compared the navigation behavior of the students in the experimental condition to the navigation behavior of the students in the control condition. We hypothesized that self- created prompts would increase the frequency and time the students devote to relevant pages and decrease the linearity in which the hypermedia learning environment is navi- gated compared with learning without prompts. Furthermore, we hypothesized that these effects would persist in a second learning session (stability test) in which the stu- dents under both conditions learned without prompts. Table 2 displays the descriptive values for all navigation parameters in the first and second learning sessions. A multi- variate analysis of variance showed that the prompts facilitated navigation behavior in the first learning session with a large effect (F(3, 53) = 6.63, p < .001; Wilk’s Λ = 0.73, partial η = .27). The analysis of the individual parameters showed that the self-created prompts affected only the relative time spent on relevant webpages (see Table 2). We found no overall effect of prompts on the navigation behavior in the second learning session (F(3, 53) = 2.60, p = .062; Wilk’s Λ = 0.87, partial η = .13). In summary, the results partly support our hypotheses regarding our first research question: The self-created metacognitive prompts supported the learning process in the first learning session; however, the effect could not be maintained in the second learning session. The posi- tive effect could be found only for one navigation parameter, i.e., the relative time spent on relevant webpages, an individual effect that was also shown in the second learning session. Effects of self-created prompts on learning performance To analyze the potential effect of self-created prompts on learning performance, we an- alyzed students’ recall, comprehension, and transfer performance in the experimental condition and compared their learning outcomes with the students in the control con- dition. We hypothesized that self-created prompts would foster learning performance compared with learning without prompts. Furthermore, we hypothesized that these ef- fects would persist in a second learning session in which the students in both condi- tions learned without prompts. Table 2 displays the descriptive values for all learning performance parameters in the first and second learning sessions. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 14 of 21 Table 2 Descriptive values and inferential statistics for navigation behavior and learning outcomes in learning session 1 and learning session 2 Learning with Learning t p (one-tailed) d self-created without prompts prompts (n = 28) (n = 29) MSD M SD Learning session 1 Relative frequency of relevant webpages visited .41 .12 .38 .13 –1.02 .157 0.24 Relative time spent on relevant webpages* .62 .13 .49 .16 –3.48 <.001 0.92 Linearity of navigation steps .39 .18 .47 .23 1.59 .117 0.39 Recall 14.61 4.95 12.24 5.70 –1.67 .051 0.44 Comprehension 16.93 2.58 15.38 3.68 –1.85 .036 0.49 Transfer 20.27 3.74 21.24 4.54 0.88 .191 0.23 Learning session 2 (stability test) Relative frequency of relevant webpages visited .32 .11 .32 .11 –0.23 .410 0.00 Relative time spent on relevant webpages .58 .17 .50 .16 –1.92 .030 0.51 Linearity of navigation steps .38 .18 .41 .17 0.76 .226 0.17 Recall 11.00 3.43 10.28 4.07 –0.73 .236 0.19 Comprehension 13.00 2.19 11.70 2.06 –1.84 .036 0.49 Transfer 20.66 5.054 20.88 5.39 0.16 .438 0.04 Note. Students of the experimental group were supported by self-created prompts only in learning session 1; in learning session 2, both groups learned without prompts. The Bonferroni-corrected significance level was 0.0083. The comparisons reached significance are marked with an asterisk (*) A multivariate analysis of variance showed a large, significant overall effect of the self- created metacognitive prompts on performance in the first learning session (F(3, 53) = 3.78, p = .016; Wilk’s Λ =0.82, partial η = .18). However, an analysis of the individual parameters showed no differences in recall, comprehension, or transfer (see Table 2). In the second learn- ing session, the self-created metacognitive prompts did not show an overall effect on perform- ance measures (F(3, 53) = 1.61, p = .198; Wilk’s Λ =0.92, partial η =.08). In summary, the results do not support our hypotheses regarding our second re- search question: The self-created metacognitive prompts positively affected the learning performance in the first learning session; however, the effect could not be maintained in the second learning session and the individual comparisons do not reveal a positive effect of the self-created metacognitive prompts on recall, comprehension, or transfer Effects of prompt utilization on short- and long-term learning performance For a deeper understanding of how the self-created prompt could affect or fail to affect the students’ learning performance, we analyzed the think-aloud protocols to determine how the prompts were interpreted by the participants during learning. Furthermore, we analyzed whether possible effects could persist in a second learning session in which the participants learned without prompts. Table 3 contains the results of this exploratory analysis. Half of the participants used their prompts to reflect on their current metacognitive learning activities (n = 11), i.e., as a reflection request. The other half of the students used their prompts mainly as a cue to enact one or more prompted metacognitive learning activities (n = 12), i.e., as a call to action without deeper reflection on current learning activities. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 15 of 21 Table 3 Descriptive values and inferential statistics for the analysis of prompt utilization regarding learning outcomes in learning session 1 and learning session 2 Reflection request (n = 11) Call to action (n = 12) t p (two-tailed) d MSD M SD Learning session 1 Recall 14.09 3.78 15.17 6.07 .50 .619 –0.21 Comprehension 17.00 1.79 16.00 3.10 –.94 .361 0.40 Transfer 21.95 1.97 18.92 4.68 –2.06 .059 0.84 Learning session 2 (stability test) Recall 10.09 3.59 11.00 3.74 .59 .559 –0.25 Comprehension 14.09 1.22 12.08 2.57 –2.42 .029 1.00 Transfer* 23.36 4.21 17.79 3.71 –3.37 .003 1.50 Note: This analysis includes only students of the experimental condition learning group with self-created prompts. Students of the experimental group were supported by self-created prompts only in learning session 1; in learning session 2, both groups learned without prompts. The Bonferroni-corrected significance level was 0.017. All comparisons that reached significance are marked with an asterisk (*) Table 3 displays the descriptive results and individual differences in the parameters. There were no clear descriptive differences between the groups in terms of recall and comprehension. The descriptive difference in the first learning session in transfer per- formance was in favor of the students’ interpretation of the prompts as a reflection tool. However, a multivariate analysis of variance showed no main effect of the interpretation of the self-created prompts on performance (F(3, 19) = 2.87, p =.064; Wilk’s Λ =0.69, partial η = .31). The analysis of the second learning session showed a long-term effect of the interpretation of the self-created prompts on learning performance (F(3, 19) = 6.45, p =.003; Wilk’s Λ = 0.50, partial η = .50). In the individual comparisons, a significant main effect with a large effect size was found in the transfer performance in favor of the stu- dents’ interpreting the self-created prompts as a reflection request. In summary, the results regarding the third research question lead us to the assump- tion that interpreting self-created prompts as a reflection request is more beneficial for learning performance than interpreting prompts as a cue for action in fostering long- term transfer performance. Discussion Do self-created metacognitive prompts promote short- and long-term effects in computer-based learning environments? The prior evidence supports the claims that metacognitive prompts generally can facilitate learning in CBLEs (e.g., Bannert et al., 2015; Belland et al., 2015; Bouchet et al., 2016; Delen et al., 2014; Kim & Pedersen, 2011; Winters et al., 2008; Zheng, 2016). We closely examined how a specific type of metacognitive prompt, self-created metacognitive prompts, may help students. More- over, we investigated the effects of self-created prompts on not only students’ short- term learning performances but also students’ learning processes, i.e., how students navigate CBLEs and utilize the self-created metacognitive prompts and how sustainable the possible effects are in a second learning session without prompts. The results of this experimental study allow several conclusions to be drawn regarding the support of stu- dents in CBLEs with self-created metacognitive prompts. As shown by the results, there are mixed beneficial effects for self-created metacogni- tive prompts on the learning process and learning performance. In addition to the Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 16 of 21 significant effects in this regard, we found medium effect sizes for the relative time spent on relevant webpages in learning session 1 as well as descriptive advantages for comprehension in learning session 1 and learning session 2 (see Table 2). Especially when the prompt use of the students was characterized by deep reflection (reflection request), we found even larger effect sizes for transfer in learning session 2 (see Table 3). In addition, the results are interesting in comparison to studies testing different types of (metacognitive) prompts. In a next step, it would be necessary to test the effect of self-created prompts in empirical investigations compared with other metacognitive prompts. The result pattern of this study differs from that of earlier studies with com- parable settings (e.g., Bannert et al., 2015; Bannert & Mengelkamp, 2013; Müller & Seu- fert, 2018), suggesting that the learning mechanism with self-created metacognitive prompts is perhaps just slightly different from prompts that students could not adapt (e.g., Müller & Seufert, 2018) or that they could adapt to a lesser degree by setting the timing of the prompt (e.g., Bannert et al., 2015). In the prior studies, setting the timing of the prompts caused students to navigate to relevant webpages more often and for longer periods and increased the students’ transfer performance. In this study, the self- created prompts similarly caused students to stay longer on relevant webpages. How- ever, in contrast to similar work (e.g., Bannert, 2007; Bannert et al., 2015; Bannert & Mengelkamp, 2013), in general, the self-created prompts did not show to facilitate transfer performance. Thus, the freedom to manipulate the content of the prompts themselves leads to a difference in the pattern of the results: the students in this study regulated their learning to a lesser degree based on the learning goals, and their per- formance suffered compared to that of the participants in studies that gave the students less freedom in designing their learning process. We based the design of this study on the general assumption that metacognitive prompts enhance the process of self-regulation and thus induce metacognitive ac- tivities and initiate deeper processing of information that is important for perform- ance and mainly relevant for transfer performance. The results of this study allow us to hypothesize that the self-created metacognitive prompts might not have been sufficiently targeted to facilitate this process. The self-created prompts were intended to improve poor-to-mixed compliance (Bannert et al., 2015;Bannert & Mengelkamp, 2013; Clarebout & Elen, 2006;Randi &Corno, 2000;Schworm & Gruber, 2012). Our results could suggest that the students might not have been sufficiently knowledgeable to create prompts that would best fit their learning process. While the students were introduced to the creation of prompts, they could not possibly have the same prior knowledge as the researcher who created the prompts in most other studies. Thus, the self-created prompts could not be used to achieve the full potential of the students. In future studies, a more in-depth training session should be given so that students better understand how they are going to study and which prompts would be helpful. Another approach for further investigation would be to allow the students to create or manipulate their metacog- nitive prompts while they engage with the learning materials. An alternative explanation for the inconclusive results could lie in the content of the prompts. While the metacognitive prompts created by experts and the self-directed prompts target metacognitive processes, the prompts created by the students might (also) target other learning processes such as cognitive activities. The advantage of Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 17 of 21 purely metacognitive prompts over alternative or mixed prompts is consistent with meta-analytic results showing that only metacognitive prompts significantly foster per- formance compared with conceptual, strategic, or multiple prompts (Zheng, 2016). While we aimed to investigate self-created prompts by giving more freedom to the students, this approach also allows less control over the content of the prompts in this experiment, which is a major limitation of the study and causes our conclusions to be limited to the intended effect of the self-created prompts and, to a lesser degree, to the actual behavior of the students (except for the coded interpretation of the prompts, i.e., students’ prompt utilization). The research indicates that some students do not utilize prompts as intended (e.g., Clarebout & Elen, 2006; Clarebout, Elen, Collazo, Lust, & Jiang, 2013; Moser et al., 2017; Schworm & Gruber, 2012), which could influence their learning outcomes (e.g., Bannert et al., 2015). Our rather explorative analysis investi- gated the utilization of prompts regarding the interpretation of the self-created prompts either as a cue to reflect about current cognitive and metacognitive activities or as a cue to enact a learning activity suggested by the self-created prompt. The results show a trend suggesting the benefit of interpreting the prompts as a cue to reflect one’s own learning compared with using the prompts mainly as a starting cue or as a cue for more new learning activities. Here, such reflection seems to lead to higher transfer perform- ance and a strong effect in the second learning session. This result could support our assumption that self-created prompts indeed address a production deficit (Marschner et al., 2012; Veenman, 2007; Veenman et al., 2006), thus helping students to reflect upon their learning activities and subsequently improve the self-regulated learning process; however, this finding applies only to some students. The students who inter- preted the prompts as a call to action might not understand their own learning process enough to reflect their current cognitive and metacognitive activities. If this was the cause, there was no production deficit (i.e., students possess the strategies but do not apply them spontaneously) to be addressed by the self-created prompts but rather a mediation deficit (i.e., students do not have the strategies) that cannot be addressed by this kind of prompt only (Bannert, 2007; Veenman, 1993). This analysis might have de- tected that the underlying assumption of using metacognitive prompts to facilitate the self-regulation process applies to only a subgroup of students: those who are affected by a production deficit (Marschner et al., 2012; Veenman, 2007; Veenman et al., 2006). The benefit of the interpretation of the self-created prompts as a reflection aid seems to contradict prior studies and the meta-analyses (Mäeots et al., 2016; Van den Boom, Paas, van Merriënboer, & van Gog, 2004;Zheng, 2016) suggesting that reflection prompts are inferior to metacognitive prompts. However, these studies investigated only the intended design of the researchers and not the interpretation of the prompts by the students. The self-created prompts investigated in this study would probably be categorized as metacog- nitive prompts and not as reflection prompts. However, as discussed before, the students did not always interpret the prompts as such. As assumed above, this interpretation is probably dependent on prior knowledge of self-regulation strategies. However, this inter- pretation is limited because of the exploratory nature of this work. The pattern found here should be tested directly in future confirmatory studies. One important limitation of this study is the design, comparing learning with self- created prompts to learning without prompts. This design was employed to take a first step in investigating whether self-created prompts would be at all beneficial. For a Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 18 of 21 closer examination of the mechanism behind different types of prompts, the design would need to include at least one more condition in which the prompts were not self- created by the students. Furthermore, the study is limited by two methodological con- straints. The training before the learning phase as well as the learning phase took only approximately 10 min and 40 min, respectively. Thus, the time was comparable to one lesson in school or university but did not include a longer time in which SRL processes could develop further. Thus, we must consider that the SRL process might change over time and this development could not be shown in this paper. In addition, the Cron- bach’s Alpha level for the comprehension test in the second learning session was rather low with 0.40 indicating a low internal consistency of this knowledge test. All other performance tests show higher values for internal consistencies of the specific scales, what we consider to be valid for testing the effect of the intervention. Conclusion This study expands our understanding of the support prompts can give to students in CBLEs by showing the partial effects of self-created prompts on the learning process. Furthermore, the study exceeded efforts in analyzing the impact of different prompts on learning (Zheng, 2016) by analyzing how differently the prompts are utilized and the effect of this different utilization on learning. Based on the results of this study, we can recommend paying closer attention to stu- dents’ utilization of self-created prompts: what are the students doing with the prompts and are the students utilizing the prompts as they were meant to by researchers? While the self-creation of prompts might help students to design prompts to their anticipated needs, this practice also introduces new difficulties that might hinder learning such as the limited knowledge of the students regarding the support they will need in a subse- quent learning session. Moreover, we must further investigate the interrelationship of involving students increasingly in creating their own prompts and the differences in utilization of prompts as well as the effect of this factor on the learning process and performance outcomes. The results warrant careful recommendation for the design and implementation of prompts in a learning environment. The study cannot directly support the prior recom- mendation (e.g., Azevedo, 2005; Bannert et al., 2015; Belland et al., 2015; Zheng, 2016) to design prompts to target metacognitive learning processes. However, it might be beneficial for students to be asked to reflect upon their current learning activities as our exploratory results suggest that an inclusion of reflection in the utilization of self- created prompts affects transfer performance, even more in long-term performance. In conclusion, we were able to show that self-created prompts partly facilitate the learning process CBLEs. However, due to the different outcome patterns compared to similar studies in which the students did not create the prompts themselves (e.g., Ban- nert et al., 2015; Bouchet et al., 2018; Daumiller & Dresel, 2018), we conclude that the involvement of students in creating prompts may influence the way in which the prompts support the learning process. Similarly, our results also show that the utilization of prompts affects the performance of students. While prompts that aim to support reflection did not yield a positive effect from a meta-analytic perspective (Zheng, 2016), self-created metacognitive prompts that are utilized to reflect upon current learning activities led to distinctly better transfer learning in the students of this Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 19 of 21 study. Thus, this study contributes to the body of literature investigating how students can be supported to enhance the learning process and performance outcomes when learning in CBLEs. Acknowledgements We would like to thank Dr. Christoph Sonnenberg for his invaluable contributions in collecting, analyzing, and discussing the data as well as the student assistants Anna Horrer, Stefanie Beck, and Veronika Danner for their help in collecting the data, coding the data, and preparing the manuscript. This research was funded by the German Research Foundation (BA 2044/7-2). Authors’ contributions Katharina Engelmann: Analysis of the literature, analysis of the data, and writing the manuscript. Maria Bannert: Analysis of the literature, design, preparation and collection of the data, analysis of the data, and writing the manuscript. Nadine Melzner: Analysis of the literature and writing the manuscript. The authors read and approved the final manuscript. Funding This research was funded by the German Research Foundation (BA 2044/7-2). After accepting to fund the studies (before any data collection), the German Research Foundation did not influence any aspect in the design of the study and collection, analysis, and interpretation of data and in writing the manuscript. Availability of data and materials Since we are still publishing original articles based on the data from this study, we are not making the data and materials available at the moment. Ethics approval and consent to participate All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. Informed consent was obtained from all individual participants included in the study. Consent for publication Not applicable Competing interests The authors declare that they have no conflict of interest. Received: 15 January 2020 Accepted: 28 January 2021 References Anthonysamy, L., Koo, A. C., & Hew, S. H. (2020). Self-regulated learning strategies in higher education: Fostering digital literacy for sustainable lifelong learning. Education and Information Technologies, 25(4), 2393–2414. https://doi.org/10.1 007/s10639-020-10201-8 . Astleitner, H. (1997). Lernen in Informationsnetzen: Theoretische Aspekte und empirische Analysen des Umgangs mit neuen Informationstechnologien auserziehungswissenschaftlicher Perspektive. Lang. Frankfurt/M. Azevedo, R. (2005). Using hypermedia as a metacognitive tool for enhancing student learning? The role of self-regulated learning. Educational Psychologist, 40(4), 199–209. https://doi.org/10.1207/s15326985ep4004_2 . Azevedo, R., & Aleven, V. (Eds.). (2013). Springer international handbooks of education.In International handbook of metacognition and learning technologies. New York: Springer. https://doi.org/10.1007/978-1-4419-5546-3 . Azevedo, R., & Cromley, J. G. (2004). Does training on self-regulated learning facilitate students’ learning with hypermedia? Journal of Educational Psychology, 96(3), 523–535. https://doi.org/10.1037/0022-0663.96.3.523 . Azevedo, R., Cromley, J. G., Moos, D. C., Greene, J. A., & Winters, F. I. (2011). Adaptive content and process scaffolding: A key to facilitating students’ self-regulated learning with hypermedia. Psychological Test and Assessment Modeling, 53(1), 106–140. Azevedo, R., Moos, D. C., Johnson, A. M. M. Y., & Chauncey, A. D. (2010). Measuring cognitive and metacognitive regulatory processes during hypermedia learning: Issues and challenges. Educational Psychologist, 45(4), 210–223. https://doi.org/10.1 080/00461520.2010.515934 . Azevedo, R., Taub, M., & Mudrick, N. (2018). Understanding and reasoning about real-time cognitive, affective, and metacognitive processes to foster self-regulation with advanced learning technologies. In D. H. Schunk, & J. A. Greene (Eds.), Handbook of self-regulation of learning and performance, (2nd ed., pp. 254–270). Routledge. Bannert, M. (2007). Metakognition beim Lernen mit Hypermedien: Erfassung, Beschreibung und Vermittlung wirksamer metakognitiver Strategien und RegulationsaktivitätenZugl.: Koblenz, Univ., Habil.-Schr., 2004 Pädagogische Psychologie und Entwicklungspsychologie: Vol. 61. Waxmann. http://deposit.d-nb.de/cgi-bin/dokserv?id=2993278&prov=M&dok_var= 1&dok_ext=htm. Bannert, M. (2009). Promoting self-regulated learning through prompts. Zeitschrift Für Pädagogische Psychologie, 23(2), 139–145. https://doi.org/10.1024/1010-0652.23.2.139 . Bannert, M., & Mengelkamp, C. (2013). Scaffolding hypermedia learning through metacognitive prompts, Springer international handbooks of education. In R. Azevedo, & V. Aleven (Eds.), International handbook of metacognition and learning technologies, (vol. 28, pp. 171–186). New York: Springer. https://doi.org/10.1007/978-1-4419-5546-3_12 . Bannert, M., & Reimann, P. (2011). Supporting self-regulated hypermedia learning through prompts. Instructional Science, 40(1), 193–211. https://doi.org/10.1007/s11251-011-9167-4 . Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 20 of 21 Bannert, M., Sonnenberg, C., Mengelkamp, C., & Pieger, E. (2015). Short- and long-term effects of students’ self-directed metacognitive prompts on navigation behavior and learning performance. Computers in Human Behavior, 52, 293–306. https://doi.org/10.1016/j.chb.2015.05.038 . Belland, B. R., Walker, A. E., Olsen, M. W., & Leary, H. (2015). A pilot meta-analysis of computer-based scaffolding in STEM education. Journal of Educational Technology & Society, 18(1), 183–197. Berthold, K., Nückles, M., & Renkl, A. (2007). Do learning protocols support learning strategies and outcomes? The role of cognitive and metacognitive prompts. Learning and Instruction, 17(5), 564–577. https://doi.org/10.1016/j.learninstruc.2007.09.007 . Bloom B. S. (1956) Taxonomy of educational objectives. In Bloom B. S., M. D. Engelhart, E. J. Furst, W. H. Hill & D. R. Krathwohl (Hg.), Handbook I: Cognitive domain (S. 20–24). Longmans. New York, NY, USA. Bouchet, F., Harley, J. M., & Azevedo, R. (2016). Can adaptive pedagogical agents’ prompting strategies improve students’ learning and self-regulation? In A. Micarelli, J. Stamper, & K. Panourgia (Eds.), Lecture notes in computer science: Vol. 9684. Intelligent tutoring systems: 13th international conference, ITS 2016, Zagreb, Croatia, June 7-10, 2016. Proceedings, (vol. 9684, pp. 368–374). Springer International Publishing. https://doi.org/10.1007/978-3-319-39583-8_43 . Bouchet, F., Harley, J. M., & Azevedo, R. (2018). Evaluating adaptive pedagogical agents’ prompting strategies effect on students’ emotions. In R. Nkambou, R. Azevedo, & J. Vassileva (Eds.), Lecture notes in computer science. Intelligent tutoring systems, (vol. 10858, pp. 33–43). Springer International Publishing. https://doi.org/10.1007/978-3-319-91464-0_4 . Clarebout, G., & Elen, J. (2006). Tool use in computer-based learning environments: Towards a research framework. Computers in Human Behavior, 22(3), 389–411. https://doi.org/10.1016/j.chb.2004.09.007 . Clarebout, G., Elen, J., Collazo, N. A. J., Lust, G., & Jiang, L. (2013). Metacognition and the use of tools. In R. Azevedo & V. Aleven (Eds.), International handbook of metacognition and learning technologies (Vol. 28, pp. 187–195). Springer. New Yor, NY, USA. Daumiller, M., & Dresel, M. (2018). Supporting self-regulated learning with digital media using motivational regulation and metacognitive prompts. The Journal of Experimental Education, 87(1), 161–176. https://doi.org/10.1080/00220973.2018.1448744. Delen, E., Liew, J., & Willson, V. (2014). Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video-based environments. Computers & Education, 78, 312–320. https://doi.org/10.1016/j.compedu.2014.06.018 . Hilbert, T. S., Nückles, M., Renkl, A., Minarik, C., Reich, A., & Ruhe, K. (2008). Concept mapping zum Lernen aus Texten. Zeitschrift Für Pädagogische Psychologie, 22(2), 119–125. https://doi.org/10.1024/1010-0652.22.2.119 . Hsu, Y.-S., Wang, C.-Y., & Zhang, W.-X. (2017). Supporting technology-enhanced inquiry through metacognitive and cognitive prompts: Sequential analysis of metacognitive actions in response to mixed prompts. Computers in Human Behavior, 72, 701–712. https://doi.org/10.1016/j.chb.2016.10.004 . Jansen, R. S., van Leeuwen, A., Janssen, J., Conijn, R., & Kester, L. (2020). Supporting learners’ self-regulated learning in massive open online courses. Computers & Education, 146, 103771. https://doi.org/10.1016/j.compedu.2019.103771 . Kauffman, D. F., Ge, X., Xie, K., & Chen, C.-H. (2008). Prompting in web-based environments: Supporting self-monitoring and problem solving skills in college students. Journal of Educational Computing Research, 38(2), 115–137. https://doi.org/10.2190/EC.38.2.a . Kim, H. J., & Pedersen, S. (2011). Advancing young adolescents’ hypothesis-development performance in a computer- supported and problem-based learning environment. Computers & Education, 2(57), 1780–1789. Kizilcec, R. F., Pérez-Sanagustín, M., & Maldonado, J. J. (2017). Self-regulated learning strategies predict learner behavior and goal attainment in massive open online courses. Computers & Education, 104,18–33. https://doi.org/10.1016/j.compedu.2016.10.001 . Kramarski, B., & Friedman, S. (2014). Solicited versus unsolicited metacognitive prompts for fostering mathematical problem solving using multimedia. Journal of Educational Computing Research, 50(3), 285–314. https://doi.org/10.2190/EC.50.3.a . Lai, C.-L., & Hwang, G.-J. (2016). A self-regulated flipped classroom approach to improving students’ learning performance in a mathematics course. Computers & Education, 100, 126–140. https://doi.org/10.1016/j.compedu.2016.05.006 . Lallé, S., Taub, M., Mudrick, N. V., Conati, C., & Azevedo, R. (2017). The impact of student individual differences and visual attention to pedagogical agents during learning with metatutor. In E. André, R. Baker, X. Hu, M. M. T. Rodrigo, & B. Du Boulay (Eds.), Lecture notes in computer science. Artificial intelligence in education, (vol. 10331, pp. 149–161). Springer International Publishing. https://doi.org/10.1007/978-3-319-61425-0_13 . Mäeots, M., Siiman, L., Kori, K., Eelmets, M., Pedaste, M., & Anjewierden, A. (2016). The role of a reflection tool in enhancing students’ reflection. In L. Gómez Chova, A. López Martínez, & I. Candel Torres (Eds.), INTED proceedings, INTED2016 proceedings, (pp. 1892–1900IATED). https://doi.org/10.21125/inted.2016.1394 . Marschner, J., Thillmann, H., Wirth, J., & Leutner, D. (2012). Wie lässt sich die Experimentierstrategie-Nutzung fördern? Zeitschrift für Erziehungswissenschaft, 15(1), 77–93. https://doi.org/10.1007/s11618-012-0260-5 . Michalke, M. (2015). koRpus (version 0.06-3) [computer software] http://reaktanz.de/?c=hacking&s=koRpus. Molenaar, I., Roda, C., van Boxtel, C., & Sleegers, P. (2012). Dynamic scaffolding of socially regulated learning in a computer- based learning environment. Computers & Education, 59(2), 515–523. https://doi.org/10.1016/j.compedu.2011.12.006 . Moser, S., Zumbach, J., & Deibl, I. (2017). The effect of metacognitive training and prompting on learning success in simulation-based physics learning. Science Education, 101(6), 944–967. https://doi.org/10.1002/sce.21295 . Müller, N. M., & Seufert, T. (2018). Effects of self-regulation prompts in hypermedia learning on learning performance and self- efficacy. Learning and Instruction, 58,1–11. https://doi.org/10.1016/j.learninstruc.2018.04.011 . Nückles, M., Schwonke, R., Berthold, K., & Renkl, A. (2004). The use of public learning diaries in blended learning. Journal of Educational Media, 29(1), 49–66. https://doi.org/10.1080/1358165042000186271 . Pieger, E., & Bannert, M. (2018). Differential effects of students’ self-directed metacognitive prompts. Computers in Human Behavior, 86, 165–173. https://doi.org/10.1016/j.chb.2018.04.022 . Randi, J., & Corno, L. (2000). Teacher innovations in self-regulated learning. In M. Boekaerts, M. Zeidner, & P. R. Pintrich (Eds.), Handbook of self-regulation, (pp. 651–685). Academic. https://doi.org/10.1016/B978-012109890-2/50049-4 . Renner, B., Prilla, M., Cress, U., & Kimmerle, J. (2016). Effects of prompting in reflective learning tools: Findings from experimental field, lab, and online studies. Frontiers in Psychology, 7, 820. https://doi.org/10.3389/fpsyg.2016.00820 . Sambe, G., Bouchet, F., & Labat, J.-M. (2017). Towards a conceptual framework to scaffold self-regulation in a MOOC. In C. M. F. Kebe, A. Gueye, & A. Ndiaye (Eds.), Lecture notes of the Institute for Computer Sciences, social informatics and telecommunications engineering. Innovation and interdisciplinary solutions for underserved areas, (vol. 204, pp. 245–256). Springer International Publishing. https://doi.org/10.1007/978-3-319-72965-7_23. Engelmann et al. Research and Practice in Technology Enhanced Learning (2021) 16:3 Page 21 of 21 Schunk, D. H., & Zimmerman, B. J. (1998). Self-regulated learning: From teaching to self-reflective practice. Guilford Press http://www.loc.gov/catdir/bios/guilford051/97046438.html. Schwonke, R., Hauser, S., Nückles, M., & Renkl, A. (2006). Enhancing computer-supported writing of learning protocols by adaptive prompts. Computers in Human Behavior, 22(1), 77–92. https://doi.org/10.1016/j.chb.2005.01.002 . Schworm, S., & Gruber, H. (2012). E-learning in universities: Supporting help-seeking processes by instructional prompts. British Journal of Educational Technology, 43(2), 272–281. https://doi.org/10.1111/j.1467-8535.2011.01176.x . Stark, R., & Krause, U.-M. (2009). Effects of reflection prompts on learning outcomes and learning behaviour in statistics education. Learning Environments Research, 12(3), 209–223. https://doi.org/10.1007/s10984-009-9063-x . Stark, R.,Tyroller, M.,Krause, U.-M., &Mandl,H.(2008).Effekte einermetakognitiven Promptingmaßnahme beim situierten, beispielbasierten Lernen im Bereich Korrelationsrechnung. Zeitschrift Für Pädagogische Psychologie, 22(1), 59–71. https://doi.org/10.1024/1010-0652.22.1.59 . Thillmann, H., Künsting, J., Wirth, J., & Leutner, D. (2009). Is it merely a question of “what” to prompt or also “when” to prompt? Zeitschrift Für Pädagogische Psychologie, 23(2), 105–115. https://doi.org/10.1024/1010-0652.23.2.105 . van Alten, D. C. D., Phielix, C., Janssen, J., & Kester, L. (2020). Effects of self-regulated learning prompts in a flipped history classroom. Computers in Human Behavior, 108, 106318. https://doi.org/10.1016/j.chb.2020.106318 . van den Boom, G., Paas, F., van Merriënboer, J. J. G., & van Gog, T. (2004). Reflection prompts and tutor feedback in a web- based learning environment: Effects on students’ self-regulated learning competence. Computers in Human Behavior, 20(4), 551–567. https://doi.org/10.1016/j.chb.2003.10.001 . Veenman, M. V. (1993). Metacognitive ability and metacognitive skill: Determinants of discovery learning in computerized learning environments. European Journal of Psychology of Education, 29(1), 117–137. Veenman, M. V. J. (2007). The assessment and instruction of self-regulation in computer-based environments: A discussion. Metacognition and Learning, 2(2-3), 177–183. https://doi.org/10.1007/s11409-007-9017-6 . Veenman, M. V. J., van Hout-Wolters, B. H. A. M., & Afflerbach, P. (2006). Metacognition and learning: Conceptual and methodological considerations. Metacognition and Learning, 1(1), 3–14. https://doi.org/10.1007/S11409-006-6893-0 . Winters, F. I., Greene, J. A., & Costich, C. M. (2008). Self-regulation of learning within computer-based learning environments: A critical analysis. Educational Psychology Review, 20(4), 429–444. https://doi.org/10.1007/s10648-008-9080-9 . Wong, J., Khalil, M., Baars, M., de Koning, B. B., & Paas, F. (2019). Exploring sequences of learner activities in relation to self-regulated learning in a massive open online course. Computers & Education, 140, 103595. https://doi.org/10.1016/j.compedu.2019.103595 . Zheng, L. (2016). The effectiveness of self-regulated learning scaffolds on academic performance in computer-based learning environments: A meta-analysis. Asia Pacific Education Review, 17(2), 187–202. https://doi.org/10.1007/s12564-016-9426-9 . Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal, 45(1), 166–183. https://doi.org/10.3102/0002831207312909 . Publisher’sNote Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Journal

Research and Practice in Technology Enhanced LearningSpringer Journals

Published: Feb 10, 2021

There are no references for this article.