Abstract Collaborative problem solving (CPS) is an essential soft skill that should be fostered from a young age. Research shows that a good way of teaching such skills is through video games; however, the success and viability of this method may be affected by the technological platform used. In this work we propose a gameful approach to train CPS skills in the form of the CPSbot framework and describe a study involving 80 primary school children on user experience and acceptance of a game, Quizbot, using three different technological platforms: two purely digital (tabletop and handheld tablets) and another based on tangible interfaces and physical spaces. The results show that physical spaces proved to be more effective than the screen-based platforms in several ways, as well as being considered more fun and easier to use by the children. Finally, we propose a set of design considerations for future gameful CPS systems based on the observations made during this study. RESEARCH HIGHLIGHTS Collaborative problem solving (CPS) is a valuable skill that should be fostered from a young age. Games are a successful way of training CPS, but the platform used may affect its effectiveness. We present a framework and a game implemented to foster CPS, and compare its acceptance and user experience with 80 primary school students in three different implementations: tabletops, tablets and physical spaces. Physical spaces are perceived as easier as and more fun than screen-based sedentary activities, and they are reported as the most desirable to use again both inside and outside the educational. They may also provide additional benefits for CPS enhancement in comparison with purely digital platforms, especially where planning and organization are concerned. 1. INTRODUCTION Modern educational approaches consider the development of communication, teamwork, adaptability and problem solving key elements to include in current curricula (Greenberg and Nilssen, 2015; Pachauri and Yadav, 2014; UNESCO, 2017). Collaborative problem solving (CPS) emerges as the combination of such skills. The OECD in the PISA 2015 report (OECD, 2013), after revising and discussing more than 150 works in the field, defines CPS as ‘the capacity of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution and pooling their knowledge, skills and efforts to reach that solution’. According to this report, CPS involves four cognitive processes or skills that have to be trained: exploring and understanding, representing and formulating, planning and executing, and monitoring and reflecting. In addition, Vygotsky’s Social Development Theory (1978) implies that a person’s potential can only be achieved through interaction with and support from other, ideally more capable, people and various tools. This is based on the idea that when trying to solve a problem, the exchange of ideas could lead to a shared understanding that an individual cannot achieve alone. Gokhale (1995) also highlights the fact that ‘the active exchange of ideas within small groups not only increases interest among the participants but also promotes critical thinking’. This leads to the conclusion that focusing on developing a person’s individual problem solving skills is not enough and it is now essential to have a certain level of proficiency in CPS. Adding an element of play to the learning process has been proven to be a natural and successful way of improving the effectiveness of learning seeing as human culture is generated at least partially through play (Huizinga, 1949). With the aid of technology, educational games (or serious games) can be created to help develop skills like CPS through play and offer instant feedback and interactivity in a game-based learning environment. Educational games are designed to teach people about certain subjects, expand concepts, reinforce development, or help them learn or improve a skill (Dempsey et al., 1996) and they have been shown to have many cognitive, motivational, emotional and social benefits (Granic et al., 2014; Wouters et al., 2013). Many technological games designed to foster CPS rely on digital tabletops, often considered adequate for collaborative learning activities because of their public display, which enhances workspace awareness (Gutwin and Greenberg, 1998, 2002) and in turn improves collaboration. These devices, however, are seldom used in actual educational settings, mostly due to their high cost or their form factor, which hinders their mobility. Handheld devices, on the other hand, are becoming more and more popular in these settings. ‘Once seen as a distraction in the classroom, mobiles are now a powerful tool for advancing learning’ (The New Media Consortium (NMC) and Consortium for School Networking (CoSN), 2017). Also, according to the latest NMC Horizon Report (2017), ‘the global market for mobile learning is predicted to grow by 36% annually, increasing from $7.98 billion in 2015 to $37.6 billion by 2020’. However, interaction with these devices is limited mostly to touch contacts on the small screen area (Garcia-Sanjuan et al., 2016a). In contrast, several authors (Antle et al., 2009; Schneider et al., 2012; Xie et al., 2008) propose mediations via tangible objects, which have been identified suitable and interesting for designing learning activities for children (Strawhacker and Bers, 2014). Despite all of their different advantages, to our knowledge, no comparative studies have been made on which of these platforms is best experienced by children in the context of CPS learning. Quality of experience (Alben, 1996; Hassenzahl and Tractinsky, 2006) is a key factor because, as has been shown in other technological and learning contexts (Bargshady et al., 2015; Pindeh et al., 2016; Tan et al., 2016), the usefulness, ease of use and fun perceived by children influence their attitude towards the learning application usage and the effectiveness of the learning process. In this context, this work explores the potential of a gamified approach based on multi-surface environments (Garcia-Sanjuan et al., 2015, 2016c, 2016d) and tangible interactions in physical spaces to stimulate and foster skills such as communication, negotiation and teamwork which, as indicated above, facilitate the collaborative resolution of problems. To our knowledge, no other previous work has compared traditional technological approaches for CPS with those based on multi-surfaces and tangible interactions in physical spaces to evaluate whether the quality of experience (Alben, 1996) is enhanced by these new technological forms. In this respect, the contributions of this work are manifold. First, the design of the CPSbot framework, a gameful approach to provide the students with an environment that stimulates and fosters skills such as communication, negotiation and teamwork. Secondly, a study of a specific game (Quizbot) with 80 primary school students on three different technological platforms (a tabletop, tablets and physical spaces). This study reveals that our approach based on physical spaces is perceived to be easier and more fun than the screen-based sedentary ones, along with being the platform that subjects manifest as being the one they most want to use again both in class and out-of-class scenarios. Additionally, we consider how the framework can support CPS backed by observations made during the study, suggesting that physical spaces may provide more benefits for CPS enhancement than purely digital platforms, especially where planning and organization are concerned. Finally, we provide a list of considerations for designers of future gameful CPS systems. 2. RELATED WORKS Problem solving skills are highly valued and therefore there are many studies on nurturing and enhancing these skills. In this section, we look at some of these works and separate them into different technologies attending to the quality of experience they enable. 2.1. Single display multi-touch environments The traditional desktop computer is a known and reliable medium very often used for educational games (Brayshaw and Gordon, 2016; Hatzilygeroudis et al., 2012; Liao and Shen, 2012; Raman et al., 2014; Siang and Rao, 2003). However, several studies show that primary-school children between 6 and 13 years of age find it difficult to use a mouse and keyboard (Berkovitz, 1994; Donker and Reitsma, 2007; Strommen, 1994), whereas others reveal newer multi-touch technologies as more intuitive and usable, even to children in kindergarten (Nacher et al., 2016, 2015; Romeo et al., 2003). Many studies highlight the benefits of using digital tabletops in primary education. These benefits include a low interference with the teaching/learning process, increasing motivation (Salvador et al., 2012), fostering creativity (Giannakos et al., 2012), knowledge acquisition (Jackson et al., 2013) and assimilation (Salvador et al., 2012), and, most importantly in this case, favoring hands-on problem solving activities through collaboration (Dillenbourg and Evans, 2011). Works like (Falloon and Khoo, 2014; Mercier et al., 2015; Rick and Rogers, 2008) are examples of studies where single multi-touch displays were used to enhance CPS skills, among others. Rick and Rogers (2008) present a game to learn relationships between mathematics and art on a multi-touch tabletop, and report on it being successful at promoting reflective dialogue in children aged 10–12. Mercier et al. (2015) test the effectiveness of the multi-touch display with respect to the usage of paper by comparing the problem solving process of children aged 10–11 in both platforms. The results of the work show higher levels of collaboration taking place when using the display. The one by Falloon and Khoo (2014) is a more concrete study with 5-year olds of the type of communication that takes place when an Apple iPad is used as a public workspace for a CPS class activity. The results show that indeed a lot of on-task talk took place, but the young age of the students made it necessary to include a teacher in order to help them achieve the appropriate talk quality. Unfortunately, tabletops are a rare commodity in real educational settings, mostly due to their high cost, as well as because of their form factor, which prevents their usage in scenarios that require mobility. Other limiting factors associated with tabletops include the fact that the workspace is always public, making it difficult to perform any kind of private task, as well as the fact that the actual workspace dimensions are very limited and can only accommodate a certain number of participants (Garcia-Sanjuan et al., 2016b). 2.2. Multi-display multi-touch environments One way of dealing with the disadvantages of tabletops while maintaining their positive aspects, such as awareness (Gutwin and Greenberg, 2002; Hornecker et al., 2008), parallelism (Rick et al., 2011) and fluidity of the interaction (Hornecker et al., 2008), is to use handheld tablets instead. Handheld tablets easily solve the public versus private space issue by having a different tablet assigned to each person. Mobility is also increased with these devices due to their small size and light weight, and the workspace dimensions become virtually unlimited if the application is so designed. Furthermore, handhelds are now very common and can be found in any regular household due to their low cost, making it possible to follow a ‘Bring Your Own Device’ (Ballagas et al., 2004) strategy if necessary. Several works use tablets as multi-touch, multi-display platforms to either facilitate or enhance CPS in an educational environment. Araujo et al. (2014), for example, use tablet PCs in a high school setting to encourage 15/16-year-old students to work collaboratively to solve mathematical problems. The results show a general improvement in the students’ grades after a semester of using the tablets in class. Similarly, Lohani et al. (2007) use tablets in individual and group problem solving activities in a freshman-year (ages 18–19) engineering course. Results show that the students liked using the tablets for taking notes and setting up collaborative sessions. The work by Sutterer and Sexton (2008) is another similar setup in a civil engineering course where the students (also aged 18–19) used tablet PCs for collaborative note taking as well as CPS. The study concludes that the students believed that both in-class and out-of-class learning were improved, however, the final test scores showed no significant changes in performance. Mayumi (2015) introduces two systems developed by Fujitsu and meant to be used with tablets by secondary students. The first, called ‘Shu-Chu-Train’, improves the student’s ability to concentrate and retain information. The other, called ‘Manavication’, speeds up communication between the teacher and students while supporting the development of thinking power, judgment and expressive power. The two solutions can be used to support the development of collaborative problem-solving abilities. Finally, Hung et al. (2014) and Hung et al. (2012) present a collaborative educational game consisting of a jigsaw puzzle that can be played on a Microsoft Surface. After performing a pre-game test and a post-game test on 20 elementary-school participants (Hung et al., 2012) and 240 participants aged 9–10 (Hung et al., 2014), the study concludes that the game did indeed help in raising the mean score in the tests. Mann et al. (2016) introduce several iPads on a classroom with 10-to-12-year olds, and observe their use in a collaborative task to research a news story for later presentation. The setting enables different collaborative behaviors, including discussion among peers and designating different dedicated uses for each device in the multi-display environment. However, collaboration is not enforced by the activity, resulting in many children working individually and then gathering the information at the end. UniPad (Kreitmayer et al., 2013) does enforce collaboration by constraining students to share tablets in small groups, and a study with adolescents aged 16–17 showed the system as a successful facilitator of verbal participation in the classroom. Most of the works presented in this section include older participants of high school or college age. Furthermore, they do not consider any gameful approach, instead opting for less engaging tool designs. Hung et al. (2014) and Hung et al. (2012) are the exception in that regard, but they fall into the same pattern as the rest by assuming that multi-touch tablets are the go-to solution and do not make any type of comparison with other platforms to test the effectiveness of these devices. As a counterexample, the work by Chipman et al. (2011) presents a game for children aged 5–6 to collaboratively learn about patterns, and compares a tablet version versus an analogic (paper-based) one, finding that the former increases awareness, provides more shared experiences, and keeps the students engaged longer. 2.3. Tangible user interfaces and physical spaces When dealing with younger children (such as primary school students or even kids in kindergarten) tangible user interfaces (TUIs) might be an even more interesting platform than purely digital ones like tabletops and tablets. Works like (Strawhacker and Bers, 2014) suggest that TUIs have an added value in early childhood education ‘as they resonate with traditional learning manipulatives’. Studies such as (Schneider et al., 2011) have showcased the advantages of TUIs, and others have made a direct comparison between the traditional desktop-based setup and tangible interfaces, showing that the latter enable more exploratory actions in children aged 7–10, which in turn provide faster and easier ways of interaction (Antle et al., 2009), and that they can increase 4-to-6-year-old students’ interest, engagement and understanding of the activity (Fails et al., 2005). TUIs offer the possibility of creating imaginative and original CPS activities like the ones presented by Schneider et al. (2012). Combinatorix combines tangible objects with an interactive tabletop to help students explore, solve and understand probability problems, which in turn allows them to develop an intuitive grasp of abstract concepts. The tool was only tested with five participants however and lacks a formal evaluation. Several works that include a TUI platform focus on making a comparison with traditional methods and/or purely digital platforms, instead of presenting a tool on the TUI platform only. One example of such work is by Pan et al. (2015), who investigated the affordances and constraints of physical and virtual models integrated into a dynamics course. The students in this study were separated into three groups and received either traditional instruction, traditional plus physical manipulatives or traditional plus virtual manipulatives. The results of the study suggest that adding physical and virtual manipulatives may be helpful. Schneider et al. (2011) also compare tangible and multi-touch interfaces for collaborative learning and interaction, and conclude that tangibility helped perform the given problem solving task better and achieve a higher learning gain. Robots can be examples of TUI if they can be interacted via direct touch and manipulation, hence, benefiting from the advantages of tangible manipulation described above. As reported by different studies and reviews (Ali Yousuf, 2009; Li et al., 2009; Miller et al., 2008; Mubin et al., 2013), the usage of robots in education has been steadily increasing. Possible causes of this include the fact that robots ‘capture the imagination’ of children (Li et al., 2009) and that they provide both the ability to add social interaction to the learning context and a tangible and physical representation of learning outcomes (Mubin et al., 2013). According to Mubin et al. (2013), there is a trend of using robots in education under the theory of constructionism (Papert, 1980), which consists of acquiring knowledge through building a physical artifact (in this case, a robot) and reflecting on one’s problem solving experience based on the motivation to build it. A popular platform used in this context is Lego Mindstorms,1 although the learning benefits of building robots with are not yet clear enough (McNally et al., 2006), and they present some drawbacks that could prevent their implantation in actual schools. For example, Martin et al. (2000) introduced this platform in a primary school for a whole year and found that, even though they were able to build creative designs successfully, the teachers struggled with its learning curve. Other approaches rely on robots as companions to facilitate learning. Chang et al. (2010) introduced a robot in a language-learning course with 11-year-olds and explored five different roles the robot could play: storytelling, leading the students to read aloud, encouraging and cheering the children, commanding some tasks as well as responding to the students’ commands, and having a simple Q&A conversation. They observed high motivation levels on the children showing the approach as promising and engaging. Similarly, Saerbeck et al. (2010) observed a positive impact on learning performance with children aged 10–11 of different social supportive behaviors implemented on the iCat robot (Breemen, 2004). Another example is by Wei et al. (2011), who introduce a Lego Mindstorms robot as a companion for 8-year-old students to learn mathematics. The authors report such platform being able to increase motivation and to offer a more joyful learning experience, as well as supporting educators by providing them feedback on the students’ progress. With respect to physical spaces, physical body movements are proven to be essential for the enjoyment of life (Bowlby, 1969) and several works highlight the benefits of games which favor physical activity and make use of tangible objects (Cheok et al. (2005); Xie et al., 2008). As Soute et al. (2010) showcase, creating games in which interaction transcends the boundaries of a display by making the children interact with the physical world can enable fun experiences and stimulate social interaction. In their work, the authors present what they call ‘Head Up Games’, which are intended to reminisce traditional games such as tag or hide-and-seek. The games they present, however, do not have an educative motivation underneath. In this respect, Stanton et al. (2001) design a TUI for collaborative storytelling. Multiple children aged 6–7 interact with the system by walking on a ‘magic carpet’ with pressure sensors underneath and by showing some physical props to a camera. The authors suggest that collaborative work can be encouraged by using big-sized TUIs and physical props, because these slow down the pace of interaction and increase the effort required to make manipulations, which entails more communication and discussion among the students. Another example is by Georgiadi et al. (2016) with a mobile game to collaboratively learn about archeological fieldwork. Each group of four students explores a physical space in search of special objects (Bluetooth beacons) that, when approached to a tablet, trigger specific mini-games and activities on it. Even though the children can explore the environment conjointly, each group is given only one tablet, which restricts multi-user interactions and limits collaboration. While these works do touch on CPS enhancement in some ways, they do not explore all the dimensions associated with the skill. For example, Pan et al. (2015) mention aiding communication, which we have identified as a CPS sub-skill, but completely overlook the planning process. Similarly, Schneider et al. (2012), Stanton et al. (2001) or Georgiadi et al. (2016) focus on the collaborative aspect in general but not on the individual processes that make up CPS. 3. A ROBOT BOARD-BASED GAMIFICATION APPROACH TO SUPPORT CPS 3.1. Designing a gameful framework to support CPS For this work, we consider the PISA 2015 definition of CPS by the OECD (OECD, 2013), which is endorsed by more than 70 economies worldwide and by multiple researchers (Liu et al., 2016; Nouri et al., 2017; Webb and Gibson, 2015). The PISA 2015 report discusses more than 150 works related to CPS and, as a result, defines the CPS competency as ‘the capacity of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution and pooling their knowledge, skills and efforts to reach that solution’. From this definition, we can extract three core competencies, which are: Establishing and maintaining shared understanding Taking appropriate action to solve the problem Establishing and maintaining team organization These competences require an adequate interrelation of four cognitive processes: exploring and understanding, representing and formulating, planning and executing, and monitoring and reflecting (OECD, 2010): Exploring and understanding implies understanding the situation by deciphering the initial information provided about the problem and any further information that appears during the exploration of and interaction with the problem. During the representing and formulating process, the information gathered previously is selected, organized and integrated with previous knowledge. This is achieved by representing the information in the most convenient way, whether using graphs, tables, symbols,or words, and then formulating hypotheses by extracting the relevant factors and evaluating the information. Planning and executing includes clarifying the goal of the problem, setting sub-goals and developing a plan to reach the main goal. The plan created in the first half of this process is then executed in the second part. Finally, monitoring and reflecting implies monitoring the steps in the plan to reach the main goal and reflecting on any possible solutions and assumptions. Problem-solving tasks can be categorized by one or several of the following properties: large, complex, spatially distributed, and in need of extensive communication and a large degree of functional specialization between the agents (Obeid and Moubaiddin, 2009). If a problem satisfies one or more of these properties, it is considered unsolvable by a single agent and therefore the collaboration of several agents is required. We have developed a framework called CPSbot to support the previous CPS processes and competencies. More specifically, its design revolves around the following four sub-skills associated with CPS (OECD, 2013): Negotiation: wherein the agents involved in the CPS task are expected to share their knowledge, express their ideas and come to a shared understanding leading to an agreement over the solution of the problem or the course of action to take in order to reach a solution. In some cases, an actor is expected to learn to become more flexible in the negotiations, while in other cases an actor may need to learn to be more assertive. Planning: this includes the ability to divide a given problem into smaller tasks and formulating as efficient a plan as possible in order to reach the final solution. Communication: this skill makes the enhancement of the other skills possible. Negotiation, planning and organization can only be achieved through communication; therefore, it is essential to develop the right type of communication in order to ensure the correct transmission of information and the effective interaction between the actors. Organization: wherein the agents are expected to take on the necessary roles in the team in order to structure and coordinate their efforts and therefore reach a solution in the least chaotic way possible. CPSbot has been developed for multi-touch tabletops, handheld tablets and physical spaces. It is, in essence, a framework for creating board games with a robot as the main actor that the players can move. Board games, particularly cooperative ones, are known to promote communication and socialization between the players due to their co-located nature promoting face-to-face communication (Eisenack, 2013; Zagal et al., 2006). The framework allows the instantiation of different types of robot-based board games supporting any arbitrary problem domain. CPSbot allows the definition of new problem solving scenarios in an extensible way, i.e. game designers may include new types of behaviors for interactive elements on the board and education practitioners may define educational content tailored to their needs in the form of different types of quizzes to be solved. CPSbot aims to foster CPS by compelling the users to collaborate in order to solve the given problems. The platform enables the design of interactive exploration spaces where decision-making processes about the coordination of the actions to be carried out by the robot to follow a given path; the interactive elements to be consulted; the division of work or roles assigned to each participant, and the communication strategies to use take place continuously during the game as mechanisms that drive the acquisition of CPS skills. Three main design aspects of CPSbot would make it suitable to support CPS with respect to the previous list ok skills: the distribution of game elements on a publicly visible and accessible board, the distribution of the robot’s movement commands among the players, and the slow pace at which the game is played. With respect to the former, the fact that all items are spatially distributed on the board and made available to every player would enable exploring the possible solutions, planning the proper path for the robot to take, maintain a shared understanding of the game state and the resolution process, and, once a solution is executed (i.e. bringing the robot to a specific cell), monitoring the decision adopted. In turn, the distribution of the movement commands would enable the functional specialization of each participant, making team organization through communication necessary not only to move the robot, but also to be able to solve the game problems correctly. The choice of having a slowly paced action is also important, since it would allow the users to take their time to understand the problem statement presented, negotiate and plan a strategy, and finally, in case of failure, reflect and propose another one. Of course, the educational contents in the form of problems being defined by teachers would be crucial to fully and successfully develop CPS skills. Therefore, teachers are provided with a tool to specify this educational content (Fig. 2). 3.2. Quizbot: a CPSbot game Gamification, or gameful design, is defined by Deterding et al. (2011) as ‘the use of design elements characteristic for games in non-game contexts.’ Therefore, when designing a specific game with CPSbot, we took the five game dynamics identified by Bartel et al. (2015) in accordance with Deterding et al.’s definition into consideration. These dynamics are constraints, emotions, narrative, progression and relationships. Among the different game approaches that could be implemented with CPSbot, a quiz-style board game was selected because, as pointed out by Harris (2009), in this type of game students ‘participate and collaborate as members of a social and intellectual network of learners and… the learning takes place as a natural and authentic part of playing these board games’. This is also confirmed by Westergaard (2009) who points out that quiz-style games ‘can encourage participation and foster an informal, positive and energetic learning environment’. Finally, this is an effective learning strategy because it supports retrieval practice which is, as pointed out in (Blunt and Karpicke, 2014) ‘a powerful way to enhance long-term meaningful learning of educationally relevant content’. Following this design strategy, the CPSbot framework was used to implement Quizbot, a robot-based board, quiz-style game (Fig. 1). In Quizbot, the players are presented with a board split up into an undetermined number of cells. At this point, the game is considered to be in its normal mode (versus the quiz mode described further below). The board cells may be empty, or they may contain one of the following items: Key: this is the most important item in the game. Keys are used to activate the game’s quiz mode, which presents the players with a question that must then be answered. Block: This mostly harmless item simply serves as a blockade. The robot that the player controls cannot pass through these cells on the board. Bomb: this could be considered the game’s main antagonist. Colliding with a bomb while the game is in quiz mode undoes any previously correct answers and the quiz is restarted from the beginning. Figure 1. View largeDownload slide Quizbot in normal mode (left) and quiz mode (right). Figure 1. View largeDownload slide Quizbot in normal mode (left) and quiz mode (right). The bombs in this case work both as the main constraint in the game when considering the previously mentioned game dynamics, as well as for interaction precision measurement. They are also meant to be the main cause of emotional outbursts in players (whether negative due to collision or positive due to evasion). The board itself also contains a robot, which acts as the player’s agent. Four movement commands are associated with the robot and players may have any number of these commands available to them. The commands are go forward, turn left, turn right and stop. The reason behind this setup is so that in a multiplayer case, different players would control different commands and must coordinate with each other in order to move the robot efficiently, thus fulfilling the relationships metric in the gameful design. In the normal mode, the goal is to move the robot to a key-containing cell while avoiding blocks and bombs in order to activate the next quiz. Once that is done, the game enters into quiz mode. In quiz mode, the blocks and bombs remain in place but all the key-containing cells minus the one that the robot reached in order to activate the quiz are turned into answer cells (Fig. 1, right). The reached cell is turned into a question cell instead. A question cell, as the name suggests, contains a question that the player(s) must answer. A question, or quiz, is answered by guiding the robot to the correct answer cells. The game contains three types of questions that must be answered in different ways: Choice questions: this type of question is a multiple-choice type of question where the players are presented with several answers and must choose the correct one(s) out of those, visiting the cells containing these answers in any order of the players’ choice. Ordering questions: in this type of question, all the answers are correct but the cells containing them must be visited in a specific order dictated by the question itself. Accumulation questions: these questions provide the players with a greater freedom of choice where answering is concerned. The players simply have to choose any number of answers wherein their sum equals the value given in the question. Once a question is answered correctly, the quiz is considered ended and in the case of there being more questions available the game goes back to its normal mode with the previous keys (or answer cells), bombs and blocks being removed from the board and replaced with new ones scattered over different cells. If there are no further questions available, the game is considered to have finished. The number of questions in the game and the distribution of the items per question on the game board can be modified using the external application shown in Fig. 2, which creates and stores configuration files that Quizbot accesses on startup, making it possible to follow any desired game narrative. Progression can also be achieved through this by increasing the difficulty of the question or increasing the number of bombs (or constraints). The number at the top left corner of the board would serve as an indicator for this progress. Figure 2. View largeDownload slide Quizbot configuration application. Figure 2. View largeDownload slide Quizbot configuration application. 3.2.1. Quizbot for tabletops and handheld tablets Quizbot is based on a client-server architecture, where the tabletop or the handhelds would act as clients, making it possible to have the same game view on more than one device at a time. This way, each user could have their own private space while still seeing the game board with the results of the actions taken by everyone playing. While this is not particularly interesting for the tabletop platform, shown in Fig. 3, it is so for the handheld tablets, shown in Fig. 4 (where, in this particular case, each tablet has one of the four possible movement commands). Figure 3. View largeDownload slide Instance of Quizbot running on a Windows tabletop. Figure 3. View largeDownload slide Instance of Quizbot running on a Windows tabletop. Figure 4. View largeDownload slide Four connected instances of Quizbot running on Android tablets. Figure 4. View largeDownload slide Four connected instances of Quizbot running on Android tablets. 3.2.2. Quizbot for physical spaces We also created a version of Quizbot using a mixture of physical and digital spaces for a TUI experience. For this platform, several objects and devices were used (Fig. 5). The non-technological objects included interlocking foam mats for a 7 m × 4 m board, where each piece of the mattress represented a cell on the board. Foam tubes were used to represent ‘block items’, and inflatable rubber balls were used to represent the ‘bombs’ on the board. As for the technological aspect of the game, several Android handheld tablets were used as ‘key cells’ to be placed on the foam mattress in their corresponding cell. Furthermore, a Lego Mindstorms robot (Fig. 6) was used as the actual robot actor to be controlled on the board. Finally, in order to allow for communication between the board and the robot, RFID tags were placed around the ‘key’ and the ‘bomb’ cells on the back side of the mattress, and an Android phone connected to an RFID reader was mounted on the robot. This communication is made through the game server, where once a tag is read, the smartphone sends a message to the server about whether it was a key or a bomb cell (in case of the former, the ID of the cell is included), and the server then behaves according to the message received. Figure 5. View largeDownload slide Quizbot in a physical space. Figure 5. View largeDownload slide Quizbot in a physical space. Figure 6. View largeDownload slide Tangible Lego Mindstorms robot setup. Figure 6. View largeDownload slide Tangible Lego Mindstorms robot setup. 4. EVALUATION The overall goal of our study was to analyze the experience of primary school children with a game-oriented approach based on physical spaces for the enhancement of CPS skills and compare the proposed gamification approach with other more traditional technologies based on tabletops and multi-touch tablets. 4.1. Participants Overall, 80 primary school students between the ages of 9 and 10 took part in the study, of which 36 were girls and 44 were boys. The study was carried out at the Universitat Politècnica de València’s Summer School, with the additional benefit of the children being from different schools with different curriculums. 4.2. Apparatus Two implementations of the game were made. A version for the tabletop and handheld tablets was implemented using the LibGDX framework and Node.js. The tabletop device used ran Windows OS and included a 42-inch multi-touch screen. The tablets for the handheld version were BQ tablets running Android OS. Finally, the tangible version of the game was developed in native Android. RFID tags and an RFID reader were used to identify ‘bomb’ and ‘key’ item cells. An Android smartphone connected to the RFID reader was mounted on a Lego Mindstorms robot, allowing it to also read movement commands from RFID tagged paddles. BQ tablets running Android OS were used to simulate ‘key’ item cells showing the quiz questions and answers. 4.3. Procedure The children were separated randomly into 10 groups of 8 and were made to test the three platforms in different rotations. For example, one group would start with the tabletop then move onto the handheld tablets and then onto the TUI, while another group would start with the TUI platform then move onto the tabletop and then onto the handheld tablets. This helped reduce order effects on factors such as enjoyment or learning. A fully counterbalanced design could not be conducted due to logistic reasons since the time the children could be participating in the activities was limited. The questions to resolve on the platforms were randomized in order to ensure that any possible variability in problem difficulty would not have an effect on the children’s impression of the platform. The questions themselves were taken from third and fourth grade school textbooks. For each group on each platform, four children were playing at any given moment while the other four would observe from the sidelines. They would then switch after 3 min of gameplay and then back again after another three and so on, for a total of 18 min of gameplay. This does not include the time it took for them to complete a trial question at the beginning of each platform’s session. Each participant was given control over one of the robot’s commands (turn left, turn right, move forward, stop) and they were left to their own devices as far as everything else was concerned. Throughout all the activities, several observations were made of the children’s behavior by two different observers as no video recording was allowed. These observations consisted on events the observers found relevant with respect to the children’s collaboration, problem resolution or impressions, including both their actions and comments. Furthermore, at the end of each group session, after a group had tried out Quizbot on all three platforms, a questionnaire was administered to each child in order to get their feedback on the experience. The questionnaire itself is a Fun Toolkit (Read, 2008; Read and MacFarlane, 2006) questionnaire adapted to this study. Table 1 shows the questions that were asked in the questionnaire. Questions 1–6 use a Smileyometer in order to measure how much fun the children had on each platform and how easy they found controlling the robot was on each platform. Questions 7–10 use a Fun Sorter in order to measure on which platform the children thought they performed better and worse, and on which platform the children had the most and the least fun. Questions 11 and 12 use an Again-Again table where the children can report the likelihood with which they might play the game on each platform inside and outside a classroom. This is a way of indirectly measuring the intrinsic motivation caused by the platform although participants did not provide details about in which external contexts they would prefer to play. Finally, questions 13–15 deal with additional considerations in order to have a better grasp of the type of quiz questions the children prefer, and whether they prefer playing in collaboration with friends or whether they prefer playing alone. The last question is simply for future reference, in order to make Quizbot more appealing and therefore possibly more effective. Table 1. Post-game session questionnaire. # Question 1 How much fun did you have with the game on the floor? 2 How much fun did you have with the game on the tablets? 3 How much fun did you have with the game on the big table? 4 How easy was it to handle the robot on the floor? 5 How easy was it to handle the robot on the tablets? 6 How easy was it to handle the robot on the big table? 7 With which version do you think you did best? 8 With which version do you think you did worst? 9 With which version did you have the most fun? 10 With which version did you have the least fun? 11 Would you like to play again in class? 12 In what subjects would you play? 13 Would you like to play again outside class? 14 Would you prefer to play alone or with friends? 15 What would you change in the game to like it better? # Question 1 How much fun did you have with the game on the floor? 2 How much fun did you have with the game on the tablets? 3 How much fun did you have with the game on the big table? 4 How easy was it to handle the robot on the floor? 5 How easy was it to handle the robot on the tablets? 6 How easy was it to handle the robot on the big table? 7 With which version do you think you did best? 8 With which version do you think you did worst? 9 With which version did you have the most fun? 10 With which version did you have the least fun? 11 Would you like to play again in class? 12 In what subjects would you play? 13 Would you like to play again outside class? 14 Would you prefer to play alone or with friends? 15 What would you change in the game to like it better? Table 1. Post-game session questionnaire. # Question 1 How much fun did you have with the game on the floor? 2 How much fun did you have with the game on the tablets? 3 How much fun did you have with the game on the big table? 4 How easy was it to handle the robot on the floor? 5 How easy was it to handle the robot on the tablets? 6 How easy was it to handle the robot on the big table? 7 With which version do you think you did best? 8 With which version do you think you did worst? 9 With which version did you have the most fun? 10 With which version did you have the least fun? 11 Would you like to play again in class? 12 In what subjects would you play? 13 Would you like to play again outside class? 14 Would you prefer to play alone or with friends? 15 What would you change in the game to like it better? # Question 1 How much fun did you have with the game on the floor? 2 How much fun did you have with the game on the tablets? 3 How much fun did you have with the game on the big table? 4 How easy was it to handle the robot on the floor? 5 How easy was it to handle the robot on the tablets? 6 How easy was it to handle the robot on the big table? 7 With which version do you think you did best? 8 With which version do you think you did worst? 9 With which version did you have the most fun? 10 With which version did you have the least fun? 11 Would you like to play again in class? 12 In what subjects would you play? 13 Would you like to play again outside class? 14 Would you prefer to play alone or with friends? 15 What would you change in the game to like it better? 4.4. Results This section describes the three types of result obtained from the tests: performance results obtained from the game logs, user impressions from the questionnaires that all the participants filled out, and a summary of the observations we made during the session. 4.4.1. Performance The three platforms included a logging system, each of which logged events such as the movement command given, a bomb contact, an answer has been reached, a quiz has started, and a quiz has ended. Table 2 shows a summary of the averages per platform obtained from these logs as well as the significance level obtained from running a Friedman test on them. With α = 0.05, the only significant differences found were for the time between answers and the number of wrong answers. A Wilcoxon test was then used to check for significant differences between pairs of platforms for the two significantly different variables. The results of a Bonferroni adjustment (α = 0.05/3 = 0.017), which takes into consideration that three independent variables are being compared, indicate that the significant differences are in the comparisons between the tangible platform and the other two platforms for both the average number of wrong answers and the average time between answers (Table 3). Table 2. Log summary (*P < 0.05). Tabletop Tablets TUI P-value Commands given 344.571 375.333 271.583 0.205 Bomb contacts 2.429 2.333 3.333 0.341 Wrong answers 8.571 9.167 1.583 0.000* Quizzes completed 1.571 2.333 2.083 0.201 Time between answers (in seconds) 46.091 40.648 83.223 0.001* Time between correct answers (in seconds) 72.386 86.828 80.894 0.368 Time to finish quiz (in seconds) 212.667 238.158 224.536 0.197 Tabletop Tablets TUI P-value Commands given 344.571 375.333 271.583 0.205 Bomb contacts 2.429 2.333 3.333 0.341 Wrong answers 8.571 9.167 1.583 0.000* Quizzes completed 1.571 2.333 2.083 0.201 Time between answers (in seconds) 46.091 40.648 83.223 0.001* Time between correct answers (in seconds) 72.386 86.828 80.894 0.368 Time to finish quiz (in seconds) 212.667 238.158 224.536 0.197 Table 2. Log summary (*P < 0.05). Tabletop Tablets TUI P-value Commands given 344.571 375.333 271.583 0.205 Bomb contacts 2.429 2.333 3.333 0.341 Wrong answers 8.571 9.167 1.583 0.000* Quizzes completed 1.571 2.333 2.083 0.201 Time between answers (in seconds) 46.091 40.648 83.223 0.001* Time between correct answers (in seconds) 72.386 86.828 80.894 0.368 Time to finish quiz (in seconds) 212.667 238.158 224.536 0.197 Tabletop Tablets TUI P-value Commands given 344.571 375.333 271.583 0.205 Bomb contacts 2.429 2.333 3.333 0.341 Wrong answers 8.571 9.167 1.583 0.000* Quizzes completed 1.571 2.333 2.083 0.201 Time between answers (in seconds) 46.091 40.648 83.223 0.001* Time between correct answers (in seconds) 72.386 86.828 80.894 0.368 Time to finish quiz (in seconds) 212.667 238.158 224.536 0.197 Table 3. Wilcoxon test results for platform pairs (*P < 0.017). Variable Platform 1 Platform 2 P-value Wrong answers Tabletop Tablets 0.637 Wrong answers Tabletop TUI 0.002* Wrong answers Tablets TUI 0.004* Time between answers Tabletop Tablets 0.182 Time between answers Tabletop TUI 0.005* Time between answers Tablets TUI 0.002* Variable Platform 1 Platform 2 P-value Wrong answers Tabletop Tablets 0.637 Wrong answers Tabletop TUI 0.002* Wrong answers Tablets TUI 0.004* Time between answers Tabletop Tablets 0.182 Time between answers Tabletop TUI 0.005* Time between answers Tablets TUI 0.002* Table 3. Wilcoxon test results for platform pairs (*P < 0.017). Variable Platform 1 Platform 2 P-value Wrong answers Tabletop Tablets 0.637 Wrong answers Tabletop TUI 0.002* Wrong answers Tablets TUI 0.004* Time between answers Tabletop Tablets 0.182 Time between answers Tabletop TUI 0.005* Time between answers Tablets TUI 0.002* Variable Platform 1 Platform 2 P-value Wrong answers Tabletop Tablets 0.637 Wrong answers Tabletop TUI 0.002* Wrong answers Tablets TUI 0.004* Time between answers Tabletop Tablets 0.182 Time between answers Tabletop TUI 0.005* Time between answers Tablets TUI 0.002* 4.4.2. Impressions The results obtained from the Fun Toolkit questionnaire are reported in this section. The questions were split into groups where the same factor was being measured for the different platforms in order to see how the children perceived the platforms. A Wilcoxon test was used on the Smileyometer results in which the questions were paired by platform (tabletop, tablets, tangible) for each measurement factor (fun, ease of use). The results of these tests are summarized in Table 4, where it can be seen that the only statistically significant difference (P < 0.017, due to the Bonferroni adjustment) obtained was between the tablets and TUI ease of use factor. Table 4. Smileyometer result comparison; questions 1–6 (*P < 0.017). Variable Platform 1 Platform 2 P Fun (Q1–3) Tabletop Tablets 0.461 Fun (Q1–3) Tabletop TUI 0.06 Fun (Q1–3) Tablets TUI 0.019 Ease of use (Q4–6) Tabletop Tablets 0.02 Ease of use (Q4–6) Tabletop TUI 0.438 Ease of use (Q4–6) Tablets TUI 0.005* Variable Platform 1 Platform 2 P Fun (Q1–3) Tabletop Tablets 0.461 Fun (Q1–3) Tabletop TUI 0.06 Fun (Q1–3) Tablets TUI 0.019 Ease of use (Q4–6) Tabletop Tablets 0.02 Ease of use (Q4–6) Tabletop TUI 0.438 Ease of use (Q4–6) Tablets TUI 0.005* Table 4. Smileyometer result comparison; questions 1–6 (*P < 0.017). Variable Platform 1 Platform 2 P Fun (Q1–3) Tabletop Tablets 0.461 Fun (Q1–3) Tabletop TUI 0.06 Fun (Q1–3) Tablets TUI 0.019 Ease of use (Q4–6) Tabletop Tablets 0.02 Ease of use (Q4–6) Tabletop TUI 0.438 Ease of use (Q4–6) Tablets TUI 0.005* Variable Platform 1 Platform 2 P Fun (Q1–3) Tabletop Tablets 0.461 Fun (Q1–3) Tabletop TUI 0.06 Fun (Q1–3) Tablets TUI 0.019 Ease of use (Q4–6) Tabletop Tablets 0.02 Ease of use (Q4–6) Tabletop TUI 0.438 Ease of use (Q4–6) Tablets TUI 0.005* The results from the Fun Sorters where the children’s platform preferences for the fun and the ease of use factors were asked explicitly (questions 7–10) are shown in Table 5. The average score is shown for each platform. This score was established by assigning three points to the platform that was chosen as the best, two points for the platform that was chosen as second best, and one point to the platform that was chosen as worst. This means that the closer the score is to 3, the better it is. Table 6 shows the results of the Wilcoxon test applied to the results of the Fun Sorters. Table 5. Fun sorter results (mean score for each platform is shown between parentheses). Variable Best Intermediate Worst Easy to use TUI (2.45) Tabletop (1.86) Tablets (1.74) Fun TUI (2.64) Tabletop (1.96) Tablets (1.47) Variable Best Intermediate Worst Easy to use TUI (2.45) Tabletop (1.86) Tablets (1.74) Fun TUI (2.64) Tabletop (1.96) Tablets (1.47) Table 5. Fun sorter results (mean score for each platform is shown between parentheses). Variable Best Intermediate Worst Easy to use TUI (2.45) Tabletop (1.86) Tablets (1.74) Fun TUI (2.64) Tabletop (1.96) Tablets (1.47) Variable Best Intermediate Worst Easy to use TUI (2.45) Tabletop (1.86) Tablets (1.74) Fun TUI (2.64) Tabletop (1.96) Tablets (1.47) Table 6. Fun sorter results comparison; questions 7–10 (*P < 0.017). Variable Platform 1 Platform 2 P Ease of use Tabletop Tablets 0.445 Ease of use Tabletop TUI 0.000* Ease of use Tablets TUI 0.000* Fun Tabletop Tablets 0.000* Fun Tabletop TUI 0.000* Fun Tablets TUI 0.000* Variable Platform 1 Platform 2 P Ease of use Tabletop Tablets 0.445 Ease of use Tabletop TUI 0.000* Ease of use Tablets TUI 0.000* Fun Tabletop Tablets 0.000* Fun Tabletop TUI 0.000* Fun Tablets TUI 0.000* Table 6. Fun sorter results comparison; questions 7–10 (*P < 0.017). Variable Platform 1 Platform 2 P Ease of use Tabletop Tablets 0.445 Ease of use Tabletop TUI 0.000* Ease of use Tablets TUI 0.000* Fun Tabletop Tablets 0.000* Fun Tabletop TUI 0.000* Fun Tablets TUI 0.000* Variable Platform 1 Platform 2 P Ease of use Tabletop Tablets 0.445 Ease of use Tabletop TUI 0.000* Ease of use Tablets TUI 0.000* Fun Tabletop Tablets 0.000* Fun Tabletop TUI 0.000* Fun Tablets TUI 0.000* Figure 7 shows the results of the Again-Again tables in which the children state their intention of playing again on each platform in class and outside (questions 11 and 12). The general response in both cases can be seen as a positive one. Table 7 displays the results of the Wilcoxon test applied to the Again-Again tables and shows that, while all three platforms got a generally positive reply, the tangible platform got a significantly more positive reaction in comparison. Figure 8 shows which school subjects the children prefer for the quiz questions on each platform (question 13). Figure 9 shows the ratio of children who prefer playing alone versus with friends on each platform (question 14). The majority of them stated that they would rather play with friends on all three platforms. Finally, Fig. 10 shows some of the changes that the children suggested for Quizbot (question 15). Most of these changes appear to be related to the game visuals. Figure 7. View largeDownload slide Again-Again table results, stating desire to play again in class and outside. Figure 7. View largeDownload slide Again-Again table results, stating desire to play again in class and outside. Table 7. Again-Again tables results comparison; questions 11 and 13 (*P < 0.017). Variable Platform 1 Platform 2 P Classroom Tabletop Tablets 0.148 Classroom Tabletop TUI 0.004* Classroom Tablets TUI 0.000* Outside Tabletop Tablets 0.648 Outside Tabletop TUI 0.088 Outside Tablets TUI 0.001* Variable Platform 1 Platform 2 P Classroom Tabletop Tablets 0.148 Classroom Tabletop TUI 0.004* Classroom Tablets TUI 0.000* Outside Tabletop Tablets 0.648 Outside Tabletop TUI 0.088 Outside Tablets TUI 0.001* View Large Table 7. Again-Again tables results comparison; questions 11 and 13 (*P < 0.017). Variable Platform 1 Platform 2 P Classroom Tabletop Tablets 0.148 Classroom Tabletop TUI 0.004* Classroom Tablets TUI 0.000* Outside Tabletop Tablets 0.648 Outside Tabletop TUI 0.088 Outside Tablets TUI 0.001* Variable Platform 1 Platform 2 P Classroom Tabletop Tablets 0.148 Classroom Tabletop TUI 0.004* Classroom Tablets TUI 0.000* Outside Tabletop Tablets 0.648 Outside Tabletop TUI 0.088 Outside Tablets TUI 0.001* View Large Figure 8. View largeDownload slide Results for which school subjects are preferred for questions on the three platforms. Figure 8. View largeDownload slide Results for which school subjects are preferred for questions on the three platforms. Figure 9. View largeDownload slide Company preference for the three platforms. Figure 9. View largeDownload slide Company preference for the three platforms. Figure 10. View largeDownload slide Changes suggested for Quizbot. Figure 10. View largeDownload slide Changes suggested for Quizbot. 4.4.3. Observations Throughout the game sessions, two observers took notes about the children’s general behavior with respect to CPS and the game. Afterwards, both of them discussed their notes and extracted some patterns from those behaviors they had both observed. These observations are not quantified as the impossibility of recording the sessions prevents us from reporting precise measures. The most frequently observed action on all three platforms was planning. Whether it was at the beginning of each quiz or after a correct (and sometimes incorrect) answer, all 10 groups would stop and discuss which path to take to get to the next question. Some of the discussion revolved around whether the robot would be able to pass between two items on the board or not. Sometimes, they would plan ahead for several answers. However, there were also some cases in which no plans were made and some children in the group would take charge and try different answers randomly. It was not only the children who were playing at that moment who planned; the four children watching from the sidelines were also observed planning in hushed voices for when it was their turn to play. Another frequently observed action was exploration. Whenever a new quiz would start, before selecting an answer the children would visit all the possibilities before starting the planning process. This was observed most frequently on the TUI platform, especially among the children watching from the sidelines. During the exploration and planning processes, a lot of knowledge sharing also took place, especially if a child was sure of an answer or if someone asked a question. A lot of negotiation in different forms took place on all three platforms. For example, sometimes the children would discuss whether a set of answers was correct or not and would then agree to visit one answer and then another. Negotiations related to path planning also took place, where they would evaluate whether it was worth risking a shorter path containing bombs or if it was better to play it safe and take a longer path. Some subgroups would also negotiate which movement command each person would have whenever it was their turn to play. This last type of negotiation was observed most frequently on the TUI platform and sometimes on the tablet platform, but rarely on the tabletop. In most groups, one of the children would eventually take on a leadership role, ordering movements constantly. Most of the children would shout for the robot to be stopped, especially when it was about to collide with a bomb, making some children either avoid having that movement command or purposely ask for it, but the group leaders would shout out all the movements, telling the others when to go forward or when to turn. In some groups, children would be fed up with waiting for someone to perform a movement command and would either invade the other’s workspace (in the tabletop and tablets case) or grasp the other child’s hand to force them to perform the wanted command. In some groups, the children waiting on the sidelines would collaborate with those currently playing by telling them the answer or warning them about a bomb. This occurred most frequently on the handhelds platform, but also sometimes on the other two platforms. However, the children on the sidelines were more frequently found trying to annoy those playing by counting down the time for their turn to end, taunting them, asking them to collide with a bomb or to choose a wrong answer, giving them wrong answers, or actually sabotaging by invading their interface. There were also cases where one of the players would sabotage the rest by constantly turning the robot or stopping it as soon as it started moving. In these cases, the other children would either tell them off or, in a few cases, physically stop them by grabbing their hand. Overall, there were several groups with good coordination and groups with bad coordination. Sometimes a person would know and say the correct answer but the others would ignore them, causing them to sulk and ignore the game. In some cases, after answering wrongly, part of the group would sulk and momentarily stop playing. There were also cases where someone would try to cheer up the rest of the group and encourage them to try another answer. As far as individual platform observations go, a couple of children complained about the warm air given off by the tabletop, as well as about having to read the question and the answers upside down (for those standing in the north position). In the latter case, the person standing in the south position would help by reading the text aloud. While playing on the tablets platform, the children would sometimes stand up when they got excited (such as when they answer something correctly or, in the case of the children on the sidelines, a wrong answer is chosen). The children on the sidelines would also stand up sometimes to have a better view of all the tablets, even though they can view one or two tablets easily from their position. A lack of coordination was also observed when it came to the two children with the turning movements; they would often turn the robot left and right at the same time, causing it to stay in the same position. They would also often call out an answer to go to, by saying ‘This one!’ while pointing at their own tablet, causing the others to ask ‘Which one?’ in return. Finally, when faced with the TUI platform, several children would make satisfied exclamations such as ‘That’s so cool!’ or ‘This is great!’ and so on. In a few cases, the children would make the robot purposely collide with the blocks. There were also cases where the robot came apart because of the children’s rough handling (whether because of colliding or because they moved the robot manually). On some occasions, the children who were supposed to be on the sidelines would stay on the board to observe the actions of those who were playing, while on others they would move around the board to play with the foam blocks or the rubber balls. 4.5. Discussion 4.5.1. Performance The performance results show an overall lack of significant differences between the three platforms, which is interesting in certain cases, such as in the number of quizzes completed. We expected fewer questions to be completed on the platform using physical spaces due to the bigger size of the board making it more time consuming to check the different answers, but the groups divided that task efficiently enough between the members to make this not be the case. Instead, the bigger board size could be the reason behind the two significant differences that were obtained from the logs. We observed the children colliding with an unwanted (and usually incorrect) answer by accident several times on both digital platforms (usually when trying to make a right or left turn), making the average time between answers in general less than on the tangible platform despite the average time between correct answers being mostly similar. This could be due to the perceived distances on the board; the bigger physical board amplifies the otherwise small distance that is seen on a screen. The children would shrug off the accidental collisions with wrong answers the same way they would shrug off a collision with a ‘block’ item, which is probably why these collisions had no effect on the total time it would take them to complete a quiz. In sum, these results seem to indicate that the platform using physical spaces is the best platform to use with children in terms reducing undesired mistakes. The two variables with significant differences (number of wrong answers and time between answers) are both affected by movement precision, and unlike screen-based technologies, in which size is either limited or hard to extend, physical spaces such as the one described in this work make it very easy to expand the game world since the RFID tags, mats and other props used are very cheap and easy to install. Nevertheless, they do occupy space that in some contexts might be unavailable. On the other hand, as far as the rest of the measured variables are concerned, the three platforms provided no significant differences, meaning that no single platform provides any particular disadvantage, while physical spaces do provide a major advantage. 4.5.2. Impressions The main purpose of the Fun Toolkit questionnaires was to compare the three platforms in order to see whether one would stand out from the rest. Overall, it seemed like the children’s preference was the TUI platform using physical spaces. The Smileyometer results (questions 1–6) showed that the tangible interface was easier to use than the tabletop/tablets, and this agrees with the Fun Sorter results shown in Tables 5 and 6. This could be due to a combination of smaller public workspace in the latter, which makes knowledge sharing harder, and the generally higher difficulty observed with the entirely digital version of the game. Tables 5 and 6 show that the tangible platform was both the most fun and the easiest to use, while the tablets were both the least fun and the least easy to use, which suggests a correlation between the two variables. The reason behind these results could be that the TUI was more intuitive for the children, as some previous studies revealed (Schneider et al., 2011; Strawhacker and Bers, 2014). The tangible game being a generally rarer type of activity might also affect the fun factor in this case. The results of the Again-Again table (Fig. 7) show a mostly positive reaction to all three platforms, which could possibly be related to the children’s age and their eagerness to play most of the time. This could be considered a positive result since the intention is to make CPS skill enhancement fun so that the activity would be repeated willingly, thus, helping to further enhance the children’s CPS skills. This is reinforced by the fact that they report willing to play outside the classroom, i.e. during their free time and with no enforcement of the teachers. Figure 8, which displays what subjects the children would like to study using the three platforms, does not show much variety between the subjects the children chose based on platform, but there is somewhat more of a variety of subjects on the TUI platform. This could be due to the wider options this platform provides. For example, Physical Education-related activities would be harder on the digital-only platform. When asked whether they would rather play Quizbot alone or with friends, an overwhelming number responded that they would prefer to play with friends. This is a positive result considering the purpose of Quizbot is to enhance CPS, which requires the participation of more than one agent. The handheld tablets might have the highest number of replies indicating they would rather play alone due to children perceiving tablets as generally private devices. On the last question in the questionnaire, where the children were asked about any changes they would make to Quizbot, it can be noted that most of the changes suggested by the children are aesthetic, suggesting that visually pleasant items are more appealing, which is important to take into consideration when creating something with the intention of being used repeatedly. Some children also wanted higher participation from the other children in their group, possibly indicating a difference in motivation levels. This would probably be avoided in cases where friends were playing together during a time they chose themselves. Finally, an interesting change that was suggested is one related to receiving rewards, which is a common extrinsic motivator in games. While an interesting addition to consider, studies suggest that it is more rewarding for the learning process to rely on intrinsic motivation instead (Deci et al., 1991; Werbach and Hunter, 2012). 4.5.3. Observations As for the observations that were made during the study, a lot of them involved seeing communication, negotiation, and planning taking place, which is in accordance with the processes needed for CPS to be fostered (OECD, 2013). Organization varied between the different teams, mostly depending on whether there were one or two children sabotaging the activity or not, which could be attributed to children simply acting their age. Sometimes, better organization simply took longer, waiting instead for a group leader to appear. Other roles identified by Fan (2010) as usually formed during a CPS activity were also present to different degrees in each group. These roles are Brainstormer, Critic, Supporter and Team Wrangler. The three main CPS competencies discussed in Section 3.1 were clearly observed taking place during the study. The children would share their knowledge when required, take action to solve the given questions and maintain some level of organization. The fact that improvement in some of these aspects could be observed already shows that Quizbot fulfills its intended purpose of encouraging the practice of the CPS sub-skills and CPS skills in general. On a platform-specific level, the reason more exploring took place on the TUI platform could be the fact that the children had to move around to explore, and that is precisely what the children wanted. It would also explain the constant standing up on the other platforms. More negotiation was observed on the TUI platform as well, at least when it came to negotiating what movement command (which could be considered a tool) each child would have. Since this was observed on the handhelds as well, albeit to a lesser extent, it could be related to the fact that it is easier to move the movement commands around on these two platforms. The only drawback that we found on the TUI platform was that it was somewhat distracting for the children, diverting them from the game’s main objective while they sometimes walked around the board aimlessly. The tabletop platform’s main flaw was having to read text upside in some positions, which could be attributed to its limited workspace dimensions. As a possible solution, 360° controls could be used to enable all users to have the same view, regardless of their position (Catala et al., 2012). Finally, the handheld tablets provided a mixed bag of results. On the one hand, the private space seemed to have made coordination more difficult for the children because they would point at their own tablet and say ‘here’ or ‘there’ when referring to a point on the board to go to. However, this can be seen as an opportunity to improve the children’s communication skills by encouraging them to be more specific and descriptive with their language. 4.5.4. Design considerations for future game-based CPS systems As a result of this work, we present a series of recommendations for future game-based CPS designers. These lessons are based mostly on our observational results, but also take into consideration the results on performance and user impressions. Engender equitable face-to-face discussions While one of the main CPS sub-skills is communication, it is important to design the system so that it would support discussion through face-to-face communication. Several works reported in (Drago, 2015) suggest that the decrease in the amount of time children spend interacting face to face may eventually have ‘significant consequences for their development of social skills and their presentation of self’ (Brignall and Van Valey, 2005). Physical spaces such as the one presented in this work enable this type of communication within the game space facilitating the direct reference to physical game elements that everyone in the space can see and refer to. The use of multi-surface environments to implement a purely digital game environment may be problematic because users have local copies of the game elements and it may not be clear to which game element a participant is referring to during a collective discussion. This problem could be overcome by using digital shared pointers that everyone could see. On the other hand, a positive aspect of using purely digital distributed interactive surfaces over physical spaces is that the former enable teachers to implement scenarios for children to understand the differences between face-to-face and online communication. Having multiple surfaces located on the same physical space enable face-to-face communication whereas separating them in distributed physical spaces would force children to use online communication. These two modes of communication can be practiced and discussed with children so that they understand the positive effects of the face-to-face modality (Przybylski and Weinstein, 2013). Finally, another interesting aspect to consider when engendering equitable discussions is to foster the public oral expression of all the participants. In this respect, the strategies could range from trivial turn-based ones implemented in a multi-surface system by using visual clues on the devices to communicate which person is allowed to speak during a group discussion to more advanced orchestration strategies such as those described in (O’Connor and Michaels, 1996). This would give shy children or children with communication problems the opportunity to express their opinions. Encourage group-based negotiation at multiple levels CPS involves negotiating different aspects of the problem from different perspectives. In this respect, it is critical the design of discussion spaces where different approaches can be negotiated. The use of a physical space may also naturally support the creation of subgroups around different physical artifacts to engage in different aspects of negotiation. This could also be promoted with purely digital multi-surface environments by suggesting group members to create subgroups of negotiations over a subset of the multi-surface space. This situation, in which different children focus on different elements of the problem, facilitates the process of learning to construct a shared interest. This is a key element of negotiation strategies (Fisher et al., 2011) where each child needs to understand the other child’s side to find a solution. Another aspect related to negotiation learning supported by the platforms evaluated in this work is the task of brainstorming options. It takes children time and practice to get used to finding options, but learning to invent and create options for mutual gain is an important aspect of CPS that has to be properly addressed. Although our proposal supports the brainstorming of paths to be followed by the shared robot and the consideration of the alternative interactive elements to be visited when solving a quiz, it does not support the storage and visualization of the choices expressed by each child. In our opinion, having an explicit mechanism for expressing alternatives on the board would facilitate the reconsideration of previous discussions when a chosen alternative fails to succeed. Promote the acquisition and expression of different social and personality roles Designers of gameful CPS systems should consider choices in which typical CPS roles (Brainstormer, Critic, Supporter and Team Wrangler) can emerge naturally. The different roles help with developing innate organization. Multi-tablet environments may be a good approach for this purpose because personalized indications in each participant’s device may be provided. These indications may include the role to play and the distinctive features that define the role. This can promote the training of different socio-cognitive skills at different moments during the game. This aspect is important in the design of future gameful CPS systems so that children develop the regulation and expression of emotions, empathy, the identity of self in relation to others, and social understanding (Dunn, 1988). Another opportunity that emerges with gameful CPS environments, if properly designed, is their potential to implement group-play therapies and interview therapies for children with very distinct personality traits. As pointed out in (Ginott, 1961), ‘most children between the ages of nine and thirteen have genuine difficulty in communicating emotional conflicts either verbally or through miniature toys’, and this is particularly the case with two opposite children personality categories: the over-inhibited and the acting-out. The former prefer quiet activities and usually the goal of the group activity is to lead them to more energetic forms of expression whereas the latter engage in uproarious and destructive activities and the goal of the group activity is to lead them to more focused forms of collaboration. In this respect, CPS systems based both on physical spaces and on purely digital multi-surfaces can present an opportunity to accommodate both types of goals. Physical CPS spaces could include, as pointed out in (Ginott, 1961), tangible elements that allow for safe and respectable expression of aggression (e.g. group-operated boxing or penny-arcade machines, physical elements in the game to be destroyed) Purely digital CPS environments could have similar digital interactive elements where energetic children could find ways for acting more vigorously and then be ready to engage in more focused group activities. These interactive elements could also be an opportunity ‘for children who cannot sustain close contact to become part of the group without having to go beyond their depth in personal relationships’ (Ginott, 1961). Design to support private versus public spaces Separately, private and public spaces have their own advantages and disadvantages, but having a public-only space is not usually representative of a real workspace, while a private-only space makes discussion and knowledge sharing harder. This separation can be naturally supported by multi-surface environments of handheld devices that may be used either privately or as a collective, shared, larger surface where collaboration arises. This aspect is important if divergent thinking needs to be supported as part of a CPS experience based on creativity (Sternberg and O’Hara, 1999). A pitfall in the design of our physical space for CPS is the fact that all surfaces were used as public displays of interactive content during gameplay but were not available for children as personal spaces to record notes, strategy plans, etc. It remains to be studied whether the inclusion of personal devices in CPS environments based on physical spaces has a positive effect on the cognitive processes discussed above related to negotiation, communication and planning. 5. CONCLUSIONS This work focuses on the many soft skills that are required of today’s students, and the consolidation of said skills into what is referred to as CPS. These skills can be nurtured and enhanced in many ways, but one way that has been proven effective for learning in general is through video games. However, the program’s effectiveness mainly depends on the platform used. Reviewing other works that are related to the subject at hand revealed that, while it is generally agreed upon that CPS skills need to be developed in all students, very few try to add a gamification approach to the enhancement process. Comparisons between platforms to test the differences that they could provide besides the tool itself are also rare. A CPS skills enhancement framework called CPSbot and a quiz-style game based on this platform, Quizbot, was therefore developed on three platforms in order to compare user experience and acceptance of an approach using physical spaces with screen-based sedentary platforms. Quizbot is a mixture of a board game and a quiz-solving game, where the users control a robot, moving it on a board with cells containing different game items. Some game items trigger quizzes that the players must answer, also by guiding the robot to the correct answer(s). The game presents a CPS scenario by urging the players to coordinate their actions to make the robot move, plan the robot’s route, and share their knowledge to answer the quiz questions. The first of the three platforms Quizbot was developed for is a multi-tactile tabletop, which provides a public space where players can share their knowledge with more ease. The second is a multi-tactile handheld platform where the board can be viewed on several tablets, making it possible to give each player their own private space. The third and last platform is based on a TUI using physical spaces where the robot, the game board, and even the robot movement commands became physical objects. A study was performed with 80 summer school students in which they were split into groups of eight to try out the three platforms in turn. The children were observed without interference while they played, and at the end of each group session, a questionnaire was handed out. A summary of the logs taken by the logging system that was previously implemented shows that the only significant gameplay differences between the platforms were in the number of wrong answers and the time between answers, which can probably be attributed to the perceived distances due to the board size. The questionnaire itself showed that the TUI platform was both the most fun and the easiest to use, besides the fact that it instilled a general eagerness to play again both in class and out-of-class environments. The observational results of the study provided feedback on concrete differences between the three platforms, as well as verifying that Quizbot serves its intended purpose and encourages the use of the skills associated with CPS. Finally, this study provides the first evidence that indicates that, despite the current widespread individual tablet-based learning strategies, educational technology for CPS skill acquisition should concentrate on collaborative games based on physical spaces in which technology based on robots is perceived by children as natural and motivating game elements. ACKNOWLEDGMENTS We would like to thank the Universitat Politècnica de València’s Summer School for their collaboration in this study. FUNDING Spanish Ministry of Economy and Competitiveness and the European Regional Development Fund (project TIN2014-60077-R); Spanish Ministry of Education, Culture and Sport (with fellowship FPU14/00136) and Conselleria d'Educació, Cultura i Esport (Generalitat Valenciana, Spain) (grant ACIF/2014/214). Footnotes 1 https://www.lego.com/en-us/mindstorms REFERENCES Alben, L. ( 1996) Quality of experience: defining the criteria for effective interaction design. Magazine Interact. , 3, 11– 15. http://doi.org/10.1145/235008.235010. Google Scholar CrossRef Search ADS Ali Yousuf, M. ( 2009) Robots in education. In Encyclopedia of Artificial Intelligence . pp. 1383– 1388. IGI Global, Hershey, PA, USA. http://doi.org/10.4018/978-1-59904-849-9.ch203. Google Scholar CrossRef Search ADS Antle, A.N., Droumeva, M. and Ha, D. ( 2009) Hands on what?: comparing children’s mouse-based and tangible-based interaction. Proceedings of the 8th International Conference on Interaction Design and Children (pp. 80–88), New York, USA: ACM. http://doi.org/10.1145/1551788.1551803. Araujo, C.F., Dias, E.J. and Ota, M.A. ( 2014) The tablet motivating mathematics learning in high school. Proceedings of the International Conference on Mobile and Contextual Learning, pp. 42–51. Springer. http://doi.org/10.1007/978-3-319-13416-1_5 Ballagas, R., Rohs, M., Sheridan, J. and Borchers, J. ( 2004) BYOD: bring your own device. Proceedings of the Workshop on Ubiquitous Display Environments. Nottingham, UK. Bargshady, G., Pourmahdi, K., Khodakarami, P., Khodadadi, T. and Alipanah, F. ( 2015) The effective factors on user acceptance in mobile business intelligence. J. Teknol. , 72, http://doi.org/10.11113/jt.v72.3913. Bartel, A., Figas, P. and Hagel, G. ( 2015) Towards a competency-based education with gamification design elements. Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play, pp. 457–462. New York, NY, USA: ACM. http://doi.org/10.1145/2793107.2810325. Berkovitz, J. ( 1994) Graphical interfaces for young children in a software-based math curriculum. Proceedings of the CHI’94 Conference companion on Human factors in computing systems, pp. 247–248. New York, NY, USA: ACM Press. http://doi.org/10.1145/259963.260466. Blunt, J.R. and Karpicke, J.D. ( 2014) Learning with retrieval-based concept mapping. J. Educ. Psychol. , 106, 849– 858. http://doi.org/10.1037/a0035934. Google Scholar CrossRef Search ADS Bowlby, J. ( 1969) Attachment and Loss . Basic Books, Ann Arbor, MI, USA. Brayshaw, M. and Gordon, N. ( 2016) Using motivation derived from computer gaming in the context of computer based instruction. Proceedings of the 2016 SAI Computing Conference, pp. 828– 832. IEEE. http://doi.org/10.1109/SAI.2016.7556074. Brignall, T.W. and Van Valey, T. ( 2005) The impact of Internet communications on social interaction. Sociol. Spectr. , 25, 335– 348. http://doi.org/10.1080/02732170590925882. Google Scholar CrossRef Search ADS Catala, A., Garcia-Sanjuan, F., Jaen, J. and Mocholi, J.A. ( 2012) TangiWheel: a widget for manipulating collections on tabletop displays supporting hybrid input modality. J. Comput. Sci. Technol. , 27, 811– 829. Google Scholar CrossRef Search ADS Chang, C.-W., Lee, J.-H., Chao, P.-Y., Wang, C.-Y. and Chen, G.-D. ( 2010) Exploring the possibility of using humanoid robots as instructional tools for teaching a second language in primary school. J. Educ. Technol. Soc. , 13, 13– 24. Cheok, A.D., ShangPing, L, Kodagoda, S., Tat, K E and Thang, L N. ( 2005) A social and physical inter-generational computer game for the elderly and children: age invaders. Proceedings of the Ninth IEEE International Symposium on Wearable Computers, pp. 202– 203. IEEE. http://doi.org/10.1109/ISWC.2005.6 Chipman, G., Fails, J.A., Druin, A. and Guha, M.L. ( 2011) Paper vs. tablet computers: a comparative study using tangible flags. Proceedings of the 10th International Conference on Interaction Design and Children, pp. 29–36. New York, NY, USA: ACM. http://doi.org/10.1145/1999030.1999034. Deci, E., Vallerand, R., Pelletier, L. and Ryan, R. ( 1991) Motivation and education: the self-determination perspective. Educ. Psychol. , 26, 325– 346. http://doi.org/10.1207/s15326985ep2603&4_6. Google Scholar CrossRef Search ADS Dempsey, J., Rasmussen, K. and Lucassen, B. ( 1996). The Instructional Gaming Literature: Implication and 99 Sources. Technical Report 96-1. Deterding, S., Dixon, D., Khaled, R. and Nacke, L. ( 2011) From game design elements to gamefulness: defining ‘gamification.’ Proceedings of the 15th International Academic MindTrek Conference on Envisioning Future Media Environments, pp. 9–15. New York, NY, USA: ACM. http://doi.org/10.1145/2181037.2181040. Dillenbourg, P. and Evans, M. ( 2011) Interactive tabletops in education. Int. J. Comput.-Support. Collab. Learn. , 6, 491– 514. http://doi.org/10.1007/s11412-011-9127-7. Google Scholar CrossRef Search ADS Donker, A. and Reitsma, P. ( 2007) Young children’s ability to use a computer mouse. Comput. Educ. , 48, 602– 617. http://doi.org/10.1016/j.compedu.2005.05.001. Google Scholar CrossRef Search ADS Drago, E. ( 2015) The effect of technology on face-to-face communication. Elon J. Undergrad. Res. Commun. , 6, 13– 19. Dunn, J. ( 1988) The Beginnings of Social Understanding . Harvard University Press, Cambridge, MA, USA. Google Scholar CrossRef Search ADS Eisenack, K. ( 2013) A climate change board game for interdisciplinary communication and education. Simul. Gaming , 44, 328– 348. http://doi.org/10.1177/1046878112452639. Google Scholar CrossRef Search ADS Fails, J.A., Druin, A., Guha, M.L., Chipman, G., Simms, S. and Churaman, W. ( 2005) Child’s play: a comparison of desktop and physical interactive environments. Proceeding of the 2005 Conference on Interaction Design and Children, pp. 48–55. New York, NY, USA: ACM. http://doi.org/10.1145/1109540.1109547. Falloon, G. and Khoo, E. ( 2014) Exploring young students’ talk in iPad-supported collaborative learning environments. Comput. Educ. , 77, 13– 28. http://doi.org/10.1016/j.compedu.2014.04.008. Google Scholar CrossRef Search ADS Fan, S.B. ( 2010) Roles in online collaborative problem solving. Proceedings of 2010 IEEE Symposium on Visual Languages and Human-Centric Computing, pp. 265–266. IEEE. http://doi.org/10.1109/VLHCC.2010.51. Fisher, R., Ury, W.L. and Patton, B. ( 2011) Getting to Yes: Negotiating Agreement Without Giving In (3rd Edition) . Penguin Group, London, UK. Garcia-Sanjuan, F., Jaen, J., Catala, A. and Fitzpatrick, G. ( 2015) Airsteroids: Re-designing the Arcade Game Using MarkAirs. Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, pp. 413–416. New York, NY, USA: ACM. http://doi.org/10.1145/2817721.2823480. Garcia-Sanjuan, F., Jaen, J., Fitzpatrick, G. and Catala, A. ( 2016a) MarkAirs: around-device interactions with tablets using fiducial markers—an evaluation of precision tasks. Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2474–2481. New York, NY, USA: ACM. http://doi.org/10.1145/2851581.2892486. Garcia-Sanjuan, F., Jaen, J. and Nacher, V. ( 2016b) From tabletops to multi-tablet environments in educational scenarios: a lightweight and inexpensive alternative. Proceedings of the 16th International Conference on Advanced Learning Technologies, pp. 100–101. IEEE. Garcia-Sanjuan, F., Jaen, J. and Nacher, V. ( 2016c) Toward a general conceptualization of multi-display environments. Front. ICT , 3, 20:1– 20:15. http://doi.org/10.3389/fict.2016.00020. Google Scholar CrossRef Search ADS Garcia-Sanjuan, F., Nacher, V. and Jaen, J. ( 2016d) MarkAirs: are children ready for marker-based mid-air manipulations? Proceedings of the 9th Nordic Conference on Human-Computer Interaction, pp. 2.1–2.8. New York, NY, USA: ACM. http://doi.org/10.1145/2971485.2971517. Georgiadi, N., Kokkoli-Papadopoulou, E., Kordatos, G., Partheniadis, K., Sparakis, M., Koutsabasis, P. and Stavrakis, M. ( 2016) A pervasive role-playing game for introducing elementary school students to archaeology. Proceedings of the 18th International Conference on Human-Computer Interaction With Mobile Devices and Services Adjunct, pp. 1016–1020. New York, NY, USA: ACM. http://doi.org/10.1145/2957265.2963117. Giannakos, M.N., Jaccheri, L. and Leftheriotis, I. ( 2012) Learning and creativity through tabletops: a learning analytics approach. Bull. IEEE Tech. Committee Learn. Technol. , 14, 11– 13. Ginott, H.G. ( 1961) Group psychotherapy with children: the theory and practice of play-therapy. Gokhale, A.A. ( 1995) Collaborative learning enhances critical thinking. J. Technol. Educ. , 7, 22– 30. http://doi.org/10.21061/jte.v7i1.a.2. Google Scholar CrossRef Search ADS Granic, I., Lobel, A. and Engels, R.C.M.E. ( 2014) The benefits of playing video games. Am. Psychol. , 69, 66– 78. http://doi.org/10.1037/a0034857. Google Scholar CrossRef Search ADS PubMed Greenberg, A.D. and Nilssen, A.H. ( 2015) The role of education in building soft skills. Gutwin, C. and Greenberg, S. ( 1998) Effects of awareness support on groupware usability. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 511–518. New York, NY, USA: ACM. http://doi.org/10.1145/274644.274713. Gutwin, C. and Greenberg, S. ( 2002) A descriptive framework of workspace awareness for real-time groupware. Comput. Support. Coop. Work , 11, 411– 446. http://doi.org/10.1023/A:1021271517844. Google Scholar CrossRef Search ADS Harris, C. ( 2009) Meet the new school board: board games are back—and they’re exactly what your curriculum needs. Sch. Libr. J. , 55, 24– 26. http://www.slj.com/2009/05/collection-development/meet-the-new-school-board-board-games-are-back-and-theyre-exactly-what-your-curriculum-needs/. Hassenzahl, M. and Tractinsky, N. ( 2006) User experience—a research agenda. Behav. Inf. Technol. , 25, 91– 97. http://doi.org/10.1080/01449290500330331. Google Scholar CrossRef Search ADS Hatzilygeroudis, I., Grivokostopoulou, F. and Perikos, I. ( 2012) Using game-based learning in teaching CS algorithms. Proceedings of IEEE International Conference on Teaching, Assessment, and Learning for Engineering, pp. 9–12. IEEE. http://doi.org/10.1109/TALE.2012.6360338. Hornecker, E., Marshall, P., Dalton, N.S. and Rogers, Y. ( 2008) Collaboration and interference: awareness with mice or touch input. Proceedings of the 2008 ACM Conference on Computer Supported Cooperative Work, pp. 167–176. New York, NY, USA: ACM. http://doi.org/10.1145/1460563.1460589. Huizinga, J. ( 1949) Homo Ludens: A Study of the Play-Element in Culture . Routledge & Kegan Paul, London, UK. Hung, C., Chang, T.-W., Yu, P.-T. and Cheng, P.-J. ( 2012) The problem solving skills and learning performance in learning multi-touch interactive jigsaw game using digital scaffolds. Proceedings of the 2012 IEEE Fourth International Conference On Digital Game and Intelligent Toy Enhanced Learning, pp. 33– 38. IEEE, Piscataway, NJ, USA. http://doi.org/10.1109/DIGITEL.2012.13. Hung, CY, Kuo, F.-O., Sun, J.C.-Y. and Yu, P-T ( 2014) An interactive game approach for improving students’ learning performance in multi-touch game-based learning. IEEE Trans. Learn. Technol. , 7, 31– 37. http://doi.org/10.1109/TLT.2013.2294806. Google Scholar CrossRef Search ADS Jackson, A.T., Brummel, B.J., Pollet, C.L. and Greer, D.D. ( 2013) An evaluation of interactive tabletops in elementary mathematics education. Educ. Technol. Res. Dev. , 61, 311– 332. http://doi.org/10.1007/s11423-013-9287-4. Google Scholar CrossRef Search ADS Kreitmayer, S., Rogers, Y., Laney, R. and Peake, S. ( 2013) UniPad: orchestrating collaborative activities through shared tablets and an integrated wall display. Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, pp. 801–810. ACM Press. http://doi.org/10.1145/2493432.2493506. Li, L.-Y., Chang, C.-W. and Chen, G.-D. ( 2009) Researches on using robots in education. Proceedings of the 4th International Conference on E-Learning and Games, pp. 479–482. Springer. http://doi.org/10.1007/978-3-642-03364-3_57. Liao, Y.H. and Shen, C.-Y. ( 2012) Heuristic evaluation of digital game based learning: a case study. Proceedings of the 2012 IEEE Fourth International Conference On Digital Game and Intelligent Toy Enhanced Learning, pp. 192– 196. IEEE. http://doi.org/10.1109/DIGITEL.2012.54. Liu, L., Hao, J., von Davier, A.A., Kyllonen, P. and Zapata-Rivera, J.-D. ( 2016) A tough nut to crack: measuring collaborative problem solving. In Handbook of Research on Technology Tools for Real-World Skill Development . pp. 344– 359. IGI Global, Hershey, PA, USA. http://doi.org/10.4018/978-1-4666-9441-5.ch013. Google Scholar CrossRef Search ADS Lohani, V., Castles, R., Lo, J. and Griffin, O. ( 2007) Tablet PC applications in a large engineering program. Proceedings of the 114th Annual ASEE Conference and Exposition, pp. 12.1341.1– 12.1341.13. Mann, A.-M., Hinrichs, U., Read, J.C. and Quigley, A. ( 2016) Facilitator, functionary, friend or foe?: Studying the role of iPads within learning activities across a school year. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1833–1845. ACM. http://doi.org/10.1145/2858036.2858251. Martin, F.G., Butler, D. and Gleason, W.M. ( 2000) Design, story-telling, and robots in Irish primary education. Proceedings of the 2000 IEEE International Conference on Systems, Man and Cybernetics, Vol. 1, pp. 730–735. IEEE. http://doi.org/10.1109/ICSMC.2000.885082. Mayumi, H. ( 2015) Next-generation digital educational solutions for use in the classroom. Fujitsu Sci. Tech. J. , 51, 22– 27. McNally, M., Goldweber, M., Fagin, B. and Klassner, F. ( 2006) Do lego mindstorms robots have a future in CS education? Proceedings of the 37th SIGCSE Technical Symposium on Computer Science Education, pp. 61–62. New York, NY, USA: ACM Press. http://doi.org/10.1145/1121341.1121362. Mercier, E., Vourloumi, G. and Higgins, S. ( 2015) Student interactions and the development of ideas in multi-touch and paper-based collaborative mathematical problem solving. Br. J. Educ. Technol. , 48, 162– 175. http://doi.org/10.1111/bjet.12351. Google Scholar CrossRef Search ADS Miller, D.P., Nourbakhsh, I.R. and Siegwart, R. ( 2008) Robots for education. In Springer Handbook of Robotics . pp. 1283– 1301. Springer, Berlin, Heidelberg, http://doi.org/10.1007/978-3-540-30301-5_56. Google Scholar CrossRef Search ADS Mubin, O., Stevens, C.J., Shahid, S., Mahmud, A., Al and Dong, J.-J. ( 2013) A review of the applicability of robots in education. Technol. Educ. Learn. , 1, 1– 7. http://doi.org/10.2316/Journal.209.2013.1.209-0015. Google Scholar CrossRef Search ADS Nacher, V., Ferreira, A., Jaen, J. and Garcia-Sanjuan, F. ( 2016) Are kindergarten children ready for indirect drag interactions? Proceedings of the 2016 ACM on Interactive Surfaces and Spaces, pp. 95– 101. New York, NY, USA: ACM. http://doi.org/10.1145/2992154.2992186 Nacher, V., Jaen, J., Navarro, E., Catala, A. and González, P. ( 2015) Multi-touch gestures for pre-kindergarten children. Int. J. Hum. Comput. Stud. , 73, 37– 51. http://doi.org/10.1016/j.ijhcs.2014.08.004. Google Scholar CrossRef Search ADS Nouri, J., Åkerfeldt, A., Fors, U. and Selander, S. ( 2017) Assessing collaborative problem solving skills in technology-enhanced learning environments—the PISA framework and modes of communication. Int. J. Emerg. Technol. Learn. , 12, 163– 174. http://doi.org/10.3991/ijet.v12i04.6737. Google Scholar CrossRef Search ADS Obeid, N. and Moubaiddin, A. ( 2009) On the role of dialogue and argumentation in collaborative problem solving. Proceedings of the 9th International Conference on Intelligent Systems Design and Applications, pp. 1202–1208. http://doi.org/10.1109/ISDA.2009.60. OECD. ( 2010) PISA 2012 field trial problem solving framework. OECD. ( 2013) PISA 2015 draft collaborative problem solving framework. https://www.oecd.org/pisa/pisaproducts/Draft PISA 2015 Collaborative Problem Solving Framework.pdf. O’Connor, M.C. and Michaels, S. ( 1996) Shifting participant frameworks: orchestrating thinking practices in group discussion. In Hicks, D. (ed.), Discourse, Learning, and Schooling . pp. 63– 103. Cambridge University Press, Cambridge, UK. http://doi.org/10.1017/CBO9780511720390.003. Google Scholar CrossRef Search ADS Pachauri, D. and Yadav, A. ( 2014) Importance of soft skills in teacher education programme. Int. J. Educ. Res. Technol. , 5, 22– 25. Google Scholar CrossRef Search ADS Pan, E., Chiu, J.L., Inkelas, K., Garner, G., Russell, S. and Berger, E. ( 2015) Affordances and constraints of physical and virtual manipulatives for learning dynamics. Int. J. Eng. Educ. , 31, 1629– 1644. Papert, S. ( 1980) Mindstorms: Children, Computers, and Powerful Ideas . Harvester Press, Hemel Hempstead, Hertfordshire, UK. Pindeh, N., Suki, N.M. and Suki, N.M. ( 2016) User acceptance on mobile apps as an effective medium to learn kadazandusun language. Procedia Econ. Financ. , 37, 372– 378. http://doi.org/10.1016/S2212-5671(16)30139-3. Google Scholar CrossRef Search ADS Przybylski, A.K. and Weinstein, N. ( 2013) Can you connect with me now? How the presence of mobile communication technology influences face-to-face conversation quality. J. Soc. Pers. Relat. , 30, 237– 246. http://doi.org/10.1177/0265407512453827. Google Scholar CrossRef Search ADS Raman, R., Lal, A. and Achuthan, K. ( 2014) Serious games based approach to cyber security concept learning: Indian context. Proceedings of the 2014 International Conference on Green Computing Communication and Electrical Engineering, pp. 1– 5. IEEE. http://doi.org/10.1109/ICGCCEE.2014.6921392. Read, J.C. ( 2008) Validating the Fun Toolkit: an instrument for measuring children’s opinions of technology. Cogn. Technol. Work , 10, 119– 128. http://doi.org/10.1007/s10111-007-0069-9. Google Scholar CrossRef Search ADS Read, J.C. and MacFarlane, S. ( 2006) Using the fun toolkit and other survey methods to gather opinions in child computer interaction. Proceedings of the 2006 Conference on Interaction Design and Children, pp. 81– 88. New York, NY, USA: ACM. http://doi.org/10.1145/1139073.1139096. Rick, J., Marshall, P. and Yuill, N. ( 2011) Beyond one-size-fits-all: how interactive tabletops support collaborative learning. Proceedings of the 10th International Conference on Interaction Design and Children, pp. 109– 117. New York, NY, USA: ACM. http://doi.org/10.1145/1999030.1999043. Rick, J. and Rogers, Y. ( 2008) From DigiQuilt to DigiTile: adapting educational technology to a multi-touch table. Proceedings of 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems, pp. 73– 80. IEEE. http://doi.org/10.1109/TABLETOP.2008.4660186. Romeo, G., Edwards, S., McNamara, S., Walker, I. and Ziguras, C. ( 2003) Touching the screen: issues related to the use of touchscreen technology in early childhood education. Br. J. Educ. Technol. , 34, 329– 339. http://doi.org/10.1111/1467-8535.00330. Google Scholar CrossRef Search ADS Saerbeck, M., Schut, T., Bartneck, C. and Janse, M.D. ( 2010) Expressive robots in education: varying the degree of social supportive behavior of a robotic tutor. Proceedings of the 28th international conference on Human factors in computing systems, pp. 1613–1622. New York, NY, USA: ACM Press. http://doi.org/10.1145/1753326.1753567. Salvador, G., Pérez, D., Ortega, M., Soto, E., Alcañiz, M. and Contero, M. ( 2012) Evaluation of an augmented reality enhanced tabletop system as a collaborative learning tool: a case study on mathematics at the primary school. Proceedings of the Eurographics 2012—Education Papers. The Eurographics Association. http://doi.org/10.2312/conf/EG2012/education/009-016. Schneider, B., Blikstein, P. and Mackay, W. ( 2012) Combinatorix: a tangible user interface that supports collaborative learning of probabilities. Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces, pp. 129–132. ACM. http://doi.org/10.1145/2396636.2396656. Schneider, B., Jermann, P., Zufferey, G. and Dillenbourg, P. ( 2011) Benefits of a tangible interface for collaborative learning and interaction. IEEE Trans. Learn. Technol. , 4, 222– 232. http://doi.org/10.1109/TLT.2010.36. Google Scholar CrossRef Search ADS Siang, A.C. and Rao, R. K.. ( 2003) Theories of learning: a computer game perspective. Proceedings of the 5th International Symposium on Multimedia Software Engineering, pp. 239–245. IEEE. http://doi.org/10.1109/MMSE.2003.1254447. Soute, I., Markopoulos, P. and Magielse, R. ( 2010) Head Up Games: combining the best of both worlds by merging traditional and digital play. Pers. Ubiquitous Comput. , 14, 435– 444. http://doi.org/10.1007/s00779-009-0265-0. Google Scholar CrossRef Search ADS Stanton, D., Pridmore, T., Bayon, V., Neale, H., Ghali, A., Benford, S. and Wilson, J. ( 2001) Classroom collaboration in the design of tangible interfaces for storytelling. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 482–489. New York, NY, USA: ACM. http://doi.org/10.1145/365024.365322. Sternberg, R.J. and O’Hara, L.A. ( 1999) Creativity and intelligence. In Handbook of Creativity . pp. 251– 272. Cambridge University Press Cambridge, UK. Strawhacker, A. and Bers, M.U. ( 2014) ‘I want my robot to look for food’: comparing Kindergartner’s programming comprehension using tangible, graphic, and hybrid user interfaces. Int. J. Technol. Des. Educ. , 25, 293– 319. http://doi.org/10.1007/s10798-014-9287-7. Google Scholar CrossRef Search ADS Strommen, E. ( 1994) Children’s use of mouse-based interfaces to control virtual travel. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Celebrating Interdependence, pp. 405–410. New York, NY, USA: ACM Press. http://doi.org/10.1145/191666.191803. Sutterer, K. and Sexton, S. ( 2008) Interactive learning using a tablet PC in civil engineering soil mechanics. Proceedings of the 2008 Annual ASEE Conference and Exposition, pp. 13.783.1– 13.783.15. Tan, J.L., Goh, D.H.-L., Ang, R.P. and Huan, V.S. ( 2016) Learning efficacy and user acceptance of a game-based social skills learning environment. Int. J. Child-Comput. Interact. , 9–10, 1– 19. http://doi.org/10.1016/j.ijcci.2016.09.001. Google Scholar CrossRef Search ADS The New Media Consortium (NMC). ( 2017) NMC Horizon Report—2017 Higher Education Edition. The New Media Consortium (NMC), & Consortium for School Networking (CoSN). ( 2017) NMC/CoSN Horizon Report—2017 K-12 Edition. UNESCO. ( 2017) Schools in action, global citizens for sustainable development: a guide for students. van Breemen, A. J. N. ( 2004) Bringing robots to life: applying principles of animation to robots. Proceedings of the Workshop on Shaping Human-Robot Interaction—Understanding the Social Aspects of Intelligent Robotic Products. In Cooperation with the CHI 2004 Conference. ACM. Vygotsky, L.S. ( 1978) Mind in Society: The Development of Higher Psychological Processes . President and Fellows of Harvard College, Cambridge, MA, USA. Webb, M. and Gibson, D. ( 2015) Technology enhanced assessment in complex collaborative settings. Educ. Inf. Technol. , 20, 675– 695. http://doi.org/10.1007/s10639-015-9413-5. Google Scholar CrossRef Search ADS Wei, C.-W., Hung, I.-C., Lee, L. and Chen, N.-S. ( 2011) A joyful classroom learning system with robot learning companion for children to learn mathematics multiplication. Turkish Online J. Educ. Technol. , 10, 11– 23. Werbach, K. and Hunter, D. ( 2012) For The Win: How Game Thinking Can Revolutionize Your Business . Wharton Digital Press, Philadelphia. Westergaard, J. ( 2009) Effective Group Work With Young People . Open University Press, Berkshire, UK. Wouters, P., van Nimwegen, C., van Oostendorp, H. and van der Spek, E.D. ( 2013) A meta-analysis of the cognitive and motivational effects of serious games. J. Educ. Psychol. , 105, 249– –265. http://doi.org/10.1037/a0031311. Google Scholar CrossRef Search ADS Xie, L., Antle, A.N. and Motamedi, N. ( 2008) Are tangibles more fun?: comparing children’s enjoyment and engagement using physical, graphical and tangible user interfaces. Proceedings of the 2nd International Conference on Tangible and Embedded Interaction, pp. 191–198. New York, NY, USA: ACM. http://doi.org/10.1145/1347390.1347433. Zagal, J.P., Rick, J. and Hsi, I. ( 2006) Collaborative games: lessons learned from board games. Simul. Gaming , 37, 24– 40. http://doi.org/10.1177/1046878105282279. Google Scholar CrossRef Search ADS Author notes Editorial Board Member: Dr Effie Law © The Author(s) 2018. Published by Oxford University Press on behalf of The British Computer Society. All rights reserved. For Permissions, please email: firstname.lastname@example.org This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices)
Interacting with Computers – Oxford University Press
Published: Mar 7, 2018
It’s your single place to instantly
discover and read the research
that matters to you.
Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.
All for just $49/month
Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly
Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.
Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.
Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.
All the latest content is available, no embargo periods.
“Hi guys, I cannot tell you how much I love this resource. Incredible. I really believe you've hit the nail on the head with this site in regards to solving the research-purchase issue.”Daniel C.
“Whoa! It’s like Spotify but for academic articles.”@Phil_Robichaud
“I must say, @deepdyve is a fabulous solution to the independent researcher's problem of #access to #information.”@deepthiw
“My last article couldn't be possible without the platform @deepdyve that makes journal papers cheaper.”@JoseServera