Playground for practising design thinking. Training activities and toolkits for teaching problem framing.

Main Article Content

Xue Pei
https://orcid.org/0000-0002-5172-264X
Leandro Sgro
https://orcid.org/0009-0001-0935-6578

Abstract

This study adopts the metaphor of a “Playground” to explore how to create effective learning environments and situated guides and tools that enable non-design students to learn and practice the early phases of the design process: design research and problem framing. Grounded in both theoretical frameworks and the authors’ practical experience in applying and teaching design thinking, the research introduces a structured yet adaptable visual toolkit aimed at supporting educators in facilitating design-based learning for students with non-design backgrounds. The toolkit was developed and refined through an iterative action research approach across three higher education courses. Emphasising empathy, ambiguity navigation, and creative exploration, it seeks to cultivate key designer mindsets and problem-framing capabilities. The findings present both practical toolkits, including the process and specific tools, and pedagogical insights for educators and researchers who apply design thinking to interdisciplinary educational contexts. This research contributes to the ongoing discourse on design pedagogy by examining the opportunities and challenges of transferring design thinking methodologies and practices across disciplines. It underscores the value of situated and experiential learning strategies in fostering a designerly approach to complex problem-framing in non-design domains.

Downloads

Download data is not yet available.

1. Introduction

This study is situated within the field of design education, with a specific focus on how design thinking can be introduced and taught to non-design students and non-design learning contexts. In recent years, design thinking has gained increasing attention as a pedagogical framework capable of fostering creativity, user-centred exploration, and problem-solving (Foster, 2019). While widely adopted in design schools, its implementation in other disciplines remains challenging, such as students in management and innovation studies (Dunne & Martin, 2006).

This paper explores how the early stages of the design process - particularly the Discover and Define phases - can be supported through the use of structured visual tools. These phases are essential for developing skills such as empathy, problem-framing, and iterative reasoning. However, for learners unfamiliar with design culture, engaging in open-ended and research-oriented activities can be difficult without clear scaffolding.

The Playground toolkit is a visual and digital toolkit created to help educators introduce students to these phases through accessible, guided activities. The toolkit was developed through an exploratory, action-research process and tested in various higher education contexts with students from different academic backgrounds.

This study aims to advance the discourse on design education by offering practical tools and insights to support problem-framing and creative thinking in interdisciplinary learning environments. By translating abstract design principles and processes into accessible, actionable formats, the research equips educators with concrete means to integrate design practices into a wide range of pedagogical contexts.

2 . Theoretical background

2.1. Why teach ing design thinking is important

In recent decades, design thinking has expanded beyond its disciplinary origins, becoming a framework adopted in diverse fields of education, from business to engineering. Its potential to foster creativity, empathy, and iterative problem-solving has been widely acknowledged (Brown, 2009; Martin, 2009). Recently, Cross (2023) argued that the simplified process and application of standard models and patterns block the most valuable features of design thinking. In education, design thinking has become an increasingly adopted approach due to its potential to foster creativity, collaboration, and innovation (Liedtka, 2015; Henriksen et al., 2017), especially for the training of university students to face the challenges of today (Calavia et al., 2023). However, its implementation in non-design learning environments remains problematic. Students who are unfamiliar with design culture often struggle to engage with its open-ended and iterative nature, particularly during the early stages of the process, where ambiguity, empathy, and problem-framing are essential. Design problems are inherently “wicked” and do not lend themselves to linear problem-solving methods. This makes the initial phases of the design process - Discover and Define (Design Council, 2003) - especially challenging for learners accustomed to solution-oriented, disciplinary thinking. For many educators, facilitating these phases requires scaffolding strategies that go beyond lectures or case-based learning.

One key challenge in design education is helping students navigate the early stages of the process, where uncertainty is dominant, and the research and problem-framing activities play a crucial role in searching for promising and innovative solutions (Dorst, 2015). This stage, also called the Fuzzy Front Endphase (Frishammar et al., 2010) in the new product development process, is often underrepresented in classroom settings, which tend to prioritise solution generation over problem exploration (Kolko, 2010). The lack of structured tools and methods for guiding students through these phases limits their ability to develop critical and reflective thinking.

2. 2 . Project-based learning for teaching design thinking

Project-Based Learning (PBL) emerged as a fundamental element for integrating design methods into management education. The literature emphasizes that PBL not only makes learning more engaging but also allows students to experience design thinking first-hand by applying it to real-world problems (Thomas, 2000). PBL encourages students to tackle authentic challenges, interact with stakeholders, and iterate on their solutions - elements that are also central to the design process. Moreover, research in experiential learning (Kolb, 1984; Schön, 2013) highlights the importance of learning by doing, reflection in action, and iterative feedback. These pedagogical principles align closely with the goals of design-based education, particularly in interdisciplinary contexts where students need tangible entry points into abstract processes. Visual tools and design canvases, such as the Business Model Canvas (Osterwalder & Pigneur, 2010) and the Design Thinking Toolbox (Lewrick et al., 2020), can support learning and facilitate ideation and strategic thinking (Bower, 2011). However, few tools are explicitly designed to support the early stages of the design thinking process, where the emphasis is on gathering knowledge, identifying insights, and reframing challenges.

3 . Methodology

In response to the challenges outlined above, this research seeks to enrich the existing discourse on teaching design thinking in non-design contexts by introducing a guided process and a set of supportive tools, with a particular emphasis on the early stages of the design process. While various studies offer differing interpretations of the phases within the design thinking framework (Beckman & Barry, 2007; Meinel et al., 2011), this paper adopts the terminology of the Double Diamond model (Design Council, 2003), referring specifically to the initial two stages as: Discover and Define phases. This research addresses the following questions:

  • How can students with non-design backgrounds be supported in learning to frame problems through design research?
  • What kind of processes, tools, and learning environments help educators introduce design thinking effectively in non-design academic settings?

3.1. Research setting

The authors use “Playground” as both the metaphor and name of the developed process and specific toolkits, which are visual, modular structures aimed at helping educators and learners engage with design research without prior experience, making the design process more accessible, inclusive, and reflective. Drawing from design-based learning (Beckman & Barry, 2007; Rauth et al., 2010) and constructivist pedagogies (Kolb, 1984), the toolkit aims to provide educators with accessible instruments to support interdisciplinary and reflective learning. This research follows an exploratory and qualitative methodology, based on principles of action research and iterative development. The research aimed to design, implement, and refine the Playground toolkit to support learning in the early stages of the design process, particularly in educational contexts involving students with no prior design experience.

The toolkit was tested in three courses that served as samples in this research.

Course name High Education Institution Number of non-design students Course duration
Business Design and Transformation lab (sample 1) POLIMI Design System 86 (96 in total) 4 months, 8h per week
Intensive Design Workshop(sample 2) POLIMI Design System 9 (34 in total) 11 days, 8h per day
Research Methods Course(sample 3) Istituto Europeo di Design 10 (20 in total) 3 months, 3h per week
Table 1. Information on courses for testing the teaching process and toolkits.

3.2. Research process

Across these settings, the Playground toolkit was designed as a set of seven structured tools, each supporting a key activity of the Discover and Define phases.

3.2.1 Development of “playground” toolkit

In the initial phase of the project, the foundational components of the Playground toolkit were developed with the aim of equipping educators with a set of practical, accessible tools to guide non-design students through the early stages of the design process. Grounded in the logic of how designers generate knowledge and make sense of complexity in an innovation process - moving from data collection to the framing of problems and opportunities through synthesis (Kolko, 2010) - the toolkit was designed to support students in developing a designerly mindset. The design of the process and tools aimed to train students with the capability of problem-framing of designers, who “translate” collected information and data into promising insights and potential strategic directions for developing innovative products, services, and experiences (Dorst, 2015). The development process followed an iterative and user-centred approach, prioritising simplicity, usability, and alignment with pedagogical goals. Leveraging Miro, a digital collaborative platform, enabled rapid prototyping: tools were continuously shared, tested, and refined based on feedback, ensuring their responsiveness to real educational needs. Each tool within the toolkit was crafted to support specific aspects of the design research process, with a particular emphasis on the Discover and Define phases. These stages are essential for cultivating empathy, uncovering user insights, and framing actionable design opportunities.

3.2.2 T est ing and measurement

This phase was focused on assessing the effectiveness of the toolkit in gathering data that informs its subsequent refinement. Measurement was done through observation, interviews with students and conversations with educators who participated in the three courses. The effectiveness of the toolkit was evaluated across three key dimensions: ease of use, student engagement, and learning outcomes. Ease of use referred to the extent to which students were able to navigate the toolkit’s components and comprehend its intended purpose. Engagement focused on the toolkit’s ability to foster active participation in the design research process. Learning outcomes were assessed by examining how effectively the toolkit supported students in understanding and applying design research methods, with particular attention to the Discover and Define phases of the design process. The evaluation of the toolkit was designed for both “non-design students” and “educators”.

The evaluation criteria applied to “non-design students” first aimed to understand how intuitively the toolkit could be navigated. Students were asked to reflect on the ease with which they could access its components and understand its overall purpose. Specific sections were examined for levels of difficulty. Respondents were also asked which parts of the toolkit posed the greatest challenge. In parallel, the clarity and layout of the toolkit’s instructions were assessed, with participants asked to answer how clear the guidance was for each section. Judgments were based on the clarity of instructions and the intuitive nature of the toolkit’s layout. The evaluation also tried to understand to what extent the toolkit encouraged active participation in the design research process: Whether students were actively engaged in research tasks? Did the toolkit influence their level of involvement with team members? Judgments were based on observed levels of student participation and qualitative feedback about the toolkit’s influence on engagement. To assess the impact of the toolkit on learning outcomes, the research looks at how effectively it helped students understand and apply design research methods. Students were asked which methods were most or least understood, whether they were able to apply them independently, and their comfort level with the toolkit. Evaluation questions also addressed overall student satisfaction and perceptions of the toolkit’s usefulness.

In parallel, the evaluation criteria applied to educators were structured around four main criteria: facilitation support, adaptability, monitoring, and assessment. To understand the support for facilitation, educators were asked to reflect on which sections of the toolkit were easiest or hardest to facilitate. In addition, educators were asked whether they found it easy to guide students through the toolkit’s components and how much additional support was needed to explain the toolkit. The adaptability of the toolkit was examined in relation to its suitability for different projects and student groups. Educators were asked what adjustments they made to the toolkit for specific student contexts and how easily they could customise the toolkit to fit different learning objectives or course requirements. Responses were evaluated for indications of flexibility and ease of customisation of the toolkit components. This part of the evaluation relied on free-text responses and was primarily supported through document review. To understand how effectively the toolkit supported the monitoring of student progress, educators were asked how they tracked student development through the toolkit and whether they encountered any challenges in assessing student engagement and outcomes. Questions focused on both the ease of tracking progress and the efficiency of assessment procedures. This evaluation was based on free-form responses and interpreted through observations and informal conversations with facilitators. Finally, educators were invited to reflect on how the toolkit impacted students’ understanding and application of design research, as well as whether it fostered creativity and collaboration within student teams. These reflections were gathered via open-ended responses and informal feedback, with an emphasis on overall learning outcomes.

3.2.3 Analysis

The final phase focused on analysing and synthesising the data to identify actionable insights that could be applied to refine the toolkit. For instance, one of the key learnings was that students benefited most when they had clear examples of how to use each component of the toolkit. This led to the addition of new instructional content and use-case scenarios within the toolkit, which made it easier for students to relate the abstract concepts of design research to concrete applications. Additionally, feedback indicated that the toolkit’s digital format needed to be more intuitive, prompting a redesign of the user interface within the Miro platform to improve navigation and accessibility. Learning in this phase was not just about improving the toolkit; it also involved reflecting on the process itself. The analysis of feedback enables the improvement of the toolkit based on real educational experiences, ultimately making it a more effective resource for students, especially those without a design background. The research process demonstrated the value of user-centred design in educational contexts.

4 . Results

This section presents the Playground toolkit as the primary outcome, which is followed by the evaluation results derived from the implementation and testing in three courses.

4.1. The Playground toolkit

The Playground toolkit consists of seven tools that guide non-design students step-by-step through the discover and define phases: Company Brief, Research Plan, Knowledge Repository, Sharing Findings, Prioritisation Session, Mapping Opportunities, and Project Directions(Fig. 1). It begins with the Company Brief, which sets the project context and outlines objectives and expectations. The Research Plan helps structure both field and desk research. A Knowledge Repository organises collected data for easy access and collaboration. Sharing Findings encourages teams to synthesise insights together, while Mapping Opportunities clusters these findings to identify key needs. The Prioritisation Session helps students evaluate opportunities based on impact and feasibility. Finally, Project Directions (Fig. 2) supports the creation of strategic scenarios that lead to innovative solutions. All tools were developed digitally on the Miro platform.

Figure 1. Overview of the seven tools that compose the Playground toolkit. Designed by authors

The iterative and reflexive nature of the action research process allowed the toolkit to evolve over time. Each application led to improvements in layout, wording, and usability. For example, instructions were simplified, visual hierarchy was reinforced, and terminology was adapted to better suit the audience. Through these cycles, the Playground toolkit emerged not only as a set of tools but as a pedagogical space for fostering creative confidence and critical design inquiry among non-design learners. The implementation of the Playground toolkit across three distinct educational settings revealed several patterns in how non-design students and educators interacted with the toolkit and how it supported learning in early-stage design processes.

Figure 2. One example of the seven tools - Future Project Directions example Designed by authors

4 . 2 Evaluation results of the Playground toolkits

The evaluation of the Playground toolkit across three learning environments (three samples) revealed a mix of shared strengths, context-specific challenges, and opportunities for refinement. A common strength identified across all cases was the effectiveness of the Research Plan and Knowledge Repository tools. These tools were consistently well-received by both students and educators for their clarity, structure, and ease of facilitation. Their familiarity, particularly in courses with a foundation in Design Thinking, contributed to smoother implementation and active use during the early stages of a design process.

In both sample 1 and sample 2, the toolkit proved valuable in the very beginning, where it enabled structured and guided learning. Educators in sample 1 appreciated the toolkit’s flexibility, its editable format, and the ease of tracking students’ progress, both in person and remotely. Similarly, sample 2 highlighted the toolkit’s support for educators managing fast-paced, high-intensity sessions, where the tool served as an anchor for coordination. However, both contexts also reported common challenges in the later stages of the design process, particularly during the use of the Mapping Opportunities tool. Students struggled to move from research findings to actionable insights and project directions, revealing a gap in their capabilities of analysing and synthesising data from design research. Furthermore, both settings noted the need for improved educator involvement, especially in guiding evaluation and bridging the gap between research and ideation.

The Prioritisation Session and Mapping Opportunities tools encouraged group discussion and synthesis of findings. Students considered the toolkit intuitive and helpful in making sense of data, even if some required early guidance to understand each tool’s purpose.

“I wasn’t sure what ‘mapping opportunities’ meant at first, but once we got into it, it helped us see patterns that we hadn’t noticed.” (student from Sample2).

The Project Directions tool was particularly useful in helping students articulate project goals and define speculative scenarios. While not all students fully internalised the methods, qualitative feedback indicated that the Playground toolkit supported the development of creative confidence (Kelley & Kelley, 2012), especially in learners with no prior exposure to design research. Educators noted that the toolkit encouraged reflective thinking and a more strategic understanding of design processes.

Sample 3 revealed some limitations of the toolkits. While the Research Plan and Knowledge Repository remained the most used tools, the overall adoption of the toolkit was minimal. Students working within a more academically structured design research course found the toolkit’s static format less compatible with their workflow and preferred alternative platforms. This resistance led to lower engagement, minimal data input, and reduced progress tracking. Unlike in the contexts of samples 1 and 2, sample 3 did not report enhanced collaboration or creativity through the toolkit. Instead, they found its structure too rigid for the depth and autonomy expected in an academic research setting.

Across all three courses, there was a shared recognition of the toolkit’s value in early-stage research planning, but also a common call for enhancements in later-stage tools such as Opportunity Mapping, Scenario Definition, and Project Directions. Opportunities for improvement include adapting the vocabulary for accessibility, integrating ideation and creativity-tracking tools, enabling individual contribution assessment, and making the toolkit more dynamic and customisable for diverse educational formats. While the toolkit served as a solid foundation for guiding students through design research, its future development will benefit from tailoring to specific pedagogical needs, especially in settings with higher academic or creative demands.

Overall, the iterative cycles of testing and feedback demonstrated that the Playground toolkit effectively supported both individual and collaborative learning processes, while maintaining the flexibility to be adapted across diverse course structures, timeframes, and levels of design literacy. Insights gathered from the three educational settings were synthesised to provide a comparative perspective on the toolkit’s performance in varied pedagogical contexts. An illustrative example of its application is shown in Fig. 3.

Figure 3. An example of how students used the tool Project Directions Students’ work from Sample 2

Educators reported that the toolkit provided a valuable scaffold to support the Discover and Define phases. Tools such as Research Plan, Knowledge Repository, and Sharing Findings were perceived as especially useful in helping students document, reflect on, and present their field research. The evaluation results highlight that most instructors found the toolkit components easy to facilitate and adaptable to different learning contexts.

“The toolkit gave students something to hold on to — a structure to navigate ambiguity.” (educator from Sample 3)

5. Conclusion

This research has explored the development and implementation of Playground, a visual and digital toolkit designed to support educators and non-design students during the early stages of the design process. Applied across three higher education courses with varying durations, objectives, and student backgrounds, the toolkit demonstrated its potential to support design research, facilitate collaboration, and promote critical engagement with problem framing.

The qualitative feedback gathered through observation, informal interviews, and conversation with both students and educators highlighted several recurring benefits: students gained a structured entry point into design methods, educators appreciated the adaptability of the tools, and the canvases served as a shared visual language that supported team alignment and reflection. While challenges emerged in terms of initial onboarding and digital fluency, these were often mitigated through facilitation and iteration.

Importantly, the Playground toolkit offered a pedagogical contribution not only in terms of content but also in terms of format - providing a modular, visual, and collaborative environment for learning-by-doing. Its use in interdisciplinary contexts confirmed that well-designed toolkits can make abstract phases of design more accessible and actionable, particularly for students without prior exposure to design thinking.

Further research may expand on this foundation by assessing long-term learning outcomes, integrating the toolkit into hybrid or distance-learning settings, or co-developing new canvases with educators across different disciplines.

Acknowledgements

This work was developed independently and did not receive specific funding. The authors would like to thank the students and educators who contributed to the testing and refinement of the Playground toolkit across the three academic courses mentioned in this paper.

Article Details

How to Cite
Pei, X., & Sgro, L. (2025). Playground for practising design thinking.: Training activities and toolkits for teaching problem framing. Convergences - Journal of Research and Arts Education, 18(36), 95–106. https://doi.org/10.53681/c1514225187514391s.36.325
Section
Case Reports
Author Biographies

Xue Pei, Department of Design, Politecnico di Milano

PhD in Design and Assistant Professor at the Department of Design, Politecnico di Milano, where she is a member of the Design+Strategies research group. She previously served as part of the Department’s Research Delegation and has been a visiting researcher at both Delft University of Technology and Chalmers University of Technology. Her research explores the strategic role of design and design thinking in driving innovation within organisations and complex socio-technical systems. Her primary interests include strategic design, design thinking, design for organizational change, and design for systemic transitions. Her current work focuses on enabling sustainable and circular transformation in the furniture sector through servitization and design-driven innovation.

Dr. Pei has extensive experience leading and contributing to interdisciplinary and intercultural research initiatives. She has coordinated several EU-funded projects (Horizon Europe, Horizon 2020, Interreg, Erasmus+) as well as national research programmes (PNRR MICS, Fondazione Cariplo). She also collaborates with industry partners on strategic and service design initiatives. She is an active lecturer in Europe and Asia, teaching courses in strategic design, design thinking, and design research. Her academic contributions are regularly published in international peer-reviewed journals and conference proceedings.

Leandro Sgro, Independent designer and educator

Independent designer and educator, based in Milan, with a background in Strategic Design and Management Engineering. He collaborates as a teaching assistant at Politecnico di Milano and as a tutor and coach at Poli.Design, supporting master’s students in Strategic Design during workshops and research activities. He also teaches at Istituto Europeo di Design (IED), where he leads courses and workshops focused on research methods, design thinking, and creative facilitation for interdisciplinary student teams. His approach combines visual tools, speculative scenarios, and collaborative learning environments to foster problem framing and innovation literacy across diverse educational settings.

Leandro has contributed to several academic and applied design projects, including international programs, industry collaborations, and European-funded initiatives. His research and practice explore how design methods can be translated into tools that support non-design learners and educators. He developed Playground, a toolkit for teaching early-stage design research through action research, and tested it in higher education contexts. His work bridges pedagogy, strategy, and design culture with a particular focus on inclusive and adaptive learning formats.

References

Beckman, S. L., & Barry, M. (2007). Innovation as a learning process: Embedding design thinking. California Management Review, 50(1), 25–56.

Bower, M. (2011). Redesigning a Web-Conferencing Environment to Scaffold Computing Students' Creative Design Processes. Educational Technology & Society, 14 (1), 27–42

Brown, T. (2009). Change by Design: How design thinking transforms organisations and inspires innovation. New York NY, USA: Harper Business Press.

Calavia, M. B., Blanco, T., Casas, R., & Dieste, B. (2023). Making design thinking for education sustainable: Training preservice teachers to address practice challenges. Thinking Skills and Creativity, 47, 101199.

Cross, N. (2023). Design thinking: What just happened?. Design Studies, 86.

Design Council. (2003). “The Double Diamond.” Available at: https://www.designcouncil.org.uk/our-resources/the-double-diamond/ [accessed 31 March 2025]

Dorst, K. (2015). Frame Innovation: Create New Thinking by Design. The MIT Press. https://doi.org/10.7551/mitpress/10096.001.0001

Dunne, D., & Martin, R. (2006). Design Thinking and How It Will Change Management Education: An Interview and Discussion. Academy of Management Learning & Education, 5(4), 512–523. https://doi.org/10.5465/amle.2006.23473212

Foster, M. K. (2019). Design Thinking: A Creative Approach to Problem Solving. Management Teaching Review, 6(2), 123–140. https://doi.org/10.1177/2379298119871468

Frishammar, J., Florén, H., & Wincent, J. (2010). Beyond managing uncertainty: Insights from studying equivocality in the fuzzy front end of product and process innovation projects. IEEE Transactions on Engineering Management, 58(3), 551-563.

Henriksen, D., Richardson, C., and Mehta, R. (2017). Design thinking: A creative approach to educational problems of practice. Thinking Skills and Creativity. 26, 140–153. doi: 10.1016/j.tsc.2017.10.001

Kelley, T., & Kelley, D. (2012). Reclaim your creative confidence. Harvard business review, 90(12), 115-118.

Kolb, D. A. (1984). Experimental learning: Experience as the source of learning and development. Prentice-Hall.

Kolko, J. (2010). Abductive Thinking and Sensemaking: The Drivers of Design Synthesis. Design Issues, 26(1), 15–28. https://doi.org/10.1162/desi.2010.26.1.15

Lewrick, M., Link, P., & Leifer, L. (2020). The design thinking toolbox: A guide to mastering the most popular and valuable innovation methods. Wiley.

Liedtka, J. (2015). Perspective: Linking Design Thinking with Innovation Outcomes through Cognitive Bias Reduction. Journal of Product Innovation Management, 32(6), 925–938. https://doi.org/10.1111/jpim.12163

Martin, R. (2009). The Design of Business: Why design thinking is the next competitive advantage. Boston MA, USA: Harvard Business Review Press.

Meinel, C., Leifer, L., & Plattner, H. (2011). Design thinking: Understand-improve-apply. Berlin, Heidelberg: Springer.

Osterwalder, A., & Pigneur, Y. (2010). Business model generation: A handbook for visionaries, game changers, and challengers. Wiley&Sons.

Rauth, I., Köppen, E., Jobst, B., & Meinel, C. (2010). Design thinking: An educational model towards creative confidence. In Proceedings of the 1st International Conference on Design Creativity ICDC, 1(1), 1–7.

Schön, D. A. (2013). The reflective practitioner: How professionals think in action. Ashgate.

Thomas, J. W. (2000). A review of research on project-based learning. San Rafael, CA: Autodesk Foundation.

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.