## Key Ideas
> [!abstract] Core Concepts
>
> - **All students can access, all can be extended**: Everyone can get started (floor) and everyone can get stuck (ceiling) regardless of ability level
> - **Differentiate by pace and depth**: Students work at different speeds through same question sequence rather than different tasks
> - **Procedural knowledge to reasoning progression**: Floor focuses on basic skills, ceiling demands complex problem-solving and connections
## Definition
**Low-Floor High-Ceiling Tasks**: Activities designed to be mathematically accessible to all students whilst containing built-in extension opportunities that challenge even the most capable learners.
## Connected To
[[Differentiation]] | [[Scaffolding]] | [[Problem-Solving]] | [[Surface and Deep Structure]] | [[Worked Examples]] | [[Completion Problem Effect]] | [[Fluency Practice]]
---
## Increasingly difficult progression of questions
When students work independently, questions should be sequenced from easy to difficult on a continuum (Rosenshine, 2012). This progression allows all students to access initial questions whilst ensuring even the strongest students encounter genuine challenge (Wilson et al., 2019).
The progression begins with questions near-identical to worked examples, using [[Completion Problem Effect|backwards fading]] to support initial practice (van Merriënboer, 1990). As scaffolding is gradually removed, students develop [[Fluency|fluency]] through [[Fluency Practice|drilling]] (Rosenshine & Stevens, 1986). [[Minimally Different Questions]] help students recognise deep structure beneath surface variations (Paas & van Merriënboer, 1994).
Further along the sequence, [[Interleaving Effect|mixed topics]] appear, such as area problems with fractional dimensions (Rohrer & Taylor, 2007). Error analysis tasks require students to [[Explain the Mistake|identify and explain mistakes]] (Chi et al., 1989), whilst [[Solution Comparison]] activities push them to evaluate the most efficient methods (Rittle-Johnson & Star, 2007). The sequence culminates in [[Problem-Solving|unfamiliar questions]] requiring generalisation and connections (Sweller, 1988), followed by [[Exam Practice|exam-style questions]] for assessment preparation.
## Open-middle problems
Open-middle problems provide natural differentiation through depth of thinking rather than difficulty of content (Boaler, 2016). These problems share the same initial prompt (closed beginning) and reach the same final answer (closed end), but allow multiple solution approaches in between (open middle).
This structure offers several advantages over traditional problems. Students can pursue multiple solution methods rather than following a prescribed approach, and can focus on optimisation; finding not just an easy answer but the best answer. Open-middle problems avoid the complex performance task contexts that create extraneous cognitive load (Sweller et al., 2019), allowing students to focus on [[Surface and Deep Structure|deep structure]] rather than surface features (Chi, Feltovich, & Glaser, 1981).
![[OpenMiddle.png|800]]
A distributive law problem exemplifies this approach. The same problem can support both a procedural trial-and-error approach and strategic pattern recognition, allowing natural differentiation through depth of mathematical reasoning (Rittle-Johnson, Siegler, & Alibali, 2001). Weaker students engage with the basic procedure whilst stronger students recognise underlying patterns and optimise their solutions. The problem provides multiple solution pathways, creates optimisation challenges, and focuses attention on underlying mathematical principles without adding surface complexity.
## Implementation strategy
Effective low-floor high-ceiling tasks start with an accessible entry point for all students (Rosenshine, 2012) and build in natural progression points that allow learners to advance at different rates (van Merriënboer et al., 2003). Tasks should include multiple pathways to solution (Rittle-Johnson & Star, 2007) and provide clear success criteria at each level (Wilson et al., 2019). Extension opportunities must be available for students who reach mastery quickly, preventing boredom whilst others consolidate understanding (Kalyuga et al., 2003).
---
## References
Boaler, J. (2016). *Mathematical mindsets: Unleashing students' potential through creative math, inspiring messages and innovative teaching*. Jossey-Bass.
Chi, M. T. H., Bassok, M., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. *Cognitive Science*, 13(2), 145-182. https://doi.org/10.1207/s15516709cog1302_1
Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. *Cognitive Science*, 5(2), 121-152. https://doi.org/10.1207/s15516709cog0502_2
Kalyuga, S., Ayres, P., Chandler, P., & Sweller, J. (2003). The expertise reversal effect. *Educational Psychologist*, 38(1), 23-31. https://doi.org/10.1207/S15326985EP3801_4
Paas, F., & van Merriënboer, J. J. G. (1994). Variability of worked examples and transfer of geometrical problem-solving skills: A cognitive-load approach. *Journal of Educational Psychology*, 86(1), 122-133. https://doi.org/10.1037/0022-0663.86.1.122
Rittle-Johnson, B., Siegler, R. S., & Alibali, M. W. (2001). Developing conceptual understanding and procedural skill in mathematics: An iterative process. *Journal of Educational Psychology*, 93(2), 346-362. https://doi.org/10.1037/0022-0663.93.2.346
Rittle-Johnson, B., & Star, J. R. (2007). Does comparing solution methods facilitate conceptual and procedural knowledge? An experimental study on learning to solve equations. *Journal of Educational Psychology*, 99(3), 561-574. https://doi.org/10.1037/0022-0663.99.3.561
Rohrer, D., & Taylor, K. (2007). The shuffling of mathematics problems improves learning. *Instructional Science*, 35(6), 481-498. https://doi.org/10.1007/s11251-007-9015-8
Rosenshine, B. (2012). Principles of instruction: Research-based strategies that all teachers should know. *American Educator*, 36(1), 12-19, 39.
Rosenshine, B., & Stevens, R. (1986). Teaching functions. In M. C. Wittrock (Ed.), *Handbook of research on teaching* (3rd ed., pp. 376-391). Macmillan.
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. *Cognitive Science*, 12(2), 257-285. https://doi.org/10.1207/s15516709cog1202_4
Sweller, J., van Merriënboer, J. J. G., & Paas, F. (2019). Cognitive architecture and instructional design: 20 years later. *Educational Psychology Review*, 31(2), 261-292. https://doi.org/10.1007/s10648-019-09465-5
van Merriënboer, J. J. G. (1990). Strategies for programming instruction in high school: Program completion vs. program generation. *Journal of Educational Computing Research*, 6(3), 265-285. https://doi.org/10.2190/4NK5-17L7-TWQV-1EHH
van Merriënboer, J. J. G., Kirschner, P. A., & Kester, L. (2003). Taking the load off a learner's mind: Instructional design for complex learning. *Educational Psychologist*, 38(1), 5-13. https://doi.org/10.1207/S15326985EP3801_2
Wilson, R. C., Shenhav, A., Straccia, M., & Cohen, J. D. (2019). The eighty five percent rule for optimal learning. *Nature Communications*, 10(1), 4646. https://doi.org/10.1038/s41467-019-12552-4