Instructional Practices Guide
© 2018 e Mathematical Association of America, Inc.
Electronic ISBN 978-1-61444-325-4
Print ISBN 978-0-88385-198-2
Printed in the United States of America
e MAA Instructional Practices Guide is an open access publication distributed in accordance with the Creative
Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt,
build upon this work non-commercially, and license their derivative works on dierent terms, provided the original
work is properly cited and the use is non-commercial. See http://creativecommons.org/licenses/by-nc/4.0.
Project Leadership Team
Martha L. Abell, Georgia Southern University
Linda Braddy, Tarrant County College
Doug Ensley, Mathematical Association of America
Lewis Ludwig, Denison University
Hortensia Soto, University of Northern Colorado
Instructional Practices Guide
Published and Distributed by
e Mathematical Association of America
e MAA Notes Series, started in 1982, addresses a broad range of topics and themes of interest to all who are involved
with undergraduate mathematics. e volumes in this series are readable, informative, and useful, and help the math-
ematical community keep up with developments of importance to mathematics.
Council on Publications and Communications
Jennifer Quinn, Chair
Notes Editorial Board
Michael C. Axtell, Editor
Crista L. Arangala
Suzanne Hamon
Hugh Howards
David R. Mazur
Elizabeth W. McMahon
Dan Sloughter
Joe Yanik
John M Zobitz
14. Mathematical Writing, by Donald E. Knuth, Tracy Larrabee, and Paul M. Roberts.
16. Using Writing to Teach Mathematics, Andrew Sterrett, Editor.
17. Priming the Calculus Pump: Innovations and Resources, Committee on Calculus Reformand the First Two Years,
a subcomittee of the Committee on the Undergraduate Program in Mathematics, omas W. Tucker, Editor.
18. Models for Undergraduate Research in Mathematics, Lester Senechal, Editor.
19. Visualization in Teaching and Learning Mathematics, Committee on Computers inMathematics Education, Steve
Cunningham and Walter S. Zimmermann, Editors.
20. e Laboratory Approach to Teaching Calculus, L. Carl Leinbach et al., Editors.
21. Perspectives on Contemporary Statistics, David C. Hoaglin and David S. Moore, Editors.
22. Heeding the Call for Change: Suggestions for Curricular Action, Lynn A. Steen, Editor.
24. Symbolic Computation in Undergraduate Mathematics Education, Zaven A. Karian, Editor.
25. e Concept of Function: Aspects of Epistemology and Pedagogy, Guershon Harel and Ed Dubinsky, Editors.
26. Statistics for the Twenty-First Century, Florence and Sheldon Gordon, Editors.
27. Resources for Calculus Collection, Volume 1: Learning by Discovery: A Lab Manual for Calculus, Anita E. Solow,
Editor.
28. Resources for Calculus Collection, Volume 2: Calculus Problems for a New Century, Robert Fraga, Editor.
29. Resources for Calculus Collection, Volume 3: Applications of Calculus, Philip Stran, Editor.
30. Resources for Calculus Collection, Volume 4: Problems for Student Investigation, Michael B. Jackson and John R.
Ramsay, Editors.
31. Resources for Calculus Collection, Volume 5: Readings for Calculus, Underwood Dudley, Editor.
32. Essays in Humanistic Mathematics, Alvin White, Editor.
33. Research Issues in Undergraduate Mathematics Learning: Preliminary Analyses and Results, James J. Kaput and Ed
Dubinsky, Editors.
34. In Eves Circles, Joby Milo Anthony, Editor.
35. Youre the Professor, What Next? Ideas and Resources for Preparing College Teachers, e Committee on Prepara-
tion for College Teaching, Bettye Anne Case, Editor.
36. Preparing for a New Calculus: Conference Proceedings, Anita E. Solow, Editor.
37. A Practical Guide to Cooperative Learning in Collegiate Mathematics, Nancy L. Hagelgans, Barbara E. Reynolds,
SDS, Keith Schwingendorf, Draga Vidakovic, Ed Dubinsky, Mazen Shahin, G. Joseph Wimbish, Jr.
38. Models at Work: Case Studies in Eective Undergraduate Mathematics Programs, Alan C. Tucker, Editor.
39. Calculus: e Dynamics of Change, CUPM Subcommittee on Calculus Reform and the First Two Years, A. Wayne
Roberts, Editor.
40. Vita Mathematica: Historical Research and Integration with Teaching, Ronald Calinger, Editor.
41. Geometry Turned On: Dynamic Soware in Learning, Teaching, and Research, James R. King and Doris Schattschnei-
der, Editors.
42. Resources for Teaching Linear Algebra, David Carlson, Charles R. Johnson, David C. Lay, A. Duane Porter, Ann E.
Watkins,William Watkins, Editors.
43. Student Assessment in Calculus: A Report of the NSF Working Group on Assessment in Calculus, Alan Schoenfeld,
Editor.
44. Readings in Cooperative Learning for Undergraduate Mathematics, Ed Dubinsky, David Mathews, and Barbara E.
Reynolds, Editors.
45. Confronting the Core Curriculum: Considering Change in the Undergraduate Mathematics Major, John A. Dossey,
Editor.
46. Women in Mathematics: Scaling the Heights, Deborah Nolan, Editor.
47. Exemplary Programs in Introductory College Mathematics: Innovative Programs Using Technology, Susan Lenker,
Editor.
48. Writing in the Teaching and Learning of Mathematics, John Meier and omas Rishel.
49. Assessment Practices in Undergraduate Mathematics, Bonnie Gold, Sandra Z. Keith and William A. Marion, Editors.
50. Revolutions in Dierential Equations: Exploring ODEs with Modern Technology, Michael J. Kallaher, Editor.
51. Using History to Teach Mathematics: An International Perspective, Victor J. Katz, Editor.
52. Teaching Statistics: Resources for Undergraduate Instructors, omas L. Moore, Editor.
53. Geometry at Work: Papers in Applied Geometry, Catherine A. Gorini, Editor.
54. Teaching First: A Guide for New Mathematicians, omas W. Rishel.
55. Cooperative Learning in Undergraduate Mathematics: Issues at Matter and Strategies at Work, Elizabeth C.
Rogers, Barbara E. Reynolds, Neil A. Davidson, and Anthony D. omas, Editors.
56. Changing Calculus: A Report on Evaluation Eorts and National Impact from 1988 to 1998, Susan L. Ganter.
57. Learning to Teach and Teaching to Learn Mathematics: Resources for Professional Development, Matthew Delong
and Dale Winter.
58. Fractals, Graphics, and Mathematics Education, Benoit Mandelbrot and Michael Frame, Editors.
59. Linear Algebra Gems: Assets for Undergraduate Mathematics, David Carlson, Charles R. Johnson, David C. Lay,
and A. Duane Porter, Editors.
60. Innovations in Teaching Abstract Algebra, Allen C. Hibbard and Ellen J. Maycock, Editors.
61. Changing Core Mathematics, Chris Arney and Donald Small, Editors.
62. Achieving Quantitative Literacy: An Urgent Challenge for Higher Education, Lynn Arthur Steen.
64. Leading the Mathematical Sciences Department: A Resource for Chairs, Tina H. Straley, Marcia P. Sward, and Jon
W. Scott, Editors.
65. Innovations in Teaching Statistics, Joan B. Gareld, Editor.
66. Mathematics in Service to the Community: Concepts and models for service-learning in the mathematical sciences,
Charles R. Hadlock, Editor.
67. Innovative Approaches to Undergraduate Mathematics Courses Beyond Calculus, Richard J. Maher, Editor.
68. From Calculus to Computers: Using the last 200 years of mathematics history in the classroom, Amy Shell-Gellasch
and Dick Jardine, Editors.
69. A Fresh Start for Collegiate Mathematics: Rethinking the Courses below Calculus, Nancy Baxter Hastings, Editor.
70. Current Practices in Quantitative Literacy, Rick Gillman, Editor.
71. War Stories from Applied Math: Undergraduate Consultancy Projects, Robert Fraga, Editor.
72. Hands On History: A Resource for Teaching Mathematics, Amy Shell-Gellasch, Editor.
73. Making the Connection: Research and Teaching in Undergraduate Mathematics Education, Marilyn P. Carlson and
Chris Rasmussen, Editors.
74. Resources for Teaching Discrete Mathematics: Classroom Projects, History Modules, and Articles, Brian Hopkins,
Editor.
75. e Moore Method: A Pathway to Learner-Centered Instruction, Charles A. Coppin, W. Ted Mahavier, E. Lee May,
and G. Edgar Parker.
76. e Beauty of Fractals: Six Dierent Views, Denny Gulick and Jon Scott, Editors.
77. Mathematical Time Capsules: Historical Modules for the Mathematics Classroom, Dick Jardine and Amy Shell-Gel-
lasch, Editors.
78. Recent Developments on Introducing a Historical Dimension in Mathematics Education, Victor J. Katz and Costas
Tzanakis, Editors.
79. Teaching Mathematics with Classroom Voting: With and Without Clickers, Kelly Cline and Holly Zullo, Editors.
80. Resources for PreparingMiddle School Mathematics Teachers, Cheryl Beaver, Laurie Burton, Maria Fung, and Klay
Kruczek, Editors.
81. Undergraduate Mathematics for the Life Sciences: Models, Processes, and Directions, Glenn Ledder, Jenna P. Car-
penter, and Timothy D. Comar, Editors.
82. Applications of Mathematics in Economics, Warren Page, Editor.
83. Doing the Scholarship of Teaching and Learning in Mathematics, Jacqueline M. Dewar and Curtis D. Bennett, Ed-
itors.
84. Insights and Recommendations from the MAA National Study of College Calculus, David Bressoud, Vilma Mesa,
and Chris Rasmussen, Editors.
85. Beyond Lecture: Resources and Pedagogical Techniques for Enhancing the Teaching of Proof-Writing Across the
Curriculum, Rachel Schwell, Aliza Steurer and Jennifer F. Vasquez, Editors.
86. Using the Philosophy of Mathematics in Teaching Undergraduate Mathematics, Bonnie Gold, Carl E. Behrens, and
Roger A. Simons, Editors.
87. e Courses of History: Ideas for Developing a History of Mathematics Course, Amy Shell-Gellasch and Dick
Jardine, Editors.
88. Shiing Contexts, Stable Core: Advancing Quantitative Literacy in Higher Education, Luke Tunstall, Gizem Karaali,
and Victor Piercey, Editors.
89. Instructional Practices Guide, Martha L. Abell, Linda Braddy, Doug Ensley, Lewis Ludwig, and Hortensia Soto, Proj-
ect Leadership Team.
vii
Manifesto: A declaration of values
Success in mathematics opens opportunities for students. A wealth of research literature exists on how
mathematics instructors can facilitate rich, meaningful learning experiences and on what instructors can do
to improve teaching and learning at the undergraduate level: Eective teaching and deep learning require
student engagement with content both inside and outside the classroom. is Instructional Practices Guide
aims to share eective, evidence-based practices instructors can use to facilitate meaningful learning for stu-
dents of mathematics. Professional associations in the mathematical sciences along with state and national
funding agencies are supporting eorts to radically transform the undergraduate education experience; it is
truly an exciting time to be a mathematics instructor!
With that big picture in mind, this guide is written from the perspective that teaching and learning are
forces for social change. Beyond the connes of individual instructors’ classrooms, beyond their decisions
about what mathematics to teach and how to teach it, there are societal forces that call upon all mathematics
instructors to advocate for increased student access to the discipline of mathematics. Inequity exists in many
facets of our society, including within the teaching and learning of mathematics. Because access to success in
mathematics is not distributed fairly, the opportunities that accompany success in mathematics are also not
distributed fairly. We in the mathematical sciences community should not arm this inequitable situation
as an acceptable status quo. We owe it to our discipline, to ourselves, and to society to disseminate mathe-
matical knowledge in ways that increase individuals’ access to the opportunities that come with mathemat-
ical understanding.
Some of us have become reective instructors over the course of our careers, and our classrooms have
changed and improved as a result. But if we truly want to eect change, then we are compelled to extend the
reach of our eorts beyond our own students in our own classrooms. It is our responsibility to examine the
system within which we educate students and nd ways to improve that system. It is our responsibility to
help our colleagues improve and to collectively succeed at teaching mathematics to all students so that our
discipline realizes its full potential as a subject of beauty, of truth, and of empowerment for all.
Such a sea change will require transforming how mathematics is taught and facing our own individual
and collective roles in a system that does not serve all students well. Societal norms tend toward a belief that
only a certain kind of individual can do mathematics and other kinds of people need not even try. We in the
profession of teaching mathematics must look inward to determine if we are doing our part to dispel this
myth.
All instructors can facilitate student success in mathematics, and we cannot underestimate the power
of the environment in our classrooms, departments, and institutions to positively impact student learning.
Changing teaching practice is hard. But those of us who do mathematics recognize the hard work required
to learn and understand it and we choose to do that hard work. We can likewise choose to do the hard work
required to teach our beloved subject.
Mathematics instructors stand at a crossroads. We must gather the courage to take the dicult path of
change. We must gather the courage to venture down the path of uncertainty and try new evidence-based
viii MAA Instructional Practices Guide
strategies that actively engage students in the learning experience. We must gather the courage to advocate
beyond our own classroom for student-centered instructional strategies that promote equitable access to
mathematics for all students. We stand at a crossroads, and we must choose the path of transformation in
order to fulll our professional responsibility to our students. is Instructional Practices Guide can serve as
a catalyst for community-wide transformation toward improved learning experiences and equitable access
to mathematics for all students. Society deserves nothing less.
ix
Introduction to this guide
e most recent MAA documents, Committee on the Undergraduate Programs in Mathematics (CUPM) Cur-
riculum Guide to Majors in the Mathematical Science (Zorn, 2015) and A Common Vision for Undergraduate
Mathematical Sciences Programs in 2025 (Saxe and Braddy, 2016) serve as an impetus for this Instructional
Practices Guide. e CUPM Curriculum Guide provides course recommendations along with sample syllabi
for mathematical science courses, but does not provide specic teaching strategies faculty have found to be
eective with their students, and Common Vision calls for the use of evidence-based instructional strategies
by reiterating the call from the INGenIOuS project report (Zorn, et al., 2014):
We acknowledge that changing established practices can be dicult and painful. Changing cultures of
departments, institutions, and organizations can be even harder. But there is reason for optimism. In
mathematical sciences research, we are always willing, even eager, to replace mediocre or “somewhat
successful” strategies with better ones. In that open-minded spirit, we invite the mathematical sciences
community to view this call to action as a promising opportunity to live up to our professional respon-
sibilities by improving workforce preparation (p. 25).
With this in mind, this Instructional Practices Guide is designed as a “how to” guide focused on mathe-
matics instruction at the undergraduate level. It is based on the concept that eective teaching is supported
by three foundational types of practices: classroom practices, assessment practices, and course design
practices all informed by empirical research as well as the literature on technology and equity. In this intro-
duction, we describe the intended audience, provide a brief overview of each practice, and oer suggestions
on how to navigate this guide.
e Instructional Practices Guide is founded on the belief that every student should have the opportunity
to engage in deep mathematics learning, guided and mentored by their instructor. It is intended for all in-
structors of mathematics, from the new graduate teaching assistant to the most experienced senior instruc-
tor; from the contingent faculty member at a two-year institution to the new faculty member at a doctor-
al-granting institution; from the instructor who wants to transform her own teaching to the mathematician
or mathematics educator facilitating professional development for graduate students or collegiate faculty.
It is also intended for administrators who are in positions to work with their faculty to initiate systemic
change in their departments and across their institutions. Administrators will recognize that many of our
suggestions are applicable to other disciplines; in fact, some of the suggestions are borrowed from research
in science education
e Classroom Practices chapter provides examples of teaching practices, both inside and outside the
classroom, that foster student engagement as well as a section on selecting appropriate mathematical tasks
that contribute to building a sense of community within the classroom. e Assessment Practices chapter
builds on policy assessment documents from various associations including the National Council of Teach-
ers of Mathematics, the American Statistical Association, and, of course, the MAA. is chapter centers
on the interplay between formative and summative assessment to examine the teaching and learning of
mathematics with a strong focus on learning outcomes. e Design Practices chapter provides the reader a
x MAA Instructional Practices Guide
brief introduction to instructional designs that help achieve desired learning outcomes, based on theories of
design, along with potential challenges and opportunities associated with instructional design.
We acknowledge the suggestions in this guide are not exhaustive, but we aim to include something of
interest for any reader to adapt for their own classroom. Each of the practices informs the others, and de-
pending on readers’ experiences, they might choose to read the guide in an order other than the one pre-
sented. We purposefully begin with the Classroom Practice chapter in an eort to engage readers who are
just beginning to transform their teaching. As readers gain more experience with student-centered teaching
practices, they can navigate back and forth among the chapters as needed. For example, a reader more expe-
rienced with student-centered teaching might begin by reading the Design Practices chapter to prepare for
designing a new course, then read the Classroom Practices chapter to prepare specic lessons and activities,
then read the Assessment Practices chapter to garner formative assessment strategies, redesign a lesson or
classroom activities based on the results of the formative assessment, and then learn about novel summative
assessments. e model shown below indicates the uid way in which readers might utilize the guide.
Design
Practices
Assessment
Practices
Classroom
Practices
We also acknowledge transforming ones classroom practices takes time, and we rmly believe specic
examples are helpful in facilitating such a transformation. roughout the guide we oer vignettes that are
both easy to follow and informed by the substantial body of research regarding eective teaching and deep
student learning.
e crucial nding from the research upon which this guide is founded is that eective teaching and deep
learning require student engagement with mathematics both inside and outside the classroom. Bringing
student ideas, beliefs, and practices into the direct view of peers and instructors enriches teaching and
learning and promotes community in remarkable ways. e vast body of evidence strongly supports the
transformational power of these practices in prompting changes in instructors and students at all levels from
all demographic backgrounds.
Indeed, such transformation can promote diversity, inclusion, cultural responsiveness, and social justice
within the mathematical sciences community. Our task as a community is to create these meaningful and
inspiring mathematical experiences for all our instructors and students. As such, we conclude the document
with a brief discussion on cross-cutting themes regarding technology and equity, two important topics that
are intertwined in each of the other chapters. We strongly encourage our readers to reect on how they
integrate technology into each of the practices and how their practices promote equity in the mathematics
classroom.
In summary, this Instructional Practices Guide is a call to the mathematical sciences community to scale
up the use of evidence-based instructional strategies and to collectively and individually hold ourselves
accountable as professional educators for improving the learning experiences of all undergraduate mathe-
matics students.
xi
Acknowledgements
is large project could not have been completed without the hard work of many people within the mathe-
matics community, including MAA sta members, and the project team is very grateful to all who helped.
Of particular importance the project activities were supported in large part by the National Science Foun-
dation Division of Undergraduate Education (NSF-1544324). Any opinions, ndings, and conclusions or
recommendations expressed in this material are those of the authors and do not necessarily reect the views
of the National Science Foundation.
e following people made direct contributions to the funded project, and many more participated from
the initial discussion to reviewing the nal dra.
Project Leadership Team
Martha L. Abell, Georgia Southern University
Linda Braddy, Tarrant County College
Doug Ensley, Mathematical Association of America
Lewis Ludwig, Denison University
Hortensia Soto, University of Northern Colorado
Project Steering Committee and Lead Writers
James Alvarez, University of Texas, Arlington
Benjamin Braun, University of Kentucky
Elizabeth Burroughs, Montana State University
Rick Cleary, Babson College
Karen Keene, North Carolina State University
Gavin LaRose, University of Michigan
Julie Phelps, Valencia College
April Strom, Scottsdale Community College
Project Advisory Board
Matt Ando, University of Illinois
David Bressoud, Macalester College
Marilyn Carlson, Arizona State University
Annalisa Crannell, Franklin and Marshall College
Tara Holm, Cornell University
Dave Kung, St. Mary’s College of Maryland
Rachel Levy, Harvey Mudd College
Francis Su, Harvey Mudd College
xii MAA Instructional Practices Guide
Uri Treisman, University of Texas, Austin
Paul Zorn, St. Olaf College
Contributing Writers
Scott Adamson, Chandler-Gilbert Community College
Aditya Adiredja, University of Arizona
Spencer Bagley, University of Northern Colorado
Randy Boucher, US Military Academy
Derek Bru, Vanderbilt University
Joe Champion, Boise State University
Beth Cory, Sam Houston State University
Jessica Deshler, West Virginia University
Jackie Dewar, Loyola Marymount University
Jess Ellis Hagman, Colorado State University
Angie Hodge, Northern Arizona University
Brian Katz, Augustana College
Elizabeth Kelly, Berea College
Klay Kruzcek, Southern Connecticut State University
Brigitte Lahme, Sonoma State University
Luis Leyva, Vanderbilt University
Rachel Levy, Harvey Mudd College
Guadalupe Lozanoa, University of Arizona
Bill Martin, North Dakota State University
John Meier, Lafayette College
Victor Piercy, Ferris State Uiversity
Mike Pinter, Belmont University
Chris Rasmussen, San Diego State University
Jack Rotman, Lansing Community College
Behnaz Rouhani, Perimeter College
Ae Şahin, Wright State University
Milos Savic, University of Oklahoma
Kimberly Seashore, San Francisco State University
Mary Shepherd, Northwest Missouri State University
Robert Talbert, Grand Valley State University
Diana omas, US Military Academy
Christine von Renesse, Westeld State University
Laura Watkins, Glendale Community College
Claire Wladis, Borough of Manhattan Community College
Phil Yates, St. Michaels College
Maria Del Rosario Zavala, San Francisco State University
xiii
Contents
Manifesto: A declaration of values .............................................................. v
Introduction to this guide .................................................................... vii
Acknowledgements ......................................................................... ix
Classroom Practices. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
CP.1. Fostering student engagement ............................................................ 1
CP.1.1. Building a classroom community ...................................................... 1
CP.1.2. Wait time .......................................................................... 3
CP.1.3. Responding to student contributions in the classroom ...................................5
CP.1.4. One-minute paper or exit tickets ...................................................... 7
CP.1.5. Collaborative learning strategies ...................................................... 8
CP.1.6. Just-in-time teaching (JiTT) ......................................................... 15
CP.1.7. Developing persistence in problem solving ............................................ 18
CP.1.8. Inquiry-based teaching and learning strategies ......................................... 21
CP.1.9. Peer instruction and technology ..................................................... 22
CP.2. Selecting appropriate mathematical tasks .................................................. 26
CP.2.1. Intrinsic appropriateness: What makes a mathematical task appropriate? .................. 26
CP.2.2. Extrinsic appropriateness ........................................................... 27
CP.2.3. eoretical frameworks for understanding appropriateness .............................. 28
CP.2.4. How to select an appropriate mathematical task ........................................ 29
CP.2.5. Choosing meaningful group-worthy tasks ............................................. 30
CP.2.6. Communication: Reading, writing, presenting, visualizing ............................... 35
CP.2.7. Error analysis of student work ....................................................... 38
CP.2.8. Flipped classrooms ................................................................. 41
CP.2.9. Procedural uency emerges from conceptual understanding ............................. 42
CP Conclusion ............................................................................. 44
CP References .............................................................................. 45
Assessment Practices ............................................................. 49
AP.1. Basics about assessment ................................................................ 49
AP.1.1 Assessment frameworks ............................................................. 49
AP.1.2 Clearly specify learning outcomes .................................................... 50
AP.1.3. Formative and summative assessment ................................................ 53
AP.2. Formative assessment creates an assessment cycle .......................................... 53
AP.2.1. Implementing formative assessment .................................................. 54
xiv MAA Instructional Practices Guide
AP.2.2. Formative assessments to improve mathematical practices ............................... 57
AP.2.3 Formative assessment to inuence students’ beliefs and motivations ....................... 58
AP.3. Summative assessment ................................................................. 59
AP.3.1 Assigning course grades ............................................................. 59
AP.3.2. Exemplary summative assessments ................................................... 61
AP.3.3. Creating and selecting problems for summative assessment .............................. 63
AP.4. Assessments that promote student communication ......................................... 66
AP.4.1. Writing assignments ............................................................... 66
AP.4.2. Oral presentations ................................................................. 68
AP.4.3. Group projects .................................................................... 69
AP.5. Conceptual understanding: What do my students really know? ...............................71
AP.5.1. What is conceptual understanding? ..................................................71
AP.5.2. What are concept inventories? .......................................................71
AP.5.3. Using items from concept inventories ................................................. 73
AP.6. Assessment in large-enrollment classes ...................................................75
AP.6.1. Online homework systems ..........................................................75
AP.6.2. Classroom polling systems ..........................................................77
AP.7. Assessment in non-traditional classrooms .................................................78
AP.7.1. Assessment in online courses ........................................................78
AP.7.2. Assessing via technology ............................................................ 79
AP.7.3. Assessment in non-traditional course settings ..........................................81
AP References ..............................................................................82
Design Practices ................................................................. 89
DP.1. Introduction to design practices .........................................................89
DP.1.1. Questions for design ...............................................................90
DP.1.2. Considerations for design ...........................................................92
DP.1.3. Designing for equity ...............................................................92
DP.2. Student learning outcomes and instructional design ........................................95
DP.2.1. Designing the learning environment .................................................100
DP.2.2. Designing mathematical activities and interactive discussions ...........................101
DP.2.3. Designing homework .............................................................102
DP.2.4 Designing a ipped classroom. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
DP.2.5 Using formative and summative assessment in design ..................................103
DP.2.6. Reective instruction .............................................................. 103
DP.2.7. Students needing accommodations ..................................................104
DP.3. Challenges and opportunities ........................................................... 105
DP.3.1. Big-picture challenges and opportunities .............................................105
DP.3.2. Other challenges .................................................................. 105
DP.3.3. Embracing opportunities ..........................................................108
DP.4. eories of instructional design ......................................................... 109
DP.4.1. Backward design .................................................................. 110
DP.4.2. Realistic mathematics education ....................................................110
DP.4.3. Universal design for learning .......................................................111
DP References .............................................................................111
Contents xv
Cross-cutting emes ........................................................... 115
Technology and instructional practice ........................................................115
XT.1. Introduction ....................................................................... 115
XT.2. Uses of technology .................................................................115
XT.3. Eectiveness of technology ..........................................................116
XT.4. Technology incorporated into instructional practice .....................................116
XT.4.1. Technology and exploratory activities .............................................117
XT.4.2. Technology and formative assessment .............................................118
XT.4.3. Technology as a tool ............................................................119
XT.5. Practical implications ...............................................................120
XT References ..........................................................................121
Equity in Practice ..........................................................................122
XE.1. Introduction ....................................................................... 122
XE.2. Denitions ........................................................................122
XE.2.1. Four Dimensions of Equity ......................................................122
XE.2.2. Equity, Inclusion, and Systemic Barriers ...........................................124
XE.3. Higher-order equity-oriented principles ...............................................125
XE.3.1. Social discourses and narratives impact teaching and learning ........................125
XE.3.2. All students are capable of learning mathematics ...................................126
XE.3.3. e importance of fostering a sense of classroom community .........................126
XE.4. Attending to equity .................................................................126
XE.4.1. An illustration: Students with disabilities ..........................................126
XE.4.2. Critical need to attend to developmental mathematics ...............................127
XE.4.3. Conclusion: Anti-decit perspective and focus on excellence .........................128
XE References ..........................................................................128
1
Classroom Practices
CP.1. Fostering student engagement
e purpose of this chapter is to provide college and university instructors with accessible starting points
for implementing practices that foster student engagement. Classroom practices aimed at fostering student
engagement attend to the research-based idea that students learn best when they are engaged in their learn-
ing (e.g., Freeman, et al., 2014). Consistent use of active learning strategies in the classroom also provide
a pathway for more equitable learning outcomes for students with demographic characteristics who have
been historically underrepresented in science, technology, engineering, and mathematics (STEM) elds
(e.g., Laursen, Hassi, Kogan, and Weston, 2014). In this section we focus on classroom practices that enable
students to be actively engaged in their own learning. We illustrate what it means to be actively engaged in
learning and we oer suggestions to foster student engagement.
Student engagement can be enhanced by activities that require sense-making, analysis, or synthesis of
ideas during class. ese strategies may be “anything course-related that all students in a class session are
called upon to do other than simply watching, listening and taking notes” (Felder and Brent, 2009, p. 2). Such
practices may be new to students, thus we caution instructors to be mindful of students’ interactions and
responsiveness to these new teaching techniques. Historically, students’ learning experiences of mathemat-
ics generally involve memorization and rote repetition. As such, in transforming ones teaching instructors
may need to provide structures for peer-to-peer communication and display a genuine interest in student
contributions in the classroom that moves beyond questioning solely dependent on memorization and rote
applications. Instructors will need to create a classroom environment where students feel accountable both
as individuals and as members of the classroom community of learners.
us, we begin by providing suggestions on how to build a classroom community and then turn to
quick-to-implement strategies (e.g., wait time aer questioning and one-minute papers), followed by more
elaborate strategies that may require more preparation (e.g., collaborative learning strategies, ipped class-
room, just-in-time teaching). While many of our suggestions focus on in-class activities, out-of-class ac-
tivities are also important and can impact the classroom dynamics, discourse, and community (Pengelley,
2017).
CP.1.1. Building a classroom community
e connection between student success and student engagement is supported by several studies (e.g., Free-
man, et al., 2014; Hake, 1998; Kuh, 2007). In fact, the Community College Survey of Student Engagements
(CCSSE) “use of student engagement as a proxy for student academic achievement and persistence” has been
validated by results from three recent studies (McClenney, Marti, and Adkins, 2007). However, national
CCSSE data underscore the need for a more deliberate and immediate approach to establishing a classroom
community that supports engagement. In 2016, the CCSSE showed low rates of student connections with
instructors, other students, and institutional resources. Of the students surveyed in the 2016 CCSSE, only
2 MAA Instructional Practices Guide
about 50% report discussing grades or assignments with their instructor or working with other students
during class. e National Survey of Student Engagement (NSSE) shows similar results.
Whereas community and sense of belonging are more likely to ourish in classrooms where the instruc-
tor incorporates student-centered learning approaches (see Slavin, 1996; Rendon, 1994), establishing norms
for active engagement or taking steps to increase a student’s sense of belonging to the classroom community
also impacts the quality of student engagement in the classroom. us, setting the classroom norms and
engagement expectations to incorporate student-centered learning on the rst day of class is important.
Such norms can be related to in and out of classroom settings. For example, taking time to build a classroom
community allows students to form and broaden their support networks outside of class which can include
other students, instructors, or other campus resources. Another benet of such networks is that they play
an important role in the success of members of marginalized communities (Leyva, 2016; Treisman, 1992).
Classroom vignette: First day
To establish a personal connection with her students and to create a classroom culture of participation, Dr.
Garcia makes it a priority to learn students’ names and something unique about them on the rst day. is
helps Dr. Garcia build community, establish rapport with her students, and give them a sense of the class
structure.
Aer briey reviewing the contents of the course syllabus, she passes out large note cards where students
write the name they prefer to be called on both sides of the card in print large enough to be read from any-
where in the room. Sometimes she takes pictures of students with their note cards to review for the next
class. In an eort to get to know her students better and to encourage class interaction, students write their
name, most recent mathematics course, intended major, and three interesting facts about themselves on 3 by
5 cards. Students then pair up and introduce each other to the class using one of these facts. It is important
that her students also get to know her and thus Dr. Garcia also shares three interesting facts about herself.
She may say, “You wouldnt know it by looking at me but….” She oen nds that students feel more com-
fortable participating because they are able to perceive her as their instructor, as a person, and as a learner
like them.
Aer the initial work of learning names and sharing interesting facts, Dr. Garcias students work on a
carefully selected mathematical task (see section CP.2). She monitors the discussions and follows this with a
class discussion about prior learning experiences where each group shares one or two techniques that they
think would help with their learning. She uses students’ work on the task to ground the discussion and keeps
the students’ techniques in mind for future teaching strategies. Students leave the rst class with a home-
work syllabus quiz and a campus resource task, which encourages students to read the syllabus and explore
campus resources. A recurring question on the syllabus quiz asks students to report the name and contact
information for at least three other students in the class.
On the second day of class, Dr. Garcia requests that everyone have their name cards out. She reminds
them that it very important for her to learn as many names as possible, and that her goal is to know all names
within the rst few weeks. Next, she introduces proposed norms for the class, based on ideas she heard from
students in the rst class session. Dr. Garcia explicitly attributes norms to what she heard and thus sends the
message that students’ ideas are important.
Discussion. From the rst moment of a class, Dr. Garcia intentionally connects everyone in the classroom.
To enforce the notion that Dr. Garcia is approachable, she chooses to share personal anecdotes or interesting
facts about herself. Having students share with one another on the rst day of class also helps students better
connect with others in the classroom and emphasizes the type of interactions that will be used in the class-
room. Having students spend time to establish connections with each other and understand the resources
available to them helps build their learning community both inside and outside of the classroom.
Classroom Practices 3
Practical tips
When requiring more interaction in the classroom, it is helpful to establish behavioral norms and guide-
lines for productive exchanges. Many colleges have established principles on developing community among
students (e.g., Valencia College publishes a set of principles on how members of their college community
treat one another, http://valenciacollege.edu/PJI/principles.cfm) and promoting cooperative behavior in the
classroom. To this end, it is important to have a conversation about the expected behaviors. For example,
late arrivals to class impact all group members and unnecessary cell phone use unfairly distracts from group
interactions and attentiveness in class, but the willingness to listen intently and communicate ideas about
the mathematics promotes learning and engagement. e following are more ideas for creating a classroom
community.
1. Promote a friendly atmosphere while attending to students who are members of marginalized commu-
nities.
2. Establish the importance of arriving prepared for class by completing assigned readings, videos, or
homework before coming to class.
3. Model reaching consensus to arrive at resolutions to questions posed to the class.
4. Remind students that when working in groups it is important to listen carefully and with respect. is
includes listening with the intent to understand others’ strategies and questions and not dominating
group conversations or classroom interactions.
5. Focus on the mathematics and critique ideas, but do not criticize people.
6. Help students take responsibility for their own learning by asking them to share strategies and questions
with the goal of communicating their reasoning instead of using unhelpful phrases such as “I just dont
get….
CP.1.2. Wait time
Questioning techniques cannot meaningfully foster student engagement if students are not given enough
time to think about the questions posed and to respond. e time instructors allow for this is called wait
time. Specically, wait time refers to (1) how long the instructor waits for students to consider and respond
to a question or (2) how long the instructor waits for a student who pauses during or aer their response.
Research shows that on average instructors wait less than 1.5 seconds before they either answer their own
question or ask a follow-up question. is practice can result in lowering the cognitive demand of tasks (To-
bin, 1987) and discouraging students’ deep engagement in mathematics. It also communicates to students
that a response is not actually necessary or that they are not expected to answer questions. To encourage stu-
dent participation, research points to the need for instructors to wait at least seven seconds and that an aver-
age wait time greater than three seconds is a threshold for changing instructor-student discourse (Fuller, et
al., 1985; Tobin, 1987). Although seven seconds may not seem like a long time, when enacted it can feel like
an eternity, especially when students are not used to the instructor genuinely expecting them to think about
and respond to their questions. Benets of allowing students enough time to think and respond include: a
decrease in the number of “I don’t know” responses; an increase in the number of students that respond to
questions; more correct or more sophisticated responses; and greater conceptual depth of student responses.
Classroom vignette: Wait time
In this example from a calculus course, Dr. Smiths goal is to develop the integral formula for work done by a
non-constant force,
WF
xdx
a
b
=
() .
In particular, Dr. Smith discusses the work done to compress a spring
4 MAA Instructional Practices Guide
by 0.2 meters using Hookes law, which says F(x)= kx.
Dr. Smith: Why cant we just use W=F · d with d=0.2 meters? [waits 1.5 seconds]
Dr. Smith: Because the force is non-constant, right? So why does that matter? [waits less than 1 second]
Dr. Smith: Because W=F · d only works when both quantities are constant. So then what should we do
instead? [waits 1.5 seconds]
Dr. Smith: We should slice the total compression distance into small pieces of length Dx. Why does that
help? [waits 2 seconds]
Dr. Smith: Because on those slices, the force is approximately constant. So then how can we compute
work? [waits 1 second]
Dr. Smith: Well, then we can use W=F · d. But whats d for these slices? [waits less than 1 second]
Dr. Smith: Wouldnt it be d=Dx? Why? [waits 1 second]
Dr. Smith: Because the slices are of length Dx. So, how much work is done on each slice? [waits 1.7 sec-
onds] Donna?
Student: I don’t know.
Dr. Smith: ats just F(x) · Dx, so kxDx.
Discussion. is vignette illustrates how this instructor’s use of wait time likely did not allow students to
reason about and engage in the ideas. We invite the reader to imagine a situation in which a longer wait time
occurs. In such a situation, an increase in wait time not only benets student engagement in mathematics,
but it can also result in positive changes in instructor questioning. For example, as instructors increase their
wait time, they tend to decrease the number of questions that require only a quick, factual, or procedural
response and increase the quality and variety of questions asked. It is important to acknowledge that better
questioning and more wait time for student responses will increase the time needed per topic or lesson.
However, as the quality and variety of questions asked increases, so does students’ higher order thinking and
engagement. More on instructor-questioning strategies can be found in section CP.1.3 and section CP.1.5.
Practical Tips
• Remember that wait time is important in creating a sense of community in the classroom and students
might not be accustomed to long wait times. Remind students that the wait time is important so they
have time to think and respond.
• Ask questions such as, “Do you need me to repeat the question?” or “Do you want me to rephrase the
question?”
• Aer asking a question, actually keep track of how many seconds you wait. Some instructors count to
seven on their ngers behind their back.
• Tell students why you are waiting and the benets for them when you wait.
• If you reach 10 or more seconds with no response, consider making use of the ink-Pair-Share strat-
egy discussed in section CP.1.5 of this chapter.
• For particularly challenging questions, consider revisiting this question as a one-minute paper prompt.
See section CP.1.4 of this chapter.
For more tips on integrating wait time eectively in the classroom see www.ericdigests.org/1995-1/think.
htm.
Classroom Practices 5
CP.1.3. Responding to student contributions in the classroom
Instructors oen greet students’ excellent ideas with enthusiasm, praise, and a positive facial expression.
When students see a less-than-enthusiastic look on an instructors face, they may start to backpedal with
their response. is is especially true when students are appropriately challenged, where they experience
struggle and sometimes failure. On the other hand, research shows that productive struggle can help stu-
dents develop persistence and condence (Edwards and Beattie, 2016) which can lead to successful learning.
In this section we discuss how to respond to incorrect answers, incomplete explanations, faulty arguments,
and students’ struggles.
From an equity stance, one of the most powerful ways an instructor can build community and student
condence is to reframe errors. Instead of simply listening for and identifying student errors, learn to listen
for aspects of correct reasoning in students’ responses. Instead of viewing struggle as a problem, learn to say
to the class “I love that we are struggling with this, it is an opportunity to learn!” is communicates to the
students that the instructor values the students’ reasoning and that she supports students as they move from
their current understanding towards coherent understanding.
To help facilitate a classroom environment where questioning and justication are normative, instructors
need strategies for responding to student errors in classwork and class discussions. e following vignettes
contain ideas on how instructors may respond to students that will facilitate the development of a commu-
nity in the classroom and equip instructors with strategies to manage student responses (e.g., Battey and
Stark, 2009; Hughes, 1973).
Classroom vignettes: Responding to students
For each vignette below, there are several possible responses. You may want to consider your own response
before reading the options provided.
Vignette 1: A group of students are working a problem together at the board and Dr. Bird hears one of them
make a strong (correct or incorrect) assertion. None of the other students question the assertion. Dr. Bird
usually opts to use one of these three interventions.
1. She works with just that group and asks the student to revisit that assertion and explain their thoughts
out loud. If the assertion is incorrect, she waits to see if the student self-corrects, if another student makes
a suggestion, or if she can ask a follow-up question that helps the student recognize the incorrect asser-
tion. If the assertion is correct, she waits to see if students in the group can defend the assertion, answer
probing questions related to the assertion, and that they are not just accepting the assertion due to the
dominance of the group member or for other reasons.
2. She regroups the class and shares the assertion. She then uses some form of think-pair-share (see section
CP.1.5) to help students examine the assertion for what makes sense and what needs further reasoning.
3. She introduces the assertion on a homework problem or assignment as an example of student reasoning
and asks the students to explain why the assertion is correct or incorrect (and perhaps also what related
idea is correct). Alternatively, she presents the assertion on the homework and asks students to decide
whether the assertion is true and to justify their answer.
Vignette 2: In the middle of a lecture Dr. Brown poses a question to the class. A student who rarely speaks
up volunteers an incorrect answer.
1. Dr. Brown responds by asking the student to explain their reasoning. Aer the student has explained
their reasoning, Dr. Brown poses a question to help clarify, oers a counterexample, or validates any
correct reasoning while oering “another way to reason about his original question” that helps bridge the
student response with the expected one.
6 MAA Instructional Practices Guide
2. If Dr. Brown is familiar with the misconception, he may say “yes, it’s really common or tempting to think
about it that way, but heres an example that doesnt t the pattern.” He then oers, “It may be more help-
ful to think about the concept….” Many times, he validates the students response by thanking them for
bringing up the idea and that he intended to mention the pitfall or tempting misconception. See also the
vignette following section CP.1.9.
3. Dr. Brown may pose the answer as a “conjecture” and ask students to work in pairs (e.g., section CP.1.5)
and reason about the conjecture.
Discussion. Appropriately handling student responses whether correct, incorrect, or unexpected and em-
bracing student contributions in the classroom directly aects the learning environment and can encourage
student-centered learning in the classroom. To encourage student responses and participation, it is im-
portant to recognize the value of students oering both correct and incorrect responses. If Dr. Brown only
engages groups who make incorrect assertions, this implicitly communicates that there may be something
incorrect with their mathematical reasoning. Instead, Dr. Brown established classroom norms for which
following up with individual students and groups is not interpreted as a sign of being incorrect, but rather
part of the classroom participation structures.
As instructors choose approaches, it is important to be intentional about the purpose of the questions as
well as the questioning techniques. Determine if the purpose of the question is to do a quick check for learn-
ing and retention, to assess prior knowledge, or to elicit discussion. e purpose of the question can inform
the way in which you handle any responses, correct or incorrect.
Practical tips
ese tips rst appeared in the MAA Teaching Tidbits Blog, maateachingtidbits.blogspot.com.
1. Create a safe space for incorrect answers. is takes time and care. For example, you can say “Im so glad
you raised that point. We oen think [incorrect idea] because [some kind of reason], but actually if you
take into account [key idea] it leads to this other way of reasoning, which is correct.” is emphasizes that
reasonable attempts at solving a problem can sometimes lead to incorrect solutions.
2. Keep a poker face. Make sure no matter what the student says that you ask the student to justify the rea-
soning behind the answer. Try to not give away whether the answer is correct. Another option is to have
a dierent student discuss whether the answer is correct or incorrect and explain why.
3. Focus on the reasoning. e poker face is also important to encourage students to share their reasoning,
without fear of discouragement from negative reactions. It also prevents them from changing their an-
swer (based on the look on your face) without diagnosing the cause of their error.
4. Distinguish between types of errors. You may or may not want to give a lot of time to discussing a typo,
versus a common misconception or confusion. Sometimes it is important just to correct and move on.
5. Identify correct aspects of a solution. Even though a solution may be incorrect, the student may have
done some good work to get there. In some cases you can say, “at would be the correct answer if [xxx],
but actually we are thinking about [yyy].
6. Keep in mind that speaking in front of peers and the instructor is risky. A way to lessen the pressure is to
give students the opportunity to come back to their idea. is could be as simple as asking a student to
rethink an assertion, and say “Gloria, shall we come back to you? Does that sound good? I think we all
want to know what you are thinking, so let’s hear from someone else and come back to see what you are
thinking.
Classroom Practices 7
CP.1.4. One-minute paper or exit tickets
Instructors use one-minute papers or exit tickets to quickly assess what students learned from a class session
or their general thoughts about the course. e use of one-minute papers and exit tickets may enhance stu-
dent engagement because students are required to reect upon the learning taking place, to demonstrate a
skill, or to communicate a concept at the close of a topic or class session.
Not surprisingly, a one-minute paper takes about a minute to complete and is usually integrated at the
end of class, although it can also be used at the end of a section or topic. e instructor poses a question that
prompts students to reect upon signicant concepts a student learned that day or on concepts for which
they still feel uneasy or did not understand. Students address the question in writing and hand it in, but
they can also complete their one-minute paper via an online learning management system and submit it by
the end of the day. Both mechanisms allow instructors to review submissions quickly and obtain formative
feedback about student learning.
Similarly, exit tickets function like one-minute papers except that they may consist of a short-answer
question or a multiple-choice question that students must answer by the end of a class session and submit
as they exit the classroom. Again, this can serve to determine how well students understand new material.
Classroom vignette: One-minute exit ticket
Dr. Kessler introduced Taylor series during a class meeting of his calculus course. He wants to assess what
students learned from the class session as well as what they felt they did not fully grasp in order to address
these points in the next class session. us, at the end of the class period he asks students to take one minute
to explain in concise, complete sentences:
1. What are the three most signicant things you learned today about Taylor series?
2. What are you le wondering about Taylor series?
3. Is there anything that still is unclear about Taylor series for you?
4. Why are you studying Taylor series?
He collects the one-minute papers and reviews them before the next class. He uses responses to the rst
question to determine how much students said they learned, responses to the second question to connect
student “wonderings” to the next class session, and responses to the third question to construct follow up in-
class or homework tasks. While it is not always necessary to request student names on one-minute papers,
Dr. Kessler requests that students write their names on the paper, so he can plan individualized follow up
with students, if necessary.
Practical tips
One-minute papers and exit tickets do not take a signicant amount of time to review, and they provide
important information on the status of student learning in the classroom. e one-minute papers can also
be used to obtain general feedback about the course by having students reect, once or twice during the
semester, on “What is going well in this class?”, “What needs to be modied?, and “What should be main-
tained?” An exit ticket may be used to assess how well student groups are functioning by asking students to
respond to questions such as “Does working in this group enhance my learning?” or “Does working in this
group hinder my learning?” A one-minute paper could be used for the same purpose by asking students to
“Explain how working in this group enhances your learning” or “Explain how working in this group does
not enhance your learning.” Note that in this case the exit ticket questions are short questions that cannot
be answered “yes” or “no. Exit tickets can also incorporate a mathematical task or question to formatively
assess students’ knowledge on a particular mathematical concept.
8 MAA Instructional Practices Guide
It is important that one-minute papers and exit tickets
• take less than one minute to address. However, some implementations of one-minute papers allow
students more than one minute but less than about ve minutes to complete the task.
• contain clear directions. e questions can be standard such as, “What is the most signicant concept
you learned today?, “What do you still wonder about?”, and “What do you still not understand?” If
several questions are posed, it helps to have them written on the board or projected on a screen.
CP.1.5. Collaborative learning strategies
Collaborative learning and cooperative learning are terms oen used interchangeably, but the meanings
of the terms dier. Collaborative learning typically refers to learning that takes place as small groups of
students focus on open-ended, complex tasks, whereas cooperative learning typically refers to more struc-
tured, small-group learning that focuses on foundational or traditional knowledge with group roles (e.g.,
facilitator, summarizer, recorder, presenter) that may also serve to help students learn to work in groups (see
Cooper and Robinson, 1997; Smith and MacGregor, 1992).
Johnson and Johnson (1999) indicate ve basic elements essential for successful cooperative learning:
• Positive interdependence: Group interaction is necessary for successful resolution of the question or
task and for linking individual success and the success of the group. For example, a task can be broken
into parts to be completed by individuals but the individual work is needed for a group resolution. e
relationship between individual and group success is exemplied by the fact that students work on a
task together, but submit one group response orally or in writing
• Face-to-face interaction: Group interactions include discussing solution paths, important concepts,
connections to prior knowledge, and facilitating words of encouragement and help when needed. For
example, when a student asks the instructor, “Is this right?” the instructor can redirect the question to
the group and ask for input from others in an eort to help the students answer their own questions.
• Individual accountability: Students are held accountable for their share of the work in the group. For
example, a portion of a students grade for group work may depend on an individual quiz given at the
end of the activity, or there may be questions in a task that must be answered individually.
• Social skills: Group interaction requires interpersonal, social, and collaborative skills. Instructors
must provide students with guidance on how to eectively interact in a small group. For example, a
class discussion of appropriate group behaviors and expected norms of communication is an essential
precursor to implementing successful cooperative learning. Providing a handout listing these behav-
iors and reminding students of these expectations throughout the term is also important.
• Group processing: Group members discuss eectiveness in reaching their goals and in working to-
gether. For example, students should be given time to reect on prompts such as, “What I liked most
about this group was…” or “Our eectiveness as a group could be improved by…,” and then the stu-
dents should discuss their responses in the group. It also may be helpful for the instructor to collect
these reections.
Implementing collaborative or cooperative learning strategies successfully relies heavily on assigning
groups of an appropriate size and with students and task in mind. For some strategies (e.g., ink-Pair-
Share) pairing students in groups of two by proximity is appropriate. For other strategies (e.g., Small Group
Work), a group size of no more than three students is ideal because, by nature of the size of the group, all
students have more opportunities to contribute to group discussions than they would in a larger group in
which less assertive students may not have ample opportunities to contribute. Sometimes it is appropriate
to randomly assign groups and other times it is appropriate to assign groups by thinking carefully about the
Classroom Practices 9
task to be completed and the skills each student brings to the task. For example, in a class of 40 students fa-
cilitated by an instructor and a graduate student, providing students with a diagram of the classroom set-up
(i.e., location of groups 1-13) and pre-assigning groups so that students know where to go at the beginning
of class is preferable to an ad-hoc approach of having students count o from 1 to 13 and then “nd” their
groups. e latter also does not allow for placing students in groups based upon performance on homework
or other factors that may inuence group dynamics.
In the following sections, we discuss specic cooperative learning strategies, such as think-pair-share,
paired board work, and small group learning.
Think-pair-share
ink-pair-share is a cooperative learning strategy that requires students to think about a question, discuss
their thinking with a partner, and then verbally share their ideas in class or submit their ideas for review.
Brame and Biel (2015) describe think-pair-share as follows: “e instructor asks a discussion question.
Students are instructed to think or write about an answer to the question before turning to a peer to discuss
their responses. Groups then share their responses with the class.” Novices to collaborative learning may nd
that the think-pair-share cooperative learning strategy is a good rst step toward implementing cooperative
learning in the classroom.
Classroom vignette: Think-pair-share
Dr. Adams attempts to make his lectures interactive and to motivate concepts with as much student
participation as he can garner in a lecture course attended by 90 students. Dr. Adams motivates the Extreme
Value eorem in calculus using think-pair-share several times via a cycle of questioning, individual time
to think, sharing with a partner, and reporting out; then he repeats the process with follow-up questions.
Dr. Adams asks each student to take out a sheet of paper and graph a function over the interval [a, b]
and to pass their paper to someone in front of or behind them. He then asks them to pair with a student
beside them, and he begins to pose questions about the graphs that they have at hand. Given that the task is
relatively open-ended, the expectation is that there will be many dierent graphs generated by the students.
Aer each question he gives students time to think about their answer and to discuss with their partner. He
poses questions such as, “Does your function have a maximum value over the interval [a, b]?” “Does your
function have a minimum value over the interval [a, b]?” and “Does your function have both a maximum
and minimum value over the interval [a, b]?” He then polls the class by asking students to “Raise your hand
if the function you have has a maximum value over the interval [a, b],” “Raise your hand if the function
you have has a minimum value over the interval [a, b],” and “Raise your hand if your function has both a
maximum and minimum value over the interval [a, b].” Aer each question and poll, he asks students to
discuss their answer with their partner. Students who raise their hand to the question, “Raise your hand if
your function has both a maximum and minimum value over the interval [a, b]” share their functions by
sketching them on the board or portraying them via the document camera. Recall that these are graphs of
functions drawn by classmates.
Aer this Dr. Adams asks students to think (individually) about any features that the displayed graphs
have in common and to discuss their thoughts with their partner. He asks students to share their ndings
by calling on student pairs. He uses the ideas presented by the students regarding the common features
present to pose additional questions such as, “Is this condition really necessary?” and “What restrictions or
guidelines would we need in order to guarantee that a function has a maximum and minimum value on the
interval?” Dr. Adams has witnessed how students’ engagement in creating their own functions and examples
results in them feeling a sense of satisfaction when they establish that requiring a function to be continuous
on [a, b] will guarantee that the function attains an absolute maximum and absolute minimum on the inter-
10 MAA Instructional Practices Guide
val. In the process, he also exposes the logical implications (via their generated examples) that a function can
attain an absolute maximum and absolute minimum on the interval [a, b] and not be continuous, but that
continuity guarantees that maximum and minimum values exist. e polling of the class also introduces an
empirical element to the process of establishing the theorem.
Discussion. Some practitioners may argue that a pure form of think-pair-share involves students sharing
their thinking with the class or writing down their thoughts and handing in their written response. However,
in a lecture-based setting, especially one with over 60 students, as in the vignette, it is not practical to call
on every student. In this case, the instructor may solicit answers that are qualitatively dierent from those
already mentioned by other students.
e think-pair-share strategy gives students the opportunity to polish their mathematical reasoning and
communication prior to presenting their ideas to the whole class. It is also particularly helpful for English
language learners and students with learning disabilities to prepare their contributions for whole-class dis-
cussion with a partner. is allows time to process their reasoning and to practice their communication
skills in a less high-stakes context.
In addition, the incorporation of think-pair-share strategies advances participation structures in under-
graduate mathematics classrooms that depart from strictly teacher-student forms of communication. En-
couraging students to share their mathematical reasoning with a partner is another mechanism to build a
stronger sense of classroom community, as discussed earlier in the chapter.
Practical tips
It is helpful to think through the goal of your think-pair-share activity.
• Do you plan to use student reasoning to resolve the question in a meaningful way? If so, can you antic-
ipate some of the responses? What follow-up questions might be needed?
• Does your question or task require students to speak to one another, to generate new ideas, or to stim-
ulate diverse strategies or examples? How will you sequence the interaction? How long will you give
students to think; pair; then share?
• Do you believe that incorporating the activity will enhance student reasoning? If so, in what way?
Paired board work
Another collaborative learning strategy that is eective in promoting classroom engagement is called paired
board work. To help facilitate learning, students engage in mathematics through problem solving, making
conjectures, discovering patterns, exploring informally and formally, and formalizing ideas, while having
the opportunity to learn from their peers. Paired board work necessitates that all students demonstrate their
knowledge—in pairs or even in triples—at the blackboards or whiteboards in a classroom.
e logistics of using paired board work during class is rather simple. First, it is important to have access
to stationary (wall-mounted) whiteboards or blackboards. Ideally, all student pairs should have space at the
boards, but if this is not possible, it is appropriate to have half of the students at the board and half at their
desks. To implement the strategy, assign students to work in pairs at the board with one student tasked as
the scribe and the other student tasked as the quality controller. e scribe is responsible for writing the
mathematics on the board. e quality controller is responsible for assisting the scribe and monitoring the
quality of the mathematics displayed, attending to precision of notation, correctness, and accuracy. Aer
each problem, students rotate roles so that each student has multiple opportunities to serve as both scribe
and quality controller.
is method allows students to share their reasoning publicly, while also allowing instructors to forma-
tively assess students’ knowledge and skills. Oen, students are asked to describe and present their reasoning
of the work they have displayed on the boards and other students are asked to critique the presented reason-
Classroom Practices 11
ing. It is important to remind students that we are critiquing the mathematics, not the person. Paired board
work is a powerful tool for demonstrating knowledge, critiquing reasoning, analyzing multiple solution
pathways, and assessing students’ reasoning. Students enjoy having the opportunity to gain other students
perspectives on problems and discover other methods for completing various mathematical tasks. Figure 1
and Figure 2 below show examples of paired-board work in action.
ere are many ways to assign tasks for students to complete at the boards when implementing paired
board work. One example is to have all students work on the same problem at the same time. An advantage
to this approach is that the instructor can assess students’ thinking and abilities on this one problem in re-
al-time, thus allowing the instructor to tailor their guidance for each pair as the instructor roams the room
and monitors student progress. is approach also allows for multiple solution pathways to emerge, which
can be leveraged by the instructor during the whole class debrieng stage before moving on to the next
problem. An example of this is shown below in Figure 3, where dierent student pairs created dierent, and
mathematically sound, solution pathways.
Maximum Rectangle Task: A rectangle is to be inscribed within a right triangle with a base of 3 and a
height of 4. What is the largest rectangle that can be created?
Another approach when implementing paired board work is to have students work on dierent problems.
Figure 1. Paired board work with calculus students. Figure 2. Paired board work with college algebra students.
Figure 3. Multiple solution pathways of the Maximum Rectangle Task during paired-board work.
12 MAA Instructional Practices Guide
is approach can be accomplished by posting printouts of the problems at various places on the boards,
and asking students to choose a problem to solve at the board. Once the task is completed, students can be
called upon to present their solutions to the class and solicit feedback from their peers for improvement of
their solutions. Overall, paired board work allows students to engage in mathematics through collaboration,
to increase high-quality mathematical discourse, to critique the reasoning of others, to justify their solu-
tions, and to display their reasoning publicly through written and verbal descriptions.
Small group work
Cooperative or collaborative learning must involve small groups, and there are many ways to incorpo-
rate small group work into the classroom. If grouping students based on performance, it is important to
place low-performing students with medium-performing students and medium-performing students with
high-performing students. is practice provides the best opportunity for students to work together and
grow in their learning. It is best to avoid placing low-performing students with high-performing ones. e
following is a list of some common strategies instructors use for grouping students for classroom work:
• Balance student personalities so that more vocal students are grouped with less vocal students.
• Regroup students oen so that they work with a variety of students from class.
• Use dierent grouping strategies, such as using random generators, drawing from a deck of cards (all
aces together, etc.), assigning groups based on the order in which students entered class, etc.
• Use more strategic approaches, such as grouping based on declared majors or interests, class perfor-
mance, or other knowledge of the students.
• Avoid allowing students to remain in groups when the dynamic of the group impedes student learning.
is is a good time to regroup!
Eective small group learning should incorporate tasks or questions that involve the ve critical elements
mentioned in section CP.1.5: positive interdependence, face-to-face interactions, individual accountability,
social skills, and group processing.
In summary, collaborative learning entails students working on tasks in small groups of 3-4 students.
In a more structured cooperative setting, students may be assigned roles in their groups such as facilitator,
summarizer, recorder, and presenter. During this work, the instructor typically listens to the discussions
and engages with dierent groups in a variety of ways, such as asking guiding questions or redirecting stu-
dents’ inquiries. Oen a period of group work is followed by group presentations to the entire class, and the
instructor may use the presented work to draw out important topics, to make connections, or to lead into
subsequent material. e following vignette illustrates these processes.
Classroom vignette: Group work with rings
In his abstract algebra course, Professor Morales wants to help students generalize ideas that have been
familiar to them since grade school. For example, a ring is a set of objects together with two operations, ad-
dition and multiplication, such that a certain collection of rules is satisfied. To illustrate the general behavior
of elements in a ring, one can make analogies to the behavior of the integers under addition and multiplica-
tion. But how does one recognize when a student truly understands definitions such as “unity element” and
additive inverse”?
As is customary, the textbook uses the generic notation “0” and “1” to denote a rings additive identity
and unity, and the notation “–a” to denote the additive inverse of the ring element . e rst surprise comes
when students work on the following task in small groups:
Task 1. Calculate 0, 1, and −1 in the ring Z
4
, the set {0, 1, 2, 3} under addition and multiplication modulo 4.
Misconceptions are evident. Students correctly identity the additive identity as the integer 0 and the unity
Classroom Practices 13
as the integer 1, but many students insist that there is no additive inverse of the unity element since the inte-
ger −1 is not present in the set {0, 1, 2, 3}. is brings up an opportunity for a whole-class discussion of what
the symbol “−1” means in an abstract setting. Professor Morales writes the equation x+1=0 on the board
and asks the students to discuss in their groups whether this equation has any solutions in Z
4
. In this con-
text, most students are comfortable with the fact that x=3 is a solution, and so writing –1=3 makes sense
in this particular ring. Groups who see this right away are encouraged to generalize to the additive inverse
of other elements in any ring Z
n
.
Professor Morales presents the next task for group discussion:
Task 2. Calculate 0, 1, and −1 in the ring M
2
(R), the set of 2´2 matrices over the real numbers.
Many groups will be wrestling with the use of symbols 0, 1, and −1 to represent matrices. For groups stuck
on the values of 0 and 1, Professor Morales points out that a unity element satises
(1) In Z, a1=1a=a for all aÎ Z, and
(2) In M
2
(R), A1=1A=A for all AÎM
2
(R).
Reminding students that in (2) the ring multiplication may only be between 2´2 matrices leads to the
correct characterization of I as the unity 1 in M
2
(R), and then the correct interpretation of –1 follows the
same way as in the previous task.
Aer building this facility to work within , Professor Morales next introduces a new task aer a brief
presentation to explain that the notation represents the repeated addition of copies of the ring element ,
provided is a positive integer.
Task 3. In the ring M
2
(R), compute the element 3 · 1.
Professor Morales asks students to explain their reasoning very carefully within their groups. Most groups
produce a correct answer very quickly, but upon listening to their explanations within their groups, the pro-
fessor discovers that most of the students adopted scalar multiplication rather than repeated addition. Rath-
er than lecture on improving the explanation, Professor Morales presents a nal task for group discussion
that renders impossible the incorrect explanation.
Task 4. Let R = {v, w, x, y, z} be a ring where v, w, x, y, and z are all distinct elements under the operations +
and - for which the Cayley tables are given in Figure 4.
Figure 4. Cayley tables for Task 4.
Use the given information to complete the following.
1. Calculate 0 and 1 in this ring.
2. Calculate −1 in this ring.
3. Calculate 3 · 1 in this ring.
Within this context there is no way to obtain a correct answer for without interpreting the expression as
14 MAA Instructional Practices Guide
repeated addition. Moreover, one is forced to recognize the critical properties of the additive identity and the
unity element in order to correctly identify them among all the ring elements.
Aer completing Task 4, students are asked to revisit their explanations for their answers to Task 3.
Discussion. In this vignette, we see how an instructor uses an activity to learn about his students’ under-
standing of a specic mathematical concept. We will describe some of the aspects of a typical group work
activity by considering the role of the instructor and the role of the student in this example.
An important aspect of the work of the instructor in group work activities is to select a task for students
to work on.
• To elicit positive interdependence, select a suciently subtle or complex accessible task that generates
discussion. Since the instructor cannot talk to all groups simultaneously, students need to have an en-
try point to productively engage with the material. In this vignette, the problem starts with a familiar
example and then increases in abstraction.
• To attend to the mathematics, select an activity with a clear mathematical purpose in mind. In this case
students generalize the idea of additive and multiplicative identity from the familiar ring of integers to
more abstract rings.
• To expose student thinking, select a task that provides opportunities for formative assessment. Instruc-
tors can learn about his students and make informed decisions by listening to their students and by
asking for explanations that may consist of awed reasoning even if reaching a correct answer. Such
inquiry can lead to task extensions that require a dierent strategy, such as the addition of Task 4 in
the vignette.
During the activity, the instructor actively monitors and observes the group work in progress.
• Based on the student reasoning made visible during the group work, the instructor can decide where to
go next. By observing his students work in groups on each task, he gains valuable insights into where
his students have conceptual diculties.
• e instructor selects approaches and ideas to highlight during whole class discussions and determines
a sequence in which the approaches will be mentioned. Note that some parts of the activity may not
need any attention—students suciently covered the content already—and other parts may oer op-
portunities for deeper learning or show gaps in students’ understanding.
• During a whole class discussion, the instructor can make connections between ideas that have come
up. In this example the instructor points out the parallels between how 1 (integer unity) and I (matrix
unity) are solutions to similar equations, generalizing the idea of unity from specic examples to the
general denition.
Managing the student engagement and interaction is also an important role for the instructor.
• Students work on problems in small groups, discuss solutions with each other and with the instructor.
ey encounter hurdles, questions, misconceptions. All students actively engage with each other and
with the content—there is no place to hide and just watch. In the vignette, many groups have trouble
with the notation used for the unities when applied to matrices. is is addressed directly within the
group discussion before it becomes a major stumbling block.
• Students engage with the instructor by asking questions or giving explanations for their work. In the
vignette, dierent groups justify their answers to the instructor in similar, incorrect ways. is points
out widely held misconceptions that can be addressed during the activity and with a subsequent new
activity.
• Students share solutions and oen dierent groups contribute dierent approaches, contributing to
Classroom Practices 15
deeper learning for all students.
• Making connections to classroom norms and practices for mathematical problem solving and com-
munication can enhance the equity-oriented considerations for the small-group work classroom prac-
tice. For example, how are members of small groups engaged in collaborative learning in ways that
foster equitable, meaningful engagement with the mathematics?
CP.1.6. Just-in-time teaching (JiTT)
Just-in-time teaching (JiTT) is a formative assessment practice employing pre-class readings and online
quizzes to shorten the feedback loop between out-of-class and in-class experiences. It was developed by
Gregor Novak and Andrew Gavrin, physics professors at Indiana University - Purdue University Indianapo-
lis (IUPUI), and Evelyn Patterson, physics professor at the Air Force Academy, in 1996. eir objective was
to help students structure their out-of-class study time in order to get more out of the limited face-to-face
class time. To accomplish this objective, they designed an online system that allows them to ask students
questions and receive student responses in a short time frame. ey used student responses to modify their
classroom instruction “just in time,” i.e., right before class started.
JiTT creates a short feedback loop between out-of-class and in-class experiences. Students prepare for
class by answering questions on pre-class assignments, instructors modify their plans for in-class activities
to address gaps in understanding and capitalize on student strengths, and instructors reect on in-class
activities to develop the next set of pre-class questions. Additionally, instructors can provide individual stu-
dents with feedback on their responses. Specically, many JiTT platforms and learning management systems
allow instructors to send feedback to students directly from the response-viewing screen.
e questions on pre-class assignments, oen referred to as “JiTTs,” should be “short, thought-provoking
questions that, when fully discussed, oen have complex answers” (Novak and Patterson, 2010, p. 6). ey
should have a low oor and a high ceiling. at is, they should be simple enough to allow students to engage
with them despite not yet having formal instruction on the concept, but complex enough to spark interest
and require meaningful thought to answer. e GoodQuestions Project, run by researchers at Cornell Uni-
versity, developed many multiple-choice questions for use as JiTTs in calculus. For example, the question,
“Were you ever p feet tall?” could be posed in a calculus class to make students grapple with continuity, the
intermediate value theorem, and irrational and transcendental numbers. To help students develop their
metacognitive skills, instructors oen end their JiTT assignments with a question such as, “Aer completing
this exercise what concepts or ideas are still unclear and why?” (Simkins and Maier, 2010, p. xvii).
JiTT diers from online homework because it is pre-class formative assessment, designed to get students
thinking about an idea before class. On the other hand, homework (whether online or paper-based) is
post-class assessment, usually summative, designed to help students cement and demonstrate their under-
standing of concepts discussed earlier in class. Given that JiTT is a formative assessment practice and JiTT
exercises are “non-judgmental diagnostic tools” (Novak and Patterson, 2010, p. 11), student responses to
JiTTs should be graded solely on the basis of eort and completion, rather than on the basis of correctness.
is allows students to feel safe expressing ideas that are still tentative, inchoate, or under development. As
illustrated in the second vignette below, incorrect or partially correct responses are oen extremely valuable
as seeds for classroom discussion. Additionally, incorrect responses can help illuminate gaps in students
understanding of a particular topic. at is, if a large number of students all make the same mistake, then
instructors know to focus their in-class activities in an eort to address that mistake.
Research (see Formica et al., 2010; Marrs and Novak, 2003; Riskowski, 2014) suggests that students in
JiTT classes exhibit increased preparation for and participation in the in-class activities due to having spent
some time before class thinking about the material. ey feel more ownership of important ideas and con-
cepts, because their own words are used to start the in-class conversation, and they have frequent oppor-
16 MAA Instructional Practices Guide
tunities to practice discipline-appropriate reasoning, communicating, and metacognition in a low-stakes
environment. Instructors develop a more complete picture of what topics students do and do not fully grasp,
are able to modify their instruction to target more directly the concepts students are having the most di-
culty with, and have another opportunity to give students quick feedback on their learning. Furthermore,
empirical results show that students in JiTT classes outperform their peers in similar classes without JiTT .
We illustrate JiTT with two vignettes and follow-up discussions, and then provide more information on
implementing JiTT in the classroom.
Classroom vignette: An all-too-familiar story
Dr. Gomez taught her calculus class the formal denition of continuity (i.e., that f is continuous at x=c if
lim
x® c
f(x)=f(c)) and discussed the use of this denition to compute limits: if we know that f is continuous at
some point c (for example, if f is a rational function for which the denominator is nonzero at c), then we can
compute lim
x® c
f(x) by simply evaluating the function. However, she continued, if we are going to compute
limits in this way, then we must justify our steps by saying why we know f is continuous at c.
Her students appeared to be engaged during her lecture, nodded along through several examples, and
even worked together in small groups through an example problem of their own. Dr. Gomez, consequently,
believed that her class understood this idea. However, as Dr. Gomez graded the test she gave her students a
week aer this class period, she quickly discovered that very few people justied their steps in limit calcu-
lations. She was both surprised and dismayed by this nding, because she thought students understood the
idea so well during the class session and because the students had high scores on the online homework. As
a result, she spent a fair amount of class time discussing this common mistake even though she felt that her
students were confused or disengaged because it had been so long since their original exposure to the formal
limit denition of continuity.
Discussion. e problem that manifests itself in the above vignette is that the feedback cycle took too long.
Students learned the limit denition of continuity, a week elapsed before the test, several days of grading
time elapsed further, and Dr. Gomez was thus unable to identify and address the misunderstanding until
perhaps two weeks aer students’ initial introduction to the idea. By that time, her students had learned
so many other things that the formal denition of continuity was no longer fresh in their minds, and thus
they were not receptive to the re-discussion of the idea.
Classroom vignette: Another story, with a shorter feedback cycle
On Friday, Dr. Gomez planned to cover global optimization of continuous functions. On the preceding
Wednesday, she sent her students a link to a pre-class assignment due one hour before class on Friday, in
which she asked students to look at the following graph of a function f(x) shown in Figure 5.
Figure 5. Graph of f(x)
On this pre-class assignment, Dr. Gomez asked students to estimate the critical points and critical val-
Classroom Practices 17
ues of f(x), and to identify the very smallest and very largest values of f(x). e students were also asked to
identify the very smallest and very largest values of f if its domain was restricted to [-3, 5] and if its domain
was restricted to [0, 4]. Finally, the students were asked to make a conjecture based on their work about the
possible locations of global extrema of a function dened on a closed interval.
In class on Friday, Dr. Gomez highlighted two student conjectures: Student As conjecture was, “Global
extrema occur at critical points,” and Student B’s conjecture was, “e global extrema are the endpoint val-
ues.” She asked her students to decide whether Student A or Student B was correct. Aer exploring several
examples in small groups, her students decided that both Student A and Student B were partially correct.
Dr. Gomez next asked, “How could we change their statements to make them completely correct?” and she
gave the students several minutes to discuss this question in small groups. Eventually, the class decided that
Student A should have said, “Global extrema can occur at critical points,” and that Student B should have
said, “e global extrema can be the endpoint values.
Dr. Gomez continued with the rest of her plan for the class session, which included students working to-
gether in small groups to nd the global extrema of f(x)=x
3
- x
2
- 48x + 52 on the domain [0, 10] and on
the domain [-5, 15]. In both instances, students automatically checked the values at the endpoints, saying
to each other that they had to do that because of Student Bs pre-class assignment response.
Discussion. Compare the length of the feedback cycle in this example to the length of the feedback cycle in
the rst example. In the second story, the assignment was only out to students for a period of several days;
grading the assignment only took a few minutes, since it was on the basis of completion and eort; and
students received feedback in an anonymous and holistic way during the very next class session, when the
concept was still fresh in their minds. e second story is an example of Just in Time Teaching, JiTT.
Practical tips
As with any of the other student engagement strategies detailed in this guide, JiTT can be implemented well
and have good results, or it can be implemented poorly and have poor results. In particular, if pre-class ac-
tivities and questions are not carefully integrated into the in-class activities, students may come to resent the
extra work, which can lead to negative eects on their motivation and attitudes toward the class, and even to
the possibility of a reversal of learning gains (Camp et al., 2010).
When implementing JiTT, the following are needed: (1) JiTT questions that are thought-provoking for
students, (2) a platform where students can respond to JiTTs and where the instructor can grade their re-
sponses, and (3) a plan to incorporate JiTT responses into in-class activities.
• Question banks: e GoodQuestions Project (www.math.cornell.edu/~GoodQuestions/) is a source of JiTT
questions for calculus, and the JiTT Digital Library (jittdl.physics.iupui.edu/) provided by the original
inventors of JiTT focuses primarliy on physics concepts. Additionally, ConcepTest questions designed
for use in peer instruction can oen be modied to be good JiTT questions, and many textbook pub-
lishers provide extensive banks of ConcepTest questions carefully aligned with the textbook. Finally,
the website mathquest.carroll.edu/resources.html contains links to resource collections primarily focused
on classroom voting systems.
• Platforms: Most learning management systems (such as Canvas, Blackboard, D2L, or Moodle) and on-
line homework systems have integrated assignment features that are excellent for collecting and grad-
ing student responses to JiTT questions. Additionally, many instructors use online survey tools (such
as Google Forms, Qualtrics, or SurveyMonkey) to accept and grade student responses. Of particular
interest is a tool purpose-built by the JiTT developers; a guide to this tool is available at bit.ly/jittdlguide.
• Planning: ere are two places where instructors will need to do some careful planning. e rst is
the beginning-of-semester dialogue conveying class expectations. Because JiTT is a formative assess-
ment technique and, thus, quite dierent from the summative assessments most students are used to,
18 MAA Instructional Practices Guide
instructors will need to spend a fair amount of time explaining it at the beginning of the semester. In-
structors should plan to explain to students the benets of the system, give them tips on how to make
sure they are keeping up with the daily JiTT activities, and clarify expectations for their answers. Sec-
ond, instructors will need to plan how to integrate JiTT responses into daily class activities. Instructors
oen schedule ve or ten minutes of JiTT response discussion into the beginning of each class session
and look for points in each class session that can be illustrated nicely by a student comment. Avoid
positioning individual students as being more or less mathematically able when orchestrating whole-
class discussions based upon JiTT responses (see Ray, 2013, pp. 42–55).
• A caveat: As with any of the student-centered strategies outlined in this guide, it is best to start small.
Instructors who attempt to make a pre-class assignment for every class the rst time they try JiTT may
become overwhelmed with this technique. It is usually more successful to develop and improve the
list of pre-class assignments iteratively each term by perhaps integrating a JiTT prompt once every
two weeks and adding and rening questions each term until the set of pre-class assignments becomes
manageable and stable.
CP.1.7. Developing persistence in problem solving
A possible obstacle to student-centered classrooms relates to students’ learned or innate ideas about mathe-
matics and what it means to do mathematics. Many tasks meant to actively engage students in the classroom
work best if students understand that persistence in working mathematics is valued and is integral to doing
mathematics. Persistence can be dened as “student actions that include students concentrating, applying
themselves, believing they can succeed, and making eort to learn” (Clarke et al., 2014, p. 67). e good
news is that according to research, perseverance can be improved (Duckworth, 2016).
Classroom vignette: Persistence
Dr. Smith has been working with his analysis students on the basics of mathematical sequences. His goal is
for students to understand the intricacies of the formal denition for the limit of a sequence: “A sequence
{a
n
} has a limit L if for every e > 0, there exists a natural number N such that for every n > N, |a
n
- L| <e.” To
this end, he wants his students to re-invent this denition on their own, a task that is quite challenging and
requires a good deal of persistence. As class begins he enthusiastically explains the task and acknowledges
that it will require persistence. “Aer all,” he says, “inventing the formal limit denition took mathemati-
cians over 100 years!” However, he also tells the students he believes in their ability to succeed and promises
to guide them away from any dead ends so that the project will take only a few weeks rather a century. He
builds their motivation by explaining why the mathematics is useful and by connecting it to their personal
interests and goals (Schechtman et al., 2013). Since this is an analysis class for mathematics majors, he takes
a dierent motivational route than he does in his college algebra class. He explains that they will be working
like mathematicians, and that this task will greatly enhance their ability to understand and to do upper-level
mathematics.
Dr. Smith has learned that how he sets up the task is crucial and that an overview of how the task will
play out can make a signicant dierence in helping students persist. As such he begins by asking students
to create a set of sequence graphs, some of which converge to 5 and some of which do not.He explains that
students will write an initial dra denition that describes the sequences that converge to 5 and “kicks out
the sequences that do not. e students test their denition against their sequence graphs and rene their
denitions to take care of any problems they notice. At this point, the cyclic process of “testing and rening”
will go through many iterations until the students develop a complete denition.
Dr. Smith has the students work in groups of four to provide the peer support needed to persist. As the
Classroom Practices 19
groups begin to work he wanders around the room, observing and listening carefully. He listens for areas of
diculty while resisting the temptation to tell them what to do so that students will feel their denitions are
their own. He notices that most groups create denitions such as, “A sequence converges to a limit of 5 if it
approaches 5.” Aer about 10 minutes he brings the class back together to briey discuss the vagueness of the
word “approaches” and the need to avoid circular denitions. He makes sure the discussion is brief because
he knows that giving the students plenty of time to work on the task will encourage persistence. As the stu-
dents continue working, Dr. Smith makes sure to talk, stand, and look in ways that communicate that it mat-
ters to him what each student is saying and doing even if he does not comment (Lampert, 2001). Aer a few
minutes, he notices that Brian, Je, Kayla, and Misty have developed the following denition, “A sequence
converges to a limit of 5 if the next term always moves closer to 5.” Resisting the urge to immediately point
out the graph of the damped oscillating sequence or the fact that a sequence like a
n
=4- 1/n satises their
denition, he instead asks a guiding question, “Does your denition work for all of our sequence graphs that
converge to 5?” or “Does your denition also work for sequence graphs that do not actually converge to 5?”
Aer testing the denition against each graph, the group realizes on their own that the damped oscillating
sequence is a problem or that the sequence a
n
=4- 1/n satises their denition, but converges to 4.
Although watching the students struggle can be dicult and time-consuming, Dr. Smith knows that
giving challenging tasks at the right level can facilitate persistence. Aer a while, he notices that one group
struggles more than the others and one outspoken student from this group is extremely frustrated. He talks
to this group about having a growth mindset (Dweck, 2008; also see the equity section in the chapter on
Cross-Cutting emes), explaining that while this is a dicult task, they can actually grow their intelligence
by persisting even if they do not make as much progress as they would like. He modies their task for now,
suggesting that they focus on writing a denition only for the monotonic sequences and then include the
more dicult sequences once the monotonic ones are tackled. Towards the end of the class period, each
group shares their initial denitions, discusses their ideas, and describes problems they encountered. e
other class members have a chance to ask questions and give feedback. is gives everyone a chance to glean
ideas from the other groups’ denitions and to celebrate their persistence.
Practical tips
A number of strategies exist for motivating students to attempt complex tasks and to persist until they solve
them.
• Choose challenging tasks that are beyond students’ current abilities because they can motivate stu-
dents (Seeley, 2009). ese tasks should require students to struggle constructively. In contrast to frus-
tration, constructive struggle involves expending eort to make sense of the mathematics and to gure
out something that is not readily apparent (Heibert and Grouws, 2007).
• Set up the lesson carefully (Clarke et al., 2014). Be enthusiastic about the tasks. Acknowledge that the
tasks require persistence. Give students a brief overview of how the lesson will play out. Make your
expectations clear. Believe in each students ability to succeed. According to the Carnegie Foundation
for the Advancement of Teaching, a students belief that they are capable of learning mathematics is an
important psychological driver of persistence (see the equity section of the chapter on Cross-Cutting
emes).
• Provide opportunities for intrinsic rewards and appropriate extrinsic rewards. Extrinsic motivators
such as grades can damage persistence, but intrinsic motivation based on individual interests and a
desire to grow and learn can strengthen perseverance (Clarke et al., 2014). Build intrinsic motivation
by focusing on the value of the task. Connect the task to students’ everyday lives, interests, and goals
(Schechtman et al., 2013). Use extrinsic rewards only if these rewards are unexpected and encourage
identiable behaviors rather than outcomes (McKay, 2015). 
20 MAA Instructional Practices Guide
• Encourage students to develop and value a growth mindset instead of a xed mindset (Dweck, 2008).
Specic to mathematics, a xed mindset implies that students believe that their talent in mathematics
is xed. at is, one is either good at mathematics or one is not. A growth mindset in mathematics
implies that students view challenge and struggle as avenues to shape and expand their mathematical
understanding and that mathematical talent is malleable.
• Give students plenty of time (Clarke et al., 2014). To encourage persistence, the lesson must be struc-
tured so that students have plenty of time to struggle. Some educators have suggested giving students
a few minutes to plan with a partner how they will solve the problem. Keep any mini-lessons brief and
to the point.
• Have students work in groups. Feeling socially tied to peers is another important psychological factor
that drives persistence (Carnegie Foundation for the Advancement of Teaching). Actively monitor
status issues related to gender or race that may impede full participation in the group or possibly cause
a situation where Stereotype reat impedes developing persistence in a social setting.
• Resist the urge to tell (Clarke et al., 2014) and allow students to struggle. Lampert (2001) suggests using
about two-thirds of the problem-solving time to move around the room, simply watching and listen-
ing. Look and stand in ways that show that it matters to you what they are saying and doing even if you
do not comment. Instructors may need to develop questions on the spot that guide students but “stop
short of telling [them] what they need to know to solve the problem” (Seeley, 2009, p. 90). Any teaching
that is done will be “on the run” in response to what you see or hear (Lampert, 2001). ink about how
to support student engagement without removing the struggle (Baldinger and Louie, 2014). As you
listen, push students to justify their thinking and strongly encourage them to write down their justi-
cations (Clarke et al. 2014). As you walk about the room, select certain students to present their work
during the closing stage of the lesson and plan the sequence in which their work should be displayed.
• As needed, pull the class back together for brief discussions or mini-lessons (Clarke et al., 2014). A
good time to bring the class back together for a short discussion might be aer about 10 minutes of
problem-solving. If students have been given a series of problems to solve, ask for their solutions to the
rst few, having the students collectively decide on the correct answers. If students are tackling just one
problem during the class period, ask them to describe and discuss their initial thoughts and attempts.
If you observe common diculties or notice that some students are struggling more than others, bring
the class together for a short mini-lesson. If the lesson is brief, the students who have demonstrated a
strong mathematical understanding and successfully completed the problem solving task will be happy
to see they are on the right track.
• Prepare appropriate prompts ahead of time to support persistence for students at various stages in their
mathematical understanding. Prepare “enabling prompts” to help students who are struggling with
the entry point into the mathematical task (Sullivan, 2011). ese can be slightly dierent initial tasks
that can get students on track before returning to the main problem. Prepare “extending prompts” for
students who complete the task quickly to keep them appropriately challenged, and pace group inter-
actions (Sullivan, 2011). Aer the lesson reect on how future instruction could create opportunities
for more students to engage deeply with the mathematics (Baldinger and Louie, 2014).
• As appropriate, talk to the students about the metacognitive strategies involved in mathematical prob-
lem-solving. In order to persist, students need to be able to step back and monitor their progress. Help
them develop strategies for getting “unstuck.” When feeling frustrated they may want to stop and take a
look at their progress, consider if they are on the right track, determine if they need to switch strategies,
or perhaps get some pointers from a peer or from the instructor.
• As appropriate, close the lesson with a summarizing discussion (Clarke et al., 2014). Have selected
Classroom Practices 21
students present their solutions. Help the class make connections between dierent students’ respons-
es and summarize their work. Keep in mind that summarizing discussions could occur at various
points during the lesson or could occur at the beginning of the next class period. Remember that even
incomplete solutions or incorrect responses may reveal important points. Remind the students of the
important aspects of a growth mindset, such as the fact that errors are part of learning, and learning
occurs even if the task is not completed.
• Celebrate students’ learning and be proud of their eorts to persist!
CP.1.8. Inquiry-based teaching and learning strategies
Inquiry-based teaching provides a rich way in which to actively engage students in the classroom. In the
spirit of inclusiveness of a range of particular teaching strategies, we oer three guiding principles that
promote the success of inquiry-based teaching: students’ deep engagement in mathematics, peer to peer
interaction, and instructor interest in and use of student thinking (Hayward and Laursen, 2014; Rasmussen
and Kwon, 2007). Specic strategies for enacting these principles are within the purview of the individual
instructor. Below, we elaborate on each of these three guiding principles.
Deep engagement in mathematics
In an inquiry-based learning classroom, students are engaged in doing mathematics. ey are not just listen-
ing and taking in information. Rather they are engaged in deep mathematical thinking. is does not mean
that lectures do not exist in every IBL classroom, for there is a time and place for well-craed exposition
that brings ideas together and makes connections to more formal or conventional mathematics. Lecture,
however, does not mean doing problem aer problem for students. Instead, the instructor may introduce a
new topic and have the students try problems themselves. e students try unfamiliar or familiar problems,
but are engaged in productive struggle until they gure out the answers themselves or with guidance from
the professor or other students. We also refer readers to the section on Selecting Appropriate Mathematical
Tasks (see section CP.2).
Peer to peer interaction
Given that students are working on problems that are designed to be engaging, it oen means that these
problems are also more dicult than standard problems and require collaborating with peers. Working with
others can appear in dierent forms (in or outside of class) such as group work, think-pair-share, collabo-
rative board work, whole class discussion, etc. In all of these settings students communicate mathematics
with each other via mathematical discussions which can include arguing about the mathematics. ese
collaborations facilitate learning to form logical arguments and as a result students are able to tackle more
dicult problems.
Instructor’s interest in and use of student reasoning
In a classroom where an instructor adopts inquiry-based teaching, students are deeply engaged in doing
mathematics and collaborating with their peers. Such work on the part of the students provides the instruc-
tor with ample occasions to inquire into students’ reasoning, to help students listen to and orient to other
students’ reasoning, and to use students’ reasoning to advance the mathematics lesson. Such work on the
part of the instructor necessitates interest in how students reason, and it creates a classroom environment
where students explain their own reasoning and attend to the reasoning of others. Fortunately, the research
literature provides some helpful and concrete suggestions so that instructors can achieve these goals (e.g.,
Michaels and OConnor, 2013; Rasmussen, Yackel, and King, 2003, pp. 143–154). For example, if a student
22 MAA Instructional Practices Guide
participates in the discussion, the student has to be able to share thoughts and responses out loud, regardless
of how tentative or unsure they are of their ideas. If only one or two students can do this, it is not a discus-
sion—it is a monologue or, at best, a dialogue between the instructor and a student. Prompts that facilitate a
classroom environment where students routinely share their reasoning include:
• Say more about that.
• “Dave, I know you havent nished, but tell us your initial thinking.
• Take your time, were not in a rush.
• Oh, thats interesting. Did everyone hear what Julie just said? She said that …
• at’s an important point. Keisha, can you say that again?”
Once students start sharing their reasoning, they need to consider others’ ideas and reasoning and respond
to them. is is the impetus of a real discussion that supports robust learning. Given students are used to
instructors evaluating and commenting on their reasoning, it takes explicit attention on the part of the
instructor to help students attend to and make sense of other students’ reasoning. Prompts that can help
instructors achieve this goal include:
• “Who can repeat in their own words what Juan just said?”
• “But what about what Damens point that …?”
• Oscar, can you say in your own words what I am asking you all to do?”
• So Debbie, is Jorge saying that …?”
• “Jesika, what do you think about what Ding just said?
Note that some of these prompts are ones that a person typically does not use in everyday conversation but
are extremely useful for instructor talk.
Inquiry-Based Learning (IBL) is used as a student-centered teaching strategy throughout all levels of
instruction in mathematics and in other STEM elds. IBL engages students in sense-making activities.
Students work on tasks that require them to solve problems, conjecture, experiment, explore, create, and
communicate, all critical skills and habits of mind in which mathematicians and scientists engage regularly.
Rather than showing facts or a clear, smooth path to a solution, the instructor solicits student reasoning and
guides students via well-craed problems and questions through an adventure in mathematical discovery
while using student reasoning to advance the mathematical agenda. Students are thus engaged in the creation
of mathematics, allowing them to see mathematics as a part of human activity, not apart from it (Ernst,
Hodge, and Yoshinobu, 2017; Freudenthal, 1991).
CP.1.9. Peer instruction and technology
e term peer instruction, coined by Harvard physics professor Eric Mazur, refers to a particular sequence
of in-class activities modeled on the think-pair-share sequence and supported by the use of a classroom
response system. Some of these systems use handheld devices (referred to as “clickers”) that let students
submit their answers to a receiver plugged into the instructor’s computer. One resource that can be leveraged
for more information about clickers can be found at cft.vanderbilt.edu/guides-sub-pages/clickers/. Other systems
use a “bring your own device” approach that requires students to respond via wi or text messaging using
their own devices such as phones, tablets, or laptops. Instructors can even simply supply colored or num-
bered cards for students to hold up in order to accomplish a no-tech version of this idea.
Peer instruction typically involves the following steps:
1. An instructor poses a question, oen a multiple-choice question, aimed at dierentiating student con-
Classroom Practices 23
ceptions about a topic.
2. All students in the class are invited to answer the question using the classroom response system.
3. If there is a technological system involved, the instructor views the aggregated responses of the students,
perhaps as a bar graph generated by the system.
4. Based on the student responses, one of three things happens next.
a) If most students answer the question correctly, then the instructor can move on fairly quickly to the
next topic. Some brief discussion of the question is warranted for the sake of students who answered
incorrectly or guessed their way to the correct answer, but if 80% or more of the students got the
question correct, it may be a better use of class time to move onto other topics.
b) If students are split among two or more answers, then the instructor asks students to turn to their
neighbors, discuss the question, and try to reason their way to the correct answer. Aer some time
for this paired discussion, the instructor asks the students to answer the question a second time
using the classroom response system, perhaps sticking with their original answer, perhaps changing
their response based on their paired discussion. Aer this, the instructor displays the aggregated
responses to the students and leads a classwide discussion of the question, inviting students to share
their reasons for the answers they chose and guiding the conversation toward the correct answer—
and the correct reasoning.
c) If most students answer the question incorrectly, having students pair up and discuss the question is
not likely to be useful. In this case, the instructor may ask a question to probe student understand-
ing and provide clarication or pose some scaolding questions or examples to help students bridge
ideas and reach a consensus.
Most instructors who practice peer instruction aim for that middle outcome by craing questions designed
to be hard, but not too hard. It is in this “Goldilocks zone” where peer instruction shines. “Just right” ques-
tions that are not too hard and not too easy.
Classroom vignette: Peer instruction and classroom response systems
At the beginning of a unit on probability in an undergraduate statistics course, Dr. Sun poses the following
question to her students. Your sister calls to say shes having twins. Which of the following is more likely? (As-
sume shes not having identical twins.)
A. Twin boys
B. Twin girls
C. One boy and one girl
D. All three are equally likely.
Dr. Sun asks her students to get out their phones, to log on to the classroom response system, and to
respond individually to this question. “No talking right now,” she says, encouraging the students to think
for themselves and commit to an answer. Aer about 30 seconds, the response system says that 27 of the 30
students in the class have submitted their answers, so Dr. Sun says, “Last call!” e remaining three students
respond quickly, and she looks over the bar chart generated by the system. Fourteen of the students (47%)
have answered D (“All three are equally likely), while 12 of the students (40%) have answered C (“One boy
and one girl”), the correct answer. e remaining students are split between choices A and B.
She then displays the bar chart to the students on the classroom projector. “ere is not a consensus here,
she says and asks the students to pair up and discuss the question. “Talk about the reasoning you used in
arriving at your answer. Even if you both agree on the answer, you may have dierent approaches that pro-
24 MAA Instructional Practices Guide
vide insight into the problem.” e room buzzes as students start arguing about the answer, some sharing
their personal experiences with twins, some drawing genetic diagrams they saw in their biology courses,
others arguing over whether boy babies are more common than girl babies. Aer a couple of minutes, Dr.
Sun asks students to answer the question again using their phones. “You can stick with your original choice,
she says, “or you can change your choice based on your discussion.” e student responses come in quickly,
and she displays an updated bar graph. is time, 22 of the students (73%) have the right answer: C, one boy
and one girl. Only seven students (23%) still answered D. One student stuck with “twin boys,” in spite of the
popularity of answers C and D.
Given the results, Dr. Sun asks for a volunteer who changed their mind from D to C to explain their
reasoning with the class. In an eort to encourage more students to engage she calls on a student who is
usually quiet in class. at student explains that she thought that all three outcomes were equally likely, but
her partner convinced her that there were actually four equally likely outcomes: boy-boy, boy-girl, girl-boy,
and girl-girl. And that meant that there was a 50% chance of one boy and one girl. Dr. Sun knows this ex-
planation is a correct one, but she does not want to conrm the right answer for the students too soon. She
wants to make sure they all have a chance to reason it out themselves. us, she asks for another volunteer,
this time someone willing to explain their reasoning for “all three are equally likely.” Another student speaks
up, saying that having a boy or a girl is 50/50, so all three combos are equally likely. “If we had identical twins
in the mix, then a boy-boy or girl-girl combo would be more likely,” the student says, “but we dont, so the
right answer is D.
At this point, Dr. Sun is condent that several students are still unsure, so she decides to run a simula-
tion. She asks each student to get out a coin (or a student ID card), assign one side to “boy” and one side to
girl,” then ip the coin (or ID card) twice, to simulate having twins. Again the classroom starts buzzing, as
students ip coins and other items at their seats. en she asks the students to report their results using the
classroom response system: A for twin boys, B for twin girls, C for one of each. is time, she lets the bar
chart update itself on the big screen in real time as the data come in. Aer a few seconds of moving bars, the
chart settles down on the nal distribution: 23% twin boys, 30% twin girls, and 47% one of each. “Given this
admittedly small sample size,” she says, “would you believe that not all three are equally likely?
To nish the discussion Dr. Sun draws a quick tree diagram on the chalkboard, noting the two simplify-
ing assumptions she is making with this question, that there are only two sexes (boy and girl) and that they
occur with equal probability. Twin #1 is either a boy or a girl, with equal odds, and Twin #2 is either a boy or
a girl, again with equal odds. e tree diagram illustrates the four equally likely outcomes mentioned by the
rst student volunteer, and Dr. Sun conrms that boy-girl will occur 50% of the time, on average.
“Why did I have you spend 20 minutes of class time working through this example?” she asks, rhetori-
cally. “Because a lot of people have the misconception that with probability, all outcomes are equally likely.
ats not always the case, especially when we group multiple outcomes into what we call events, as we did
here. We needed to directly confront this misconception, if you’re going to make sense of this probability
unit weve begun. We humans tends to have a lot of misconceptions about probability, so well have to model
the problems in this unit very carefully as we go.
Discussion. ere are a number of reasons discussed elsewhere in this guide that indicate why collabora-
tive learning activities, such as peer instruction, are eective at promoting student learning. Below we oer
reasons as to why and how technology such as classroom response systems can support peer instruction.
Formative Assessment. In contrast to the summative assessment performed at the end of a course for eval-
uative purposes, formative assessment involves making student learning visible during the learning process,
so that students can receive feedback on their learning. A classroom response system is a tool for formative
assessment that does not depend on the class size. All students are invited to respond to questions and to
commit to their answers, and thus all students have the chance to nd out if their answers are correct. is
Classroom Practices 25
type of feedback is critical to the learning process, and a classroom response system helps it occur regularly
during class.
Agile Teaching. Formative assessment is not only useful to the student, it is also useful to the instructor. Bar
chart summaries of student responses allow instructors to determine if students have mastered a particular
concept, whether they need more help with a topic, and even which misconceptions are most common in a
particular class of students. All of this information, gathered “on the y” during class allows an instructor to
practice what is sometimes called agile teaching—making instructional choices in the moment that respond
directly to student learning needs.
Times for Telling. Students are better able to understand and remember an explanation if they are ready to
hear it. By asking all students to commit to their answers to a question and by showing students a distri-
bution of answers that makes clear the question is a hard one, an instructor can use a classroom response
system to create a “time for telling” (Schwartz and Bransford, 1998). Cognitively, students are ready to un-
derstand an explanation because they have already brought to mind their prior knowledge and experiences
as well as conjectured about the question. Students want to know the answer because they have committed
to their answers, and the bar graph shows them their peers disagree about the question. is can be a pow-
erful process for surfacing and confronting student misconceptions.
All of this is possible without technology, but a classroom response system makes certain aspects of peer
instruction easier, such as asking all students to respond individually before seeing their peers’ responses.
Furthermore, it allows students to discern whether there is agreement or disagreement in the response.
Practical Tips
e classroom vignette above featured a question designed to surface a misconception about a topic. Mis-
conception questions work very well with peer instruction, but other types of questions are also possible, in-
cluding application questions that require students to apply a computational algorithm to a given example or
to choose the correct expression to set up a computation given a word problem. Also useful in some courses
are ratio reasoning questions that ask students to determine if the value of a variable will increase, decrease,
or stay the same if some other variable changes. See the MathVote website, mathquest.carroll.edu/, for question
banks for commonly taught mathematics courses.
Some instructors follow the peer-instruction sequence, but skip the individual vote phase, preferring to
have students move directly to the paired discussion. is can be useful for hard questions, where students
are likely to need some discussion before they can formulate an answer, or when time is short. But it is usu-
ally better to have students answer individually before the peer instruction phase, so that every student has
something to share during paired discussions.
When the vote is split among two or three answer choices, showing students the results can help motivate
discussion, as the students see that the question is a hard one. However, if one answer is more popular than
other answers (for instance, a 10/60/10/10/10 distribution), showing students such a bar graph can inhibit
discussion, as students assume the popular answer is the correct one. In these cases, it usually is better not
to show students the results, but peer discussion can still be useful, given how many students are unsure of
the correct answer.
e examples above all assume the use of multiple-choice questions. Historically, “clickers” only allowed
for the submission of answers to multiple-choice questions, so instructors practicing peer instruction were
limited to such questions. In today’s classrooms, BYOD (bring your own device) systems get around this
limitation, making free-response questions more practical. However, aggregating results from free-response
questions is still challenging. A simple bar graph works great for multiple-choice questions, but it does not
work for short answer questions. Sometimes a word cloud can help instructors see patterns in student re-
26 MAA Instructional Practices Guide
sponses, but word clouds are not as useful for numeric response questions. Scrolling through a complete
list of student responses is sometimes useful, especially in smaller classes, but in general, free-response
questions are not as easy to use with peer instruction in a mathematics class as multiple-choice questions.
CP.2. Selecting appropriate mathematical tasks
Selecting appropriate mathematical tasks is critical for fostering student engagement. e tasks chosen pro-
vide the conduit for meaningful discussion and mathematical reasoning. But, how does an instructor know
when a mathematical task is appropriate? ere does not appear to be one single idea on what constitutes
appropriateness in the research literature or in practice. Rather, appropriateness appears to be determined
from a combination of a number of factors. e successful selection of an appropriate mathematical task
seems to involve two related ideas.
1. e intrinsic appropriateness of the task, by which we mean the aspects of the task itself that lend itself
to eective learning.
2. e extrinsic appropriateness of the task, by which we mean external factors involving the learning
environment that aect how well students will learn from the task.
We now look at each of these ideas in turn.
CP.2.1. Intrinsic appropriateness: What makes a mathematical task appropriate?
By intrinsic appropriateness, we mean aspects that are inherent in the task itself that aect how well-suited
it is for the moment. ese include:
e degree to which the task is aligned with the learning objectives of the lesson and the course: A task
that is in clear harmony with the learning objectives for the lesson and/or the course is more appropriate
than one that is not. For example, if a lesson is designed so that a stated outcome of the lesson is the ability
to nd the roots of a second-degree polynomial equation, then a task requiring students to do this is an ap-
propriate one; a task that requires students to nd the roots of a linear or cubic equation is not. On a larger
scale, if a high-level learning objective of the course is to use technology eectively to solve mathematical
problems, then it would be appropriate to include technology as a means to solving a task. If the use of tech-
nology is not one of the course objectives, then having students perform a task with technology may not be
appropriate.
e mathematical expertise of the learners: If an instructor selects a task for which students have not had
adequate preparation, then clearly the task is not appropriate. For example, in a calculus class a task that in-
volves the chain rule is not appropriate if the students have not yet learned the chain rule or if students have
demonstrated issues with the concept of a composite function. Conversely, if a task is signicantly below the
expertise level of students, then it may not be appropriate because students may not be motivated enough
to engage with it.
Student readiness coming into the task: is is a dierent idea than mathematical expertise although
related. Whereas expertise refers to the knowledge that a student has on a subject, readiness refers to
whether that knowledge is activated and ready for use. For example, a student might have expertise in
solving second-degree polynomial equations using the quadratic formula, but may not be able to apply the
quadratic formula in novel situations without guidance from an expert, such as the instructor. Instructors
need to be knowledgeable about students’ prior knowledge.
Classroom Practices 27
e degree to which the task satises students’ basic cognitive needs and provokes intrinsic motivation:
is is similar to the rst point. A task that oers students no reason to be interested is likely not an appro-
priate choice. A more appropriate choice might be a task that has similar actions to perform (e.g., factoring
a polynomial) but which is somehow connected to students’ basic cognitive needs for competence, autono-
my, and relatedness. Self-determination theory (discussed in section CP.2.3.) states that students’ intrinsic
motivation to complete a task requires a conuence of meeting these three needs.
e level and nature of cognitive load that the task places on the students: A task that is needlessly com-
plicated will place a strain on students’ cognitive load and could lead to a failure to engage with it. On the
other hand, a task that addresses a learning objective with a minimum of extraneous load, or extra work,
which adds germane load to the basic concept, will be better suited for student learning. We discuss cogni-
tive load theory more in section CP.2.3.
CP.2.2. Extrinsic appropriateness
Whether a given task is appropriate for a given situation also depends on factors not inherent in the task
itself, such as the following:
Student motivation coming into the task: A mathematical task may be at an appropriate level of diculty
and in alignment with instructional goals, and yet students may fail to engage with it in a meaningful way
because of their levels of motivation. Apart from whether a task is inherently interesting or meaningful to
the student, the deployment of a task will be more successful—more appropriate for the moment—if stu-
dents recognize why the task is interesting.
e degree to which the physical space and makeup of the learning environment is suited to the task:
For example, a task that involves students self-selecting into small groups of three of four may be entirely
appropriate for a classroom in which the furniture can be rearranged easily, and yet not as appropriate for a
class with xed stadium-style seating. A task that involves students putting work on a chalkboard or white-
board may not be appropriate for a class meeting in a space in which there are no such implements. A task
that involves peer instruction, such as using classroom response devices to probe student misconceptions
and provoke discussion, may be more appropriate for a large lecture course than for a tutorial with fewer
than ten students.
e degree to which the mode of instructional delivery is suited to the task: Here we are mainly think-
ing of a distinction between face-to-face courses, online courses, and hybrid courses. While denitions of
these terms vary, we informally refer to face-to-face courses as those in which the students in the class meet
together at xed times in a common, xed location for all its main activities. Online courses are those in
which the main class activities do not occur in a xed space but rather in a common online locale such as
a course management system. We can make a further distinction between synchronous online courses, in
which students meet online but at xed times, and asynchronous online courses in which there are neither
xed times nor xed physical locations for course meetings. Hybrid courses are those that combine face-
to-face and online components. Some tasks that might be appropriate for face-to-face courses would be
dicult or impossible to accomplish in an online course. For example, exit tickets do not make sense for
asynchronous online courses in which there are no meetings at a common time. Conversely, some tasks that
leverage the online environment, such as having students work out solutions on a discussion board, might
be less appropriate for a face-to-face course, where, for example, having students work out solutions in class
might be more appropriate.
28 MAA Instructional Practices Guide
CP.2.3. Theoretical frameworks for understanding appropriateness
e above considerations are rooted in at least three major theoretical perspectives on human learning:
Vygotsky’s Zone of Proximal Development idea, cognitive load theory, and self-determination theory.
Although there is much literature on each of these topics, we briey describe each of these perspectives;
the reader can nd more in the Design Practices Chapter.
Zone of proximal development
Vygotsky (1978) developed the notion of the zone of proximal development (ZPD), which can be described
roughly as the space between what a learner can do without help and what the learner cannot do even with
help. at is, the ZPD refers to the space of tasks that the learner can do with guidance. ZPD is “the domain
of transitions that are accessible to the [learner]” (p. 211). Vygotsky claimed that instruction need not wait
for full readiness, but rather instruction can provide a motivation for extending a learner’s intellectual reach:
Instruction is not limited to trailing aer development or moving stride for stride along with it. It can
move ahead of development, pushing it further and eliciting new formations (p. 198).
Vygotsky sees classroom instruction in its ideal state as something that “forces [the learner] to rise above
himself” (p. 213). By leveraging Vygotsky’s idea, we can begin to think about how this relates to appropri-
ateness of a mathematical task. ZPD concepts suggest that a task is maximally appropriate for a particular
learner when it is located in that learner’s ZPD—that is, when it is a task that the student can do but only
with guidance. Tasks that a student can do without guidance may be less appropriate, because there is no
purpose for the task other than to practice or build condence. Similarly, a task that the student cannot do
with any amount of guidance has no purpose. Note that this is a local criterion in the sense that it applies to
one student at a time. When selecting a task for an entire class, the instructor must maintain a sense of each
learner’s ZPD at the time, and judge whether a task is likely to hit the “center of mass” of all those ZPDs.
e concept of the ZPD informs the aspects of appropriateness listed above that involve mathematical
expertise and student readiness, among others.
Cognitive load theory
Sweller’s (1998) cognitive load theory (CLT) proposes that cognitive tasks carry with them three dierent
kinds of load on the learner’s working memory:
• Intrinsic load, which refers to the irreducible diculty that the task itself carries.
• Extrinsic load, which refers to extra diculty that is placed on a task due to the way it is posed or
delivered.
• Germane load, which refers to diculty that helps learning by leading to the production of schemas,
or organized patterns of knowledge.
According to CLT, learning is most eective when the task is best aligned with “human cognitive architec-
ture” as described by Sweller (1998). us CLT gives a framework for judging on at least a partial level the
appropriateness of a mathematical task. Namely, a mathematical task that minimizes extrinsic load while
including germane load would be more appropriate for learning than would the same task with heightened
extrinsic load.
Self-determination theory
Ryan (2000) describes self-determination theory (SDT) as pertaining to concepts of human motivation.
While motivation to perform a mathematical task can be aected by ones expertise and readiness, moti-
vation in return inuences ones focus and level of eort expended on those tasks. SDT proposes that all
Classroom Practices 29
learners possess three basic cognitive needs:
• e need for competence, which refers to expertise in performing tasks in a given context.
• e need for autonomy, which refers to the ability to locate the source of ones competence within
oneself.
• e need for relatedness, which refers to a sense of belonging or association with a social group in a
given context.
SDT also makes a distinction between intrinsic motivation and extrinsic motivation. Intrinsic motivation
for a task refers to an internal, natural, inherent interest in the task, whereas extrinsic motivation refers to
interest that is driven by forces outside the learners. For our purposes, the relevance of SDT to the selection
or construction of appropriate mathematical tasks are:
• A task that increases intrinsic motivation is more appropriate for students’ long-term intellectual de-
velopment and short-term learning goals in a course than one that relies on extrinsic motivation.
• Intrinsic motivation is heightened when a task provides a feeling of competence, but only when ac-
companied by a sense of autonomy.
• At the same time, intrinsic motivation is more likely to ourish in social contexts that enhance a feeling
of security and relatedness (Niemiec and Ryan, 2009).
Hence, for the purposes of selecting a mathematical task, an instructor should attend to whether a task pro-
motes both competence and autonomy, as well as to whether the social environment of the class promotes
relatedness.
CP.2.4. How to select an appropriate mathematical task
ere is no formula for selecting an appropriate mathematical task, but the above considerations can guide
instructors’ choices. For example, an instructor could ask the following questions:
• Do I have clearly-stated and concrete learning objectives dened for the lesson in which the task is
going to appear, and do students have access to those objectives?
• Does the task align with my learning objectives?
• Do I have actionable information, based on formative assessment or surveys, about my students’ mo-
tivations, attitudes, and mathematical readiness for the task?
• Based on that information, does the task meet students at their level of expertise (not too easy, not too
hard) and at their level of readiness (they are prepared to do the task apart from having the right level
of expertise) and motivation (students have a reason to perform the task apart from extrinsic rewards
and punishments)?
• Is the task well-constructed in terms of building students’ intellectual development, competence, and
autonomy? Does it leverage the social context of the class to promote relatedness?
• Is the task suitable for the physical environment of the class meeting?
• Is the task suitable for the mode of instruction (face-to-face vs. online)?
It seems unlikely that the answer to all of these questions will be “yes” for any single task, but at least these
guidelines can help facilitate good design choices. e following section provides insights into choosing
meaningful group-worthy tasks.
30 MAA Instructional Practices Guide
CP.2.5. Choosing meaningful group-worthy tasks
Stein, Grover, and Henningsen (1996) dene a mathematical task as a set of problems or a single complex
problem that focuses student attention on a particular mathematical idea. In this section we elaborate on
group-worthy tasks, which provide opportunities for students to develop deeper mathematical meaning
for ideas, model and apply their knowledge to new situations, make connections across representations and
ideas, and engage in higher-level reasoning where students discuss assumptions, general reasoning strate-
gies, and conclusions. In her chapter on “Craing Group-worthy Learning Tasks, Lotan (2014, pp. 85–97)
outlines a number of characteristics for academically challenging, intellectual, and rigorous tasks:
• ey are open-ended, productively uncertain, and require complex problem solving.
• ey provide opportunities for students to use multiple intellectual abilities to access the task and to
determine intellectual competence.
• ey address discipline-based, intellectually important content.
• ey require positive interdependence and individual accountability.
Another way to think about characteristics of group-worthy tasks is in terms of the cognitive demand re-
quired. Stein et al. (1996) distinguish between low-level cognitive demand and high-level cognitive demand
as shown in Table 1. Group-worthy tasks are characterized by high-level cognitive demand. One thing to no-
tice in Table 1 is that high-level cognitive demand tasks can encompass both deep understanding about pro-
cedures and what may be thought of as relational or conceptual understanding. We nd this useful because
it moves beyond supercial arguments that oen pit procedural knowledge against conceptual knowledge.
e important thing is, as Lotan (2014, pp. 85–97) points out, that tasks require complex problem solving
skills, use varied intellectual abilities, address signicant mathematics, and require peer-to-peer interaction.
Before providing examples of group-worthy tasks, we quote Hsu, Kysh, and Reseks (2007, p. 7–8) depic-
tion of classroom settings where such tasks are used.
• Students are placed into groups that are purposefully not homogenous. We commonly group stu-
dents using some public random process, such as counting o, so it is clear there is no pre-grouping
by perceived strength. For instructors who know the strengths and weaknesses of their students, we
encourage purposeful grouping where students work collaboratively with students who have similar
mathematical abilities.
• All students are given the same initial problem to work on in their non-homogenous groups. Groups
who think they have nished early are asked to consider alternative methods of solution, further gen-
eralization, or other extension questions.
• Groups are expected to be responsible for the respectful learning of all members.
• Students need to communicate their reasoning so others can understand and build on it. Usually stu-
dents are expected to share their work with the class, either in informal whole class discussion or as
formal presentations, which sometimes include prepared overheads or posters.
• Students are expected to struggle with the problem and to negotiate and argue their dierent ideas.
• Good mathematical argument and explanation are emphasized as goals and are necessary to the re-
porting process.
• Creative approaches are encouraged and analyzed, even if they dont lead directly to a solution.
We next analyze a group-worthy, cognitively demanding task and discuss how the task can provoke math-
ematical curiosity, struggle, discussion about mathematics, and insight into important ideas. is task is
part of a research-based inquiry-oriented curriculum developed by Rasmussen and colleagues.
Classroom Practices 31
Classroom vignette: The sh.net task
is task can be used as a capstone for students’ study of rst order autonomous dierential equations (DEs).
e task presents an opportunity for students to critique given mathematial models, use and connect multi-
Low-level cognitive demands High-level cognitive demands
Memorization tasks
Involve either producing previously learned facts,
rules, formulae, or denition, or committing facts,
rules, formulae, or denitions to memory.
Cannot be solved using procedures because a proce-
dure does not exist or because the time frame in which
the task is being completed is too short to use a pro-
cedure.
Are not ambiguous—such tasks involve exact repro-
duction of previously seen content and what is to be
reproduced is clearly and directly stated.
Have no connection to the concepts or meaning that
underlay the facts, rules, formulae, or denitions be-
ing learned or reproduced.
Procedures without connections tasks
Are algorithmic. Use of the procedure is either spe-
cically called for or its use is evident based on prior
instruction, experience, or placement of the task.
Require limited cognitive demand for successful com-
pletion. ere is little ambiguity about what needs to
be done and how to do it,
Have no connection to the concepts or meaning that
underlie the procedure being used.
Are focused on producing correct answers rather than
developing mathematical understanding.
Require no explanations or require only explanations
that focus solely on describing the procedure that was
used.
Procedures with connections tasks
Focus students’ attention on the use of procedures
for the purpose of developing deeper levels of under-
standing of mathematical concepts and ideas.
Suggest pathways to follow (explicitly or implicitly)
that are broad general procedures that have close con-
nections to underlying conceptual ideas as opposed to
narrow algorithms that are opaque with respect to un-
derlying concepts.
Usually are represented in multiple ways (e.g., visual
diagrams, manipulatives, symbols, problem situa-
tions). Making connections among multiple represen-
tations helps to develop meaning.
Require some degree of cognitive eort. Although
general procedures may be followed, they cannot be
followed mindlessly. Students need to engage with the
conceptual ideas that underlie the procedures in order
to successfully complete the task and develop under-
standing.
Doing mathematics tasks
Require complex and non-algorithmic thinking (i.e.,
there is not a predictable, well-researched approach or
pathway explicitly suggested by the task instructions,
or a worked-out example).
Require students to explore and to understand the na-
ture of mathematical concepts, processes, or relation-
ships.
Demand self-monitoring or self-regulation of ones
own cognitive processes.
Require students to access relevant knowledge in
working through the task.
Require students to analyze the task and actively ex-
amine task constraints that may limit possible solution
strategies and solutions.
Require considerable cognitive eort and may involve
some level of anxiety for the student due to the unpre-
dictable nature of the solution process required.
Table 1. Low- and high-level cognitive demands (Stein et al., 1996).
32 MAA Instructional Practices Guide
ple representations, and communicate their results and reasoning to others. It also provides an opportunity
to reinvent a bifurcation diagram, although doing so is not a necessary component of a complete solution.
is task is intended for classrooms that allow students to work together in small groups and present their
solutions to others in the class. e task, referred to as the “Fish.net task, has three main parts. e rst two
parts are intended to be completed in class. e third part is intended to be started in class, but then signif-
icant out-of-class work is expected. Prior to implementing this task, students should have studied various
graphical, analytical, and numerical techniques for rst order DEs in general and for autonomous DEs, in
particular. ere is no expectation that students are familiar with bifurcation diagrams. In fact, it is best if
they have never seen such diagrams.
The task: A mathematician at a sh hatchery has been using the dierential equation
dP
dt
P
P
=−
()
21
25
as
a model for predicting the number of sh that a hatchery can expect to nd in its pond. Use a graph of
dP
dt
vs. P, a phase line, and a slope eld to analyze what this dierential equation predicts for future sh
populations for a range of initial conditions. Present all three of these representations and describe in a
few sentences how to interpret them.
Recently, the hatchery was bought out by sh.net and the new owners are planning to allow the public
to catch sh at the hatchery (for a fee of course). is means that the previous dierential equation used
to predict future sh populations needs to be modied to reect this new plan. For the sake of simplicity,
assume that this new plan can be taken into consideration by including a constant annual harvesting
rate in the previous dierential equation. Which of the three modied dierential equations makes the
most sense to you and why?
(a)
dP
P
p=−
21
(b)
dP
dt
Pk
P
=−
()
21
25
(c)
dP
dt
P
Pk
=−
()
21
25
Using the modied dierential equation agreed upon from the previous problem, prepare a one page
report for the new owners that illustrates the implications that various choices of will have on future sh
populations. Your report may include one or more graphical representations but must synthesize your
analysis of the eect of dierent values in a concise way.
e rst part of the sh.net task provides an opportunity for students to use and connect multiple repre-
sentations and to interpret how these various representations tell the story of how the dierential equation
predicts future population values for dierent initial conditions. Making connections is a characteristic of
cognitively demanding, group-worthy tasks. Together the three parts of the task have a “low oor” and a
“high ceiling”, meaning that all students should be able to make signicant and meaningful progress on the
rst task, while the third task provides an opportunity for students to excel and essentially reinvent a bifur-
cation diagram. e reinvention of new mathematics is another characteristic of a cognitively demanding,
group-worthy task.
In the second part of the task, students productively struggle to determine which modication to the
DE makes the most sense and present their reasoning to the class. Typically, the class agrees that option c)
makes the most sense, but getting to that point requires struggle, negotiation, and sharing their reasoning.
Having settled on option c) for the modied dierential equation, the students move on to the third part of
the task, where they analyze how a change in a parameter aects the space of solutions that predict how the
sh population will unfold over time for dierent initial conditions. Students work in small groups during
class and work with their group outside of class to nish the analysis and prepare a one-page summary re-
port. is combination of in and out of class group work promotes positive interdependence and individual
accountability, which is another important characteristic of group-worthy tasks.
e rationale for constraining the nal report to one page is to encourage students to be creative in ways
Classroom Practices 33
to represent their analysis of multiple k values in a concise manner. Figure 6 shows two examples of summa-
ry reports from dierent groups of students.
In the rst report students summarize their analysis using a table and paragraph detailing how to interpret
the table. e second groups report made use of multiple, linked graphical representations to illustrate the
eect of changing the parameter on solutions to the dierential equations. Having dierent types of reports
provides students an opportunity to make connections across reports. For example, students might discuss
how the columns in the table presented by the rst group relate to the graphs presented by the second group.
e next example illustrates another creative way in which students synthesized their analysis and re-
sulted in a bifurcation diagram. In this presentation the students leveraged an additional representation in
their analysis by actually nding an equation for the equilibrium solutions as a function of the parameter k.
Determining the usefulness of the equation, carrying out the algebraic steps, and graphing the result can be
non-trivial for students and represents important mathematical connections and skills. Such mathematical
work reects aspects of cognitively demanding tasks that highlight “procedures with connections” (Stein et
al., 1996).
e reader might be surprised to learn that the rst three group reports all came from the same class. In
the midst of the third group presenting their analysis, students gured out that they could “drop a phase line
on the graph of P vs k, thereby making further connections across group reports. A screenshot of a student
having done this is shown in Figure 7(b). is third report provides additional opportunities for students to
make connections across the dierent small group reports. Moreover, the student-generated graph of P vs k
provides an opportunity for the instructor to label student work with the terms bifurcation and bifurcation
diagram, not only formalizing the mathematical work of students but also providing students with a sense
of mathematical ownership and signicant accomplishment.
Finally, in the fourth group report (from a dierent class), students adopted creative use of Excel, where
the rows are P values, columns are k values, and the cells contain dP/dt values. Here again we see students
rising to the challenge of producing creative and original analyses that address signicant and deep mathe-
matical ideas and with the aid of technology.
Reecting on the nature of the sh.net task, we revisit Lotans (2014, pp. 85–97) characteristics of
group-worthy tasks and nd that this task satises all four characteristics.
Figure 6. Group 1 and Group 2 sh.net summary reports.
34 MAA Instructional Practices Guide
Figure 7. (a) Fish.net report from group 3 (b) Result of student “dropping” a phase line onto the P vs k graph.
Figure 8. Use of Excel to represent analysis of sh.net task.
Classroom Practices 35
• Part 3 is open-ended and requires complex problem solving.
• All three parts provide opportunities for students to use multiple intellectual abilities to access the task.
• e task as whole addresses discipline-based, intellectually important content such as bifurcation the-
ory and understanding model parameters.
• e group reports require positive interdependence and individual accountability.
To conclude, we note that once the ideas and terminology of bifurcation and bifurcation diagram have
been appropriately connected to student work and dened, students are now in a position to follow up on
the sh.net task with problems that have no application context and that serve to illustrate dierent types
of bifurcations. Below are some suitable follow-up tasks. e possibility of such follow-up tasks is another
indicator that the sh.net task contains important mathematics and has a high ceiling.
1. For each of the following, develop a report that illustrates (with a suitable graph or graphs) and de-
scribes (in words) the way in which the solutions change as the value of r changes. Identify the precise
value(s) of r for which there is either a change in the number of equilibrium solution(s) or a change in
the type of equilibrium solution(s).
a)
dy
dt
yr
=−
()
+3
2
b)
dy
dt
yry=−+
2
1
c)
dy
dt
ry y=+
3
d)
dy
dt
yyr=− +
64
2
2. For part a) in problem 1, sketch a graph of the equilibrium solutions as r varies. Such a graph is re-
ferred to as “bifurcation diagram.
3. For part b) in problem 1, sketch a graph of the equilibrium solutions as r varies. Such a graph is re-
ferred to as “bifurcation diagram.
CP.2.6. Communication: Reading, writing, presenting, visualizing
In 2009, state leaders from around the United States began development of what we now know as the Com-
mon Core State Standards (CCSS; NGA and CCSSO, 2010), which include both mathematics standards and
English Language Arts (ELA) standards. ese state leaders had a common interest to create, as stated in the
current CCSS documents, a rigorous set of standards that prepare all students for college and career. “State
school chiefs and governors recognized the value of consistent, real-world learning goals and launched this
eort to ensure all students, regardless of where they live, graduating high school prepared for college, ca-
reer, and life.” (www.corestandards.org/about-the-standards/development-process/)
In their current form, the CCSS (NGA and CCSSO, 2010) for mathematics have two major parts: Content
Standards and Standards for Mathematical Practices. While the content standards might look more familiar
to most people, they were designed with learning progressions in mind, with a focus on student understand-
ing of mathematical ideas, and with attention to applications of mathematical ideas in real-world contexts.
e Standards for Mathematical Practices, however, “describe varieties of expertise that mathematics educa-
tors at all levels should seek to develop in their students.” e goal is to hold students accountable for
1. Making sense of problems and perseverance in solving them.
2. Reasoning abstractly and quantitatively.
3. Constructing viable arguments and critiquing others’ reasoning.
4. Modeling with mathematics.
5. Using appropriate tools strategically.
6. Attending to precision.
7. Looking for and making use of structure.
36 MAA Instructional Practices Guide
8. Looking for and expressing regularity in repeated reasoning.
Math Practice #3 provides an excellent vision of what should be happening in college mathematics class-
rooms where profound student learning is the goal.
Construct viable arguments and critique the reasoning of others
ere has long been a focus on asking students to follow procedures and to master algorithms oen with-
out conceptual understanding. Research suggests that students learn mathematics by doing mathematics,
which includes a balance between conceptual understanding, procedural uency, and applications (Shellard
and Moyer, 2002). Additionally, when procedural uency is necessary, it is developed through and emerges
from conceptual understanding and, perhaps, applications. In other words, the teaching of mathematical
concepts and skills should be centered around problems to be solved (Checkly, 1997; Wood and Sellars,
1996; Wood and Sellars, 1997). It is important for students to develop mathematical meaning to address
the challenges we currently observe in mathematics education in the United States. If we do not focus on
mathematical meaning, “the result is teachers’ inability to teach for understanding and students’ inability to
develop personal mathematical meanings that support interest, curiosity, and future learning” (ompson,
2013, p. 57). A recent Programme for International Student Assessment report (Piacentini and Monticone,
2016) from the Organisation for Economic Cooperation and Development (OECD) conrms the impor-
tance of students developing foundational understanding of mathematical ideas. Andreas Schleicher, OECD
Director of Education and Skills summarized the recent results:
Our analysis is [that] when students have really understood the foundations, they can extrapolate.
ey can apply that knowledge in another context. However, if they only teach students tips and tricks,
how to solve small everyday problems, they know how to solve those problems, but they’re not good at
transferring that knowledge to another context (Barshay, 2016).
As in previous discussions, constructing viable arguments entails students engaged in problem solv-
ing tasks (a mathematical problem or a real-world context) and asked to articulate their reasoning as they
demonstrate their solution to the problem. Students should come to interpret the word “viable” as “possible
so that during presentations students recognize that they are considering a possible solution that requires
analysis in order to determine its mathematical worthiness. Students may work independently or collabo-
ratively to create their viable argument and the argument can be delivered in written or spoken form. To
demonstrate the idea of making a viable argument, consider the following problem.
e Playground Problem: Students at a school are told that the principal plans to double the length of
each side of a square playground. e students are pleasantly surprised to discover the impact this has
on the area of the playground.
Solution #1—Algebraic response
In the solution shown in Figure 9, the students focus was on making algebraic representations of the neces-
sary quantities. rough this symbolic representation of the diagram of the playground, a student can make
a viable argument about the size of the playground aer doubling the length of the sides. e class can then
critique the reasoning of this argument by pointing out aspects of the solution that make sense, aspects that
one might question, or aspects that one might improve or edit with appropriate discussion.
Solution #2Visual response
In the solution shown in Figure 10, we see evidence of student reasoning about the area of the playground
Classroom Practices 37
aer doubling the length of the sides. Again, imagine a student presenting their reasoning with this solution
on display. Certainly, there are ways to improve the argument (more labels) but there are also very powerful
images that capture the essence of the solution, as we can see the four original squares in the new playground.
e second component of the third mathematical practice is critiquing other students’ work, which en-
tails listening, evaluating, and providing feedback on the reasoning of others. is can be observed in many
ways, some of which will be described.
Sometimes, students work in groups of 2–4 students to solve a problem, develop a mathematical idea,
or create and defend a mathematical conjecture. In this small group context students oen are aorded the
opportunity to listen to one another and critique the reasoning of others. is happens naturally as students
share ideas, articulate solution paths, or back up any claims made. Critiquing the reasoning of others can
also be observed in a whole group setting. A skilled instructor may ask any number of students to present
their work and reasoning to the entire class. Students can be selected to share based on the fact that their
work
• Oers a unique approach.
• Provides context to discuss an important idea.
• Showcases outstanding work.
• Demonstrates solutions focusing on dierent mathematical representations.
Oen, people view the idea of a critique as something meant to expose an error, but in the mathematics
classroom it
• Conrms the work of another student: Rather than simply saying, “at makes sense, students
should articulate clearly what specically makes sense to them.
• Questions the work of another student: It can be the case that a member of the audience is unclear
about some portion of the presentation and should inquire for the purpose of gaining clarity.
Figure 9. Algebraic response to Playground problem. Figure 10. Visual response to Playground problem.
38 MAA Instructional Practices Guide
• Disagrees with the work of another student: If a student thinks that the presentation is in error,
they should respectfully discuss this with the presenting student for the purpose of unpacking the
truth.
Stein and Smith (2011) provide a framework for instructors to improve classroom discourse by focus-
ing on student development of understanding mathematical ideas. Prior to implementing a well-designed
classroom task, instructors spend time anticipating student responses by working through the task inde-
pendently and thinking about multiple solution strategies. For each strategy instructors should consider
how they will respond to what students produce during the lesson. During the lesson instructors monitor
students’ work on, and engagement with, the task. Instructors tour the classroom, listen to student thinking,
ask probing questions, and encourage student thinking and sharing. As instructors monitor they watch for
opportunities to select student work (individual or group) to be presented to the entire class. Student work
is selected based on the instructional and mathematical goals of the lesson. Interesting work is selected to
show a variety of solution strategies, common mistakes, or unique ways of thinking. Selected students com-
municate their thinking (viable argument) and allow others to critique their reasoning. Prior to presenting,
the instructor must sequence students’ responses. e sequence is well thought out and intentional to tell
a story or maybe to show less sophisticated to more sophisticated reasoning. Sometimes a sequence is de-
signed to show common mistakes rst and mathematically sound solutions later (or vice versa) depending
on the learning goals at the moment. Finally, instructors work to help students make connections between
the dierent student responses as appropriate. Sometimes, solution strategies are mathematically equivalent
and students should be able to recognize and articulate these equivalencies. Sometimes, solution strategies
demonstrate a variety of mathematical representations and students should make the connections between
these representations.
Additionally, instructors must think about the nature of the selected tasks that will promote positive class-
room discourse. If the mathematical focus of the task is following a prescribed procedure for the purpose
of producing right answers, the classroom discourse could be limited and shallow. On the other hand, if the
mathematical focus of the task is appropriately challenging and interesting to students, there can be great
potential for rich discourse and powerful learning.
CP.2.7. Error analysis of student work
Engaging in mathematical tasks oen results in errors by students. Recent research has shown that our
brains grow when we make mistakes (Moser et al., 2011). at is, making mistakes is one of the most im-
portant things to do for mathematical growth. Anecdotally, we know that the more we attempt to reach new
goals, the more likely we are to fail in the short term but the more likely we are to succeed in the longer term.
In mathematics, this means we need to nd ways to embrace mistakes and retask them as learning opportu-
nities, both for us as instructors and for our students.
Error analysis in mathematics, that is, the close examination of incorrect mathematical student work has
been a focus of instructors and researchers for years, across mathematical contexts and in a variety of coun-
tries (e.g., Ganesan and Dindyal, 2014, pp. 231–238; Kingsdorf and Krawec, 2014; Luneta and Makonye,
2010). Closely examining students’ incorrect work can help us as instructors by providing a real context in
which we can explore students’ thought processes and use that work to diagnose misconceptions, usually
with a goal of remediation. By examining actual student work we can dierentiate between what Olivier
(1989) refers to as slips, which are results of carelessness, errors, which are systematic reections of student
misconceptions, and misconceptions, the underlying incorrect beliefs students hold with respect to the
mathematical processes.
In addition to instructor examination of errors, a more recent trend involves having students examine
Classroom Practices 39
Table 2. Questions generated by some common types of errors (Borasi, 1987, p. 6).
Incorrect denition
Math content:
What properties can be derived from this denition?
Which ones t our image of the concept? Which ones don’t?
What other mathematical objects could be described by this
denition?
What instances of the concept are not described by this de-
nition?
Are all the properties stated essential? Could any be elim-
inated?
Could we modify the denition and turn it into a correct
one?
What if this were the correct denition for the concept?
What would the concept itself be?
How would it compare with the standard one?
What would be the consequences of accepting this deni-
tion in mathematics?
How could this denition be further modied?
What other alternative notions could be created?
Nature of mathematics:
What characteristics do we want a mathematical denition
not to have? What properties should a mathematical de-
nition have?
How can we evaluate and choose among alternative deni-
tions for a given concept?
What should a denition accomplish? What do we use de-
nitions for?
How do mathematical denitions dier from those in other
elds?
Approximate results
Math content:
Can you evaluate how “big” an error you are making by us-
ing the approximate result instead of the exact one?
What would be the consequences of your “error”, once you
use the approximate result in other applications?
Are other approximate results available?
How do they compare with yours?
Could you further improve your result. and obtain a “closer”
approximate result?
Would such an activity be worth it?
Nature of mathematics:
Can we always get exact results in mathematical problems?
If not why?
What could the role and value of approximate results be
when exact results are available?
What if the exact results are not available?
How can alternative approximate results be evaluated?
Wrong results
Math content:
In what sense is the result wrong?
Where did the procedure fail?
Could it be xed up and thus lead to dierent results?
What were our assumptions and are they justied?
In what cases?
What are the consequences of accepting this alternative
result?
In what circumstances could such a result be considered
right?
Nature of mathematics:
How can we test whether we used a mathematical procedure
correctly?
How can we decide whether it is appropriate to apply a cer-
tain procedure in a given situation?
How can we determine the domain of application of a given
procedure?
Right results reached by an unsatisfactory
procedure
Math content:
Why do we get right results in this case?
Could the procedure be slightly modied and be made
more rigorous?
Does the procedure work in this specic case because of
specic properties pertaining to it?
In such case what are these properties?
In what cases would it work?
In what cases would it fail?
What assumptions are necessary to be sure it will work?
Nature of mathematics:
Is the dierence between being rigorous or not rigorous a
dierence in degree?
Who decides whether a procedure is suciently rigorous?
On what basis?
Were the criteria used the same throughout the history of
mathematics?
Unsatisfactory models
Math content:
In what sense does the model work and in what sense does
it not?
How does the model compare with another “good” model of
the same· concept if there are any such models available?
Why does the model fail to represent some aspects of the
concept?
How could we try to modify the model so that it “ts” the
concept better?
Is the real problem a limitation in the specic model or in
the concept itself?
Nature of mathematics:
How can we determine the aspects for which a model “ts
the original object and the aspects for which it does not
t?
How “dierent” from the actual object could an acceptable
model be?
What is the value of alternative yet impartial models?
How could we evaluate which one is better?
40 MAA Instructional Practices Guide
incorrect student work (real or instructor-created) to explore mathematics more deeply and rene their
own thinking. Borasi (1987, 1994) calls errors “springboards for inquiry” and puts forward a strategy for
using them to stimulate worthwhile mathematical inquiries. Errors can be used to investigate the nature of
fundamental mathematical notions such as “proof, “algorithm” and “denition. For example, instructors
might provide students with slightly incomplete or incorrect proofs (or algorithms or denitions) and have
them nd the point of error. From these processes, students can begin to formalize what constitutes a “good”
proof. Table 2 illustrates potential errors that occur during inquiry and provides questions that instructors
can ask to help the student decipher the error or to facilitate a classroom discussion. e nature of mathe-
matics questions can be helpful in redesigning lessons or activities that promote classroom discourse.
However, deciding to use student errors in our teaching requires some thought. e choice of tasks
through which our students do and learn mathematics is critical in ensuring we provide a growth opportu-
nity through some sort of error analysis. How do we choose and structure mathematical tasks that elicit the
student thinking we want to see? Boaler (2015) gives six suggestions for developing or adapting mathemat-
ical tasks to increase their potential for providing open learning spaces. ey include
• Open the task up to multiple methods, pathways and representations.
• Include inquiry opportunities.
• Ask the problem before teaching the method.
• Add a visual component and ask students how they see the mathematics.
• Extend the task to make it lower oor and higher ceiling.
• Ask students to convince and reason—be skeptical.
When we choose or create tasks we must rst decide who will use them. Are they for individual instructors,
a group of instructors, individual students, students working in groups, etc.? Below are some examples of
ways to use student errors to improve our assessment of students’ reasoning as well as to improve their un-
derstanding of mathematical constructs.
Carlson, Oehrtman, and Engelke (2010) designed the assessment shown in Figure 11 with purposeful
distractors that were created based on student misconceptions as documented in research. It might surprise
readers to learn that only 25% of 672 precalculus students chose the correct answer.
Figure 11. Precalculus item from Carlson, Oehrtman, and Engelke, 2010.
Exploring the sources of incorrect answers can inform instructors where we might need to intervene,
clarify, and remediate. is exploration can be accomplished with open-ended questions, group-discussion
tasks, or reective writing assignments centered on a particular, common incorrect answer.
Classroom Practices 41
CP.2.8. Flipped classrooms
So far our discussion of the selection of appropriate mathematical tasks has made tacit assumptions about
the structure of the course for which those tasks are selected, namely:
• Students are working in one of two contexts: individually or collaboratively. We refer to the contexts
in which students work individually as their individual space and the context in which students work
in structured, managed groups as the group space for the course. For a typical face-to-face course, the
group space is the class meeting and the individual space is everything in between.
• Students meet in the group space for the purpose of gaining rst contact with new ideas and engaging
in some preliminary explorations of the analysis and applications of those new ideas.
• Students work in their individual spaces to explore higher-order cognitive tasks such as advanced
applications, synthesis of new ideas with old ones, and creative tasks such as the writing of proofs or
construction of models.
We refer to a course design that uses individual and group space in these ways as a traditionally designed
course. e appropriateness of a mathematical task depends on the choice of design and the context in
which it is intended to be used. For example, a task that asks students to write a proof of a conjecture might
not be appropriate for the individual space if the students have only just encountered the concept for the rst
time. Instead, an activity having students explore the concept that leads to a formulation of the conjecture
seems more appropriate, followed by assigning the proof for the individual space.
However, an increasing number of instructors are choosing to employ a ipped learning design in their
courses. Flipped learning is a pedagogical model in which rst contact with new ideas takes place in the
individual space rather than in the group space, and the group space is repurposed to focus on active learn-
ing and creative applications of those ideas. e above task in which students explore an idea and make a
conjecture might be more appropriate for the individual space in a ipped learning environment, and then
students come to the group space with their conjectures and work together to construct and critique proofs
of that conjecture. e appropriateness of a given mathematical task may change in a ipped course model.
What kinds of tasks are appropriate for ipped learning environments? To address this question, consider
that the ipped learning environment proceeds as a cycle through several phases:
1. e individual space prior to the group meeting.
2. e rst few minutes of the group meeting.
3. e main portion of the group meeting.
4. e last few minutes of the group meeting.
5. e ongoing individual space following the group meeting.
Each phase has certain tasks that are highly appropriate for that particular context.
Prior to group meetings, student work in a ipped learning environment focuses on gaining rst contact
with new concepts through the use of structured activities. Ideally, students should gain basic uency with
new ideas so that in the group space they can work together on advanced, creative work with those ideas
without extensive review.
erefore the most appropriate tasks to select for pre-group meeting work should focus on the lowest
levels of Blooms taxonomy (Anderson, et al., 2001) the storage and retrieval of basic facts and concepts
(“Remembering”), the explanation of those ideas (“Understanding”), and using those ideas in simple new
situations (“Applying” on a basic level). Tasks at this level are appropriate because this phase of the learning
process is the student’s rst contact with new material. To keep student motivation high, anything beyond
these levels should be saved for later. Moreover, the ways in which these tasks are presented must be carefully
42 MAA Instructional Practices Guide
selected. Students are asked to pick up new concepts—some of which may be quite complex—through their
own eorts. Flipped learning makes no apologies for this, but at the same time recognizes that scaolding
and guidance is needed in order to maximize the likelihood of success and to maintain reasonable levels of
motivation.
CP.2.9. Procedural uency emerges from conceptual understanding
Procedural uency and how it is developed is of central importance to post-secondary mathematics and has
been discussed for many years in the K–12 mathematics education community. Many teachers at all levels of
K–12 education lament the lack of basic skills—especially multiplication facts—exhibited by many students.
In 2014 the National Council of Teachers of Mathematics (NCTM) released a position paper on the issue
and dened procedural uency as follows (2014):
Procedural uency is a critical component of mathematical prociency. Procedural uency is the abil-
ity to apply procedures accurately, eciently, and exibly; to transfer procedures to dierent problems
and contexts; to build or modify procedures from other procedures; and to recognize when one strat-
egy or procedure is more appropriate to apply than another. To develop procedural uency, students
need experience in integrating concepts and procedures and building on familiar procedures as they
create their own informal strategies and procedures. Students need opportunities to justify both infor-
mal strategies and commonly used procedures mathematically, to support and justify their choices of
appropriate procedures, and to strengthen their understanding and skill through distributed practice.
What are the areas of procedural uency necessary for collegiate mathematics? Does the availability and
accessibility of technology inuence this discussion? If procedural uency is desired, how is it developed?
e rst question is beyond the scope of this document and is best le for a discussion at the local level
where colleagues can work collaboratively to make decisions that are best for their students and community.
While having these discussions, instructors can also consider the second question—the role of technology in
developing procedural uency. e role of technology in the collegiate mathematics classroom is discussed
elsewhere in this document. However, once the specic areas of desired procedural uency are determined
(e.g., solving a variety of equation types, use of function notation, dierentiation and integration skills),
there are general strategies to help students to develop procedural uency.
Build procedural uency from conceptual understanding
Conceptual understanding and procedural uency can work hand in hand to help students to make sense
of important mathematical ideas and to develop eective and productive problem-solving skills. Expressed
simplistically, conceptual understanding involves knowing what to do and why it works, while procedural
uency involves deciding and knowing how to do it. While conceptual understanding and procedural u-
ency can work together as students engage in mathematical activity or in solving problems, we will focus on
how procedural uency develops from conceptual understanding.
Brain research oen reports the importance of building strong cognitive connections. When students
learn procedures in such a way that they are connected to conceptual foundations, they will have more
success in using these procedures, will recall them for a longer period of time, and will be able to use these
procedures exibly and eectively in a problem solving situation (NRC, 2005). Consider fraction division as
a case in point. Oen students are told to “keep the rst fraction, change the division sign to multiplication,
and ip the second fraction.” at is, they “keep-change-ip,” and multiply the fractions to get the desired
result. is technique can, in the short term, help students get the right answer to a computational problem.
However, since this technique is not connected to the meaning of division or to an understanding of what
a fraction is, students oen forget the procedure or apply an incorrect procedure, also void of conceptual
understanding, when faced with a fraction division situation. Furthermore, this “keep-change-ip” teaching
Classroom Practices 43
strategy does not prepare students to understand situations where the division of fractions is even necessary
or required. We see the impact of this lack of conceptual understanding in the collegiate classroom when
students employ a remembered procedure at the wrong time. Without a foundation in conceptual under-
standing, students will grab at the procedures that they remember in hopes that it will produce the correct
result (NRC, 2005).
Students can develop conceptual understanding, procedural uency, and problem-solving strategies
when they learn mathematics with a focus on the Standards for Mathematical Practices (CCSS) and when
teachers eectively implement the Mathematics Teaching Practices (NCTM, 2014). Bullmaster-Day (n.d.)
summarizes the benets of learning mathematics when the Mathematical Practices and Mathematics Teach-
ing Practices are a major focus. She claims
A consistent instructional cycle that incorporates all of these elements enables students to organize,
store, and retrieve new knowledge, while strengthening interconnections between the pieces of in-
formation in their mental “maps” so that the information will be available to them for recall, transfer,
and future use. When students have opportunity to practice skills to the point of automaticity their
working memory is freed for new tasks and they are able to see patterns, relationships, and discrep-
ancies in problems that they would have missed without such practice (Anderson, Greeno, Reder, and
Simon, 2000; Bransford, Brown, and Cocking, 2000; Collins, Brown, and Newman, 1989; Ellis and
Worthington, 1994; Good and Brophy, 2003; Marzano, Gaddy, and Dean, 2000; Means and Knapp,
1991; Pressley, et al., 1995; Rosenshine, 2002; Rosenshine and Meister, 1995; Stevenson and Stigler,
1992; Wenglinsky, 2002, 2004).
Classroom vignette: Average rate of change
e idea of average rate of change is one that is developed in collegiate algebra and precalculus courses and
used later in courses such as calculus and dierential equations. Ideally students become procedurally uent
in calculating the average rate of change in context, and conceptually make sense of what average rate of
change means. In fact, knowing the meaning of the average rate of change can help students to compute it
eectively and accurately.
A College Algebra class considers the situation from the 2016 Rio de Janeiro Olympics where Jamaican
sprinter Usain Bolt won the gold medal in the 100-meter sprint with a time of 9.81 seconds. Students are
asked to consider the speed at which Usain Bolt ran. It is correct to say that Usain Bolt ran the race at an
average speed of 100 meters per 9.81 seconds, though no one would suggest this non-conventional way to
express the idea of speed. When asked to describe the average speed, students might try to remember a for-
mula. Even if they do so accurately, they cannot articulate mathematical reasons for why division is required.
Thinking Stage 1
Imagine that the segment represents 9.81 seconds and
is cut up into a total of 9.81 one-second segments. Each
segment represents
1
981.
of the total time.
9.81 seconds
In one approach students are encouraged to think that the goal is to report Usain Bolts speed as a con-
stant speed per one second (a unit rate). ere are 9.81 one-second intervals contained in the total time of
9.81 seconds. Each of these one-second intervals represents
1
981.
of the total time of 9.81 seconds.
In order to maintain the proportional relationship between distance and time, the 100-meter distance
must also be cut up into 9.81 segments so that we can claim that each of these distance segments corre-
44 MAA Instructional Practices Guide
sponds to one of the 1-second time segments.
Thinking Stage 2
Imagine that the segment represents 100 meters and is
cut up into a total of 9.81 equal segments. Each segment
represents
1
981.
of the total distance.
100 meters
Each of these 9.81 distance segments is
1
981.
of 100 meters or
1
981.
.
100 meters or about 10.19 meters. Each
of these 10.19-meter segments corresponds to 1 second, so we can say that Usain Bolt traveled, on average,
10.19 meters each second or 10.19 meters per second.
is reasoning creates the well-connected network of understanding that can lead to procedural uency
where students come to recognize the need to divide when computing an average rate of change. Further-
more, they can connect procedural and conceptual understanding while making sense of the average rate
of change.
What is described above provides students with greater opportunity to develop procedural uency as
envisioned by the denition provided by NCTM at the beginning of this section.
In a second approach, students employ a more conventional routine where they compute Bolts average
speed by using the two points (0,0) and (9.81, 100). Using a formula which may only be memorized, students
can compute as shown.
m
yy
xx
=
=
21
21
100 0
9810
10 19
.
..
e formula alone does not allow students the procedural exibility and conceptual connections to make
sense of the underlying mathematics and meaning of this calculation. However, once a conceptual foun-
dation is established, students may be able to make sense of this traditional slope formula. Note that in the
Usain Bolt context, y
2
- y
1
represents the total distance traveled and represents the elapsed time. Students
could reason about the slope formula as follows:
• In
1
21
xx
-
of the time, Bolt will run of the distance.
• at is,
1
21
xx
-
of the elapsed time of
1
21
xx
-
seconds represents 1 second.
• To preserve the proportional correspondence, Bolt will also travel
1
21
xx
-
of the total distance or
1
21
xx
-
(y
2
- y
1
) or
yy
xx
21
21
-
-
.
Furthermore, students who have learned mathematics with the intended balance between conceptual
understanding, procedural uency, and problem solving can extend their understanding of average rate of
change to make sense of instantaneous rate of change, leading to the beginning development of the dier-
ence quotient and the limit denition of derivative.
CP. Conclusion
e learning of mathematical ideas is much more profound, learned information much more useful, and
problem solving more deeply developed when ideas make sense, when students are active participants in the
Classroom Practices 45
learning process, and when students have the opportunity for repeated reasoning. Much research demon-
strates that when students simply memorize rules and procedures and perhaps try to remember using mne-
monics, songs, and gimmicks, they struggle to recall the right procedure at the right time. When these
procedures are not part of a well-connected web of understanding, they are not useful or not remembered
correctly.
Students need to actively engage in the process of learning mathematical ideas, developing strong con-
ceptual understanding, and using these ideas to develop procedural uency. e traditional lecture format,
where the instructor is engaged in mathematical thinking, will not accomplish the procedural uency goals
we desire. Consider the following from Freeman et al. (2014):
In addition to providing evidence that active learning can improve undergraduate STEM education,
the results reported here have important implications for future research. e studies we meta-ana-
lyzed represent the rst-generation of work on undergraduate STEM education, where researchers
contrasted a diverse array of active learning approaches and intensities with traditional lecturing. Giv-
en our results, it is reasonable to raise concerns about the continued use of traditional lecturing as a con-
trol in future experiments (emphasis added) (p. 8413).
CP. References
Anderson, L.W., Krathwohl, D.R., Airasian, P., Cruikshank, K., Mayer, R., Pintrich, P., and Wittrock, M. (2001). A
taxonomy for learning, teaching and assessing: A revision of Blooms taxonomy. New York. Longman Publishing.
Artz, A.F. and Armour-omas, E. (1992). Development of a cognitive-metacognitive framework for protocol analysis
of mathematical problem solving in small groups. Cognition and Instruction, 9(2), 137–175.
Baldinger, E. and Louie, N. (2014). TRU Math conversation guide: A tool for teacher learning and growth. Berkeley, CA
and E. Lansing MI: Graduate School of Education, University of California, Berkeley and College of Education,
Michigan State University. Retrieved from map.mathshell.org/materials/pd.php.
Barshay, J. (2016, July 27). Is it better to teach pure math instead of applied math? Retrieved from hechingerreport.org/
pure-math-better-applied-math/.
Battey, D. and Stark, M. (2009). Inequitable classroom practices: Diagnosing misconceptions as inability in mathemat-
ics. In D. White et al. (eds), Mathematics for Every Student, Responding to Diversity, Grades PreK–5. Reston, VA:
National Council of Teachers of Mathematics.
Borasi, R. (1987). Exploring mathematics through the analysis of errors. For the Learning of Mathematics, 7(3), 2–8.
Borasi, R. (1994). Capitalizing on errors as “springboards for inquiry”: A teaching experiment. Journal for Research in
Mathematics Education, 25(2), 166–208.
Brame, C.J. and Biel, R. (2015). Setting up and facilitating group work: Using cooperative learning groups eective-
ly. Retrieved February 10, 2018 from cft.vanderbilt.edu/guides-sub-pages/setting-up-and-facilitating-group-work-using-
cooperative-learning-groups-effectively/
Carnegie Foundation for the Advancement of Teaching. Productive persistence. Retrieved November 1, 2016 from
www.carnegiefoundation.org/in-action/carnegie-math-pathways/productive-persistence/
Checkley, K. (1997, Summer). Problem-based learning: e search for solutions to lifes messy problems. ASCD Cur-
riculum Update, pp. 1–8.
Clarke, D., Roche, A., Sullivan, P., and Chesseman, J. (2014). Creating a classroom culture that encourages students to
persist on cognitively demanding tasks. Annual perspectives in mathematics education using research to improve
instruction. Reston, VA: NCTM.
Cooper, J. and Robinson, P. (1998). Small group instruction in science, mathematics, engineering, and technology.Jour-
nal of College Science Teaching, 27(6), 383–388.
Cornell University Center for Teaching and Learning (n.d.). Active learning. Retrieved from www.cte.cornell.edu/
teaching-ideas/engaging-students/active-learning.html.
46 MAA Instructional Practices Guide
Duckworth, A. (2016). Grit: e power of passion and perseverance. New York: Scribner.
Dweck, C.S. (2008). Mindset: e new psychology of success. New York: Ballantine Books.
Edwards, A.R. and Beattie, R.L. (2016). Promoting student learning and productive persistence in developmental
mathematics: Research frameworks informing the Carnegie pathways. NADE Digest, 9(1), 30–39.
Ethnography and evaluation research. University of Colorado Boulder. Retrieved from www.colorado.edu/eer/research/
profdev.html.
Felder, R.M. and Brent, R. (2009). Active learning: An introduction. ASQ Higher Education Brief, 2(4), 1–5.
Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., and Wenderoth, M.P. (2014). Active
learning increases student performance in science, engineering, and mathematics.Proceedings of the National
Academy of Sciences,111(23), 8410–8415.
Freudenthal, H. (1991). Revisiting mathematics education. Dordrecht, e Netherlands: Kluwer Academic.
Fuller, R., Agruso, S., Mallow, J., Nichols, D., Sapp, R., Strassenburg, A., and Allen, G. (1985). Developing student con-
dence in physics. American Association of Physics Teachers, College Park, MD.
Ganesan, R. and Dindyal, J. (2014) An Investigation of Students’ Errors in Logarithms. In J. Anderson, M. Cavanagh
and A. Prescott (eds). Curriculum in Focus: Research Guided Practice. Proceedings of the 37
th
Annual Conference
of the Mathematics Education Research Group of Australasia. Sydney: MERGA
Gredler, M.E. and Shields, C.C. (2008). Vygotskys legacy: A foundation for research and practice. New York: Guilford
Press.
Hayward, C. and Laursen, S. (2014). Collaborative research: Research, dissemination, and faculty development of in-
quiry-based learning (IBL) methods in the teaching and learning of mathematics. Cumulative evaluation report:
2010–2013. (Report to the National Science Foundation)
Hiebert, J. and Grouws, D. A. (2007). e eects of classroom mathematics teaching on students’ learning. In F. K.
Lester (ed), Second handbook of research on mathematics teaching and learning, pp. 371–404.
Hsu, E., Kysh, J., and Resek, D. (2007).Using rich problems for dierentiated instruction.New England Mathematics
Journal, 39, 6–13.
Hughes, D.C. (1973). An experimental investigation of the eects of pupil responding and teacher reacting on pupil
achievement. American Educational Research Journal 10(1), 21–37.
Johnson, D.W. and Johnson, R.T. (1999). Making cooperative learning work.eory into practice,38(2), 67–73.
Kingsdorf, S. and Krawec, J. (2014). Error analysis of mathematical word problem solving across students with and
without learning disabilities. Learning Disabilities Research and Practice, 29(2), 66–74.
Kuh, G.D. (2007). What student engagement data tell us about college readiness. Peer Review, 9(1), 4–8.
Lampert, M. (2001). Teaching problems and the problems in teaching. New Haven, CT: Yale University Press.
Laursen, S.L., Hassi, M.L., Kogan, M., and Weston, T.J. (2014). Benets for women and men of inquiry-based learning
in college mathematics: A multi-institution study.Journal for Research in Mathematics Education,45(4), 406–418.
Leyva, L. A. (2016). An intersectional analysis of Latin@ college womens counter-stories in mathematics. Journal of
Urban Mathematics Education, 9(2), 81–121.
Lotan, R.A. (2014). Craing groupworthy learning tasks. In Cohen, E.G. and Lotan, R.A. (eds). Designing groupwork:
Strategies for the heterogeneous classroom. New York: Teachers College Press.
Luketa, K. and Makonye, P.J. (2010). Learner errors and misconceptions in elementary analysis: A case study of a grade
12 class in South Africa. Acta Didactica Napocensia, 3(3), 35–46.
McClenney, K., Marti, C., and Adkins, C. (2007). Student engagement and student outcomes: Key ndings from
CCSSE validation research. Community College Survey of Student Engagement. Retrieved from www.ccsse.org/
aboutsurvey/docs/CCSSE%20Validation%20Summary.pdf
McKay, S. (September 25, 2015). Using new research to improve student motivation. Carnegie Foundation for the
Advancement of Teaching. Retrieved from www.carnegiefoundation.org/blog/using-new-research-to-improve-student-
motivation/
Classroom Practices 47
Michaels, S. and OConnor, C. (2013). Conceptualizing talk moves as tools: Professional development approaches for
academically productive discussion. In Resnick, L.B., Asterhan, C. and Clarke, S.N. (eds). Socializing Intelligence
through Talk and Dialogue. Washington DC: American Educational Research Association.
Moser, J.S., Schroder, H.S., Heeter, C., Moran, T.P., and Lee, Y.H. (2011). Mind your errors evidence for a neural mech-
anism linking growth mindset to adaptive posterror adjustments. Psychological Science, 22(12), 1484–1489.
National Governors Association Center for Best Practices, and Council of Chief State School Ocers (2010). Com-
mon Core State Standards for Mathematics. Washington, DC: Author. Retrieved from corestandards.org/assets/
CCSSI_Math%20Standards.pdf
National Research Council. 2005.How students learn: History, mathematics, and science in the classroom. Washing-
ton, DC: e National Academies Press. https://doi.org/10.17226/10126.
Niemiec, C.P. and Ryan, R.M. (2009). Autonomy, competence, and relatedness in the classroom: Applying self-deter-
mination theory to educational practice. School Field, 7(2), 133–144.
Olivier, A. (1989). Handling students’ misconceptions. Pythagoras, 21, 10–19.
Pengelley, D. (2017). Beating the lecture-textbook trap with active learning and rewards for all. Notices of the AMS,
64(8), 903–905.
Piacentini, M. and Monticone, C. (2016). Equations and Inequalities: Making Mathematics Accessible to All. PISA.
OECD Publishing.
Radatz, H. (1980). Students’ errors in the mathematical learning process: A survey. For the Learning of Mathematics,
1(1), 16–20.
Rasmussen, C. and Kwon, O. (2007). An inquiry oriented approach to undergraduate mathematics. Journal of Mathe-
matical Behavior, 26, 189–194.
Rasmussen, C., Yackel, E., and King, K. (2003). Social and sociomathematical norms in the mathematics classroom.
In Schoen, H. and Charles, R. (eds). Teaching mathematics through problem solving: Grades 6–12. Reston, VA:
National Council of Teachers of Mathematics.
Ray, M. (2013). Noticing and wondering. In Ray, M. (ed). Powerful problem solving: Activities for sense making with the
mathematical practices. Portsmouth, NH: Heinemann.
Rendon, L.I. (1994). Validating culturally diverse students toward a new model of learning and student development.
Innovative Higher Education, 19(1), 33.
Robson, E. (2009). Mathematics education in an old Babylonian scribal school. In Robson and Stedall (eds). e Ox-
ford Handbook of e History of Mathematics (Oxford, 2009).
Ryan, R.M. and Deci, E.L. (2000). Intrinsic and extrinsic motivations: Classic denitions and new directions. Contem-
porary Educational Psychology, 25(1), 54–67.
Shechtman, N., DeBarger, A.H., Dornsife, C., Rosier, S., and Yarnall, L. (2013). Promoting grit, tenacity, and persever-
ance: Critical factors for success in the 21
st
century. Washington, DC: U.S. Department of Education.
Seeley, C. (2009). Faster Isn’t Smarter: Messages About Math, Teaching, and Learning in the 21
st
Century. Sausalito, CA:
Math Solutions.
Shellard, E. and Moyer, P.S. (2002). What principals need to know about teaching mathematics. Alexandria, VA: Na-
tional Association of Elementary School Principals/Educational Research Services.
Slavin, R.E. (1996). Research on cooperative learning and achievement: What we know, what we need to know. Con-
temporary Educational Psychology, 21(1), 43–69. dx.doi.org/10.1006/ceps.1996.0004.
Smith, B.L. and MacGregor, J.T. (1992). What is collaborative learning? In Goodsell, A.S., Maher, M.R., and Tinto, V.
(eds). Collaborative Learning: A Sourcebook for Higher Education. National Center on Postsecondary Teaching,
Learning, and Assessment, University Park, PA.
Stein, M.K., Grover, B.W., and Henningsen, M. (1996). Building student capacity for mathematical thinking and rea-
soning: An analysis of mathematical tasks used in reform classrooms. American Educational Research Journal,
33(2), 455–488.
48 MAA Instructional Practices Guide
Stein, M.K. and Smith, M.S. (2011). Five Practices for Orchestrating Productive Mathematics Discussions. Reston, VA:
National Council of Teachers of Mathematics.
Sullivan, P. (2011). Teaching mathematics: Using research-informed strategies. Melbourne Vic, Australia: Australian
Council for Educational Research.
Sweller, J. (1988). Cognitive load during problem solving: Eects on learning. Cognitive Science, 12(2), 257–285.
ompson, P.W. (2013). In the absence of meaning. In K. Leatham (ed), Vital Directions for Research in Mathematics
Education. New York: Springer.
Tobin, K. (1987). e role of wait time in higher cognitive level learning. Review of Educational Research, 57(1), 69–95.
Treisman, U. (1992). Studying students studying calculus: A look at the lives of minority mathematics students in col-
lege. e College Mathematics Journal, 23(5), 362–372.
Wood, T. and Sellers, P. (1996). Assessment of a problem-centered mathematics program: ird grade. Journal for Re-
search in Mathematics Education, 27(3), 337–353.
Wood, T. and Sellers, P. (1997). Deepening the analysis: longitudinal assessment of a problem-centered mathematics
program. Journal for Research in Mathematics Education, 28(2), 163–186.
Vygotsky, L.S. (1978). Mind in Society. Cambridge, MA: Harvard University Press.
49
Assessment Practices
AP.1. Basics about assessment
e purpose of this chapter is to provide guiding principles for assessing students’ learning of mathematics
through summative and formative assessments. Instructors are likely aware of summative assessment whose
purpose is to evaluate student prociency with regard to one or more learning outcomes, such as exams,
quizzes, and homework (when graded aer only one attempt). Black and William (2009) dene a formative
assessment practice as one in which instructors elicit, interpret, and use evidence about student achievement
to make decisions about the next steps in instruction with the goal of improving instruction and student
learning. We elaborate on these assessments in section AP.1.3.
Assessment is regarded as an essential element for learning in terms of nding and recording increased
knowledge or skills (Hanson and Mohn, 2011). Recently there has been an increase in this original purpose
of assessment due to the rise of the accountability paradigm, including heightened inquiry into all aspects
of the educational process by various entities, internal or external (Hanson and Mohn, 2011). We attempt to
address both of these needs by providing various vignettes where summative or formative assessments are
implemented. is introductory section gives an overview to allow independent reading of other sections
based on what is most relevant to the reader.
roughout the chapter we emphasize components of eective assessment which include
1. stating high-quality goals for student learning,
2. providing students frequent informal feedback about their progress toward these goals, and
3. evaluating student growth and prociency based on these goals.
AP.1.1 Assessment frameworks
Several of the K–12 and collegiate professional organizations have developed frameworks for assessment.
For example, in 1995, the National Council of Teachers of Mathematics (NCTM) published their third vol-
ume on standards, Assessment Standards for School Mathematics, which includes a useful framework for
identifying the broad purposes and phases of assessment (NCTM, 1995, p. 27). e framework begins when
mathematics instructors identify what they hope to accomplish in terms of student learning, instructional
practices, student engagement, and motivation, all in the context of the broader mathematics program in
which they live. is reective assessment practice is cyclic, occurring in four phases as the instructor seeks
to meet each specic assessment purpose: planning an assessment, gathering and interpreting data, and
using evidence to modify future lessons.
At the collegiate level, the Mathematical Association of America (MAA) has published two volumes about
assessment: Assessment Practices in Undergraduate Mathematics (Gold et al., 1999) and Supporting Assess-
ment in Undergraduate Mathematics (Steen, 2006). Both of these volumes range in their scope, including
assessment of students, teachers, programs, and majors. e American Statistical Association has published
Guidelines for Assessment and Instruction in Statistics Education, College Report (GAISE, 2016), focused on
50 MAA Instructional Practices Guide
assessment at the college level. e Society for Industrial and Applied Mathematics has published Guidelines
for Assessment and Instruction in Mathematical Modeling Education (GAIMME, 2016), which is broadly ap-
plicable across courses and across student levels. All instructors in the mathematical and statistical sciences
can benet from consulting these resources as they develop assessment techniques for their courses.
Historically, the MAA has been a leader in encouraging thoughtful and ongoing assessment of student
learning in undergraduate mathematics. In the introduction to the MAAs Notes volume Assessment Practic-
es in Undergraduate Education (Gold, Keith, and Marion, 1999), Steen outlines six principles of assessment
that remain relevant today. ese principles have since been supported by additional research in mathemat-
ics education and they serve as useful background when integrating assessment information into practice
(Soto-Johnson and Fuller, 2012; Soto-Johnson, Yestness, and Dalton, 2009). Steen recommends that assess-
ments (1) be on continuous cycle, (2) serve as an open process, (3) promote valid inferences, (4) employ
multiple measures of performance, (5) measure what is worth assessing, and (6) support every students
opportunity to learn important mathematics. ese principles mirror the NCTM’s (1995) recommendations
regarding assessment. e NCTM stresses how assessments should reect the mathematics that students
should know, enhance mathematics learning, promote equity and valid inferences, and be an open and
coherent process. ese recommendations suggest that assessments should inform instructors about their
students’ learning as well as about their teaching. Below we elaborate on each of Steens principles of assess-
ment and briey discuss how we connect to them in this chapter.
Principle 1. Assessment is not a single event but a continuous cycle.
Many of the vignettes in this chapter focus on a particular problem facing a teacher, but the instructor, the
department, and indeed the whole institution should develop a culture in which assessment is ongoing.
Principle 2. Assessment must be an open process.
Approaches to assessment should inform both teaching and learning—that is, it informs both instructors
and students. Open communication about learning goals as described in section AP.1.2 is important. In
practice for instructors, this means that we use assessment to inform students of their progress and not
just as a grading tool.
Principle 3. Assessment must promote valid inferences.
Results of good assessments promote responses that directly impact classroom instruction. We try to
frame these approaches based on the goals of particular courses.
Principle 4. Assessment that matters should always employ multiple measures of performance.
Historically the mathematics community has based summative assessment on high-stakes tests in a few
standard formats. We present numerous ways to encourage a wider set of measures. In practice this
means that instructors use assessment to inform students of their progress.
Principle 5. Assessment should measure what is worth learning, not just what is easy to measure.
e balance between the depth of what we can measure for each student and the practical limitations
imposed by real world constraints is highlighted in our sections on large classes and online learning.
Principle 6. Assessment should support every students opportunity to learn important mathematics.
Good assessment practices are vital in providing opportunities for a larger and more diverse set of stu-
dents to participate in mathematics.
AP.1.2 Clearly specify learning outcomes
Mathematicians oen use the phrase “mathematical maturity” to capture the idea that a student is math-
ematically well-rounded, not only in mathematical content knowledge but in other ways as well. A use-
Assessment Practices 51
ful oversimplication frames the human psyche as a three-domain
model. e content domain or intellectual domain regards knowl-
edge and understanding of concepts. e cognitive domain or be-
havioral domain regards the practices and actions with which we
apply or develop that knowledge. e aective domain or emotional
domain regards how we feel about our knowledge and our actions.
All three domains play key roles in student learning and contrib-
ute to developing students’ mathematical maturity. As such, in our
mathematics courses it is important to create learning outcomes that
include goals representing these three domains.
Course learning outcomes will vary across courses and institu-
tions, but they should have in common this broad representation of
areas of learning that extend beyond a list of content topics. Learn-
ing outcomes for students provide the denition of student success in a course, and all decisions made by
instructors regarding assessment, in conjunction with classroom practices, should be directly aligned with
these goals.
Careful, objective wording is oen encouraged when phrasing goals for the purpose of assessment. For
example, “understand” is oen considered too vague to capture an assessable objective. Consider the state-
ment, “Students will understand the fundamental theorem of calculus.” is student learning goal is about
a specic concept within a calculus course, but does not provide a measurable outcome: it is not clear how
understand” is to be assessed, or what particular instructional design may be best for developing such un-
derstanding. By contrast, consider the statement, “Students will explain the signicance of the fundamental
theorem of calculus and use the theorem appropriately.” is second statement provides two measurable
outcomes: Students can explain the signicance of the fundamental theorem of calculus and students can
apply the fundamental theorem of calculus appropriately. e phrase, “explain the signicance” demon-
strates knowledge of the concept and why it is important within the context of calculus, while the phrase,
can apply” demonstrates skill in recognizing the applicability of the theorem as well as ability to use the
theorem.
Below we provide a vignette that exemplies the three domains.
Vignette: Professor Johnson is teaching a number theory class and a college algebra class. In the number
theory class, Professor Johnson wants to make sure that students not only learn denitions, theorems, and
proof-writing, but also that students learn how to use computer algebra systems to conduct experiments
with modular arithmetic. In the college algebra class, Professor Johnson wants students to learn mathe-
matical procedures and develop an understanding of how they work. Aer reviewing the Common Core
Standards for Mathematical Practice (National Governors Association Center for Best Practices, 2010) the
CUPM Guide to Majors in the Mathematical Sciences overview (MAA, 2015), and the MAA CRAFTY College
Algebra Guidelines (MAA, 2007), Professor Johnson created sets of learning outcomes to include in the two
syllabi, a few of which are listed below.
Number eory
Students will
Explain the connections between divisibility, the division algorithm, and the Euclidean algorithm.
Use Fermats Little eorem and Wilsons eorem to solve problems involving linear congruences.
Create proofs using a variety of techniques, including direct proof, proof by contradiction, and mathe-
matical induction.
Experiment with various computational approaches.
Intellectual
Domain
Behavioral
Domain
Emotional
Domain
Human Psyche
52 MAA Instructional Practices Guide
Students will have opportunities to engage in the following general behaviors and mindsets—i.e., mathe-
matical practices:
Persist, work through perceived failure, and engage in strategic self-questioning.
Collaborate productively with others and ask good questions.
Construct examples and counterexamples to investigate and understand new denitions and theorems.
Read and understand existing proofs and recognize incorrect proofs.
Create and communicate original proofs.
College Algebra
Students will
Use multiple perspectives (symbolic, numeric, graphic, and verbal) to explore elementary functions.
Algebraically solve linear, quadratic, exponential, logarithmic, and power equations.
Sketch polynomial and rational functions using a graphing calculator.
Identify and algebraically nd important characteristics of these graphs such as intercepts, vertical
asymptotes, and horizontal asymptotes.
Recognize and use standard transformations with graphs of elementary functions.
Use and solve systems of equations to model real world situations.
Students will have opportunities to engage in the following mathematical practices:
Persist and work through perceived failure.
Collaborate productively with a team.
Develop a personal framework of problem solving techniques (e.g., to make sense of problems, sketch
and label diagrams, restate and clarify questions, identify variables and parameters, and use analytical,
numerical, and graphical solution methods).
Create, interpret, and revise real-world models and solutions of problems.
Discussion: Professor Johnson divided the learning outcomes for each class into mathematical content top-
ics, which primarily represent the intellectual domain, and mathematical practices, which represent the
cognitive and aective domains. By relying on existing recommendations from professional societies and
other reports, Professor Johnson was able to identify examples from which to obtain inspiration and to
adopt language.
Practical tips
Create learning outcomes based on course, departmental, and personal objectives.
Generate ideas of learning outcomes that t with the behavioral and emotional domains. See the MAA
Curriculum Guide and the Common Core Standards along with the “ve-strand” model discussed in
the report Adding It Up (National Research Council, 2001).
Document learning outcomes in writing to develop your course or to include in your syllabus for stu-
dents.
When creating learning goals there is a natural tension between Steens third principle which recommends
creating valid inferences and measurable learning goals and the h principle that suggests that some
key elements of understanding are inherently dicult to measure. Professor Johnsons goals for number
theory seem more in line with the h principle, while it is easier to imagine measurable results for the
college algebra course. Finding the balance between these principles is a challenge but is also a key to good
assessment. e Design Practices chapter contains an extensive discussion on creating student learning
outcomes.
Assessment Practices 53
AP.1.3. Formative and summative assessment
Assessment can be roughly divided into two types, formative and summative, but these need not occur in
isolation. In fact many standard assessment techniques have elements of both. As discussed in the introduc-
tion formative assessments are generally incorporated to elicit, interpret, and make decisions about the next
steps in instruction. When we want to provide feedback to students, before we actually evaluate their work,
we need to conduct some informal assessment—this is always formative in nature. While techniques such as
observing students work in groups or having students answer verbal questions during class are well-known
formative assessments, there are other surprising techniques in which formative assessment can arise. For
example, online homework systems can be set to allow students multiple attempts on each problem without
penalty. is can provide students with feedback on the correctness of their work and provide instructors
with information about which problems are most dicult for students, and thus, aect an instructor’s sub-
sequent teaching.
While formative assessments are primarily used to inform the direction in which instructors might mod-
ify their lessons, summative assessments are conducted with the purpose of evaluating student prociency
with regard to one or more learning outcomes. However, looking over the learning outcomes from our ear-
lier vignette with Professor Johnson, it is apparent there are many learning outcomes for which exams, quiz-
zes, and homework are not eective summative evaluations. Sometimes these outcomes are simply dicult
to evaluate. In such cases assessments such as writing assignments, group projects, and oral presentations
are appropriate summative evaluations, but they can also have a formative component when coupled with
appropriate feedback. Many assessment tools have both summative and formative components.
In discussions regarding the use of formative and summative assessment, a distinction is oen made
between assessing growth versus assessing prociency. In measuring student growth we evaluate students
progress compared to their starting point. In measuring student prociency we evaluate students against a
xed target outcome, regardless of their starting point. In K–12 educational policy, decisions about the rela-
tive importance of assessing growth versus prociency have major impacts in terms of how schools are eval-
uated. At the postsecondary level, it is important to determine whether growth or prociency is the primary
goal when setting learning goals and selecting formative and summative assessments. In STEM courses such
as calculus, linear algebra, and dierential equations, it is reasonable for prociency to be the primary goal.
In non-STEM courses, such as those that satisfy general education or quantitative literacy requirements,
it is equally reasonable for growth to be used as the primary focus of assessment. In other courses, growth
and prociency might be given equal weight when assessment methods are chosen. Regardless of how this
distinction is handled in a given course, the important point is that one should clearly decide and articulate
how to balance the assessment of growth and prociency.
Following are independent sections on formative and summative assessment, but because these assess-
ments inform one another there is some overlap in the sections. Feedback is an essential part of both forma-
tive and summative assessment as reected in Steens principles #1 (continuous cycle), #2 (openness), and
#6 (enabling students to connect to mathematics). Instructors need to be aware that feedback is constant
through classroom behaviors as well as through graded assignments or classrooms discussions.
AP.2. Formative assessment creates an assessment cycle
How do students obtain informative feedback about their understanding of a given topic or about the ef-
fectiveness of their approach to solving problems? How do instructors determine how students “are really
reasoning” about mathematics? How do we ensure students and instructors have the same understanding of
the course goals? e answers to these and similar questions fall within the scope of techniques of formative
assessment.
54 MAA Instructional Practices Guide
AP.2.1. Implementing formative assessment
Black and Wiliam (1998, 2009) identify ve strategies for implementing formative assessments.
1. Clarify and share learning intentions and criteria for success.
2. Engineer eective classroom discussions and other learning tasks that elicit evidence of student under-
standing.
3. Provide feedback that moves learners forward.
4. Activate students as instructional resources for one another.
5. Activate students as the owners of their own learning.
Each strategy is an action for an instructor to implement. However, the degree to which one implements
each strategy, and when, are questions to consider.
Does the instructor tweak a problem in a subsequent assignment, or adjust activities and actions for
the entire next class period?
Is an instructor always eliciting feedback and adjusting their teaching on the spot, or is the instructor
adjusting subsequent class periods using feedback?
How does the instructor utilize summative assessments in conjunction with formative assessments?
Does the instructor adjust their course aer the semester or during the course?
Although these are rich suggestions and questions to ponder as one implements formative assessments,
these are also challenges to consider. Heritage et al. (2008, p. 24) nds that “teachers do better at drawing
reasonable inferences of student levels of understanding from assessment evidence, while having diculties
in deciding next instructional steps.” Millar (2013, p. 55) states that “formative assessment is more eective
when it avoids comparing one learner with another, or with a group norm, but instead focuses on providing
task-related feedback that helps the learner to better understand the relationship between an aspect of their
current performance and the desired performance.” erefore, the quality of formative feedback is essential:
e premise underlying most of the research conducted … is that good feedback can signicantly improve
learning processes and outcomes, if delivered correctly” (Shute, 2008, p. 154).
Formative assessment can occur even aer a formal quiz or exam. For example, instructors might write
questions for subsequent quizzes or homework that will adjust for the lack of student understanding of
particular content on a previous exam. Flexibility with a variety of formative assessment techniques, and
allowing students to nd their own paths towards understanding the material, seems both to increase per-
formance by students marginalized by lecture-based teaching (Laursen, Hassi, Kogan, and Weston, 2014)
and to cultivate success in subsequent courses because the students are empowered to learn (Hassi and
Laursen, 2015).
ere has, of course, much research has been conducted on formative assessment in undergraduate
mathematics education (e.g., Reinholz, 2016), and guides about undergraduate mathematics assessment
have been published (e.g., Gold, Keith, and Marion, 1999). In recent years, technology-enabled formative
assessment has become an integral part of many undergraduate mathematics departments. Adaptive com-
puter systems such as Assessment and Learning in Knowledge Spaces (ALEKS, McGraw-Hill Education),
which is used as a placement mechanism in some higher education institutions, utilize “adaptive question-
ing” by periodically reassessing what content a student knows about a particular mathematical topic (Sullins
et al., 2013). In other words, ALEKS utilizes student answers to pose subsequent questions, thereby using
previous answers in a formative way.
Discussion on the delivery of formative assessment and feedback accompanies each of the three vignettes
below.
Assessment Practices 55
Vignette 1
An instructor, Dr. Doe, is not sure her students understand concepts already covered in her precalculus
and trigonometry course. During the next class period she gives a quiz that includes questions related
to those concepts. Aer grading the quiz, she concludes that students did not perform well on questions
related to trigonometric identities. us, she spends the rst twenty minutes of her subsequent class to
engage students on identities before moving on to a new concept.
Discussion: e assessment, a quiz, was formative in that Dr. Doe adjusted her next class period to
address student diculties by focusing more time on identities. Dr. Doe also performed this adjustment
for students’ growth, knowing that perhaps she could inuence their understanding of the topic. Dr. Doe
exhibited two of the ve key strategies of Black and Wiliam (2009), eliciting evidence of student understand-
ing from a learning task using the quiz and providing feedback that moves learners forward by explicitly
addressing the diculty. However, one could question the depth of both the evidence of student under-
standing and the action taken to provide feedback. In this vignette no detail is provided regarding the rest of
the class periods. Dr. Doe could have purposefully assigned more trigonometric identity problems in future
tasks or exams. If so, this would be an example of interleaving, where tasks are spaced and repeated during
the course with the goal of retaining the concept longer (Roediger and Pyc, 2012).
Vignette 2
Dr. Smith grades the rst test in her course. She decides that the students need to reect on their own perfor-
mance on the exam. erefore, the students are given an optional assignment in which they redo their test,
correcting any mistakes, and write a one-page paper about their problem-solving process, including what
they must do to improve their performance for the next test. Dr. Smith attaches a rubric detailing how she
will grade the one-page paper.
Discussion: is example shows how a traditionally summative assessment, an exam, can be used as a
formative assessment. Dr. Smith took an exam and parlayed it into a potential learning experience for the
students. She achieved three key strategies by sharing criteria for success (by providing a rubric), eliciting
evidence of student learning (by grading the assignment), and activating students as owners of their learning
(by asking them to reect on their mistakes and how to improve). e last strategy of the assignment is a
meta-cognitive approach—that is, instead of focusing on the problems and solutions, the students are asked
to improve their own problem-solving process.
Vignette 3
Dr. Blue teaches calculus using pedagogical student-centered techniques. On this particular day, students
come into class, form into pre-determined groups of three, and work on denite integral problems for f-
teen minutes. Aer the een minutes, one group demonstrates a solution to the rest of the class. Dr. Blue
asks the class if they have any questions, and one student poses a general (and unknown to the students)
conjecture about integrals to innity. At a pedagogical crossroads, Dr. Blue chooses to eld the question by
allowing the groups to discuss their answers to the students question for ve minutes. Aer this another
group shares their answers, and Dr. Blue ends the class with her thoughts about improper integrals.
Discussion: Improvisational teaching may occur when students pose questions or thoughts that are un-
anticipated by the instructor. is scenario has a student who is mathematically curious about a certain
topic, which can be benecial to both the student and the class (Knuth, 2002). In a student-centered class-
room where the instructor is regularly soliciting student reasoning, formative assessment is used almost
constantly. Dr. Blue used two key strategies. First, she engineered classroom discussions to elicit evidence of
student understanding by giving them time to work on the denite integrals. Second, she activated students
56 MAA Instructional Practices Guide
as instructional resources for one another by asking students to present their solutions and also by giving
them time to think about and answer the students question. e instructor could have answered the ques-
tion by either deferring it to the respective section in the book or answering it herself. However, Dr. Blue
chose to take class time to allow the students in groups to conjecture answers, thus passing the ownership of
knowledge to them. e pedagogical choice she made aer the assessment task could be considered the key
strategy, activating students as the owners of their own learning.
Summary discussion
Each of the three vignettes utilizes formative assessment and feedback in dierent depths and with dier-
ent foci. In the rst vignette, Dr. Does formative feedback focuses the class again on trigonometric iden-
tities. While this may indicate that the instructor is adjusting her teaching and valuing students’ under-
standing, another discussion or lecture may not be as deep in student learning as having students engage
in trigonometric identity problems in class or create their own trigonometric identity problems to present
in class. e second vignette focuses on problem solving instead of topics, and the third focuses on owner-
ship of generating mathematics. In all cases, one item is universal: the instructor adjusts her teaching in an
eort to improve her students’ understanding of mathematics.
One aspect to consider when thinking about formative assessment is the negation: what does a classroom
without formative assessment look like? e instructor would have a predetermined plan, both with daily
activities and assessments and would never adjust that plan. e instructor would not alter future courses
with the knowledge and feedback from the previous course. While this extreme case may not be common,
there is no doubt that the amount of adjustment, the depth of assessment, and the level of feedback all vary
greatly between instructors.
erefore, it seems that a prevailing philosophical notion underlies formative assessment. e teacher’s
belief of what mathematics is, and how it should be presented or taught, is visible from each action in the
classroom. Formative assessment may reect that mathematics can be approached with a growth mindset
(Dweck, 2008) instead of xed—that is, students can be arbiters of their own study of mathematics instead
of “never being able” to learn it. In vignette 2, Dr. Smith values the problem-solving process, perhaps because
she philosophically believes that mathematics is more about process than product. In vignette 3, Dr. Blue
allows conjecturing to take place in the classroom, valuing unknown situations over known quantities.
Finally, formative assessment will be dierent depending on the environment in the classroom. However,
some of the key strategies by Black and Wiliam (2009) may not work as well in a traditional lecture-based
course. For example, “engineering eective classroom discussion to gain evidence of student understand-
ing” and “activating students as instructional resources for one another” may happen more naturally in a
student-centered course, where most of the time students are drivers of content during class. Formative
assessment can only be as eective as the instructor values it and as the students embrace it.
Practical tips
Formative assessment is about soliciting evidence to adjust teaching with the intention that student
learning improves.
Summative assessments (as discussed in the next section) can be made formative and may be bene-
cial for students to value reection.
ink about the formative feedback that you communicate. is reects your beliefs about what is
important in mathematics.
Prompt, specic feedback on students’ strengths and weaknesses is crucial to helping students under-
stand how they can improve.
Create formative assessments that align with Steens principles.
Assessment Practices 57
AP.2.2. Formative assessments to improve mathematical practices
How can formative assessment be used when trying to understand the emotional and behavioral compo-
nents of student learning? e general behaviors that students bring to mathematical work are oen referred
to as their mathematical practices (CCSSM, 2014). We next discuss an example of how formative assess-
ment can be used to inuence and develop these practices.
Vignette
To assist students in developing the habit of making sense of problems, i.e., of “ddling” with a problem to
gain insight, students are introduced to a game from Vandervelde (2010) called Last One Standing. Here are
the rules, as played with four people:
1. e players are all sitting in a row, oriented so that one person is in the back and another is in the front.
e person in the rear can see the others, while the person in front cannot see anyone else.
2. All individuals should be seated initially.
3. e game ends when the last person in line is standing while all others are seated.
4. e person in front can stand or sit at will.
5. e other three participants can move according to the following rule: a person may change state (by
standing up or sitting down) if the person immediately in front of them is standing, while all others in
front of them are seated. Otherwise they are locked in position and may not move. e status of people
behind them does not matter.
Students are asked to gather in groups and play this game several times to ensure they understand the
rules. Two volunteer groups play the game to ensure the whole class agrees on the fewest steps it takes to
complete the game with four people.
e class is then told (possibly during the following class period) the fewest steps for a game with 20 peo-
ple playing and asked to discuss the fewest steps for a game with 21 people playing. e students are posed
the question, “What questions need to be asked and answered in order to tackle this problem?” is creates
strategic confusion among the students, leading to productive discussion and an opportunity for formative
feedback.
Discussion: In practice, students are generally engaged in trying to understand the growth pattern of
the number of moves. Some groups become convinced it should be linear and are puzzled when it doesn’t
work with the small numbers that they try, going back to two people, then three, then ve. Some groups see
growth patterns they are not able to articulate algebraically, while other groups attempt to write equations
in recursive or closed forms.
e goal of this task is not necessarily to solve the problem, but rather to help students develop good
mathematical habits of making sense of problems and being persistent. For students who have previously
engaged in activities only supercially, the task provides a new opportunity for them to dive in and try to
identify mathematical patterns, allowing greater freedom and creativity in the problem-solving process.
An important caveat is that the feedback loop from this sort of assignment must take into account that
some students, particularly those facing stereotype threat or language barriers, may feel intimidated by this
non-traditional activity. As Steen suggests, ensuring assessment is cyclic is vital to evaluating the eective-
ness of these activities.
e activity situates the class to appreciate the value of playing with examples in an eort to solve a prob-
lem. It also helps students move from recognizing a numerical pattern to expressing it algebraically, and
nally to discussing the value of dierent representations.
58 MAA Instructional Practices Guide
Practical tips
When providing feedback to students, remain process-oriented and question-based:
º What is your motivation for trying that?
º Have you tried working through the simplest cases?
º What does the rest of your group think about this idea?
Focusing feedback on the mathematical process keeps the emphasis on mathematical behaviors and
away from students’ emotional responses to challenging problems.
Interactions with students regarding mathematical practices do not need to be as formal as the previous
vignette suggests. ere are many unplanned thoughts by students shared during class time, some of which
lead to explicit demonstrations of better mathematical practices. For example, when discussing in rst-year
calculus whether a series converges or diverges, a student might ask the question, “Can you have a series of
entries that are themselves series?” At that moment instructors have several options. ey may choose to
directly answer the question with a mathematically correct response, or they may tell the student that this
is not covered in the material for this course, but they will see it if they take more courses. A third option
is to change the focus of the class from the topic of series to a holistic discussion of mathematical practice,
pointing out how this student’s question is a great example of mathematical thought. is illustrates Black
and Wiliams (2009) rst strategy.
AP.2.3 Formative assessment to inuence students’ beliefs and motivations
Formative assessment can also be used to inuence student beliefs and motivation, as the following vignette
suggests.
Vignette
On the rst day of a mathematical content course for elementary education majors, Professor Somerville
asks the following questions: What is this class about? What more is there to learn? Dont you already know
all the mathematics that you will be teaching in elementary school? Arent you going to take a methods
course anyway? ese questions appear to suggest that not much mathematics is required to teach ele-
mentary students. ey also imply that computational prociency and general teaching skills are sucient
for good mathematics instruction in elementary school. Professor Somerville then denes mathematical
knowledge for teaching (Ball, Hill, and Bass, 2005) and launches an activity.
e students are asked to compute 35 × 25 by themselves and to compare answers with their neighbor.
ey are asked to compare how they solved the problem. Generally everyone uses the standard algorithm,
and they obtain the correct answer.
e instructor explains that the ability to carry out this computation is an example of a basic skill that they
possess. en they are shown the slide reproduced in Figure 1.
ey are asked to identify the method each student used, to determine why the method works, and to
determine if each student’s method will always work. Most students, aer intense group work, can identify
the rst two students’ reasoning. Groups are asked to volunteer their explanations and the class discusses
their comments. e last students work eludes all but one or two students, who still have a dicult time
articulating the actual thought process and explaining it to their peers.
In the wrap-up aer the class discussion, Professor Somerville explains to the students that possessing
basic skills allows them to determine if a nal answer is correct, but mathematical content knowledge allows
them to answer the type of questions posed with this activity.
Assessment Practices 59
Figure 1. Computations using three methods (Hill, Rowan, Ball 2005).
Discussion: e purpose of this task is to
Change the students’ mindsets about the usefulness of the course, which is oen regarded as covering
content that is too in-depth.
Explain to students the dierence between knowing mathematics and knowing the particular mathe-
matical content knowledge necessary for teaching.
In general, this activity unsettles the students, who oen express concern about their ability to perform this
analysis in real time. e nal discussion allows the instructor to frame the purpose of the course, and to
establish a reference point adopted throughout the course which is to distinguish between “How to …” and
“Why ….
AP.3. Summative assessment
As discussed in the rst section of this chapter, summative assessment is conducted with the purpose of
evaluating student prociency with regard to one or more measurable learning outcomes. In this section
we discuss various ways to structure course assessments and eective methods for designing summative
assessments.
AP.3.1 Assigning course grades
Traditionally, mathematics grades are determined by using a points-based system—that is, by assigning
points to each assessment item and computing an average to produce a course grade. is is a reasonable
approach as long as the average is created in a thoughtful manner that reects the student learning outcomes
for the course.
Vignette
Professor Jackson identied nine student learning outcomes for a rst-semester course primarily designed
to serve future engineers. Six of these learning outcomes were content-based outcomes and three related to
student mathematical practices. In order to emphasize the seriousness of the practice outcomes, Professor
Jackson realized that assessments needed to extend beyond problem sets and exams. To align the overall
course grade scheme with the student learning outcomes, Professor Jackson structured the weights for each
assessment item as follows:
ree Exams: 45% of course grade
Quizzes: 15% of course grade
Written Assignments: 5% of course grade
Online Homework (repeated attempts allowed on problems): 15% of course grade
60 MAA Instructional Practices Guide
ree Reective Essays: 15% of course grade
Participation: 5% of course grade
Discussion: Professor Jackson decided to allocate 65% of the course grade to summative assessment via
exams, quizzes, and written assignments, 15% of the course grade to formative assessment via online home-
work, and 20% of the course grade to participation and essays that reect the practice outcomes. Because
the rst semester of calculus establishes a foundation for multiple semesters of study, Professor Jackson felt it
was important to have summative assessments encompass a signicant portion of the course grade. Profes-
sor Jackson felt it was also important to have a non-trivial percentage of the grade reect persistence, eort,
and engagement in the course. By distributing grades across a range of assignments that were aligned with
the course learning outcomes, Professor Jackson was able to clearly communicate to students through the
grading scheme that all of these outcomes are critical for mathematical learning.
While points-based systems are reasonable, using other course grading schemes can positively impact
student attitudes and perceptions about their learning. One example of this is the student course portfolio,
which is a common means of assessment in humanities and art courses. Portfolios can serve as the single
source of a course grade, given that they represent the work of a student throughout a course, but they can
also serve as one component of a course grade. While there are some documented examples of the use of
portfolios for undergraduate mathematics courses (Burks, 2010; MAA, 1999), the use of portfolio grading
has been more common in K–12 mathematics courses than at the postsecondary level.
Vignette
In a junior-level course on mathematical problem solving that primarily serves preservice teachers, Profes-
sor Klein assigns the nal course grade with 10% based on attendance and participation and 90% on student
portfolios. e professor collects the portfolios four times during the semester and oers opportunities for
students to revise and resubmit quizzes and problem sets. e grading scheme for the portfolio is described
in the syllabus as follows.
1. Journal of strategies and methods: Students are expected to maintain a list of strategies and mathemat-
ical methods (e.g., induction, contradiction, etc.) that are discussed during class. is is meant to be a
typed encyclopedia, without any reective component. e journal portion of the portfolio is worth 5%
of the portfolio grade, and is assessed based on completeness.
2. Quizzes: Quizzes take place approximately once per week in this course and are announced in advance.
Each quiz consists of one exercise or problem and lasts 25 minutes. Aer Professor Klein grades and re-
turns all quizzes the students add them to their course portfolios. ere are a limited number of revision
opportunities for quizzes. e total quiz score is worth 35% of the portfolio grade.
3. Problem Solutions: e problems section of the portfolio has two subsections, one for solutions to prob-
lems discussed during in-class small group work and one for solutions to homework problems. Home-
work problems are graded aer initial submission. Select homework problems are eligible for revision
and re-grading as a part of the portfolio. Each problem solution is contained on a separate page in the
portfolio. e problem score is worth 40% of the portfolio grade.
4. Reections or cover letters: Each time the portfolio is collected, students include a cover letter detailing
their reections on their work to date, their progress toward the student learning outcomes, and specic
items in the portfolio that illustrate the key points of their reection. e reective essay score is worth
20% of the portfolio grade.
Discussion: e two key dierences between Professor Kleins portfolio-based grading system and many
traditional points-based systems are (a) students have opportunities to revise and resubmit selected quizzes
and problem sets, and (b) students are expected to collect and review their work periodically throughout
Assessment Practices 61
the course and write cover letters reecting on their learning. By focusing students’ attention on their work
through the lens of the course portfolio, Professor Klein can inspire a more holistic perspective on growth
and achievement compared to other assessments.
Other alternatives to traditional points-based systems and portfolios can be found in variants of mas-
tery-based grading such as standards-based grading and specications grading. In a generic mas-
tery-based grading system, an extensive itemized list of learning outcomes for the course is provided. Stu-
dents are then expected to demonstrate mastery of each individual learning outcome, and the number of
learning outcomes for which mastery has been achieved determines the course grade. ere are subtle and
inuential dierences across various implementations of this type of grading system. A comprehensive guide
to implementing mastery-based systems is beyond the scope of this guide, but there are many resources that
serve this purpose. For example, Pengelley (2017) describes how one can incorporate entirely qualitative ru-
bric-based grading resulting in only letter grades rather than point-based grades for all assignments. Nilson
(2014) provides a comprehensive guide for instructors regarding specications grading though it is not spe-
cic to mathematics. e use of standards-based grading in postsecondary mathematics courses has been
discussed in a variety of articles and blog posts (e.g., Brilleslyper et al., 2011; Owens, 2015).
Practical tips
When constructing a course grading system, be mindful of constraints such as available instructor
time, number of students, and availability of teaching assistants. It is usually better to implement a
lower-impact grading system with high quality than to poorly implement a more ambitious grading
system.
It is critical that students understand how the grading system for their course works. For courses that
incorporate specications or standards-based grading, a signicant amount of time and energy must
be spent early on getting students to understand and buy into the system.
For courses that use a portfolio as a large component of the overall grade, it is critical that the portfolio
be collected multiple times during the semester so that students understand how the portfolio is as-
sessed. is also supports Steens assessment as a continuous cycle.
AP.3.2. Exemplary summative assessments
As previously mentioned summative assessments are evaluations of student mastery of topics directly relat-
ed to student learning outcomes, occurring at the end of an instructional unit. ese can include evaluations
of procedural uency, of conceptual understanding—a topic of discussion in a separate section of this chap-
ter as well as in the Classroom Practices chapter, of written or oral communication in mathematics, of prob-
lem-solving strategies, or of other elements of mathematical prociency. Familiar examples of summative
assessments include exams, performance tasks, projects, and portfolios. ere are several aspects of eective
summative assessments that are not immediately apparent, as we shall discuss in this section.
A key element of summative assessment oen overlooked is the eect of prior mathematical experienc-
es on student learning. To account for this eect, it is helpful to conduct summative assessments prior to
teaching. is pre-assessment of student knowledge should not be counted toward student course grades,
making these pre-assessments formative in some sense. However, the purpose of these early assessments is
specically to evaluate student learning, since at the beginning of any unit of study certain students are likely
to have already learned some of the skills that the instructor is about to introduce, others may already under-
stand key concepts, and others still may be decient in prerequisite skills or have misconceptions. Equipped
with diagnostic information from pre-assessments, an instructor gains greater insight into what to teach by
knowing what skill gaps to ll, by initiating activities based on preferred learning styles, and by connecting
the content to students’ interests. Teachers can use a variety of practical pre-assessment strategies, includ-
62 MAA Instructional Practices Guide
ing pre-tests of content knowledge, skills checks, and concept maps. In addition, powerful pre-assessments
have the potential to address a worrisome phenomenon reported in a growing body of literature (Bransford,
Brown, and Cocking, 1999; Gardner, 1991): a sizeable number of students come into mathematics courses
with misconceptions about both subject matter and themselves as learners. If teachers don’t identify and
confront them, the misconceptions will persist even in the face of good teaching.
Assuming that eective pre-assessments and formative assessments have been implemented, our next
goal is to describe characteristics of exemplary summative assessments. Exemplary assessments, whether
classied as formative or summative, are meaningful, motivational, engaging, and should guide the student
in the learning process (Huba and Freed, 1999; Walvoord and Anderson, 1998), but most importantly they
should be in line with the course learning outcomes. Huba and Freed identied eight characteristics of ex-
emplary assessments, and many of these reect Steens principles of assessment. e characteristics are
Authentic — reect real life experiences
Challenging — stimulates the learner to apply knowledge
Coherent — serves as a guide for the student to achieve the learning goal
Engaging — attracts the learner’s interest
Respectful — sensitive to the individual learner’s beliefs and values
Responsive — includes a feedback mechanism to assist the student in the learning process
Rigorous — requires applied understanding of learning to achieve a successful outcome
Valid — provides information that is useful to meet the intended learning outcomes
e vignette below exemplies the use of exemplary assessments, but the reader should recognize that an
assessment can be exemplary without satisfying all of the above criteria.
Vignette
In an online college algebra course, Professor Agnesi has exclusively used multiple-choice questions on
computer-based exams taken at multiple, proctored environments physically separated from each other.
e professor is concerned that while the exam problems are rigorous and challenging, the exams are not in
overall alignment with the learning outcomes for the students. With Steens fourth principle (multiple forms
of assessment) in mind, and in an eort to increase the coherence of the exams with the learning outcomes,
to increase the level of engagement of the problems, and to increase the level of feedback that students can
attain, Professor Agnesi decides to implement the following guidelines for all exams in the course.
1. Create questions in multiple formats so that guessing is minimized.
2. Use an online homework system or dedicated exam soware to administer exams, with the capability of
automated answer checking for problems that are not multiple choice.
3. Randomize the numerical values in the problems given to students.
4. Allow two attempts on each problem where students enter either a function or number as their answer.
5. Insert at least one higher-order thinking question in short answer or essay form.
Discussion: By considering each aspect of exemplary assessment, Professor Agnesi was able to imple-
ment changes that increased the quality of summative assessments given to the students. While some of the
qualities of exemplary assessments were not addressed, these can be returned to at a later time aer current
changes have been streamlined and eectively implemented. Professor Agnesi made a conscious choice to
focus on only a small number of these items for each round of modications to summative assessment struc-
tures, with a long-term plan for making additional changes.
Assessment Practices 63
AP.3.3. Creating and selecting problems for summative assessment
e creation and selection of problems eective for summative assessment is a challenge that shares many
qualities with designing and selecting appropriate mathematical tasks for students in the classroom. It is
therefore recommended that this section be read in parallel with the section in the Classroom Practices
chapter on selecting appropriate mathematical tasks.
e key to creating and selecting problems for summative assessment is to have a clear sense of what the
problem is intended to assess and what it actually assesses. While it is not possible to know what a problem
truly assesses, it is possible for teachers to evaluate tasks informally using review frameworks to check that
the intended assessments are reasonably aligned with the assessment items. A well-known framework for
analyzing problems is Blooms taxonomy and its variants (Anderson et al., 2001; Bloom et al., 1956). Blooms
original work outlines multiple levels of skills in the cognitive domain of learning, increasing from simple
to complex. ese are described by six skill levels: knowledge, comprehension, application, analysis, synthe-
sis, and evaluation. is work has been extended by researchers in educational psychology to more robust
frameworks. For example, Anderson et al. (2001) introduces a two-dimensional extension of Blooms tax-
onomy, pictured in the table below. e rst dimension consists of a cognitive process dimension (remem-
ber, understand, apply, analyze, evaluate, create) similar to Blooms taxonomy, while the second dimension
consists of a knowledge dimension (factual knowledge, conceptual knowledge, procedural knowledge, and
metacognitive knowledge). When evaluating a task using this taxonomy, the cognitive process is represented
by the verb used in specifying the task (what the student is doing) and the knowledge process dimension
corresponds to the noun (what kind of knowledge the student is working with). Examples of this extended
taxonomy can be found in a special issue of the journal eory Into Practice (Anderson, 2002), where specif-
ic applications to assessment issues are discussed (Airasian and Miranda, 2002).
Cognitive Process Dimension
Remember Understand Apply Analyze Evaluate
Create
Knowledge
Dimension
Factual
Conceptual
Procedural
Metacognitive
Knowledge Dimension
1. Factual knowledge:e basic elements that students must know to be acquainted with a discipline or
solve problems in it
a) Knowledge of terminology
b) Knowledge of specic details and elements
2. Conceptual knowledge: e interrelationships among the basic elements within a larger structure that
enable them to function together
a) Knowledge of classications and categories
b) Knowledge of principles and generalizations
c) Knowledge of theories, models, and structures
64 MAA Instructional Practices Guide
3. Procedural knowledge: How to do something; methods of inquiry, and criteria for using skills, algorithms,
techniques, and methods
a) Knowledge of subject-specic skills and algorithms
b) Knowledge of subject-specic techniques and methods
c) Knowledge of criteria for determining when to use appropriate procedures
4. Metacognitive knowledge: Knowledge of cognition in general as well as awareness and knowledge of
ones own cognition
a) Strategic knowledge
b) Knowledge about cognitive tasks, including appropriate contextual and conditional knowledge
c) Self-knowledge
Cognitive Process Dimension
1. Remember: Retrieving relevant knowledge from long-term memory
a) Recognizing
b) Recalling
2. Understand: Determining the meaning of instructional messages, including oral, written, and graphic
communication
a) Interpreting
b) Exemplifying
c) Classifying
d) Summarizing
e) Inferring
f) Comparing
g) Explaining
3. Apply: Carrying out or using a procedure in a given situation
a) Executing
b) Implementing
4. Analyze: Breaking material into its constituent parts and detecting how the parts relate to one another
and to an overall structure or purpose
a) Dierentiating
b) Organizing
c) Attributing
5. Evaluate: Making judgments based on criteria and standards
a) Checking
b) Critiquing
6. Create: Putting elements together to form a novel, coherent whole or make an original product
a) Generating
b) Planning
c) Producing
Assessment Practices 65
One productive way to envision robust summative assessment is as a collection of tasks that evaluate
students across multiple levels of both the cognitive process and knowledge dimensions of this taxonomy.
When creating exams, quizzes, or other assessments, instructors should intentionally create or select tasks
that cover a broad spectrum of this framework. Alternatively, aer instructors have created an assessment,
it can be reviewed to check that it is broadly evaluating students across multiple dimensions. Guided by the
descriptions of the framework components, the table above can be used as a primary ingredient of assess-
ment design and evaluation
1
, as demonstrated in the following vignettes.
Vignette 1
Professor Ramirez begins to create a nal exam for a linear algebra course using the following two problems:
1. Find all solutions to x + y + z = 2, x - y - z = 3, and 2x - z = 0 by encoding this system in matrix form
and row-reducing the associated augmented matrix.
2. Given four vectors x, y, z, and w suppose that forms a linearly dependent set, forms a linearly dependent
set, and that forms a linearly independent set. Is it possible to determine the dimension of the span of ?
If so, what is the dimension? If not, why not?
Professor Ramirez identies the knowledge dimension of the rst task as procedural and the cognitive pro-
cess dimension as apply because the task requires students to execute a known algorithm in a specically
given situation. e second problem is assigned a type of conceptual/analyze, due to the need for students
to conceptually understand the ideas of independence, span, and dimension, and to analyze the possible cas-
es that can arise in this situation. Professor Ramirez decides to have the next two problems on the exam be
of type factual/remember (asking students to state the denition of an eigenvalue) and procedural/create
(asking students to create an example of a non-row-reduced 5×5 matrix that has rank equal to 3, and explain
why their answer is correct), in order to achieve a broad spectrum of assessment.
Vignette 2
In a graduate-level complex analysis course, Professor Granger has a learning outcome for students to devel-
op an understanding of the fundamental theorem of algebra. e nal exam consists of a set of problems to
be completed during class and a take-home component. Professor Granger observes that none of the in-class
problems are of cognitive dimension evaluate, and none of the in-class problems are of knowledge dimen-
sion metacognitive. us Professor Granger assigns the following question as the take-home component of
the nal exam in order to include these two missing dimensional components:
In this course we have seen three proofs of the fundamental theorem of algebra (FTA), which use (a)
the maximum modulus principle, (b) Liouvilles theorem, and (c) an argument using winding num-
bers. Sketch the main ideas for each of these three proofs, and for each proof list the topics from this
course that are required to complete the proof. Choose the proof that you believe should be the rst
proof of the FTA that graduate students are exposed to, and write a two-page essay providing an argu-
ment supporting your choice. Your argument should explicitly include a consideration of the mathe-
matics involved in the proofs.
Discussion: Professor Ramirez uses the taxonomy table when writing the exam to ensure that students
are evaluated on a broad range of knowledge types and cognitive processes. Rather than plan out the entire
exam based on “lling in the checkboxes,” Professor Ramirez begins by writing a pair of problems that are
of dierent avors and evaluates them using the table. Following this, clusters of gaps in the table provide
1 It is critical to clarify that the evaluation described here is not intended to be a formal psychometric evaluation of items
such as those used in educational research. Rather, this taxonomy provides one framework by which teachers can conduct informal
evaluations of summative assessments.
66 MAA Instructional Practices Guide
inspiration for further problems. On the other hand, Professor Granger writes the complete in-class exam
for the graduate students before evaluating the problems. Aer identifying that one large gap exists on the
exam for each dimension, Professor Granger creates a take-home component to address these gaps rather
than rewriting the in-class problems. If there had not been a take-home component for the exam, Professor
Granger could have decided whether or not to include metacognitive and evaluative items on this assess-
ment. e decision that Professor Granger ultimately made was motivated both by the gaps in the assess-
ment items and the presence of a student learning outcome that would be well-served by a metacognitive
and evaluative assessment item.
Practical tips
Summative assessments should be based on student learning outcomes for a course. Use these learn-
ing outcomes as a guide to create summative evaluations of students.
When using the revised taxonomy to evaluate assessments, use the keywords in the above lists to
identify the knowledge and cognitive process level of each problem.
e two-dimensional framework should not be viewed as prescriptive, but rather as a means by
which instructors can evaluate the breadth of their summative assessments.
e two-dimensional framework is one of many methods for evaluating summative assessment
items. If other taxonomies or evaluation methods are more eective in a given situation, then they
should be freely used instead.
In the next section we present various assessments that further support Steens principles.
AP.4. Assessments that promote student communication
Communication and teamwork skills are important for mathematics students. Many aspects of verbal com-
munication and collaboration arise as components of classroom practices, and these also easily t into both
formative and summative assessment. e use of writing assignments can be used for their own sake, to
develop student written communication skills, and to assess student learning outcomes that reect student
behaviors, practices, and beliefs.
In this section we describe three types of assessments that support student communication, namely writ-
ing assignments, oral presentations, and group projects. ese are all assessment tools that are examples of
a large collection of activities that constitute “homework.” e relationship between homework and assess-
ment is complex because a primary concern of instructors at almost every level is getting students to do
the homework. No assessment approach can overcome a lack of eort, but appropriate use of feedback and
assessment can raise participation rates in homework and as a result increase student learning.
AP.4.1. Writing assignments
ere are many types of writing assignments that can be used in mathematics courses for a wide range of
purposes (Bean, 2011; Braun, 2014; Crannell, LaRose, and Ratli, 2004; Meier and Rishel, 1998; Montgom-
ery and Stuelbeam, 2014). Assignments such as reective essays, expository essays, critical analysis of texts,
biographical essays, and large-scale course projects have all been used successfully by many mathematicians.
Via vignettes, we provide two examples of the use of writing assignments: a reective essay designed for
formative assessment of student mathematical practices and a critical analysis of a reading assignment de-
signed to evaluate student writing skills.
Vignette 1
Professor Germain wanted students to spend more time reecting on their performance in the course, and
specically wanted students to reect on their progress toward the student learning outcomes for the class.
Assessment Practices 67
When Professor Germain asked the students during class how they felt about their progress, the professor
learned that only one student was aware of the learning outcomes as stated in the syllabus. For their next
homework assignment, Professor Germain assigned the following reective essay:
Read the learning outcomes in the syllabus. Discuss the progress you have made toward these learn-
ing outcomes. How have you been successful in reaching these outcomes? How have you not been
successful in reaching these outcomes? Has your development toward these learning outcomes had
any impact on your work in other contexts, e.g., other classes, jobs, etc? Discuss each of these student
learning outcomes with mathematical content examples. is assignment should be: typed, three full
pages, 12-point Times New Roman, double-spaced, 1-inch margins.
Aer the students turned in their essays, one of them asked how the essays would be graded. Professor Ger-
main had not thought about this, and decided that any student who had written a complete and thoughtful
response to the prompt that was free from grammatical and typographical errors would receive full credit.
While the students accepted this, Professor Germain realized that in the future he needed to create and share
a grading rubric.
Vignette 2
Professor Robinson is teaching a real analysis class for mathematics majors. e course is structured using
a combination of lecture and small group work. For the class days with small group work, students are
required to read a textbook section prior to class. ough the book was chosen based on its accessibility,
many students have complained that the book is dicult to read. Despite in-class discussions about reading
strategies, it is clear that the students struggle making sense of what they read. In order to dig deeper into
the situation, Professor Robinson assigned the following two writing assignments:
Prompt #1: Write a critical review of [specic sections of the textbook]. Imagine that you are writing
your review for a journal for undergraduates in mathematics and the sciences. Respond positively to
some things and negatively to others; justify your opinions and provide detailed explanations for your
claims. Keep in mind that this is a review for a mathematical publication, so you will be graded on
both the quality of your writing and on the mathematical depth and mathematical style of the chapter.
is essay should be typed, 5–6 pages, double-spaced, and 12-point Times New Roman font. We will
complete an in-class peer review of your essay on the day it is due.
Prompt #2: Revise your critical review of the textbook sections based on the feedback you received
during peer editing. is essay should be typed, 5–6 pages, double-spaced, and 12-point Times New
Roman font. You must turn in both your original version (with comments) and your revised version
of this essay.
Discussion: Reective essays are oen most eective when a specic, directive prompt is provided to
students. For example, Professor Germains prompt ensured that (a) students read the learning outcomes,
and (b) students had plenty of questions they could start to answer in response. In receiving a set of specic
questions, the students’ attention is more focused in the direction that Professor Germain intends. Other-
wise, the student responses might have gone in unintended directions.
On the other hand, when asking students to develop a critical analysis of a textbook, website, video, etc.,
it is oen most eective to provide students a context for the review rather than direction. For example,
Professor Robinsons prompt informed students that the audience for their review is other undergraduates
in the sciences and that students must have mathematical content in their essay. However, Robinson did not
direct students’ attention to specic features of the textbooks. is provides students freedom to identify, in
their rst essay, features that were notable to them and then take into account the opinions and feedback of
other students for their revised version. With critical analysis assignments of this type, it is not uncommon
for students to change their opinion as they revise their initial essay and develop their nal dra.
68 MAA Instructional Practices Guide
Practical tips
Grading rubrics for written work can be found at most campus teaching and learning centers, but these
are not always well-suited for mathematical writing. Sample grading rubrics/checklists for mathemat-
ical writing, and discussion of related issues, have been developed by various mathematicians (Braun,
2014; Crannell, LaRose, and Ratli, 2004; Meier and Rishel, 1998).
For long papers with substantial mathematical content, it can be useful for students to learn LaTeX, a
typesetting system. However, for many writing assignments in mathematics courses, standard docu-
ment preparation soware will suce.
As shown in the writing prompts above, many questions from students can be avoided by providing
clear instructions.
Many students do not like peer reviews because they feel they do not receive rich feedback. By men-
tioning this common problem at the beginning of the peer review time, and by requiring students to
use the grading rubric when doing their reviews, the quality of the reviews can oen be improved.
When revising work, students should be encouraged to go beyond mere copyediting. Students oen
nd the experience of going through a substantial revision process to be both dicult and rewarding.
Revision of particular assignments is a good example of Steens rst principle that assessment is a cy-
cle. In addition to the local example of a cycle within the particular assignment, students should be
encouraged to see how practice in writing creates long-term alignment with course learning goals.
AP.4.2. Oral presentations
Student presentations can take many forms. In the simplest situation, students can be asked to come to
the board or document camera to provide a short explanation of a solution to a problem. Oen in courses
that use inquiry-based learning (as discussed in the Classroom Practices chapter), students work in small
groups and give presentations about their progress, challenges they encountered and resolved, or issues
that remain unresolved. In these situations where students present somewhat informally, the presentations
serve as formative assessment. For example, in these cases students may be expected to present for partic-
ipation grades, but not to have an evaluation of the quality of their presentation. At the other end of the
spectrum, students may prepare a presentation that is 10-15 minutes in length on a topic in the course,
subject to summative assessment using a rubric. ese ideas are highlighted in the following vignette.
Vignette
Professor Bassi is teaching a course in Euclidean and hyperbolic geometry. e students in this course
are mathematics majors or minors and half of them are 7-12 prospective mathematics teachers. Professor
Bassi decides that aer the class has collectively covered some of the fundamental content in the course,
students will take turns giving 10-minute presentations about topics of their choice. Students will be grad-
ed based on ve criteria:
1. Clarity of verbal communication
2. Presentation structure
3. Eective use of chalkboard/whiteboard, slides, or instructional technology
4. Mathematical depth
5. Mathematical style
Professor Bassi provides students with a list of suggested topics along with a rubric.
Assessment Practices 69
Discussion: Students asked Professor Bassi for clarication regarding “depth” and “style” in their math-
ematics. is led to a class discussion regarding the dierence between deep mathematics presented poorly
and simple mathematics presented clearly. Students had not considered the idea that a clear presentation
could be viewed as inadequate. Several class discussions were needed to clarify the balance between the
depth of the mathematics at hand and the quality of the style of presentation, e.g., the use of suciently
complicated motivating examples, reasonable “sketches” of proofs rather than line-by-line details, etc.
Practical tips
e rubric should include components for the presentation and for the quality of the mathematics
under discussion. Because students are oen not familiar with how presentations in a mathematics
course will be graded, it is important to provide a grading rubric in advance.
Students oen ask for feedback on their materials prior to the presentation. us, it is useful to pro-
vide a deadline for when such feedback requests must be submitted.
It is important that students begin preparing their presentations early, so that they can ask for help if
they have diculty with unfamiliar content.
AP.4.3. Group projects
Group projects are excellent assessments when instructors want to monitor student performance on tasks
not suited for a timed exam, to promote active student interaction with classmates, and to provide students
with experience working with non-routine problems. Good problems for group work oen have the char-
acteristic that they encompass content from recent class meetings as well as knowledge of other concepts
addressed earlier or in previous courses. Such problems typically are not appropriate for a timed exam or
quiz, because they frequently involve signicant conceptual problem-solving skills that draw more broadly
on the students’ mathematical backgrounds. is type of assignment can have a positive impact on students,
because it requires a more authentic engagement with mathematics than straightforward computational
exercises. It also requires students to persist through the process of making sense of the problem and at-
tempting multiple solution strategies, as suggested by Black and William (2009). We illustrate these notions
in the following vignette.
Classroom vignette: Group projects
Professor Herschel wants students in calculus to gain experience working together to solve problems requir-
ing more than the application of a specic technique illustrated by an example in the textbook. Herschel
assigns four group projects using problems such as, “Show how to nd the minimum value of the area of the
region under the curve y = x + 1 from x = a to x = a + 1.5 for all a > 0 .”
Professor Herschel assigns students to groups of four and provides the following directions and grading
structure for the group projects during the course.
is problem is to be completed in your group; each group member will receive the same grade. ere
will be four problems during the semester. Each group member is responsible for writing the nal
report for at least one of the four assignments. e report will have three parts, dened and graded as
described below. If a part is missing it will receive zero points; based on this rubric a correct solution
alone is worth at most four out of ten points!
Part 1. (3 points) Describe the problem, including challenges you encountered. Do not just restate the
problem. Instead, show that you understand what you are trying to do, but do not show how you solved
or attempted to solve the problem. at comes later. is will require a paragraph—that is, more than
a sentence and less than a page.
70 MAA Instructional Practices Guide
Part 2. (3 points) Describe how your group worked on the problem. Describe what you tried, includ-
ing both things that worked and false starts. is is a description of your strategies, not a point-by-
point listing of every thought that occurred to you. is will also take more than a sentence or two, but
less than a page.
Part 3. (4 points) Provide your solution to the problem. is should be accurate, carefully written, and
concise.
In each of your responses, follow these guidelines:
Use complete sentences, even if this is something you have not done in the past when writing a
solution to a mathematics problem.
Work on these problems only with members of your group. Everyone is expected to contribute by
working hard and discussing their thoughts. You must include this statement at the end of your
report along with signatures:
We have neither requested nor received any help from individuals outside our group on this
problem. Each of us has contributed to the work on this problem, and we will not allow anyone
to sign this paper who has not contributed in some way.
Discuss any problems about your group work with your instructor.
Provide complete explanations of how you found your answers. An answer without support will
receive little or no credit.
Discussion: Instructors should be cautious about leading students too much on such tasks. One can
gain considerable insight into students’ mathematical reasoning by judiciously providing hints and indi-
cations about whether or not students are on the right track before assignments are due. In class, dierent
groups can present their partial or complete solutions so that the entire class has a chance to practice
mathematical discourse in a situation where there is no external conrmation of the correctness of any
solutions. e instructor will gain considerable insight into student reasoning by listening and question-
ing during this student-centered classroom discussion. Students will learn how to problem-solve in the
absence of a solution template or nal answer, a valuable insight into the nature of mathematics.
Practical tips
When designing group projects it is important to emphasize to students the reason for requiring them
to work collaboratively. Many students are resistant to having their grade depend on others. An in-
structor may increase student buy-in and enthusiasm by explaining that group work reects the reality
of many job expectations in business, industry, and government, and also by explicitly connecting
group work with student learning outcomes listed in the course syllabus.
If a particular group appears dysfunctional, be willing to revise the group assignments mid-course.
However, make every eort to avoid doing this, so as to maintain coherence in how the groups are
handled for all students.
Some instructors allow students to choose their own groups, while other instructors assign groups. Ei-
ther approach can be eective, so instructors should make this decision based on their own preference
and their understanding of their students.
If groups are allowed to switch members throughout the course, the responsibility of serving as the
“lead writer” for each group should be evenly distributed among the students throughout the course. A
lack of policy in place to handle distribution of workload can be a cause for discord and tension among
members of a group.
Assessment Practices 71
e number of group projects can be adjusted for various courses. As with most changes to teaching,
it is a good idea to start small. You can always incorporate more group projects as you gain experience
and condence.
AP.5. Conceptual understanding: What do my students really know?
It is common for students at all levels of mathematics to be able to correctly complete an arithmetic or sym-
bolic procedure, and yet be unable to explain the mathematical concepts underlying the procedure or the vi-
ability of the computation. is distinction can be examined by identifying the components of mathematical
understanding referred to as conceptual understanding and procedural uency. e typical mathematics
assessment consists of a large number of procedural problems, in large part because it is more challenging
to construct appropriate and meaningful problems that evaluate conceptual understanding. It is also more
time-consuming to grade conceptual questions. In this section we discuss conceptual understanding and
concept inventories, and then turn our attention to how the idea of a concept inventory can serve as a
springboard for creating summative assessments of student conceptual understanding.
AP.5.1. What is conceptual understanding?
In practice, the term “conceptual understanding” is oen used to refer to skills involving the ability to ex-
plain why something happens mathematically, using logical reasoning as opposed to empirical evidence.
e denition of conceptual understanding and its relationship with other dimensions of mathematical
knowledge, particularly procedural uency, has been debated and discussed in the mathematical sciences
community (see e.g., Baroody et al., 2007; Star, 2005). Certainly conceptual understanding and procedural
uency as well as other mathematical skills are strongly interrelated, and there may be many specic skills or
types of knowledge that integrate both procedural uency and conceptual understanding (see e.g., Hiebert
and Lefevre, 1986; NRC, 2001).
In fact, conceptual understanding has been identied as one of the critical components of learning math-
ematics (NCTM, 2000; NRC, 2001), and as such, considerable eorts have been made over the past two
decades to develop this notion. Mathematics educators, mathematicians, and curriculum developers have
made substantial eorts to develop high-quality assessments of conceptual understanding. Dierent mathe-
matical tasks engage students in dierent types of reasoning and thereby result in dierent types of learning.
For example, some tasks engage students in memorizing or practicing specic procedures, while other tasks
engage students in complex reasoning and sense making. e latter can be assessed via concept inventories.
AP.5.2. What are concept inventories?
As conceptual understanding has become more commonly recognized as a critical component of mathe-
matical understanding, mathematics educators have increasingly developed concept inventories as a way to
assess students’ conceptual understanding in dierent content domains. Concept inventories are examina-
tions that test basic but fundamental concepts within a particular subject. ey do not test computational
skills, nor do they aim to address everything that a student might learn in a course, but rather they aim to
test concepts that are necessary (but not necessarily sucient) for mastery of course material. Traditionally,
concept inventories were developed for research purposes in order to compare the eect on gains in concep-
tual understanding in a particular content domain that occur under dierent teaching strategies. For exam-
ple, an instructor might administer an exam at the beginning and at the end of the semester, and compare
the eect of dierent teaching strategies on conceptual understanding.
While concept inventories were originally developed with research purposes in mind, this type of assess-
ment can also be used by classroom teachers and by departments to assess the extent to which students have
72 MAA Instructional Practices Guide
gained conceptual understanding in a particular course. Concept inventories seek to identify which con-
cepts students understand and which misconceptions are a barrier for student understanding. However, not
all concept inventories have been validated, and just as with other standardized exams, departments should
be cautious about making high-stakes decisions without rst rigorously testing the validity and reliability of
these instruments with their student population (Bagley, 2016; Gleason et al., 2015).
e rst well-studied concept inventory was the Force Concept Inventory (FCI) in physics (Hake, 1998;
Halloun, Hake, Mosca, and Hestenes, 1995; Hestenes, Wells, and Swackhamer, 1992). Other concept in-
ventories exist in physics (e.g., Halloun and Hestenes, 1985a; Halloun and Hestenes, 1985b), chemistry
(e.g., Mulford and Robinson, 2002), and biology (e.g., Klymkowsky, Underwood, and Garvin-Doxas, 2010;
Smith, Wood, and Knight, 2008). In mathematics the Calculus Concept Inventory (CCI) (Epstein, 2013) is
the rst concept inventory widely tested on a large student population. Other concept inventories exist for
precalculus (Carlson, Oehrtman, and Engelke, 2010), elementary algebra (Wladis et al., 2017a, Wladis et al.,
2017b), dierential equations (Hall, Keene, and Fortune, 2016) and group theory (Melhuish, 2015).
Here are two examples of assessment items that measure conceptual understanding.
Example 1: Figure 2 shows a problem from the Precalculus Concept Assessment (PCA).
Assume that water is poured into a spherical bottle at a constant rate. Which of the following graphs
best represents the height of water in the bottle as a function of the amount of water in the bottle?
Figure 2. From the Precalculus Concept Assessment (Carlson et al., 2010).
is question tests the extent to which students can correctly employ covariational reasoning—that is,
the extent to which students can explain how one variable will change as the result of a change in another
variable. is type of reasoning is central to algebra and calculus, but research suggests that students struggle
with this concept (Carlson, Jacobs, Coe, Larsen, and Hsu, 2002). While instructors who teach college-level
mathematics may not use the term covariational reasoning to describe this type of student reasoning, they
will almost certainly be familiar with many of the misconceptions that students employ when solving such
a problem.
is type of concept inventory item provides instructors a tool to determine which students can employ
covariational reasoning and to identify potential misconceptions for those students who struggle with this
type of reasoning. For example, both choices a) and d) are typically chosen by students who fail to recognize
that the rate at which the height changes in relation to the volume varies over time. Another reason students
oen choose option a) is because they perceive that the graphs upward shape indicates the height of the wa-
ter increases more and more. While these students recognize the volume increases as height increases, they
are unable to distinguish between an increasing graph and an increasing slope on that graph. Both types of
students may need additional tasks that force them to confront these two dierent features of a graph and to
make sense of the dierent information the features convey.
Students who chose option b) oen reason that this graph represents the shape of the le side of the bottle
because they do not understand that the graph should depict the rate at which the height increases as the
Assessment Practices 73
volume increases. ese students perceive the graph as a picture of the static object modeled. Students who
chose option e) oen perceive that the water rises slowly at rst, then more quickly, and then slowly again.
It may be that these students reach this conclusion because they confuse the rate at which the height of the
water increases with the width of the bottle at a given volume. ese two types of students would benet
from tasks that challenge them to explore and generalize how two co-varying quantities change over time
and to relate these patterns to graphs that depict the co-varying relationship. For example, students could
be presented with a task involving various composite objects such as a cylinder, a downward pointing cone,
or an upward pointing cone, and asked to calculate the height at various specic volumes. Furthermore, the
students could compare how the height changes as the volume increases, then construct a corresponding
graph of volume versus height for each shape. An interactive applet or a computer algebra system may be the
most ecient way to do this. Students could then be asked to compare the rate of change of the height with
respect to the volume as depicted in the various graphs.
Example 2: Consider this question that is similar in format to those on the Calculus Concept Inventory
(CCI) (Epstein, 2013):
e derivative of a function is negative everywhere on the interval x = 2 to x = 3. Where on this
interval does the function have its maximum value?
a) At x = 2.
b) We cannot tell if it has a maximum because we do not know where the second derivative is
negative.
c) Somewhere between x = 2 and x = 3.
d) At x = 3.
e) It does not have a maximum because the derivative is never zero.
is question assesses the extent to which students can connect characteristics of the derivative to the
actual behavior of the underlying function.
Students who choose options b) and e) are likely relying on procedural rules rather than reasoning about
characteristics of the derivative and the shape of the underlying function. ese students would benet from
classroom tasks that require them to connect features of a function to its derivative including working back-
wards from information about a derivative to determine features of the underlying function. For example,
students could be asked to draw the derivative of a function based on the functions graph, or they could be
asked to draw a few possible graphs of an underlying function based on a graph of its derivative.
AP.5.3. Using items from concept inventories
One way to use concept inventories is to select a few tasks to use during an in-class activity. As the students
work collaboratively on the tasks, the instructor might walk around the room and ask students to explain
their reasoning. e instructor might also facilitate a class discussion during which students explain their
reasoning. Such practices could help identify and correct student misconceptions that might impact suc-
cessful course completion, success in subsequent mathematics courses, and students’ ability to apply the
concepts to “real life” situations.
e following vignette exemplies how this process might be used in an elementary algebra class.
Vignette
Professor Jackson notices that students make certain common errors on procedural problems. For example,
when students are asked to substitute -3 in for in the expression 2 - x
2
, many students write 2 - 3
2
instead
74 MAA Instructional Practices Guide
of 2-(-3)
2
. In order to better understand why students do this, Professor Jackson gives students the fol-
lowing problem:
Consider the expression
1
2
a
a-
.
Perform the following substitutions (no need to simplify aerwards!):
a) Substitute -2 for a.
b) Substitute x
2
for a.
c) Substitute
1
y
for a.
Professor Jackson then engages students in both small group and whole class discussions about their
strategies for solving the problem.
Discussion: Students frequently make the following mistakes on the three substitutions in this problem:
a)
1
2
2
2
-
-
b)
1
2
2
x
x-
c)
11
2
yy
Students make the rst mistake above because they forget to write the negative sign in front of the 2 or they
mistake the subtraction sign for the negative sign and think they do not need to include the negative sign
for . is is an opportunity for the instructor to articulate an important algebraic concept related to substi-
tution: substitution is the process of replacing a variable with a numerical value or an expression and all the
algebraic structure outside the variable remains unchanged by the substitution.
e following items could be used as follow-up assessments to determine whether or not a student under-
stands the concept of substitution.
Item 1: Which of the following expressions is the result of substituting -2 into the expression
1
2
a
a-
?
a)
1
2
2
2
-
b)
1
2
2
2
−−
()
c)
1
2
2
2
-
-
d)
1
2
2
2
+
Item 2: What happens to the negative sign in the expression
1
2
a
a-
if we substitute -2 for a?
a) Nothing—when we substitute something for a in this expression, the rest of the expression remains
the same.
b) It is replaced by the negative sign that belongs to the -2.
c) It is canceled out by the negative sign that belongs to the -2.
d) Nothing—but we drop the negative sign from the -2 because there is already a negative sign in the
expression.
ese types of questions can be used as both formative and summative assessments. As with any classroom
assessment, instructors can create their own concept inventory questions tailored to their own students’ mis-
conceptions. e fundamental procedure followed by researchers who have developed concept inventories
mimics what instructors oen do as a typical part of classroom practice. ey create an open-ended ques-
tion they believe pinpoints important underlying concepts, and they use it for class discussion or as a short
written assignment to elicit common student answers. Based on these results a multiple-choice question
could be developed and used in a subsequent semester or on a subsequent assignment in the same semester.
For example, an instructor could present one of the two items above as an open-ended question during
class, solicit student answers and write them on the board, and ask students to vote on which answer they
think is correct before proceeding with a discussion about the relevant substitution concept. e most com-
Assessment Practices 75
mon incorrect student responses could be used as a basis for the multiple-choice question on a follow-up
exam.
is example demonstrates how conceptual understanding might be assessed both formally via exam
questions as well as informally via in-class activities, illustrates how formal and informal activities inform
one another, and highlights the relationship between conceptual understanding and procedural uency.
AP.6. Assessment in large-enrollment classes
Many higher education institutions have large-enrollment classes. Institutions with graduate programs in
the mathematical sciences oen connect small recitation sections led by graduate teaching assistants to such
classes. Two assessment techniques that are particularly helpful in large-enrollment classes are online home-
work systems and classroom polling techniques.
AP.6.1. Online homework systems
In the spring of 2009, the American Mathematical Society (AMS) surveyed 1,230 U.S. mathematics and
statistics departments about their experiences with online homework systems (AMS, 2009; Kehoe, 2010;
Lewis and Tucker, 2009). e survey respondents indicated the most important benets of online homework
systems were (a) the immediate feedback provided to students, (b) the opportunity for students to attempt
an exercise multiple times, and (c) the reduction in grading duties. On the other hand, survey respondents
indicated disadvantages to online homework were (a) the inability for students to show their work, (b) the
limited types of questions that can be eectively evaluated online, and (c) students’ frustrations with the
systems. Reducing the amount of time an instructor spends grading homework allows the instructor to in-
crease the time spent on more meaningful instructional activities and assessment that can better illuminate
conceptual understanding and misconceptions. Students’ frustrations might be alleviated by engaging them
in frequent conversations about the value of online homework and helping them discover support mecha-
nisms such as links to various online tutorials that provide further explanations.
Various researchers (Bonham et al., 2001; Doorn et al., 2010; Hauk and Segalla, 2005; Malevich, 2011)
synthesize the pros and cons of online homework systems. Additional advantages include the randomiza-
tion of variables and parameters that can mitigate cheating and save departments money on hiring graders.
Other disadvantages include (a) the inability to provide students with reasons why responses are incorrect
(b) the inability to prevent cheating, and (c) the additional costs incurred by students if they must purchase
access to these systems.
As of 2017, the most commonly used online homework systems, in alphabetical order, are ALEKS, MAA
WeBWorK, MyMathLab/MyStatLab, MyOpenMath, and WebAssign. ese systems are typically showcased
at the national conferences of the MAA, AMATYC, and AMS, which are good places to explore each system
and compare their features. Below is a brief description of each of these online homework systems.
ALEKS (McGraw-Hill Education): ALEKS (Assessment and Learning in Knowledge Spaces) is based
on knowledge space theory (Doignon and Falmagne, 1999, 2011; Falmagne et al., 1990), a eld fueled
by funding from the National Science Foundation in the 1990s. It uses articial intelligence to assess
a student’s knowledge of concepts and procedures. When a student takes an ALEKS assessment, they
typically complete 20 to 30 problems to determine their current knowledge of course content. Each
question ALEKS chooses is based on the student’s answers to previous questions. e system displays
a pie chart illustrating which topics the student has mastered and which topics they are ready to learn.
e student chooses from a list of topics they have not yet mastered but are ready to learn. ALEKS
provides practice problems on that topic and the student may request a detailed solution to each prob-
lem. Once the student answers a sucient number of practice problems correctly without access to the
76 MAA Instructional Practices Guide
solution, ALEKS determines the student has mastered that topic, and the student then chooses another
topic. ALEKS is available for use in a wide variety of K–12 and undergraduate mathematics courses.
MAA WeBWorK (MAA): WeBWorK is an open-source online homework system developed via fund-
ing from the NSF and includes an Open Problem Library (OPL). Problems are available for a wide
variety of mathematics courses ranging from college algebra and calculus to dierential equations,
linear algebra, and complex analysis. Instructors can select problems from the OPL or write new prob-
lems themselves. Like other online homework systems, WeBWorK provides students with immediate
feedback and can be linked to learning management systems.
MyMathLab/MyStatLab (Pearson): MyMathLab and its equivalent in statistics, MyStatLab, oer vid-
eos, quizzes, and homework assignments. MyMathLab can be linked to various learning management
systems such as Blackboard, Canvas, and Moodle, and assignments can be made from the selected
course textbook or chosen from other MyMathLab courses. MyMathLab oers adaptive learning skill
building exercises in select courses. Each question in a skill building exercise is based on the stu-
dents previous answers. MyMathLab also oers ”workspace assignments” that allow a student to work
through an exercise step-by-step and receive immediate feedback at each step. Students physically write
out their answers and the system uses handwriting recognition soware to evaluate them. MyMathLab
and MyStatLab are available for use in a wide variety of K–12 and undergraduate mathematics courses
MyOpenMath: MyOpenMath (www.myopenmath.com/) is designed for self-study and online courses in
developmental mathematics. It takes advantage of open source materials to provide access to students
who cannot aord traditional texts or soware licenses.
WebAssign (Cengage): WebAssign was developed in 1997 at North Carolina State University and
purchased by Cengage in 2016. However, the selection of textbook titles that utilize WebAssign come
from a wide range of publishers. It oers videos, interactive content, and tutorials to aid students on
assignments. Instructors can build assignments by selecting exercises from the textbook or by creating
their own questions. An analytics tool allows instructors to identify problems and topics with which
students had the most diculty. Instructors can collaborate with colleagues by sharing homework
questions and assignments. As with MyMathLab, WebAssign can be linked to various learning man-
agement systems like Blackboard, Canvas, and Moodle. e types of mathematics courses for which
WebAssign can be used range from basic algebra to calculus, discrete mathematics, and ordinary dif-
ferential equations.
e most recent version of the Guidelines for Assessment and Instruction in Statistics Education (GAISE,
2016) from the American Statistical Association notes the increasing trend of gaming as an entertainment
source for college students. Gamication in educational settings is the use of game design elements in non-
game situations with the goal of increasing student engagement (Attali and Arieli-Attali, 2015; Sandusky,
2015). An exciting new enhancement to WeBWorK for use in calculus courses is a gaming mechanism
developed by Goehle (2013) that utilizes the common video game system feature of “levels.” In Goehles
system students earn “experience points” for every correct homework problem, and they are able to move
up to the next level when they have accumulated a sucient number of points. e lowest level is “Calculus
initiate” and the highest is “Calculus professor.” As the levels increase, so do the number of points required
to advance to the next level.
e following vignette illustrates the benets of online homework systems.
Vignette
Professor Phillips teaches at a private, comprehensive university. Admission is competitive, and the school is
well regarded, but the endowment is not large and most revenues come from undergraduate tuition dollars.
Assessment Practices 77
Competition for good students requires the university to oer more and more merit-based scholarships, so
the budget is tight and the administration is looking for economies of scale in all aspects of the operation.
For the Department of Mathematics and Statistics, this means their service courses such as business calculus
and introductory statistics, traditionally taught in classes of about 25 students, will now be taught in classes
of 75–90. ere are no graduate students in quantitative elds to serve as teaching assistants, and there is
limited support for undergraduate graders.
Professor Phillips is discouraged, as teaching small classes was what attracted him to the university in
the rst place. e issue at hand is how to make the best of the move to larger class sizes. Professor Phillips
has traditionally collected homework in hard copy, graded it carefully, and quickly returned it to students in
order to provide them timely, detailed feedback for improvement. Recognizing this will no longer be possi-
ble, Professor Phillips decides to investigate how to take advantage of online homework systems so he can
continue to best serve his students.
e professor is pleasantly surprised to nd the developers of these systems seem to have thought seri-
ously about using evidence-based practices in assessment. Because the reduction of direct connection from
instructor to student is a clear cost of the change to online systems, Professor Phillips is eager to nd a sys-
tem that makes it possible for him to carefully monitor student progress and intervene as needed with the
whole class as well as with individuals.
Aer signicant discussions with colleagues in the department, they agree that one instructor will hold
oce hours devoted to technical issues with the chosen system. ey also agree that a student who scores
at least 85% on an online assessment will be credited with 100% in order to proactively address potential
student complaints about round-o errors and technical issues. During conversations with colleagues from
other institutions, Phillips discovers that gaming aspects available with some online systems also help to
mitigate student frustrations. He decides to integrate online games into an introductory statistics course
where students collect and analyze data from virtual reality environments to determine factors that impact
winning in a game. Incorporating the game promotes a higher level of student engagement and results in
students mastering concepts outside of class. In-class discussions then serve as formative assessments he
uses to tailor subsequent classes to enhance student learning.
AP.6.2. Classroom polling systems
As discussed in section CP.1.9 of the Classroom Practices chapter, classroom polling systems can be incor-
porated into mathematics courses in a variety of ways, ranging from using electronic “clicker” systems to
voting by show of hands. Here is one example of how this might be used, and Cline and Zullo (2011) oer
other suggestions.
Classroom vignette: Introducing a classroom polling system
Professor Ordinal teaches at a public state university that draws most of its students from the nearby urban
area but has a growing number of students from other parts of the U.S. and other countries. e school has
a college of engineering, and Professor Ordinal teaches one large section of an engineering mathematics
course each semester. is is essentially a fourth semester course for STEM students, following a three se-
mester calculus sequence, and it is meant to introduce key concepts in linear algebra and dierential equa-
tions needed in upper-level coursework. Each semester about 150–200 students enroll in Professor Ordinals
section, in which instruction consists of large lecture meetings and weekly recitation sections led by teaching
assistants.
Students’ perceptions of the course are strongly negative, and they consistently give poor evaluation rat-
ings. ey have a sense that the course is designed to serve as a last eort to weed out students who are per-
ceived to not be strong enough to major in STEM elds. It has reached a point where the engineering college
78 MAA Instructional Practices Guide
is threatening to develop their own alternative course and no longer require the mathematics departments
course. Professor Ordinal is aware of this perception and would like to increase student engagement while
maintaining high standards. A full ipped classroom approach seems impractical given the enrollment.
Professor Ordinal decides to utilize a classroom polling system. Aer some initial trial and error, Ordinal
develops some carefully craed questions tied to the key learning goals and intended to assess mastery of
concepts and encourage class discussion. Professor Ordinal ensures students have sucient time to formu-
late answers but not enough time to become distracted.
Discussion: While a primary purpose of such systems may be to increase student engagement, the system
can be an eective tool for formative assessment in real time and can play a role in summative assessment as
well. is is particularly true in large sections like Professor Ordinal’s.
Classroom polling is hardly novel and was used well before technology made compiling responses rel-
atively quick and easy. “How many think the conjecture is true? If so, raise your hand!” is is the sort
of throw-away line that instructors have inserted into lectures since the beginning of time. Experienced
instructors know that this old-school approach, while perhaps better than nothing for encouraging engage-
ment, makes it dicult to record student responses in a timely manner, can result in high non-participation
rates, and amplies the peer-pressure factor that may cause students to respond in the same way as their
peers. Perhaps most important, this technique encourages rapid responses rather than thoughtful reactions.
Practical Tips
• Consult your campus academic technology group before choosing a polling system. ey typically
have the expertise to answer specic questions like whether polling participation can be incorporated
into a classroom management system. Many institutions have an expert who will ensure you do not
reinvent the wheel and are able to successfully launch your system.
• Consider online options such as pollanywhere, if the student demographic leads to a reasonable as-
sumption that most or all students in the class have a smartphone.
• Determine what other systems might already be in place at your institution. ere is also considerable
advantage in using systems with which students are already familiar.
• If polling is to be incorporated into the course grade, learning goals for the course must include partic-
ipation and the rubric for grading must be clear.
AP.7. Assessment in non-traditional classrooms
AP.7.1. Assessment in online courses
Given that the higher education enterprise must respond to the demand for online learning, instructors
nd themselves teaching online courses and rethinking assessment of both teaching and learning (Hewson,
2012). In order to ensure online students achieve at a level comparable to students in face-to-face settings
it is vital for educators to reexamine basic notions of both formative and summative assessment in online
courses (Stewart, Waight, Norwood, and Ezell, 2004).
Online learning is a space where the principles of constructivist, learner-centered, and authenticity-based
education can be created (Lesnick, Cesaitis, Jagtiani, and Miller, 2004). e assessments that worked per-
fectly in a face-to-face setting may need to be re-conceptualized, tweaked, or even replaced in an online en-
vironment. e issue of validity and dishonesty related to assessment in online courses should be examined
carefully in the design of courses. Online education technology allows a number of assessment tools, such
as discussion boards, surveys, and online discussion groups, all of which can be modied into formative or
summative assessments to document student learning based on the course objectives. Creation of authentic
and eective assessment, both formative and summative, is possible with the use of online education tools.
Assessment Practices 79
In the online environment, the lack of physical contact between instructors and students leads to dier-
ent assessment techniques. In the online setting, instructors cannot tell whether a student is in attendance
unless he or she is actively contributing something to the class. It is for this very reason that in typical on-
line classes, 10–25% of the course grade is for discussion participation (Anderson and Elloumi, 2004). To
prevent cheating and to create a learner-centered environment, assessment of students is typically based on
a variety of assignments, quizzes, papers, tests, group projects, and discussions (Jarmon, 1999, pp. 55–63).
Students are kept abreast of their grades throughout the class, rather than at some specic junctures during
a term. is increased emphasis on continual and alternative assessment methods has great potential to
increase the transparency of the learning process and improve learning.
Arend (2006) conducted a study on how course assessment practices relate to learning strategies for
students taking online courses in two-year colleges. Learning strategies are dened as specic techniques
students use when studying for a class. In this study, Arend nds that most formative assessment variables
and general summative assessment variables do not show signicant relationships to the learning strategy.
Instead, assessment methods such as discussions or papers are signicantly related to learning strategies. It
is evident that the more a course uses discussions, writing assignments, and papers, the more students use
critical reasoning strategies. Conversely, the more a course relies on nals and midterms for assessment, the
less time students spend on critical reasoning. e courses in the study used online methods of exams, dis-
cussions, written assignments, problem assignments, and experiential activities. Additionally, many larger
assignments were broken down into smaller pieces focusing on dierent aspects and were graded over time.
e study concluded that assessment techniques that evaluate multiple dimensions of learning are most
eective since they provide opportunities for students to demonstrate and extend their learning. Such con-
stant assessment techniques allow educators to provide feedback and provide students with an opportunity
to learn from their mistakes (Kerka and Wonacott, 2000).
Arend (2006) notes that summative assessment and the number of assignments used in a course can be
one area of concern. Although multiple, shorter assignments are deemed better than using only a small
number of high-stake assignments, there is some indication that the total number of assignments can too
high. Some courses in the study used between 50 to 90 assignments in a 15-week period. In a course with too
many short assignments, students are at risk of focusing their attention on completion of the assignments,
rather than understanding of course material. e literature in general supports the use of exams to docu-
ment performance of students in an online environment (Hewson, 2012).
AP.7.2. Assessing via technology
Innovative initiatives are beginning to demonstrate the potential of technology-enhanced assessment for
integrating formative and summative assessments. e online assessment tools developed to assess students
achievement and progress allow educators to set exible tests at the required level as well as to measure and
record student progress over time. Such systems provide rich feedback for instructors on specic aspects
of student performance and support assessment for learning and of learning. Technology also supports the
use of summative assessments for formative purposes, enabling traditional testing methods to be used in
more meaningful ways. e use of multiple-choice questions, for example, is most commonly associated
with testing the recall of facts with no associated elements of useful feedback or learning interaction. When
combined with the use of digital communication tools, it can prompt new ways of activating assessment
for learning. Carefully chosen questions answered by learners via mobile devices or electronic systems can
be used by educators to identify alternative or comparative understandings and to provide students with
real-time feedback.
For eective learning, students need to be actively involved in feedback processes rather than simply serv-
ing as passive receivers of information about their progress. Self-assessment has shown to improve learning
80 MAA Instructional Practices Guide
outcomes through students’ reection on and revision of their own work. Technology-enhanced assessments
can support students’ active participation in integrated systems of formative and summative assessments.
Technology alone cannot transform assessment practices, and the role of the instructor remains of vital
importance in all educational elds. is is particularly important in connecting technology to make assess-
ment more relevant and related to learners’ achievements and progress. Digital tools used in online courses
that can support integrated assessment practices that are relevant and appropriate to the context, and to the
learners, open up new possibilities for more personalized, immediate, and engaging assessment experiences.
e use of digital technologies for assessment must support improved assessment practices and preferred
educational outcomes. We must acknowledge the complexity of the task and the signicant ethical questions
raised by the use of digital technologies in assessment.
e online environment oers some unique challenges for assessment, but it also oers opportunities
for positive ongoing assessment. For example, one could ask what online techniques can be used to make
student reasoning visible. Or, how can rubrics be used eectively to inform the evaluation process in an on-
line class? Are there tools that help create and execute assessment instruments? And, how could academic
honesty and ethics be promoted when assessments are taken online?
Self-assessment, goal setting, and motivation: e most eective students generally set personal learning
goals, use proven learning strategies, and self-assess their work. Teachers can help cultivate such habits by
teaching students to self-assess, to set goals, and to expect students to apply these habits regularly. Teachers
who provide regular opportunities for students to self-assess and set goals oen report a change in the class
culture from students asking, “What did I get?” or “What are you going to give me?” to becoming capable of
knowing how they are doing and what they need to do to improve. is is particularly important in online
courses, due to inherent challenges to instructors in providing direct feedback on student practices. Natu-
rally coupled with this challenge is that summative assessment strategies must inuence students to become
motivated to learn. Students are more likely to put forth the required eort when they clearly understand the
learning goals and standards, they know how teachers will evaluate their learning, they think the learning
goals and assessments are meaningful and worth learning, and they believe they can successfully learn and
meet the evaluative expectations.
Quizzes and exams: Quizzes and exams are generally used to measure academic achievement of students.
Such tools can consist of multiple choice, matching, and free-response items. Dierent learning manage-
ment systems (LMS) make it possible for instructors to adapt the design and deployment of the assessment.
Mainstream LMS coupled with a product like Respondus enable administration of quizzes 24/7 with safe-
guards against student collaboration. Also, many LMS are capable of assigning each student random ques-
tions, timed tests, password protection, and adaptive release of quizzes.
For non-proctored assessments certain security options can be deployed. e instructor can release the
test or quiz one question at a time and not allow students to go back. To reduce the possibility of students
getting answers from one another, the instructor can create a pool of questions and make dierent versions
of the same test/quiz. Most LMSs allow instructors to make parallel forms of a test. In addition, it is possible
to randomize the presentation of questions on a test as well as the answer options. Using randomization,
parallel versions of tests, and synchronous test periods makes it dicult for students to consult with one
another on test items.
However, with the presence of computer assisted systems like Wolfram Alpha, Symbolab, Cymath, etc.,
it is very easy for students to obtain answers to procedural questions quickly, creating potential academic
integrity issues. As another option to promote academic integrity with non-proctored quizzes or tests, in-
structors can use items that require higher-order reasoning, where the answers to the questions go beyond
computations that a computer program can do. Overall, it is best to assign a low percentage of the course
Assessment Practices 81
grade to these non-proctored quizzes or tests. In addition, inclusion of an academic honesty statement at
the beginning of tests or quizzes will serve to remind students of the penalty for dishonesty. For example, a
statement like the following keeps expectations clear: “I completed this quiz myself, worked independently,
and did not consult anyone except the instructor. I have neither given nor received help on this quiz. I un-
derstand that academic dishonesty results in consequences as described in the university catalog.
When midterm or nal exams account for a signicant proportion of a course grade and are completed
in the privacy of a students home, ethical questions about the validity of the results and the legitimacy of the
educational program naturally arise. In fact, a growing number of higher education institutions only transfer
credit for an online course if a high percentage of the exams are proctored. Awarding meaningful grades
requires that instructors take steps to ensure what is known as “academic authenticity,” meaning simply that
students do not cheat. Authentication of identity is a common concern, and various methods have been
developed in response, including palm vein recognition, keystroke recognition, and questions about public
record information that only the authentic individual might know in detail (Sandeen, 2013). Because of cost,
human proctoring is currently the most popular approach. is is done in various ways. One method is for
the learner to be supervised by a proctor such as an instructor or administrator selected by the learner and
approved by the home institution. An alternative is for the student to travel to a regional site sponsored by
the home institution. A third option is to use companies that specialize in test delivery. ese companies
provide services such as checking multiple forms of identication, photographing candidates, and videotap-
ing test sessions. erefore, to give credibility to courses taught at a distance, to facilitate transferability, and
to satisfy accrediting agencies on the soundness of programs, real eort must be placed in implementing
assessments that preserve the integrity of courses taught online.
In an online class it is easy to add multiple-choice tests to ones Learning Management System, but assess-
ment is much more than that. Assessment involves identifying clear, valid, and appropriate student learning
outcomes, collecting evidence that those outcomes are addressed, setting the stage for a dialogue to attain
a collective interpretation of the data, and using data to improve both teaching and learning. Assessment
can certainly be a tool for accountability, but it can also be an ongoing process for learning. e concept of
student-centered teaching involves eective use of both formative and summative assessment regardless of
the mechanism for delivering course content.
AP.7.3. Assessment in non-traditional course settings
While assessment is typically discussed in the context of classroom settings, it also plays an important role
in non-traditional course settings. Courses focused on service learning, independent study, undergraduate
research, industrial projects, and internships all benet from the thoughtful incorporation of assessment.
Because these types of courses vary dramatically from one institution to another, we will focus our discus-
sion on three key themes that should be considered by instructors teaching such courses.
Create student learning outcomes: Even in less formal course situations, students and instructors benet
when there is a clear vision for the purpose and scope of the course. For example, if a student participates
in an independent study it is best to establish a target set of topics to be covered with a tentative reading
schedule. Similarly, if one of the goals for the independent study is that the students will increase their abil-
ity to independently read mathematical texts, this must be clearly communicated with the students so they
are attentive to their work in this regard. By creating written student learning outcomes and sharing these
with the students, instructors can establish clear goals for students and ensure that students are aware of the
desired outcomes for the course.
Formative assessment through written reections: Non-traditional courses oen do not incorporate tra-
ditional homework and exam structures. For example, in a service learning course, the content of the course
82 MAA Instructional Practices Guide
is oen engagement in the mathematical community (through tutoring, outreach, etc.) rather than specic
topics of mathematics. Similarly, during an internship for which students receive credit, the learning out-
comes might focus on experiencing the ways in which mathematics is used in a specic business, industry,
or government setting. In these situations, one eective way to conduct formative assessment is through
the use of written reective essays, as discussed previously in this chapter. By providing students with spe-
cic prompts, or even by directly requesting that students write about their progress toward meeting the
student learning outcomes, instructors can support deep reection and meta-cognitive growth in students
throughout their course experience. In courses that have a larger mathematical content component, such as
independent study or undergraduate research courses, it is natural to have students include written updates
regarding their mathematical investigations in addition to any verbal updates provided during meetings.
Summative assessment through written portfolios: A natural way to build on the use of written reections
as formative assessment is to have students compile these into an end-of-course portfolio serving as the
summative assessment instrument. Instructors might require that students write a 3-5 page essay describing
their growth and development in the course and their self-assessment with regard to the student learning
outcomes for the course. In independent study and undergraduate research courses, it is appropriate to also
include a nal written report on the results of the mathematical investigations completed during the course.
AP. References
Airasian, P. W. and Miranda, H. (2002) e role of assessment in the revised taxonomy. eory Into Practice, 41(4),
249–254.
American Mathematical Society. (2009) AMS Homework Soware Survey. Retrieved from www.ams.org/profession/
leaders/webassess.
Anderson, L.W. (2002) is Issue. eory Into Practice. 41(4), 210–211.
Anderson, L.W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J., and Wittrock,
M.C. (2001). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Blooms Taxonomy of Educational
Objectives, Complete Edition. New York: Pearson.
Anderson, T. and Elloumi, F. (2004). eory and Practices of Online Learning. Athabasca, Canada: Athabasca University.
Arend, B. (2007) Course Assessment Practices and Student Learning Strategies in Online Courses. Journal of Asynchro-
nous Learning Networks, 11(4), 3–13.
Asiala, M., Brown, A., DeVries, D., Dubinsky, E., Mathews, D., and omas, K. (1996). A framework for research and
curriculum development in undergraduate mathematics education. Research in Collegiate Mathematics Educa-
tion II. In Kaput, J., Schoenfeld, A. H., and Dubinsky, E. (eds). CBMS Issues in Mathematics Education, 6, 1–32.
Attali, Y. and Arieli-Attali, M. (2015) Gamication in assessment: Do points aect test performance? Computers and
Education, 83, 57–63.
Babaali, P. and Gonzalez, L. (2015) A quantitative analysis of the relationship between an online homework system and
student achievement in precalculus. International Journal of Mathematical Education in Science and Technology,
46(5), 687–699.
Baker, B., Cooley, L., Trigueros, M., (2000). e schema triad—a calculus example. Journal for Research in Mathematics
Education, 31(5), 557–578.
Bagley, S., Gleason, J., Rice, L., omas, M. and White, D. (2016). Does the Calculus Concept Inventory really measure
conceptual understanding of calculus. Retrieved from blogs.ams.org/matheducation/2016/07/25/does-the-calculus-
concept-inventory-really-measure-conceptual-understanding-of-calculus/#sthash.DVwFXv.dpuf.
Ball, D.L., Hill, H.C, and Bass, H. (2005). Ball, D. L., Hill, H.C, & Bass, H. (2005). Knowing mathematics for teaching:
Who knows mathematics well enough to teach third grade, and how can we decide? American Educator, 29(1),
pp. 14–17, 20–22, 43–46. hdl.handle.net/2027.42/65072.
Assessment Practices 83
Baroody, A.J., Feil, Y., and Johnson, A.R. (2007). An alternative reconceptualization of procedural and conceptual
knowledge. Journal for Research in Mathematics Education, 38(2), 115–131.
Barr, M.L. (2014). Encouraging college student active engagement in learning: e inuence of response methods.
Innovative Higher Education, 39(4), 307–319.
Bean, J. (2011) Engaging Ideas: e Professor’s Guide to Integrating Writing, Critical inking, and Active Learning in the
Classroom, Second Edition. San Francisco: Jossey-Bass Publishers.
Beilock, S.L. (2008). Math performance in stressful situations. Current Directions in Psychological Science, 17(5), 339–
343.
Black, P., Harrison, C., and Lee, C. (2003). Assessment for Learning: Putting It into Practice. McGraw-Hill Education
(UK).
Black, P. and Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy and
Practice,5(1), 7–74.
— (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability
(formerly: Journal of Personnel Evaluation in Education), 21(1), 5–31.
Blair, R. (2006). Beyond Crossroads: Implementing Mathematics Standards in the First Two Years of College. Memphis,
TN: American Mathematical Association of Two-Year Colleges.
Bloom, B. et al., (eds) (1956) Taxonomy of Educational Objectives: e Classication of Educational Goals. Handbook I:
Cognitive Domain. New York: Longmans, Green.
Bonham, S., Beichner, R., Deardor, D. (2001). Online homework: Does it make a dierence? e Physics Teacher, 39,
293–296.
Bransford, J.D., Brown, A.L., and Cocking, R.R. (eds). (1999). How people learn: Brain, mind, experience, and school.
Washington, DC: National Research Council.
Braun, B. (2014). Personal, expository, critical, and creative: Using writing in mathematics courses. PRIMUS (Problems,
Resources, and Issues in Undergraduate Mathematics Studies), 24(6), 447–464.
Brilleslyper, M., Ghrist, M., Holcomb, T., Schaubroeck, B., Warner, B., and Williams, S. (2011). What’s the point? Ben-
ets of grading without points. PRIMUS (Problems, Resources, and Issues in Undergraduate Mathematics Studies),
22(5), 411–427.
Bull, J. and McKenna, C. (2004). Blueprint for Computer-Assisted Assessment. London: Routledge Falmer.
Burch, K. and Kuo, Y. (2010). Traditional vs. online homework in college algebra. Mathematics and Computer Educa-
tion, 44(1), 53–63.
Burks, R. (2010). e student mathematics portfolio: Value added to student preparation? PRIMUS (Problems, Resourc-
es, and Issues in Undergraduate Mathematics Studies), 20(5), 453–472.
Caldwell, J.E. (2007). Clickers in the large classroom: Current research and best practice tips. CBE—Life Sciences Edu-
cation, 6(1), 9–20.
Callahan, J.T. (2016) Assessing online homework in rst-semester calculus. PRIMUS: Problems, Resources, and Issues
in Mathematics Undergraduate Studies, 26(6), 545–556.
Carl Wieman Science Education Initiative, University of British Columbia www.cwsei.ubc.ca/resources/clickers.htm.
Carlson, M., Jacobs, S., Coe, E., Larsen, S., and Hsu, E. (2002). Applying covariational reasoning while modeling dy-
namic events: A framework and a study. Journal for Research in Mathematics Education, 33(5), 352–378.
Carlson, M., Oehrtman, M., and Engelke, N. (2010). e Precalculus Concept Assessment: A tool for assessing stu-
dents’ reasoning abilities and understandings. Cognition and Instruction, 28(2), 113–145.
Chow, A.F. (2015) Online homework impact in undergraduate mathematics and business statistics courses. Education-
al Studies, 41(3), 244–248.
Cline, K. and Zullo, H. (2011) Teaching Mathematics With Classroom Voting: With and Without Clickers. MAA Notes
79, Mathematical Association of America.
84 MAA Instructional Practices Guide
C.H. Crouch, J. Watkins, A.P. Fagen, and E. Mazur, Peer Instruction: Engaging Students One-on-One, All At Once,
Research-Based Reform of University Physics, 1 (1) (2007).
Chen, H. (2011). Practical Program Evaluation. New York: Sage Publications.
Common Core State Standards in Mathematics. (2013). Retrieved from www.corestandards.org/Math/.
Crannell, A., LaRose, G., and Ratli, T. (2004). Writing Projects for Mathematics Courses: Crushed Clowns, Cars, and
Coee to Go. Washington, DC: Mathematical Association of America.
Crowley, M. L. and Dunn, K. (1995) e mathematics portfolio. e American Mathematical Monthly, 102(1), 19–22.
DeVries, D. (2004) RUMEC APOS Glossary (Internal Working Group Notes).
Doignon, J.P. and Falmagne, J.C. (1999). Knowledge Spaces. Springer-Verlag.
(2011). Learning Spaces. Springer-Verlag.
Doorn, D., Janssen, S., and O’Brien, M. (2010). Student attitudes and approaches to online homework. International
Journal for the Scholarship of Teaching and Learning, 4(1), Article 5. doi.org/10.20429/ijsotl.2010.040105.
Dweck, C. S. (2008).Mindset: e new psychology of success. Random House Digital, Inc.
Entwistle, N. (1996). Recent research on student learning. In J. Tait and P. Knight (Eds.), e Management of Indepen-
dent Learning, 97–112. London: Kogan Page.
Epstein, J. (2013). e Calculus Concept Inventory-Measurement of the eect of teaching methodology in mathemat-
ics. Notices of the American Mathematical Society, 60(8), 1018–1027.
Falmagne, J.-C., Koppen, M., Villano, M., Doignon, J.-P. and Johannesen, L. (1990). Introduction to knowledge spaces:
How to build test and search them. Psychological Review, 97, 201–224.
GAIMME, Guidelines for Assessment and Instruction in Mathematical Modeling Education, SIAM, 2016. www.siam.
org/reports/gaimme.php.
GAISE College Report ASA Revision Committee, “Guidelines for Assessment and Instruction in Statistics Education
College Report 2016,www.amstat.org/education/gaise.
Gardner, H. (1991). e unschooled mind. New York: BasicBooks.
Gikandi, J. W., Morrow, D., and Davis, N. E. (2011). Online formative assessment in higher education: A review of the
literature. Computers and Education, 57(4), 2333–2351.
Glasersfeld, E. (1995). Radical Constructivism: A Way of Knowing and Learning. New York: Routledge-Falmer.
Gleason, J., omas, M., Bagley, S., Rice, L., White, D., and Clements, N. (2015). Analyzing the Calculus Concept
Inventory: Content Validity, Internal Structure Validity, and Reliability Analysis.Proceedings of the 37th Interna-
tional Conference of the North American Chapter of the Psychology of Mathematics Education, East Lansing, MI.
1291–1297.
Gleason, J., White, D., omas, M., Bagley, S., and Rice, L. (2015). e Calculus Concept Inventory: A psychometric
analysis and framework for a new instrument.Proceedings of the 18th Annual Conference on Research in Under-
graduate Mathematics Education,135–149.
Goehle, G. (2013) Gamication and web-based homework. PRIMUS: Problems, Resources, and Issues in Mathematics
Undergraduate Studies, 23(3), 234–246.
Gold, B., Keith, D., and Marion, W. (eds). (1999). Assessment Practices in Undergraduate Mathematics. MAA Notes #49.
Washington, DC: Mathematical Association of America. Retrieved from www.maa.org/sites/default/les/pdf/ebooks/
pdf/NTE49.pdf.
Gredler, M. E. and Shields, C. C. Vygotskys Legacy: A Foundation for Research and Practice. Guilford Press, 2008.
Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics
test data for introductory physics courses. American Journal of Physics, 66(1), 64–74.
Hall, W., Keene, K., and Fortune, N. (2016). Measuring student conceptual understanding: e case of Euler’s meth-
od.Proceedings of the 19th Annual Conference on Research in Undergraduate Mathematics Education,Pittsburgh,
PA, 19(50).
Assessment Practices 85
Halloun, I., Hake, R., Mosca, E., and Hestenes, D. (1995). Force Concept Inventory (Revised, 1995); online (password
protected) at modeling.asu.edu/RandE/Research.html.
Halloun, I.A. and Hestenes, D. (1985a). Common sense concepts about motion. American Journal of Physics, 53(11),
1056–1065.
— (1985b). e initial knowledge state of college physics students. American Journal of Physics, 53(11), 1043–1055.
Hanson, J. M. and Mohn, L. (2011). Assessment trends: A ten-year perspective on the uses of a general education as-
sessment. Assessment Update: Progress, Trends, and Practices in Higher Education, 23(5), 1–15.
Hauk, S., Powers, R.A., and Segalla, A. (2015) A comparison of web-based and paper-and-pencil homework on student
performance in college algebra. PRIMUS: Problems, Resources, and Issues in Mathematics Undergraduate Studies,
25(1), 61–79.
Hauk, S. and Segalla, A. (2005). Students perceptions of the web-based homework program WeBWorK in moderate
enrollment college algebra classes. Journal of Computers in Mathematics and Science Teaching, 24(3), 229–253.
Heritage, M., Kim, J., Vendlinski, T., and Herman, J. (2009). From evidence to action: A seamless process in formative
assessment?Educational Measurement: Issues and Practice,28(3), 24–31.
Hestenes, D., Wells, M., and Swackhamer, G. (1992). Force concept inventory. e Physics Teacher, 30(3), 141–158.
Hewson, C. (2012). Can online course-based assessment methods be fair and equitable? Relationships between stu-
dents’ preferences and performance within online and oine assessments. Journal of Computer Assisted Learn-
ing, 28(5), 488–498.
Hiebert, J. and Lefevre, P. (1986). In J. Hiebert (ed),Conceptual and Procedural Knowledge: e Case of Mathematics.
Hillsdale, NJ: Erlbaum.
Hinton, K. (2012). A practical guide to strategic planning in higher education. Society for College and University Plan-
ning.
Huba, M.E. and Freed, J. E. (1999). Learner-centered Assessment on College Campuses: Shiing the Focus from Teaching
to Learning. Needham Heights, MA: Allyn and Bacon.
Jarmon, C. (1999). Testing and assessment at a distance. In Boaz, M., Elliott, B., Forshee, D., Hardy, D., Jarmon, C. and
Olcott, D. (eds), Teaching at a Distance: A Handbook for Instructors. Laguna Hills, CA: League for Innovation in
the Community College and Archipelago Productions.
Kehoe, E. (2010). AMS homework soware survey. Notices of the AMS, 57(6), 753–757.
Kerka, S. and Wonacott, M.E. (2000). Assessing learners online: Practitioners le. Washington, D.C.: Oce of Educa-
tional Research and Improvement.
Klymkowsky, M.W., Underwood, S.M., and Garvin-Doxas, R.K. (2010). Biological Concepts Instrument (BCI): A di-
agnostic tool for revealing student thinking. arXiv Preprint arXiv:1012.4501.
Krathwohl, D.R. (2002) A revision of Blooms taxonomy: An overview. eory Into Practice. 41(4), 212–218.
Lesnick, A., Cesaitis, A., Jagtiani, U., and Miller, R. (2004). Curriculum design as re-writing: Online chat as a resource
for radicalizing the teaching of a canonical text. Curriculum and Teaching Dialogue, 6(1), 35–47.
Lewis, J. and Tucker, A. (2009). Report of the AMS rst-year task force. Notices of the AMS, 56(6), 754–760.
Malevich, K. (2011) e Accuracy and Validity of Online Homework Systems, M.S. esis, Department of Mathematics
and Statistics, University of Minnesota Duluth, 29 pp.
Mathematical Association of America (2015). 2015 CUPM Guide to Majors in the Mathematical Sciences. Washington,
DC: Mathematical Association of America. Retrieved from www.maa.org/programs/faculty-and-departments/
curriculum-department-guidelines-recommendations/cupm/.
— (2007). College Algebra Guidelines from the MAA CUPM subcommittee, Curriculum Renewal Across the First Two
Years. Retrieved from www.maa.org/sites/default/les/pdf/CUPM/crafty/CRAFTY-Coll-Alg-Guidelines.pdf.
Meier, J. and Rishel, T. (1998) Writing in the Teaching and Learning of Mathematics. MAA Notes Number 48. Washing-
ton, DC: Mathematical Association of America.
86 MAA Instructional Practices Guide
Melhuish, K.M. (2015). e Design and Validation of a Group eory Concept Inventory. Portland State University Dis-
sertations and eses. Retrieved from pdxscholar.library.pdx.edu/open_access_etds/2490.
Melhuish, K. M. and Fasteen, K. (2016). Results from the Group Concept Inventory: Exploring the role of binary op-
eration in introductory group theory task performance.Proceedings of the 19thAnnual Conference on Research
in Undergraduate Mathematics Education,Pittsburgh, PA.
Millar, R. (2013). Improving science education: Why assessment matters. In Corrigan, D.,Gunstone, R.,Jones, A. (eds),
Valuing Assessment in Science Education: Pedagogy, Curriculum, Policy. Netherlands: Springer.
Montgomery, M. and Stuelbeam, R. (2014) Editorial for special issue on writing and editing in the mathematics cur-
riculum: Part I. PRIMUS: Problems, Resources, and Issues in Mathematics Undergraduate Studies, 24(6), 443–446.
Mulford, D. and Robinson, W. (2002). An inventory for alternate conceptions among rst-semester general chemistry
students. Journal of Chemical Education, 79(6), 739. doi.org/10.1021/ed079p739.
National Council of Teachers of Mathematics. (2000). Principles and Standards for School Mathematics. Reston, VA:
NCTM.
National Council of Teachers of Mathematics. (1995). Assessment Standards for School Mathematics. Reston, VA:
NCTM.
National Research Council (Mathematics Learning Study: Center for Education, Division of Behavioral and Social Sci-
ences and Education), Adding it up: Helping children learn mathematics, edited by J. Kilpatrick et al., Washington,
DC: National Academy Press, 2001.
Niemiec, C. and Ryan, R. (2009). Autonomy, competence, and relatedness in the classroom. Applying self-determina-
tion theory to educational practice. eory and Research in Education, 7(2), 133–144.
Nilson, L. (2014) Specications Grading: Restoring Rigor, Motivating Students, and Saving Faculty Time. Stylus Publish-
ing.
Oswald, D. L. and Harvey, R. D. (2000). Hostile environments, stereotype threat, and math performance among under-
graduate women. Current Psychology, 19(4), 338-356
Owens, K. (2015) A beginner’s guide to standards-based grading. American Mathematical Society blog On Teach-
ing and Learning Mathematics. Retrieved on 8 May 2017 from blogs.ams.org/matheducation/2015/11/20/a-begin-
ners-guide-to-standards-based-grading/
Pengelley, D. (2017). Beating the lecture-textbook trap with active learning and rewards for all. Notices of the AMS,
64(8), 903- 905.
Perera-Diltz, D. M. (2009). Assessment purposes. In E. Bradford (Ed.), ACA Encyclopedia of Counseling (pp. 38–39).
Alexandria, VA: American Counseling Association.
Piaget, J. (1954). e Construction of Reality in the Child. New York: Basic Books Publishing.
Ramsden, P. (1992). Learning to teach in higher education. London: Routledge.
Reinholz, D. (2016). e assessment cycle: A model for learning through peer assessment.Assessment and Evaluation
in Higher Education,41(2), 301–315.
Rishel, T. (2000). Teaching First: A Guide for New Mathematicians. Washington, DC: Mathematical Association of
America.
Roediger, H. L. and Pyc, M. A. (2012). Inexpensive techniques to improve education: Applying cognitive psychology to
enhance educational practice.Journal of Applied Research in Memory and Cognition,1(4), 242–248.
Rovai, A. P., Ponton, M. K., Derrick, M. G., and Davis, J. M. (2006). Student evaluation of teaching in the virtual and
traditional classroom: a comparative analysis. Internet and Higher Education, 9(1), 23–35.
Ryan, R. and Deci, E. (2000). Intrinsic and extrinsic motivations: Classic denitions and new directions. Contemporary
Educational Psychology, 25(1), 54–67. doi.org/10.1006/ceps.1999.1020
Sandeen, C. (2013). Assessments place in the new MOOC world. Research and Practice in Assessment, 8(summer),
5–12.
Assessment Practices 87
Sandusky, S. (2015) Gamication in Education, e University of Arizona, Educational Technology Program. Retrieved
from hdl.handle.net/10150/556222.
Schoenfeld, A. H. (2015). Summative and formative assessments in mathematics supporting the goals of the Common
Core Standards.eory Into Practice,54(3), 183–194.
Shute, V. J. (2008). Focus on formative feedback.Review of Educational Research,78(1), 153–189.
Smith, M. K., Wood, W. B., and Knight, J. K. (2008). e Genetics Concept Assessment: A new concept inventory for
gauging student understanding of genetics. CBE Life Sciences Education, 7(4), 422–430. doi.org/10.1187/cbe.08-08-
0045.
Soto-Johnson, H. and Fuller, E. (2012). Assessing proofs via oral interviews. Investigations in Mathematics Learning,
4(3), 1–14.
Soto-Johnson, H., Yestness, N., and Dalton, C. (2009). Assessing multiple abstract algebra assessments. Investigations
in Mathematics Learning, 1(3), 1–26.
Stahl, R.J. (1994). Using “think-time” and “wait-time” skillfully in the classroom. Educational Resources Information
Center (ERIC) Digest. Retrieved from les.eric.ed.gov/fulltext/ED370885.pdf.
Star, J. R. (2005). Reconceptualizing procedural knowledge.Journal for Research in Mathematics Education, 36(5),
404–411.
Steen, L. (ed). (2006). Supporting Assessment in Undergraduate Mathematics. Washington, DC: Mathematical Associ-
ation of America.
Stee, L. P. and ompson, P. W. (2000). Radical Constructivism in Action: Building on the Pioneering Work of Ernst von
Glasersfeld. New York: Routledge.
Stewart, B. L., Waight, C. L., Norwood, M. M., and Ezell, S. D. (2004). Formative and summative evaluation of online
courses. e Quarterly Review of Distance Education, 5(2), 101–109.
Stewart, J. (2012). Single Variable Calculus: Early Transcendentals. Boston: Cengage.
Su, Francis (2015) Mathematical microaggressions. MAA Focus, October/November, 36–38. Retreived from
digitaleditions.walsworthprintgroup.com/publication/?i=278032andp=36
Sullins, J., Meister, R., Craig, S. D., Wilson, W. M., Bargagliotti, A., and Hu, X. (2013). e impact of a mathematical
intelligent tutoring system on students’ performance on standardized high-stake tests.In Falmagne, J.-C.,Albert,
D.,Doble, C.,Eppstein, D.,Hu, X. (eds), Knowledge Spaces: Applications to Education, Berlin, Heidelberg: Spring-
er-Verlag, 69–78.
Sweller, J. (1988). Cognitive load during problem solving: Eects on learning. Cognitive Science, 12(2), 257–285.
doi.org/10.1016/0364-0213(88)90023-7.
Tall, D. (2013). How Humans Learn to ink Mathematically: Exploring the ree Worlds of Mathematics. Cambridge:
Cambridge University Press.
Vandervelde, S. (2013). Bridge to Higher Mathematics (2nd Ed). Sam Vandervelde, Lulu.com.
Vygotsky, L.S. (1978). Mind in Society. Cambridge, MA: Harvard University Press.
Walvoord, B. E., and Anderson, V. J. (1998). Eective grading: A tool for learning and assessment. San Francisco: Jossey-
Bass.
Wiggins, G. (1998). Educative assessment: Designing assessments to inform and improve student performance. San Fran-
cisco: Jossey-Bass.
William, D. (2007). Keeping learning on track: Classroom assessment and the regulation of learning. In F. K. Lester
(Ed.), Second Handbook of Research on Mathematics Teaching and Learning: A Project of the National Council of
Teachers of Mathematics (pp. 1051–1098). Charlotte: IAP-Information Age Publishing, Inc.
Wiliam, D. (2000). Formative assessment in mathematics part 3: e learner’s role. Equals: Mathematics and Special
Educational Needs, 6(1), 19–22.
88 MAA Instructional Practices Guide
Wiliam, D. (2007). Content then process: Teacher learning communities in the service of formative assessment. Ahead
of the curve: e power of assessment to transform teaching and learning, 183–204.
Wladis, C., Oenholley, K., Lee, J., Dawes, D., and Licwinko, S. (2017). An instructor-generated concept framework
for elementary algebra in the tertiary context.Congress of European Research in Mathematics Education, 2017
Conference,Dublin, Ireland. Retrived from keynote.conference-services.net/resources/444/5118/pdf/CERME10_0353.pdf.
Wladis, C., Oenholley, K., Licwinko, S., Dawes, D., and Lee, J. (2017, in press). Instructor-generated concepts frame-
work for elementary algebra in the college context. Proceedings of the 19th Annual Conference on Research in
Undergraduate Mathematics Education. Washington, DC: Mathematial Association of America. Retrieved from
sigmaa.maa.org/rume/crume2017/Abstracts_Files/Papers/132.pdf.
Zerr, R. (2007) A quantitative and qualitative analysis of the eectiveness of online homework in rst-semester calcu-
lus. Journal of Computers in Mathematics and Science Teaching, 26(1), 55–73.
89
Design Practices
Mathematics instructors today have access to an expansive body of research literature that addresses how
students learn, eective teaching methods, and how course context aects students’ ability to learn. is
research is threaded through the Classroom Practices and Assessment Practices chapters of this document
and provides a foundation for this Design Practices chapter. We dene design practices to be the plans and
choices instructors make before they teach and what they do aer they teach to modify and revise for the
future. Design practices inform the construction of the learning environment and curriculum and support
instructors in implementing pedagogies that maximize student learning.
Key aspects of course and lesson design are identifying goals for student learning, selecting instructional
strategies to achieve those goals, and choosing methods to assess student learning. is chapter makes these
considerations explicit while acknowledging that great teachers have long engaged in these activities in the
construction of their courses.
e chapter is divided into four sections: (1) an introduction to design practices, including questions that
can guide design; (2) research-based components of design; (3) opportunities and challenges in designing
for student-centered engagement; and (4) a brief introduction to theories of design. Integrated throughout
this chapter are seven in-depth classroom vignettes, examples of how design practices might appear from
the concrete point of view of a mathematics instructor. While each example is placed to help illustrate a
particular point, most examples are much broader and reinforce other points throughout the chapter. As
a collection they exemplify design at granularity ranging from a single task to an entire course. Given that
the MAA Committee on the Undergraduate Program in Mathematics Guide (2015) provides examples of
designing programs in the mathematical sciences, we do not provide that level of granularity here.
DP.1. Introduction to design practices
Design practices take many forms and can be generally conceptualized as “thoughtful planning.” An in-
structor might attend a workshop at a conference and adapt the workshop materials to design a new course.
Two or three instructors at one or more institutions might work together to redesign a course or to design
one “unit” in a course that they are all going to teach. And then again, an entire department could design a
sequence of courses (e.g., the calculus sequence) that are designed to facilitate more student engagement.
e primary focus of this guide is designing for student-centered learning. is does not mean planning
a lecture requires no design, but rather, designing a lecture traditionally has focused on writing notes and
selecting examples to use during the lecture. Most instructors are comfortable with these practices of design.
e purpose of this chapter is to dene and illustrate the design work that can result in more active student
engagement with full acknowledgement that this can be much harder and more time-consuming work.
It requires more eort than writing notes for a lecture, but research shows such work makes a dierence
in student learning (CBMS, 2016; Freeman et al., 2014; NRC, 2015). Successful implementation of active
engagement classroom components necessitates a shi in the instructors role relative to traditional lecture
that can result in students investing more time and eort in their learning. is extra eort has the potential
90 MAA Instructional Practices Guide
to help students be more successful in their courses (Lee, Y., Rosenberg, J., Robinson, K., et al., 2016). e
grounding principles of this design practices chapter are as follows.
Instruction should be designed for all, not just some, students. It is a common belief that individuals
are either smart and able to do mathematics, or they are not. is can lead to an approach where teaching
simply oers students an opportunity to see if they are in the “smart” category. ere is also a tendency
for instructors to believe that what worked for them as students will work in teaching others. Some might
even claim that such teaching should continue in order to sustain the eld of mathematics by producing
mathematicians. e authors of this document do not subscribe to any of these beliefs. Instead, we consider
mathematics to be a discipline to which all students should and can have access. e authors view teaching
as a means to eect positive change in individual students and not as a means for sorting them.
Designing instruction is more than just planning content. Mathematics instruction cannot proceed un-
der the assumption that the only important thing is to have students learn the mathematical content of a
course. When designing instruction, it is important to consider developing mathematical practices, increas-
ing access to the discipline, and encouraging positive dispositions toward mathematics.
Instructional design should aim to eect meaningful change. Mathematics instructors should be inten-
tional in design with an aim toward instructional improvement. is intentionality includes articulating
goals; envisioning how to help students achieve those goals through activities both in and out of the class-
room; reecting on the activities in terms of what content students learned, in what mathematical practices
students are engaged, and what students’ dispositions toward mathematics are; and revising for future use or
revisiting lessons via additional activities that address the goals not yet achieved.
Design for student-centered learning must be in sync with evidence-based practices. Just as mathematics
research forms the foundation of work in the eld, teaching also has a foundation of research evidence. is
chapter relies on that research and provides citations for further reading. Mathematicians need not become
experts in the mathematics education research eld to be successful teachers, but all instructors should be
aware of evidence-based practices that can enhance student learning.
DP.1.1. Questions for design
e following is a list of questions we recommend instructors ask themselves at the beginning of a design
process as a productive way to focus on intentional design.
1. Who are the students in this course? What knowledge and skills will students bring to this course?
Who am I in relation to my students, and how might that inuence how they perceive me?
What kinds of variation will likely exist in my students’ backgrounds?
What mathematical practices are my students in the habit of using?
What dispositions do my students have toward mathematics?
What do my students believe about who can learn mathematics?
2. What are the course learning goals?
What concepts and procedures do I want my students to master during this course? What does “un-
derstanding” mean in this course?
What will my students believe about mathematics as a discipline and as a way of knowing aer this
course? How will my students think about their relationship with mathematics aer this course?
What are the central questions in this course, how will the course engage them, and why are these
questions of value to my students?
Design Practices 91
What goals do my students bring to the course? How will my students understand the course learning
goals, and how will I help them understand the goals productively? What will I do to support my stu-
dents in developing appropriate goals?
How are the goals I articulate for this course related to those required or expected by the department
or the institution? How can this course contribute to the larger educational goals for these students at
this institution?
What will convince me that a student has met the course goals?
How will students know they are meeting course goals?
3. What does learning look like in the context of this course?
What experiences will allow my students to make progress toward the course learning goals, keeping
in mind their current mathematical expertise and readiness?
How can my students experience competence, autonomy, or relatedness (as dened in section CP2.3)
through this course?
How can tasks provide an appropriate level of cognitive demand? (See section CP.2.5 of the Classroom
Practices chapter for a discussion of cognitive demand.)
Is the learning in this course supported with the kinds of reection and metacognition needed to en-
sure the learning is deep and long-lasting?
4. What promotes student participation in the course?
To what extent are my students motivated? Are they motivated to engage deeply in course tasks or only
supercially?
What is the impact of my students’ motivations on their attitudes, beliefs, or behaviors?
Are my students participating at an appropriate level in authentic mathematical practices?
5. How is this course inclusive?
Does this course create barriers that disproportionately impact certain groups of my students? (E.g.,
will students who live o-campus have a harder time using support resources?) How will my students
access and use available support resources?
How does this course manage the variation in student preparation levels, particularly at the beginning
of the course or new units? Are there ways to use tasks in which my students can engage with “lower
thresholds and higher ceilings” as described in the Classroom Practices chapter?
What is the plan for building an inclusive and equitable classroom as described in the following section
of this chapter?
Is the course design exible enough to adapt to specic learning needs? For example, how could a
student with limited sight succeed in the course? Are there design choices that would naturally address
needed accommodations? It is not necessary that every course be designed to address every possibility.
Rather, the instructor should question foundational assumptions within the course design that may be
false for some students.
6. How will I provide my students with feedback in this course?
How will I provide my students with formative feedback intended to help them make changes in the
future?
How will I provide my students with summative feedback intended to help them assess how their work
relates to my expectations?
How will my students understand and leverage this feedback?
92 MAA Instructional Practices Guide
7. How will I gather information to improve the course?
How will I use formative assessment to determine whether my students have met the learning goals?
How will I use summative assessment to improve the course?
Will I use mid-term course feedback forms or other tools? In what ways should this feedback be anon-
ymous?
What questions will I ask on the end-of-term student feedback forms?
What unanswered questions do I currently have about the course design, and what data could inform
changes to the course in this and future iterations?
In the Classroom Practices chapter are examples of paired board work, where it is easy for one student to
do all the work without the other student(s) contributing. In an eort to ensure that all the students are en-
gaged in the task at hand, it is useful for instructors to have dierent groups work on dierent tasks that are
possibly more relevant to the students. For example, in an introductory statistics course it is helpful to have
examples from the health sciences, the business world, the sports industry, and the environment because
students from these various majors enroll in this course. Such personalised tasks can motivate students,
engage them in meaningful mathematics, and help them to see the value of the tasks at hand.
Before moving forward we want to emphasize that the course syllabus is the primary venue for addressing
some of the above questions, specically topics centered on course structure, classroom protocols, behav-
ioral expectations, learning outcomes, homework assignments, exams, and other assessments. e syllabus
can help set expectations for an inclusive classroom climate and provide motivation and encouragement for
students to engage deeply in the course tasks. It should also include information for students needing ac-
commodations as detailed in section DP.2.7. Some institutions provide instructors with a syllabus template
or checklist (e.g., academics.lmu.edu/media/lmuacademics/centerforteachingexcellence/ctedocuments/Syllabus%20
Checklist.pdf), and books such as Grunert, Millis, and Cohen (2009) provide guidance.
DP.1.2. Considerations for design
Instructors need to be explicitly aware of many issues when designing instruction. is section comprises
practical recommendations for evidence-based practices in design as documented in the research literature
related to the following:
1. Equity
2. Learning goals for students
3. Research on supporting learning for all
4. Situational factors
5. Learning environments
6. Tasks and activities
7. Homework
8. Formative and summative assessment
9. Reective instruction
DP.1.3. Designing for equity
Equity research plays a prominent role in the mathematics classroom. Instructor-focused pedagogies tend
to ignore that students learn dierently and treat all students equally, which can be inequitable, while stu-
dent-centered pedagogies have the exibility to support students in more individualized and more equitable
Design Practices 93
ways.
Gutiérrez’ (2009) Four Dimensions of Equity details four key aspects of the educational process that re-
quire attention:
Access: is refers to the ability to gain intellectual and physical access to mathematical ideas and
mathematical teaching and learning spaces (e.g., classrooms, tutoring centers, oce hours, and infor-
mal interactions).
Achievement: is refers to students’ success in mathematics as traditionally measured (e.g., perform-
ing well on homework and exams, succeeding in courses, and majoring in elds requiring mathemat-
ical knowledge).
Identity: is refers to who our students are, including the resources and ways of knowing they bring
to the learning environment, and to who they become through their participation in mathematics.
Power: is refers to attending to the distribution of power between instructor and student, between
students, and between students and mathematics (e.g., constructor of knowledge versus passive receiv-
er of knowledge, mathematics as an empowering force versus mathematics as a barrier).
We will examine each of these aspects more thoroughly through the lens of course design, and we will
pose questions related to equity upon which a course instructor or developer may reect. See section XE.2.1
in the Equity section of the Cross-cutting emes chapter for more information.
Access
Does the nature of this course set up barriers that disproportionately impact certain groups of students (e.g.,
multilingual students, historically marginalized students of color, women, students with disabilities)?
How can I group my students in equitable ways? For example, how does the nature of a given mathe-
matical task lend itself toward certain forms of student grouping (e.g., heterogeneous, mixed-ability
pairs versus homogeneous groups of three or four)? How might I structure group problem-solving
opportunities in the classroom so that the mathematics is accessible to all my students?
How can I approach course design in ways that promote mathematics as a discipline in which all my stu-
dents feel they belong and can grow? For example, are my course activities designed with multiple entry
points that can accommodate students with diverse learning needs (e.g., students with disabilities,
multilingual students)? Do my students with dierent life circumstances (e.g., live o-campus, have a
full-time job) have equitable access to course resources?
Achievement
How do the participation structures and assessments in this course allow all students to demonstrate their
understanding and inform all students how to advance their learning? Do assessment results provide mean-
ingful data about disparities in learning outcomes?
How will my students participate meaningfully in mathematical work in this course? For example, tasks
with many entry points (“lower oors”) and with potential for extensions and connection to more
complex mathematics (“higher ceilings”) can be used to engage all students in the mathematical work.
Participation structures such as think-pair-share, small group explorations, and student presentations
can also be used to create more opportunities for students to participate, contribute meaningfully in
class, and assess their own understanding.
How will assessments provide meaningful feedback to my students about their learning and provide in-
sight for the instructor into ways the course can address all students’ needs? Results of assessments pro-
vide critical information to students about what they have learned and what additional work they must
94 MAA Instructional Practices Guide
do to be successful in the course and beyond. Instructors can empower students by helping them learn
how to prepare for assessments, anticipate what will be assessed, and interpret and respond to assess-
ment results. Assessment can also reveal for whom this course is working and for whom it is not. For
example, poor performance by English language learners may indicate that either the course activities
or the assessments should be redesigned.
Identity
In what ways does this course design recognize students’ membership and positioning in society and work
toward the development of positive social and mathematical identities?
How does my course design acknowledge and arm my students’ social identities in learning mathemat-
ics? For example, in what ways are aective aspects of mathematical problem solving (e.g., persever-
ance, learning from constructive critiques of reasoning) valued and taken into account in the course
learning outcomes and assessment of those learning outcomes? How is classroom participation in the
course structured in ways that mitigate traditional stereotype threat and implicit biases related to gen-
der, race/ethnicity, other types of status related to mathematical ability? For example, group work op-
portunities and sharing samples of student work can disrupt preconceived ideas of peers’ competence
and skills.
To what extent do I value knowledge and experiences that my students bring to the course as resources
for mathematics learning? For example, how can my course activities and assessments leverage my
students’ social backgrounds while they learn mathematics? In what ways does the course design allow
my students to demonstrate and engage in dierent forms of agency as mathematics learners? Exam-
ples of this include using mathematics to challenge the status quo and providing space for dierent
types of contributions in mathematical problem solving and reasoning.
Power
How does this course support students in constructing mathematical knowledge and empower students
through mathematics?
How will I enable my students to access and take ownership of mathematical ideas? Research on learning
suggests that in order to construct knowledge, people must engage in activities that promote intellec-
tual stimulation and growth such as asking questions, posing problems, and making mistakes. How do
key course activities support my students in the intellectual “heavy liing” required for learning? How
does the course design promote a culture of positivity, encourage mistake making, and support risk
taking?
How will I balance my students’ ways of knowing with traditional mathematics content and practices?
Instructors should create a safe environment in which students feel comfortable bringing their ways
of thinking about, engaging with, and making sense of mathematics to the classroom. e instruc-
tor must empower students to view mathematical ideas as having validity based on the structure of
mathematics itself rather than an instructors decision about validity. At the same time, the instructor
is responsible for conveying to students the traditions, usage, and conventions of mathematics as a
discipline and thereby empowering students as members of the mathematical sciences community.
How does my course empower my students through mathematics? How does the course help my students
recognize the presence and utility of mathematics in their lives? For example, students in calculus
could be asked to discuss with other professors in their major how a problem showcases the utility of
the content they are learning in their mathematics class. How can my course help my students make
sense of critical issues in the world around them through mathematics?
e following illustration typies an approach to the design of an activity that uses a low-oor, high-ceil-
Design Practices 95
ing task to establish a classroom community with mathematical power shared by students and the instructor.
is illustration is written from the perspective of an instructor three weeks into a liberal arts mathematics
course planning a lesson for the following week.
Classroom vignette: Designing a lesson for a liberal arts mathematics class
At the beginning of the course Professor Evans provides each of the thirty freshmen students with a Rubiks
cube, and they work hard on solving the cube for the rst three weeks of the semester. Giving each student
a cube alleviates issues for students who either cannot aord the cube or cannot check them out at the ref-
erence desk of the library because they live o-campus.
e students work on the cube in class on ursdays, and the professor covers other course topics on
Tuesdays. Aer most students can solve the rst layer of the cube, Professor Evans considers the course goals
and the following issues:
1. Some students are still making sense of how the cube works and do not yet see how corner cubies are
dierent from edge cubies.
2. ere are two sets of students: those who have completed the rst layer and those who have not. e in-
structor uses ideas discussed in the Classroom Practices chapter to determine how to assign students in
each set to groups.
3. Students who have completed the rst layer are grouped to work on the next task: making sense of a
powerful Rubiks cube technique called “M1”. ey will use representations other than verbal descriptions
and Singmaster notation (used in Rubiks Cube solution guides) to communicate sequences of moves.
e goal is for students to become more exible in reasoning about longer sequences of moves frequently
used to eect position changes of corner cubies.
4. Students who are still working to solve the rst layer begin working together in groups as well. Students
who solve the rst layer will move to one of the groups working on the M1 task. If several students solve
the rst layer at about the same time, they form an M1 group of their own.
An important part of this activity is the way that students communicate about sequences of moves, since
ecient representation of changes to the cube can aid in the solution process. In order to allow students to
engage in productive struggle, the professor resists the urge to help too much and instead encourages them
to talk with other groups and compare representations. e professor asks groups to explain their repre-
sentations to the class early in the solution process so the class recognizes the power of notation to aid in a
solution.
e assessment is complete when all the students have demonstrated their solution to the rst layer. All
students will earn 100% because this learning opportunity is designed to be accessible to all students. e
instructor acknowledges that it is impossbile to plan what will happen in every minute of the class because
so much depends on what the students do, so instead a rough outline is developed.
DP.2. Student learning outcomes and instructional design
ere are a variety of terms (e.g., learning outcomes, learning goals, learning objectives, competencies, stu-
dent benchmarks) used to describe what students should know upon exiting a lesson or course. Student
learning outcomes can include content, cognitive, and aective goals (e.g., MAA, 2015; CCSSM, 2010) and
are oen tied to pedagogical goals or teaching practices (e.g., Blair, 2006; NCTM, 2015). Content goals are
explicit skills and understandings oen central to the design of a course. Cognitive goals are less course-
specic and include understandings about the practice of mathematics. Aective goals pertain to elements
of students’ learning that relate to their dispositions toward and emotions about mathematics.
Content, cognitive, and aective goals are important features of successful instructional design and go
well beyond traditional assessment measures. Instructors should identify student learning outcomes that are
robust and practical enough to guide instructional design, and instructors should design instruction that
96 MAA Instructional Practices Guide
eectively supports specic student learning outcomes.
Content goals
e CUPM guide (MAA, 2015) lists four cognitive recommendations, nine core content recommendations
for programs in the mathematical sciences, and content goals for a variety of specic courses as well.
CUPM Guide: Content recommendations (MAA, 2015)
1. Mathematical sciences major programs should include concepts and methods from calculus and linear
algebra.
2. Students majoring in the mathematical sciences should learn to read, understand, analyze, and produce
proofs at increasing depth as they progress through a major.
3. Mathematical sciences major programs should include concepts and methods from data analysis, com-
puting, and mathematical modeling.
4. Mathematical sciences major programs should present key ideas and concepts from a variety of per-
spectives to demonstrate the breadth of mathematics.
5. Students majoring in the mathematical sciences should experience mathematics from the perspective of
another discipline.
6. Mathematical sciences major programs should present key ideas from complementary points of view:
continuous and discrete; algebraic and geometric; deterministic and stochastic; exact and approximate.
7. Mathematical sciences major programs should require the study of at least one mathematical area in
depth, with a sequence of upper-level courses.
8. Students majoring in the mathematical sciences should work, independently or in a small group, on a
substantial mathematical project that involves techniques and concepts beyond the typical content of a
single course.
9. Mathematical sciences major programs should oer their students an orientation to careers in mathe-
matics.
In addition, the Assessment Practices chapter of this guide provides advice about specifying measurable
content goals and specic learning outcomes to enable eective assessment. For the purposes of the Design
Practices chapter, designing activities to meet content goals through student engagement involves deliberate
selection of tasks that focus on the mathematical content specied in the goal. Content goals then serve an
important framework for the design of the rest of the course.
We oer two illustrations of designing content goals. e rst demonstrates designing a class activity to
meet a student learning outcome and the second demonstrates using content goals at the course level.
Classroom vignette: Designing an activity for a college algebra class
Student learning outcome (content goal): Describe the meaning of an algebraic formula.
Assessment: Write a paragraph describing the connections between an algebraic formula and a formula
intended to describe a computation.
Activity: Fast Food Eciency. Each evening, shi managers at a fast food restaurant have to enter data into a
form and perform a calculation. is “French fries eciency” calculation indicates what percentage of fries
cooked in a given day were actually sold as opposed to wasted, e.g., dropped on the ground, thrown out, or
eaten by employees.
Students simulate the manager’s task by taking four samples of data on four separate days and completing
the form. Students then encode variables for each non-constant quantity, connect each step using an appro-
Design Practices 97
priate operation, write the formula, and program the formula into a spreadsheet to perform calculations for
each day of an entire month.
Classroom vignette: Designing a hybrid quantitative reasoning course
e college of business contacted us requesting that we design a sequence of courses for their majors. At that
time students were required to complete an intermediate algebra course to fulll the university mathematics
requirement. Over the course of several meetings, the business faculty identied necessary student pro-
ciencies in quantitative reasoning, problem solving, and algebra content. A mathematics instructor oered
to take the lead in developing two courses: Quantitative Reasoning for Professionals 1 and 2. is instructor
determined appropriate student learning outcomes for each course and broke each outcome into several
objectives which formed the basis for individual lessons and assessments.
Quantitative Reasoning for Professionals 1: Student learning outcomes
Students should:
1. Apply prior knowledge and mathematical concepts to solve novel problems.
2. Use proportional reasoning to solve problems.
3. Use data to make and defend decisions.
4. Construct algebraic formulas to model real-world quantitative relationships.
5. Manipulate formulas involving a variety of mathematical operations.
Quantitative Reasoning for Professionals 2: Student learning outcomes
Students should:
1. Apply prior knowledge and mathematical concepts to solve complicated, novel problems in context.
2. Identify and create models of linear functions involving verbal, numerical, algebraic, and graphical
representations.
3. Identify and create models of exponential functions involving verbal, numerical, algebraic, and
graphical representations.
4. Solve problems requiring the use of logarithms.
5. Use linear systems of equations and inequalities as well as linear programming to solve problems.
For each course, the rst outcome focuses on problem solving rather than content, and the remaining
outcomes are content based. e second course increases the expectations for problem solving beyond those
in the rst course. In order to support problem solving, we oen schedule these classes in rooms that sup-
port group work, and we utilize a form of inquiry-based learning to provide a meaningful experience for the
students. Some classes have been linked with English composition classes in order to promote communica-
tion skills development in connection with problem solving.
Cognitive goals
e CUPM guide (MAA, 2015) and Beyond Crossroads (Blair, 2006) provide starting points for instructors
to identify or create cognitive goals that are robust and practical for guiding instructional design. ese re-
ports focus on cognitive goals that support the development of mathematical habits of mind (MAA, 2015,
p. 10). More specically, among the professional associations focusing on undergraduate mathematics edu-
cation, there is consensus that students should learn how to communicate mathematics eectively, develop
independence in problem solving, and learn how to work with technology (Saxe and Braddy, 2016).
Once instructors establish cognitive goals they can design instruction to support these goals. For exam-
98 MAA Instructional Practices Guide
ple, if an instructor sets a cognitive goal for developing eective mathematical communication skills, the
course activities should engage students in mathematical communication. e instructor should consider a
variety of ways that students should be able to communicate mathematical ideas, and then choose instruc-
tional strategies that require students to speak, write, read, compare, critique, and compose mathematical
explanations.
e CUPM guide (MAA, 2015, pp. 10–13) delineates four cognitive goals for programs in the mathematical
sciences.
CUPM Guide: Examples of cognitive goals (MAA, 2015)
1. Students should develop eective thinking and communication skills.
2. Students should learn to link applications and theory.
3. Students should learn to use technological tools.
4. Students should develop mathematical independence and experience open-ended inquiry.
Affective goals
Most instructors have the least experience with student learning outcomes that fall into the category of aec-
tive goals. “Aect is a disposition or tendency or an emotion or feeling attached to an idea or object. Aect is
[composed] of emotions, attitudes, and beliefs” (Phillip, 2007, p. 259). Aective goals complement content
and cognitive goals. Goals that address student access to learning, power, identity, condence, enjoyment,
creativity, curiosity, ability to work with others on mathematical tasks, and ability to seek help and accept
and respond to feedback are all examples of aective factors.
Setting aective goals for a course is especially important in light of increasing research evidence that
the way in which instruction is designed and delivered can directly aect the motivation, condence, en-
gagement, curiosity, persistence, and other aective factors for a growing number of underachieving or
underrepresented students (Ellis, Fosdick, and Rasmussen, 2016). In turn, “positive self-perceptions such
as increased perseverance, risk-taking, and the use of improved cognitive and self-regulatory strategies are
connected to ecient and deep learning” (Hassi and Laursen, 2015, p. 319 ). Furthermore, designing and
implementing student-centered instruction has been shown to improve certain aective factors for students
of color and rst-generation college students (e.g., Kelly and Hogan, 2014).
Aective factors are not typically measured or graded via traditional assessment methods. In a stu-
dent-centered classroom, instructors can monitor many of these factors via formative assessment and make
instructional design decisions based on student progress. Instructors should be purposeful in selecting
classroom activities that provide opportunities to achieve these aective goals and assessments that provide
insight into student progress.
Here is an example of an aective goal used for design:
Aective goal: Students should perceive mathematics as a discipline in which they can participate and to
which they can contribute.
Assessment: On the nal exam the instructor asks students to reect on the following prompt, “Describe
one way in which you have participated in the discipline of mathematics through this class.” e instructor
reads these reections to understand how topics in the class and structure of the class are interpreted by the
students. In future iterations the instructor modies the course, if necessary, to alleviate some of the logisti-
cal and structural barriers.
e next illustration demonstrates how an instructor uses student learning outcomes to redesign a course.
Design Practices 99
Classroom vignette: Redesigning a capstone course for mathematics majors
Professor Baker was given the task of redesigning a senior capstone course for mathematics majors. e
course is oered annually to about 20 graduating mathematics majors, more than half of whom are pursuing
teaching careers. Prior instructors and students commented that the projects and course content needed
to be revised. Professor Bakers course redesign involved working with a group of instructors for a week
during the summer as a course design community of practice and using evidence-based design practices to
redesign the course. eir goals were to rethink the student learning outcomes and evaluation criteria and
develop a new project assignment focused on students building and demonstrating problem solving and
critical thinking skills. Edwards and Hamson (2007) and resources developed by CoMAP (www.comap.com/)
helped to inform the selection of the content.
Step 1: Rethinking student learning outcomes
Rewriting the student learning outcomes helped clarify the most important aspects of the course. e com-
parison table below shows the original and redesigned learning outcomes. Changes included adding verbs
from Blooms Taxonomy (www.bloomstaxonomy.org/) to indicate the level of student reasoning expected. e
team of instructors also added new learning outcomes to better represent the goals for the course within the
departments mathematics program.
Original learning outcomes Redesigned learning outcomes
1. Understand the basic mathematical modeling
process.
2. Solve typical mathematical modeling problems
using known models of two types: Monte Carlo
simulations and dierential equations models.
3. Analyze the solution of mathematical modeling
problems.
4. Write reports and make oral presentations on
the results of modeling projects.
1. Apply basic mathematical modeling strategies to solve typ-
ical application problems in the physical, social, life, infor-
mation, and engineering sciences.
2. Eectively analyze and evaluate the quality of mathematical
models and model-based interpretations.
3. Find and synthesize connections within and across second-
ary and postsecondary mathematics content.
4. Use modern computing soware as a tool for visualization,
simulation, and analysis of mathematical models.
5. Eectively communicate mathematical modeling processes
and outcomes in both written and oral forms.
Step 2: Linking assessment to student learning outcomes
e team mapped the major course assessments, learning outcomes, and relative weights in the grading
scheme in the syllabus so that the students would see a clear connection between the student learning goals
and their earned grades.
Assessments Learning outcomes Weight
Homework 1, 2, 3, 4, 5 10%
Project #1: Pose a modeling problem 3, 5 15%
Project #2: Solve a modeling problem 1, 2, 4, 5 30%
Major eld test 3 10%
Skills mastery quizzes 1, 3, 4 25%
Final presentation 2, 5 10%
Step 3: Creating guidelines for student projects
100 MAA Instructional Practices Guide
Professor Baker structured the semester-long modeling projects to be completed by teams of three to four
students using the following four-step process: (1) researching a modeling application in a small group, (2)
developing a problem scenario that might be reasonably solved by another group in the class, (3) solving
another groups problem scenario from the prior step, and (4) writing and presenting a technical report of
ndings.
Project reports required background, problem description, solutions, implications, and limitations sec-
tions and drew on examples from the course textbook. e norms for group collaboration were already
strong among students in the class, but Professor Baker clearly articulated the expectations for teamwork as
an added layer of accountability.
To assess student work the team developed rubrics for the projects that included guidelines for peer re-
view of presentations and instructor evaluation of presentations and used the rubrics to inform students of
our expectations up front. e rubrics followed the simplied step-down analytic format (Bean, 2011) and
utilized American Association of Colleges and Universities frameworks to clarify the meaning of critical
reasoning and problem solving (www.aacu.org/summerinstitutes/igea/curriculum).
DP.2.1. Designing the learning environment
ere are factors specic to each class that inuence the learning environment and an instructors day-to-
day work. Examples include the dierences in students’ mathematical backgrounds, the number and length
of class meetings (e.g., 50 minute versus 75 minute periods), and the purpose of the course (e.g., it serves as
a prerequisite for another course). An instructor cannot assume that a student who just nished high school
with AP credit will engage in a calculus course the same way as a student taking the course aer completing
precalculus as an undergraduate.
e way students engage in a course is very dierent depending on whether the course is online, meets
face-to-face, or has been ipped (see “Designing a ipped classroom” section below). e number of stu-
dents in a class inuences activity design choices. For example, in a small class the instructor might require
students to make presentations, but this might not be as feasible in a large class. If a class is ipped and all
lectures are delivered via online videos, then the instructor must plan more activities for class time.
A factor over which instructors may have little control is the physical environment. Instructors must work
within the constraints of institutional resources. is includes the size of the classroom, the conguration
of the room, and the availability of technology. On the other hand, instructors have signicant control over
some “so” aspects of a learning environment regardless of the physical classroom setting. For example, an
instructor might have the students move their desks together in order to facilitate small group work or might
assign seats to ensure students are not sitting only with students who are similar to them.
Note that research on learning environments can be useful to lobby for institutional investment in smaller
class sizes, classrooms conducive to active student engagement, and up-to-date technology. An example of
a physical classroom environment developed to facilitate increased student engagement exists in the eld of
physics. Scale-up classrooms oer students the opportunity to work in small groups, large groups, and as a
class using computers and white boards for groups, as needed (scaleup.ncsu.edu/FAQs.html).
e learning environment can support or constrain the intellectual space. For example, it is important
to create a place where students feel comfortable communicating with their instructor and peers, feel com-
fortable making mistakes, and feel they are making valuable contributions to the class. An environment that
encourages students to take responsibility for their own learning might require the instructor not always be
in the front of the classroom. It might involve students working at the board or in small groups using indi-
vidual whiteboards. It might involve students presenting ideas in small groups, to the entire class, or to the
instructor one-on-one. ese ideas along with supporting evidence are discussed further in the Classroom
Practices chapter.
Design Practices 101
DP.2.2. Designing mathematical activities and interactive discussions
For most instructors designing the actual tasks and activities for a course is the most important part of the
design process, and this aspect of the process is addressed more extensively in the Classroom Practices
chapter. In the present chapter we briey discuss the development of mathematical tasks that allow students
to engage with the mathematics in meaningful ways. e tasks should be engaging and interesting to stu-
dents and have high-level cognitive demand. (See section CP.2.5 of the Classroom Practices chapter for a
discussion of cognitive demand.) Instructors must carefully consider how to launch complex tasks in order
to ensure they are accessible to all students.
We design instruction not only to introduce new content but to review previous content as well. Some-
times this involves brief mini-lectures followed by interactive discussions. At other times an instructor will
design for the introduction of formal mathematical language such as the denition of a mathematical object
or the statement of a theorem. Two schools of thought exist based on how mathematicians operate, and
the design choice depends on the pedagogical goals of the lesson. Sometimes it is helpful to include formal
language with the introduction of new content in order for students to begin working through ideas using
appropriate language. On the other hand, it may be advantageous to allow students to develop new mathe-
matical ideas rst, then introduce the language to describe the mathematics. When the goal is for students
to reinvent the mathematics to better understand the ideas, instructors must determine how long to wait
before summarizing students’ reinventions and guiding them to formal denitions and theorem statements.
On the other hand, if the goal is to demonstrate the progression of mathematical idea development, using
formal language from the beginning may be helpful.
e following vignette focuses on designing an interactive activity and discussion.
Classroom vignette: Designing a content unit for a differential equations course
One of the ten units in a dierential equations course focuses on introducing solutions to systems of dier-
ential equations in a setting in which students worked in groups. e aective goals focus on empowering
students to develop the mathematics themselves and take responsibility for their own learning. e cogni-
tive goals include students learning to work collaboratively, to communicate eectively both verbally and in
writing, and to visualize as a way to understand mathematics. e learning outcomes focus on developing
student notions of a three-dimensional solution to a system of dierential equations, guiding students to
recreate Euler’s method for systems, and developing student understanding of a phase plane representation
of a system.
e instructor begins by creating an activity where students visualize a solution.
ree dimensional visualizations. A crop duster plane with a two-blade propeller is rolling down a runway.
On the end of one of the propeller blades, which are rotating clockwise at a slow constant speed, is a notice-
able red paint mark. Imagine that for the rst several rotations of the propeller blades the red mark leaves a
trace” in the air as the plane makes its way down the runway.
Simulate this scenario over time with your nger or create the trace with a pipe cleaner. Sketch the ideal
perspective (what you would see) for persons at alpha, beta, gamma, and delta. What view do you think is
the best and why?
102 MAA Instructional Practices Guide
Al
pha
Gamm
a
Beta
(Person Delta is located in a hot air balloon directly above the airplane.)
e content goal for this task is straightforward. Students visualize a three-dimensional curve and rea-
son about its properties. e unit continues with tasks whose content goals are that students learn to use
Eulers method in three dimensions and to develop a phase plane. is unit was designed to help students
to reason at a big-picture level about solutions to systems of dierential equations and not just to memorize
procedures to solve them. In a later unit students develop procedures to solve linear systems of dierential
equations using the following activity:
A group of scientists wants to graphically display the predications for many dierent nonnega-
tive initial conditions to the rabbit-fox system of dierential equations and they want to do so
using only one set of axes. What single set of axes would you recommend they use (R-F-t axes,
t-R axes, t-F axes, or R-F axes)? Explain.
Assessment of student learning aer the completion of these tasks involves students writing about their
understanding in homework, quizzes, or exams. e early development of this unit included reection on
the results and multiple revisions.
DP.2.3. Designing homework
ere are dierences between “assigning homework” and “designing homework.” When designing home-
work, the intentionality of the learning outcomes must be considered. It may not always be appropriate to
simply assign a list of problems out of a textbook. Instructors should be cognizant of the purpose for each
assignment and be deliberate in their choices.
Sometimes homework is about skill building. In such cases assigning a set of textbook problems may
be a sound choice. However, assigning homework may be related to other goals such as building students
understanding and conceptualizations of mathematical ideas or helping students reect on what happened
in class. In these cases homework might consist of writing a reection, watching a video and responding
to a prompt, emailing the instructor an answer to a conceptual question, or emailing the instructor a list of
questions regarding a particular concept.
What is important and feasible in one course may not be the same in other courses. For example, an
instructor’s goal for the homework may be for students to think ahead about new material, as in the case of
Dr. Gomez in the second classroom vignette of section CP.1.6 of the Classroom Practices chapter. is goal
might be important in a course where one of the instructors goals for the students is to learn to read math-
ematics for understanding, but it may be impractical in another course.
Instructors may design homework that encourages student collaboration and mutual support for their
peers. For example, it might take as little as ve minutes during a designed class activity for students to con-
nect with peers with whom they will study outside of class. Homework assignments may be created when a
unit is designed or aer a class has occurred. For example, in a student-centered classroom it can be dicult
to know in advance what students will discuss and which concepts they might struggle to understand. If the
homework assignments are completely detailed in the syllabus, there is no room for exibility.
Design Practices 103
DP.2.4 Designing a ipped classroom
e notion of a ipped classroom is discussed in the Classroom Practices chapter. In this chapter we discuss
design practices that need to be considered when ipping a classroom. In instructional designing it can be
challenging to manage the allotted class time to give students the best opportunity for success. One strat-
egy that provides a great deal of exibility in how class time is used is known as “ipping the classroom.
In this pedagogical method the instructor gives students the responsibility to complete readings or watch
short video lectures prior to class—that is, the “instruction” is done outside of class. Students then come to
class having already been introduced to content which allows the instructor exibility in engaging students
during class time. For example, an instructor can ask students to bring questions that arise from their read-
ing or from watching the video and spend some class time discussing those questions. If class time is freed
of content delivery, time can be spent with students engaged in more collaborative activities.
An instructor who decides to implement the ipped classroom model must make informed decisions
about how students will be engaged outside of class and how these outside activities will relate to those in
class. For example, an instructor may decide either to nd video or to create a video explaining a mathemat-
ical concept and then provide examples of how to work with this concept. During class time the instructor
can have students work in small groups to solve a few carefully chosen problems that push students to reason
critically about applying this concept in dierent situations. For additional information and examples, see
the Classroom Practices chapter.
DP.2.5 Using formative and summative assessment in design
Formative and summative assessment are both dened in the Assessment Practices Chapter but they also
need to be considered as part of instructional design. Formative assessment can be student activities that al-
low students to better understand what they are learning and to assess their own progress in learning. It can
also be an activity that instructors use to inform their own instruction. In every lesson the instructor should
include opportunities to gauge student progress as students proceed through the content. is allows the
instructor to modify instructional plans to better accommodate students. For example, in student-centered
classrooms the instructor receives a continuous stream of formative assessment data as students grapple
with their assigned tasks.
Another example of formative assessment is the use of brief closure activities. e closure activity could
be an exit ticket (as described in section CP.1.4 of the Classroom Practices chapter), such as an index card
on which students respond to questions such as, “What is the clearest and what is the muddiest point from
class today?” Another closure activity (also described in section CP.1.4 of the Classroom Practices chapter)
is the “one-minute paper” in which students spend one minute summarizing the class period in writing. e
instructor can use responses to such prompts to modify the following day’s plans in order to better address
students’ questions, concerns, and misunderstandings.
Summative assessment is used to evaluate student learning and can occur in myriad ways, oen with
the idea that the class is moving on to a new topic. Examples include an end of unit exam, an end of course
exam, and a portfolio project. More information about summative assessment can be found in the Assess-
ment Practices chapter.
DP.2.6. Reective instruction
Explicit reection on the implementation of a lesson is an important component of design. Good design
practices include reecting on how things went during and aer individual lessons and aer the course as
a whole. Some questions that can guide reection are: Did the students participate as I hoped? Were the
104 MAA Instructional Practices Guide
learning outcomes met? Did the environment support conceptual learning? What went well and why? What
needs improvement and how might it be improved? Are there student comments I want to remember to help
recreate the same positive outcome in the future?
Instructors have all had days where they think, “Wow! I hope I can do that again,” or “I need to do that
dierently next time.” Such real-time reection may get lost if not recorded. For example, if an activity was
not accessible to all students, the instructor should note it and make revisions for the next time. Reection is
also an opportunity to maintain alignment of outcomes, learning activities, and assessments. Reection time
at the end of instruction is essential in order to ensure learning goals are met, but this can be challenging for
instructors who teach back-to-back classes or have other impediments to thorough reection.
DP.2.7. Students needing accommodations
As mathematicians work toward designing courses that incorporate active engagement strategies and pro-
vide inclusive learning environments, they must attend to the needs of students with disabilities. In 2011–12,
11% of undergraduate students self-identied as having a disability (National Center for Education Statis-
tics, 2016). ese students reported that they had one or more of the following conditions: a specic learning
disability (e.g., ADHD, dyslexia, dysgraphia, mathematics anxiety), or a visual, hearing, speech, orthopedic,
or health impairment. Students may also have a diagnosis of autism spectrum disorder which encompasses
four separate disorders: autistic disorder, Asperger’s disorder, childhood disintegrative disorder, and perva-
sive developmental disorder not otherwise specied.
According to the Americans with Disabilities Act (ADA) of 1990, higher education institutions are re-
sponsible for making reasonable accommodations when a student provides documentation of a disability
(www.apa.org/pi/disability/dart/legal/ada-basics.aspx). e purpose of any accommodation is to give the student
an equal opportunity to participate in an academic program. Depending on the need this may involve a note
taker, extra time on exams, a quiet testing environment, large-print materials, image-enhancing technology,
audio-recorded materials, exible due dates (e.g., for a student with a temporary disability like pregnancy), or
wheelchair friendly furniture. See www.apa.org/pi/disability/dart/toolkit-three.aspx for a more comprehensive list.
To request an accommodation a student will typically apply to a designated campus oce that provides
support for students with disabilities. Aer the oce determines whether the student is eligible for ac-
commodations it will communicate and coordinate with instructors, student housing sta, and other de-
partments on behalf of the student. Having students submit requests for accommodations to a designated
oce ensures consistency across the institution and removes from the instructor the burden of determining
whether an accommodation is appropriate. All course syllabi should detail the process students with special
needs are to follow to request reasonable modications, special assistance, or accommodations in a course.
Students need not disclose their specic disability to instructors, and it is a privacy infringement to re-
quest such information. e campus disability services oce can be a resource for instructors teaching stu-
dents with disabilities. For examples of the types of resources and information supplied by disability services
oces, see Texas A&M University (disability.tamu.edu/facultyguide/teaching), Towson University (www.towson.
edu/dss/dss-faculty-guide-2015.pdf), and Vanderbilt University (cft.vanderbilt.edu/guides-sub-pages/disabilities).
While the MAA Instructional Practices Guide does not discuss teaching mathematics to students with
disabilities in detail, the following may be of interest. For information on teaching students with Aspergers
syndrome see Langford-Von Glahn, Zakrajsek, and Pletcher-Rood (2008). Sullivan (2005) wrote about her
experience teaching three students with learning disabilities in a general education mathematics course
that emphasized making sense of mathematics and engaging in mathematical discourse. Jackson (2002)
revealed the falsity of many stereotypes about blind mathematicians. As instructors include more active
engagement techniques in their teaching, they should be mindful of both how activities may be inaccessible
Design Practices 105
for students with disabilities and what instructors can do to provide all students with an equal opportunity
to participate in the activities. For example, how can the instructor support a note taker or sign language in-
terpreter during group work? How will manipulatives need to modied for a blind student? What supports
might benet a student with Aspergers syndrome during social interactions such as group work, think-pair-
share, and class discussions? e answers to such questions are highly dependent on individual students and
should be navigated on a case-by-case basis.
DP.3. Challenges and opportunities
Successful implementation of engaging classroom components (a) necessitates a shi in the instructor’s role
relative to traditional lecture, and (b) tends to alter the time and eort cost for students (Lee, Y., Rosenberg,
J., Robinson, K., et al., 2016). In this section we discuss challenges and opportunities related to designing an
active engagement environment.
DP.3.1. Big-picture challenges and opportunities
ere are two broad challenges with active-engagement design, and with these challenges come associated
opportunities. e rst challenge is designing instruction and activities that depend on prerequisite knowl-
edge and skills. According to Megginson (n.d.), lessons designed requiring advance preparation by students
(e.g., reading the textbook, watching videos) stand in stark contrast to lessons designed with no preparation
expected from the students. e associated opportunity is designing a lesson that extends, claries, and
enriches students’ knowledge rather than a lesson introducing content for the rst time or reintroducing
content that students already have experienced. From a design perspective a lesson that meets this challenge
and capitalizes on this opportunity should center on questions related to the advance preparation activities
as well as have entry points to the in-class portion for students who have not appropriately prepared.
e second challenge arises from the fact that designing lessons with active engagement does not guar-
antee that learning will take place. An active environment without the proper tasks, quality student interac-
tions, or timely feedback may actually undermine, not amplify, the learning experience. is provides the
associated opportunity to design a lesson that leverages real-time, student-driven problem solving towards
increased conceptual engagement in and ownership of the mathematics. For example, the use of clickers or
whiteboard peer-collaborations are evidence-based pedagogies with the potential to have a positive impact
on student learning (Mulnix, A. B., Vandegri, E. V., and Chaudhury, S. R, 2016; National Research Coun-
cil, 2015), but it is ultimately the instructors management of the active engagement tasks that realizes these
opportunities. us, from a design perspective, in order to maximize learning, instructors must utilize a
variety of strategies to alleviate struggles and misconceptions that inuence real-time student work toward
a deeper understanding of content and connections (e.g., CCSSM, 2010; McCallum, 2015; Schmidt, W. H.,
McKnight, C. C., and Raizen, S. A., 1997).
DP.3.2. Other challenges
Each of the following challenges in course design are explored in more depth below.
1. Time to prepare. It initially takes longer to prepare student engagement activities than to prepare lec-
tures.
2. Judging eectiveness. Instructors should engage in a continuous cycle of reection and revision rather
than reusing the same materials without modifying them.
3. Achievement. Instructors may need to recalibrate assessments to reect updated expectations of stu-
dents’ work.
106 MAA Instructional Practices Guide
4. Content coverage. Instructors may have to reduce the content covered which will require judicious deci-
sions about what to remove.
5. Buy-in. Instructors may need to manage expectations that students and colleagues have based on their
prior experiences and beliefs.
Preparation time
When incorporating changes to fundamental pedagogy an instructor can expect an increased time commit-
ment in designing or redesigning a course. For example, if an instructor designs a ipped classroom, they
may invest considerable time creating videos for students to watch prior to class, corresponding activities,
and in-class activities. Of the 1089 instructors who responded to a survey of Faculty Focus readers, the ma-
jority found time limitations to be a signicant hurdle to ipping a classroom. “Although lack of support
was clearly a limiting factor, the biggest barrier of all was time. Approximately 38% of survey participants
indicated that time was ‘always a challenge’ and another 31.61% said time was ‘oen a challenge” (www.
facultyfocus.com/articles/blended-ipped-learning/ipped-classroom-survey-highlights-benets-and-challenges/). If a
course is taught repeatedly using a similar approach and an instructor continues to rene their methods, the
addtional time commitment will naturally decrease and stabilize.
Judging effectiveness
Instructors engaged in design practices should expect periods of adjustment and should withhold judge-
ment about the eectiveness of a new approach until they have tried and modied the approach multiple
times over a period of weeks or successive terms. During the initial implementation of a particular design,
frequent written feedback from students can inform improvements that might be made. Negative student
feedback might tempt an instructor to revert to a lecture-based class, but the instructor should remember
that no single approach works for all classes or all lessons and should allow time to adjust the design each
successive term.
While instructors should make adjustments at the end of each term, adjustments might also be required
in real time. An instructor must understand that there will be moments when lessons move in unanticipated
directions due to novel student ideas or questions and must evaluate the potential cost and benet to lever-
age the new learning opportunity (Herbst, 2008). e instructor can facilitate such an opportunity by posing
key questions that move the discussion in a productive direction.
Departments should also withhold early judgement and support instructors as they employ new ap-
proaches. Student evaluations may suer during early implementation of a student-centered design. For
example, students may complain that an instructor just stands there and makes them do all the work. Un-
tenured and contingent instructors particularly require support from colleagues and chairs as they work
toward becoming more eective instructors. In the end, students, instructors, and administrators should all
be most concerned with knowledge and understanding attained by students regardless of which instruction-
al approach is used.
Achievement
Most instructors have learned to calibrate assessments to their student learning outcomes. Using novel in-
structional approaches can result in a shi in student achievement. For example, students who are accus-
tomed to performing well on procedural tasks may not do as well on conceptual tasks, and as a result their
grades might suer. On the other hand, students who may struggle with procedural skills may be pleasantly
surprised to nd they are more successful with conceptual tasks. Such shis can create tension in a class or a
department. e design process should build in appropriate student supports to mitigate the eects of novel
instructional approaches and the corresponding changes in some students’ grades.
Design Practices 107
Content coverage
One of the biggest concerns when redesigning a course is ensuring all the necessary content is covered. In-
structors may need to reconsider how class time is used in order to achieve the student learning outcomes
for the course, especially when the course serves as a prerequisite or covers standardized content for various
stakeholders. An instructor should expect that content may not be covered at the same pace as before the
redesign, as students may learn some topics faster with a new approach while others take longer. Knowing
that students arrive with dierent levels of preparation, an instructor should design the course with both
underprepared and well-prepared students in mind. For example, the instructor might plan to assign addi-
tional work outside of class for underprepared students. To assist with content coverage an instructor may
choose to incorporate instructional technologies that facilitate learning both in and out of the classroom.
See the technology section in the Cross-cutting emes chapter for more information.
Buy-in
In order to successfully implement a course design, instructors must have the support of both administra-
tors and students when employing a new instructional approach. Instructors should set realistic expecta-
tions and create an atmosphere where students can be successful
Administrators should support instructors in identifying resources such as computers, tablets, manipula-
tives, video equipment, and even suitable classroom space. An appropriately sized classroom with furniture
conducive to student engagement can be instrumental in the success of a student-centered course design.
Most students will expect and be comfortable in a traditional mathematics classroom environment. ey
understand the traditional implicit contract between students and instructor: the instructor delivers clear
lectures with appropriate examples, and the students listen to the lecture and do the assigned problems. Stu-
dents may not hold up their end of the contract, but they understand the rules. An instructor implementing
new instructional approaches with dierent underlying agreements must make the new expectations explic-
it. Students will need time to come to terms with the intentional coherence of the methods and messages in
the course. is process will require struggle and might elicit rebellion and frustration before students are
comfortable with the revised contract. In order to facilitate this process an instructor can explain to students
on the rst day of class that this will not be a lecture-based class and ask them to read Dana Ernsts blog post
danaernst.com/setting-the-stage/. Using new instructional approaches may require new tools to help students
succeed in the class such as oering additional oce hours, meeting with struggling students early in the
term, and coordinating tutoring opportunities.
e next illustration details how an instructor modied a course design to meet his student learning out-
comes related to communication.
Classroom vignette: Designing a revision to an established geometry course
Professor Hamilton revised an upper-division modern geometry course that has a proof-based prerequisite,
is the nal proof-based course required of preservice teachers, serves as an elective for other mathematics
majors, and typically enrolls about 15 students. Historically, instructors teaching such a course adopt (a) an
abstract approach to neutral geometry that bypasses high school content, (b) an exploratory approach that
uses high school content as common knowledge, or (c) an axiomatic approach that reconsiders the common
background. e last time he taught the course, Professor Hamilton identied a number of challenges:
Some students required more writing experiences in order to build their written communication skills.
Some students were more successful with work completed outside of class.
Some students requested a reference text.
Finally, despite his emphasis on epistemological issues related to formal, axiomatic systems in mathe-
matics, some students were overwhelmed by the discussions of these themes.
108 MAA Instructional Practices Guide
For the redesigned course, Professor Hamilton adopted an axiomatic approach using David Clarks, Eu-
clidean Geometry: A Guided Inquiry Approach (Clark, 2012). roughout the redesign process, he kept in
mind that most of the content of a geometry course is familiar to undergraduates and that this course is
primarily intended for preservice high school teachers. He also remained cognizant of the position of the
course in the overall program curriculum. In order to emphasize oral communication as well as skill de-
velopment, the classroom design incorporated inquiry-based learning methods. e course centered on a
semester-long activity intended to address the challenges listed above while simultaneously elevating the
related learning outcomes. is activity required students to collaborate in writing a geometry textbook. e
student-written text consisted of students’ proofs, responses to questions from Clarks text, and discussions
related to philosophical issues related to formal, axiomatic systems. e course incorporated a wiki to sup-
port asynchronous student collaborations outside of class.
Important aspects of the activity design included
1. Each student published at least one polished proof to the wiki aer each class meeting. e polished
proofs were graded for mathematical accuracy as well as for the quality of communication, and this en-
couraged students to process class discussions about proof presentations and translate them into written
form. Publishing their proofs allowed students who did not always shine during in-class discussions to
shine outside class via their written work.
2. e student-authored text served as a reference textbook for students to use during and aer the course.
is aspect of the course helped students learn to read reference texts more critically and gain an appre-
ciation for the choices textbook authors must make when organizing content.
3. e wiki served as a model for an axiomatic system with hyperlinks acting as logical dependence. More-
over, the habit of only quoting published proofs helped students distinguish the notions of “true” and
proven” as an expert would.
In the course redesign the professor reected on the overall design of the course and considered the
interactions among the course components rather than only considering each component in isolation. He
also reected on students’ perceptions about the course design choices and how he might help students
understand the choices he had made. For example, students might initially be overwhelmed by the idea
of creating a textbook in this course. e task might seem tedious, pointless, and distracting to students.
ey might perceive it as an additional, meaningless component of an already demanding course. us, the
instructor must be transparent about the rationale for various course components and proactively address
such anticipated issues.
In later iterations of the course, Professor Hamilton added the feature of concept maps (i.e., networks
used to help represent and organize subject knowledge) built by each student. ese maps illuminate aspects
of the course that have been unclear to students and inform revisions in future iterations of the course.
DP.3.3. Embracing opportunities
Although design practices can present challenges, they also present numerous opportunities for improving
outcomes of a course, including the following areas
Collaboration. Working relationships can be strengthened and expanded to include additional col-
leagues.
Engagement. New instructional approaches can increase student-to-student and student-to-instruc-
tor interactions.
Flexibility. New instructional approaches and the associated tools and skills instructors develop can
be employed in other courses.
Design Practices 109
Dierentiated instruction. New instructional approaches can accommodate students with diverse
learning styles.
Collaboration
One of the greatest opportunities associated with engaging in design practices is the possibility of profes-
sional collaborations with colleagues both inside and outside the institution. Whether it involves an entire
department redesigning a course or just a single instructor considering a new teaching approach, the design
process oers an opportunity to build new working relationships. Furthermore, instructors can share their
experiences with colleagues at conferences or workshops and can learn from others as well. Such venues
provide an opportunity to meet colleagues with similar interests who might become future collaborators.
Engagement
One of the main reasons mathematics instructors try new teaching approaches is to engage and motivate
students. However, instructors may nd that teaching strategies that engage students also reinvigorate their
own interest in a course. Aer teaching the same course for years, redesigning a course can revitalize an
instructor’s teaching, especially if they employ a new delivery method.
Flexibility
Another benet of engaging in design practices is the exibility to adopt and adapt the practices in other
courses. ough it is oen prudent to implement new approaches on a small scale in one course at a time,
many of the tools an instructor develops will be eective in other courses. In general, design practices are
broadly applicable, the processes require less time and eort with increased experience, and many instruc-
tors nd themselves gradually using the new practices in all of their classes.
Differentiated instruction
Dierentiated instruction is a teaching approach that requires instructors to intentionally plan for student
dierences to facilitate learning for all students. In a dierentiated classroom, instructors divide their time,
resources, and eorts to reach more eectively students with various backgrounds, readiness and skill levels,
and interests. According to Tomlinson (1999), principles for dierentiated classrooms include attending to
student dierences, respecting students by honoring their dierences and commonalities, not treating all
students the same, ensuring assessment is ongoing and diagnostic, and modifying content, processes, and
products as the instructor proceeds through the class.
e design practices discussed throughout this guide lend themselves to dierentiated instruction. If for-
mative assessment provides an instructor with a true sense of the dierences and commonalities among the
students in the class, instructors can dierentiate activities in ways that allow each student to reach their po-
tential. Although instructors tend to use dierentiating instruction to focus on helping struggling students,
they should also be mindful of students who are excelling in the class. ese students also deserve attention,
and appropriate challenges can serve to heighten their interest in the content.
DP.4. Theories of instructional design
Design practices are generally informed by theories relevant to the eld of mathematics education. We pres-
ent a brief introduction to some of the theories in this section. We do not assume instructors will employ
them exactly as described. We simply provide some background on the tenets of the theories that may help
inform the process for interested parties. In the brief summaries that follow, we omit some nuances and
details of the development of the theories, but we provide references for readers interested in learning more
about each particular theory.
110 MAA Instructional Practices Guide
DP.4.1. Backward design
e backward design process requires that instructors rst articulate both short- and long-term goals for
their students. Instructors then identify evidence that would be useful in determining whether or not stu-
dents have achieved each goal and create activities that will produce such evidence. Instructors must en-
sure the goals, activities, and assessments are in alignment. As such, formative and summative assessments
should be considered in conjunction with course design rather than as a separate component.
e backward design process is based on Finks taxonomy of signicant learning, which consists of six di-
mensions: foundational knowledge, application, integration, human dimension, caring, and learning how to
learn. Similar to Blooms taxonomy, these dimensions do not exist in isolation, and they manifest dierently
in each classroom or learning environment. is theory embraces the notion that learning extends beyond
memorizing facts and procedures, and instructors should design activities to include as many dimensions
as possible in order to fully support students learning. A free online resource for this course design process
can be found at www.deenkandassociates.com/GuidetoCourseDesignAug05.pdf.
DP.4.2. Realistic mathematics education
Realistic mathematics education (RME) is an instructional design theory whose central tenets are guid-
ed reinvention and a realistic starting point. Guided reinvention encompasses tasks that guide students to
reinvent mathematical concepts and procedures with assistance from an instructor or someone else more
knowledgeable than the students. e belief is that if students reinvent the mathematics, then they own the
mathematics and are less likely to believe the bearer of knowledge is solely the instructor. Such a philosophy
supports both content and aective goals for student learning.
e second tenet of RME is that the mathematics must be grounded in a realistic starting point that is ex-
perientially relevant to the students. In other words, even if the setting is not real, it should be meaningful to
the students. For example, in an RME based dierential equations course, students reason about solutions to
dierential equations using the idea of an “ideal” sh population that grows continuously. Even though the
goal in many mathematics courses is to teach abstraction, tying the concepts to something real has the po-
tential to strengthen students’ knowledge and ground their abstracted ideas. Realistic starting points should
be part of the instructional design practices, and instructors should look for ways to cra starting points that
can be carried over to other lessons.
Classroom Vignette. Designing a task for a linear algebra class
As detailed in Wawro, Rasmussen, Zandieh, and Larson (2013), the Inquiry Oriented Linear Algebra mate-
rials’ development followed an iterative cycle of task design, implementation, and renement. In the stage
creating the initial task sequence,” the designers draw on various sources including students learning out-
comes and mathematical ideas that t into the broader scope of the course. e task designers aim to create
tasks that have the potential to facilitate those desired outcomes, drawing on the RME guideline that tasks
should have the potential to elicit students’ intuitive ways of reasoning about mathematical ideas. ese ways
of reasoning can in turn be leveraged toward more formal mathematical reasoning. Although the designers
cannot be certain how students will engage in the task sequence, they draw upon their knowledge of student
reasoning, both as linear algebra instructors and as mathematics education researchers, to design a new task
sequence.
Next the designers pilot the task sequence with a subset of students outside the actual class. ey video
record this set of students attempting the task, and one of the researchers interacts with the students as an
instructor might. ey review the video data to gain information about how the students engaged with
the task. What ways of reasoning were elicited? Which ways of reasoning were productive and which were
Design Practices 111
problematic? Did the task facilitate the development of more formal ways of reasoning about the concept(s)
involved? e developers use the information gleaned from analyzing the tasks rst implementation to
inform renements of the task, and then they use the rened task sequence in a classroom environment,
beginning the second iteration of the design research cycle. ey continue this cycle of rening the task and
implementing the new version until some balanced state is achieved, although the task and its implementa-
tion are never completely stable because an inquiry-oriented classroom requires continual responsiveness
and adaptation to student thinking.
DP.4.3. Universal design for learning
Universal design for learning (UDL) is a framework based in brain science and designed to improve and
optimize learning for all students based on scientic insights into how humans learn (CAST, 2011). ere
are three primary principles that guide UDL. First, designers should provide multiple means of representa-
tion requiring options for perception, for comprehension, and for language, mathematical expressions, and
symbols. e second principle calls for designers to provide multiple means of communication, physical
action, expression, and executive functions. Executive functions include setting long-term goals, planning
eective strategies for reaching those goals, monitoring student progress, and modifying strategies as need-
ed. e third principle calls for designers to provide multiple means of student engagement. is can consist
of creating ways to heighten students’ interest in the work, providing options for sustaining eort and per-
sistence, and oering options for self-regulation.
ere are several other learning theories that are important for design and worthy of further reading.
ese include action, process, object, and schema (APOS) theory (Dubinsky and McDonald, 2001); models
and modeling (Lesh and Doerr, 2003); Piagetian constructivism (Piaget, 1967); and sociocultural theory
including the notion of zone of proximal development (Vygotsky, 1978).
DP References
Anderson, L. W., Krathwohl, D. R., Airasian, P., Cruikshank, K., Mayer, R., Pintrich, P., and Wittrock, M. (2001). A Tax-
onomy for Learning, Teaching and Assessing: A Revision of Blooms Taxonomy. New York. Longman Publishing.
Blair, R. (2006). Beyond Crossroads: Implementing Mathematics Standards in the First Two Years of College. Mem-
phis,TN: American Mathematical Association of Two-Year Colleges.
Boaler, J. (2015). Mathematical Mindsets: Unleashing Students’ Potential through Creative Math, Inspiring Messages and
Innovative Teaching. Hoboken, NJ: John Wiley and Sons.
CAST (2011). Universal Design for Learning Guidelines, version 2.0. Wakeeld, MA
Clark, D. (2012). Euclidean Geometry: A Guided Inquiry Approach. Providence, RI: American Mathematical Society.
Common Core State Standards Initiative. (2010). Common Core State Standards for Mathematics. Washington, DC:
National Governors Association Center for Best Practices and the Council of Chief State School Ocers.
Conference Board of the Mathematical Sciences. (2016). Active Learning in Post-Secondary Mathematics Education.
Retrieved from www.cbmsweb.org/archive/Statements/Active_Learning_Statement.pdf.
Doyle, T., and Zakrajsek, T. (2013). e New Science of Learning: How to Learn in Harmony with your Brain. Sterling,
VA: Stylus Publishing, LLC.
Dubinsky, E., and McDonald, M.A. (2001). APOS: A constructivist theory of learning in undergraduate mathematics
education research. In Holton, D. (ed). e Teaching and Learning of Mathematics at University Level. Springer
Netherlands. pp. 275–282
Dweck, C.S. (2008). Mindset: e New Psychology of Success. Random House Digital, Inc.
Edwards D. and Hamson M. (2007). Guide to Mathematical Modelling. South Norwalk: Industrial Press
112 MAA Instructional Practices Guide
Ellis, J., Fosdick, B.K., and Rasmussen, C. (2016). Women 1.5 times more likely to leave STEM pipeline aer calculus
compared to men: Lack of mathematical condence a potential culprit. PloS One, 11(7), e0157447.
Epstein, J. (2007). Development and validation of the Calculus Concept Inventory. In Pugalee, D.K., Rogerson, A., and
Schinck, A. (eds). Proceedings of the Ninth International Conference on Mathematics Education in a Global Com-
munity. Charlotte, NC. Retrieved from math.unipa.it/~grim/21_project/21_charlotte_EpsteinPaperEdit.pdf
Epstein, J. (2013). e Calculus Concept Inventory—measurement of the eect of teaching methodology in mathemat-
ics. Notices of the American Mathematical Society, 60(8), 1018–1026.
L. D. Fink, Creating Signicant Learning Experiences: An Integrated Approach to Designing College Courses (2nd ed.).
Jossey-Bass, San Francisco, 2013.
Grunert, J., Millis, B. and Cohen, M. (2009). e Course Syllabus: A Learning-Centered Approach. (2nd ed). San Fran-
cisco, CA: Jossey-Bass.
Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand student survey of mechanics
test data for introductory physics courses. American Journal of Physics, 66(1), 64–74. doi.org/10.1119/1.18809.
Hassi, M. and Laursen, S.L. (2015). Transformative learning: Personal empowerment in learning mathematics. Journal
of Transformative Education, 13(4), 316–340.
Herbst, P. (2008). e teacher and the task. In Figueras, O., Cortina, J. L., Alatorre, S., Rojano, T., and Sepulveda, A.
(eds), Proceedings of the 32nd Annual Conference of the International Group for the Psychology of Mathematics
Education (Vol. 1, pp. 125–133). Morelia, Mexico: PME.
Jackson, A. (2002). e world of blind mathematicians. Notices of the AMS, 49(10), 1246–1251.
Langford-Von Glahn, S.J., Zakrajsek, T., and Pletcher-Rood, S. (2008). Teaching students with Asperger Syndrome
(and other disabilities) in the college classroom: Creating an inclusive learning environment. Journal on Excel-
lence in College Teaching, 19 (2 and 3), 107–133.
Laursen, S. L., Hassi, M. L., Kogan, M., and Weston, T. J. (2014). Benets for women and men of inquiry-based learning
in college mathematics: A multi-institution study. Journal for Research in Mathematics Education, 45(4), 406-418.
Lee, Y., Rosenberg, J.M., Robinson, K.A., Klautke, H., Seals, C., Ranellucci, J., Saltarelli, W.A., Linnenbrink-Garcia, L.,
and Roseth, C.J. (2016). Comparing motivation and achievement in ipped and traditional classroom contexts:
e role of self-regulated learning. American Educational Research Association 2016 Annual Meeting. Wash-
ington DC.
Lesh, R. and Doerr, H.M. (2003). Foundations of a models and modeling perspective on mathematics teaching, learn-
ing, and problem solving. In Lesh, R. and Doerr, H.M. (eds), Beyond Constructivism: Models and Modeling Per-
spectives on Mathematics Teaching, Learning, and Problem Solving (pp. 3–33). Mahwah, NJ: Lawrence Erlbaum
Associates, Inc.
McCallum W. (2015) e Common Core State Standards in Mathematics. In: Cho S. (ed), Selected Regular Lectures
from the 12th International Congress on Mathematical Education. Springer.
Mulnix, A.B., Vandegri, E.V., and Chaudhury, S.R. (2016). How important Is achieving equity in undergraduate
STEM education to you? Journal of College Science Teaching, 45(4), 8–11.
National Center for Education Statistics. (2016). Digest of Education Statistics, 2014 (2016-006), Table 311-10. Re-
trieved from nces.ed.gov/programs/digest/d14/tables/dt14_311.10.asp?referrer=report.
National Research Council. (2015). Reaching Students: What Research Says About Eective Instruction in Undergraduate
Science and Engineering. Washington, DC: e National Academies Press.
Rasmussen, C., Zandieh, M., King, K., and Teppo, A. (2005). Advancing mathematical activity: A practice-oriented
view of advanced mathematical thinking. Mathematical inking and Learning, 7(1), 51–73.
Rhea, K. (n.d.). e Calculus Concept Inventory at a large research university. Unpublished manuscript.
Ross, H.J. (2014). Everyday Bias: Identifying and Navigating Unconscious Judgments in Our Daily Lives. Lanham, MD:
Rowman and Littleeld.
Saxe, K. and Braddy, L. (2015). A Common Vision for Undergraduate Mathematical Sciences Programs in 2025. Wash-
ington, DC: Mathematical Association of America.
Design Practices 113
Schmidt, W.H., McKnight, C.C., and Raizen, S.A. (1997). A splintered vision: An investigation of US science and math-
ematics education. Executive summary. U.S. National Research Center for the ird International Mathematics
and Science Study, Michigan State University.
Steele, C.M. (2011). Whistling Vivaldi: And Other Clues to How Stereotypes Aect Us. New York, NY: W. W. Norton and
Company.
Sullivan, M.M. (2005,). Teaching mathematics to college students with mathematics-related learning disabilities: Re-
port from the classroom. Learning Disability Quarterly, 28(3), 205–220.
Tomlinson, C.A. (1999). The Dierentiated Classroom: Responding to the Needs of All Learners. Alexandria, Va: Associ-
ation for Supervision and Curriculum Development.
Vygotsky, L.S. (1978). Mind in Society. Cambridge, MA: Harvard University Press.
Wiggins, G.P., and McTighe, J. (2005). Understanding by Design. Alexandria, VA: Association for Supervision & Cur-
riculum Development.
115
Cross-cutting Themes
e role of technology and the role of equity in teaching and learning mathematics are inherent in each of
the three practices detailed in this guide: classroom, assessment, and design. Instead of discussing technol-
ogy and equity at length within each of the practice chapters above, we have chosen to address them here as
themes that cut across all three practices. We encourage instructors to reect on how these two themes play
a role in classroom, assessment, and design practices beyond the examples included in this guide.
Technology and instructional practice
XT.1. Introduction
In today’s world, technology is ubiquitous and applicable to many aspects of instructional practice. As such,
instructors should continually examine how and where technology ts into their work. Classes may incor-
porate audience response systems or computer-based explorations, assessments may include online home-
work and examinations, and course design may depend on specialized classroom technologies or even the
absence of a physical classroom. e technological landscape is vast and potentially intimidating but im-
mensely powerful in facilitating student learning.
e central theme of this guide, intentional instructional practice, is especially important in the context
of technology. e latest instructional technology fad is not guaranteed to enhance student learning, partic-
ularly if it is used in a less than intentional way or in a way that does not align with learning goals, teaching
style, or instructors comfort level with the technology itself. is chapter details ways in which technology
may be used for instruction, synthesizes research related to the eectiveness of technology in improving
student learning, and includes suggestions for integrating technology into the classroom, design, and assess-
ment practices described in this guide.
XT.2. Uses of technology
e CUPM Guide (MAA, 2015) identies ve broad areas in which technology can be used to enhance
teaching and learning: exploration, computation, assessment, communication, and motivation. is In-
structional Practices Guide provides a lens through which to view these ve areas interwoven with class-
room, assessment, and design practices in an eort to promote student engagement and learning.
For example, exploring a mathematical object or concept with technology may facilitate student engage-
ment while simultaneously facilitating formative assessment. e computational power of technology can
facilitate work on projects set in real-world contexts that in turn may motivate increased student interest
and oer more sustained opportunities for collaboration and communication. Such projects can also serve
as capstone, i.e., summative, assessments of student knowledge and understanding. Technology can serve
as a venue for communication among students and between students and instructors via online chats or
phone apps that both engage students and provide formative feedback. Furthermore, online homework or
examinations may serve as formative or summative assessments and encourage a higher level of student
engagement, particularly with systems that provide immediate feedback to students on their work. Tech-
nology can serve as both a learning tool and an assessment instrument depending on how and when it is
used. Accordingly, instructors should consider how available technologies support their instructional goals
and intentionally design courses to utilize available technology to increase student engagement and improve
116 MAA Instructional Practices Guide
learning. e variety of technology available can further complicate these goals, and the use of technology
will vary greatly from instructor to instructor. us, this guide is not intended as prescriptive; rather, it out-
lines research on the eectiveness of technology in improving student learning and provides information to
assist instructors in making informed choices on the use of technology in their classrooms.
XT.3. Effectiveness of technology
e use of modern technology in mathematics instruction might be dated to the advent of the electronic
calculator in the early 1970s. e breadth of research on the use of these calculators in the K–12 curriculum
is informative for instructors at the postsecondary level as well. e use of calculators is necessarily focused
on improving student learning (rather than, for example, assessing student knowledge), and the research on
their eectiveness is unambiguous. e National Council of Teachers of Mathematics (NCTM, 2011) en-
dorses their use, based on the nding that “the use of calculators in the teaching and learning of mathemat-
ics does not contribute to any negative outcomes for skill development or procedural prociency, but instead
enhances the understanding of mathematical concepts and student orientation toward mathematics” (p. 1).
is strong statement is based on several meta-analyses of numerous research studies on the eectiveness
of calculator use in improving student learning (e.g., Ellington, 2003, 2006; Hembree and Dessart, 1986).
e strength of this conclusion merits additional discussion. e calculator technology referenced in the
statement is, by today’s standards, almost trivial, having a very limited set of capabilities. Yet, the existing
body of research is signicant in breadth, spanning 35 years and hundreds of research articles. e exciting
news is that there now exists a denitive canon of literature that pinpoints a central theme underlying eec-
tive instructional practice, dating back at least to the electronic calculator:
Students must actively engage with the concepts they are learning (CBMS, 2016; Freeman, et al.,
2014; Kogan and Laursen, 2014; Laursen et al., 2014).
us, when instructors intentionally select mathematical tasks along with appropriate technology to pro-
mote student engagement with the material, they have the greatest chance of improving student learning. It
is dicult to overstate the importance of this conclusion.
Many other studies have shown enhanced student outcomes based on the use of various technologies
such as graphing calculators (Ellington, 2006), audience response technology (i.e., “clickers”; see e.g., Cline
and Zullo, 2012; Simelana and Skhosana, 2012; Stowell and Nelson, 2007), and online homework and exam-
inations (e.g., Hirsh and Weibel, 2003; LaRose and Megginson, 2003). If instructors use technology in ways
consistent with the central themes of student engagement and intentional task selection, they can expect to
see improved student learning outcomes. We argue that questions such as “Does this technology t into this
learning environment?” and “How should I use this technology in that class?” are the wrong questions with
which to begin. Instead, instructors should begin by considering their learning goals and their own comfort
level with various technologies, then ask “Which technologies can help me accomplish my goals and create
an engaging learning environment?” Fundamentally, this question motivates the remainder of this discus-
sion.
XT.4. Technology incorporated into instructional practice
Given the breadth of the applications of technology that are available and the speed with which they change,
this guide does not attempt to provide an exhaustive list of current applications. Rather, it oers illustrative
examples of the use of various technologies that are easily adaptable to dierent environments and other
technologies with comparable capabilities.
Cross-cutting emes 117
XT.4.1. Technology and exploratory activities
Many technology applications appropriate tools for engaging students in exploring mathematical concepts.
is type of technology can experientially provide students a view of mathematical structures and relation-
ships, similar to the way they observe the manifestation of physical laws in a lab science course. In other
cases, students may use the the technology to consider problems that are intrinsically interesting that in turn
may increase student motivation and engagement with the material.
Example 1. e rst vignette in section D P. 2 provides an example of using technology to promote student
exploration of mathematical concepts. Many courses include a student learning objective to demonstrate
understanding of the relationship between a mathematical expression (e.g., a formula or equation) and the
quantities appearing in the expression. Technology is well-suited for actively exploring this type of relation-
ship. e vignette describes how students use a spreadsheet application to explore ways in which dierent
terms in a formula aect the output.
Example 2. A similar example from a dierential equations course is shown below. A learning objective for
the course is that students will demonstrate understanding of the relationship between solutions to a non-
linear system and to its linearization at a critical point. Because solving a nonlinear system is, in general, not
analytically possible, this is an opportunity for students to utilize technology such as Maple, Mathematica,
Matlab, or Sage to investigate the relationship. e following set of exercises can be completed with such
technologies.
Differential Equations lab
A. Model
A van der Pol oscillator is a model of an active RLC circuit with a nonlinear resistor that dissipates energy
when the amplitude of the current is high, and pumps energy into the system whenever the amplitude of
the current is too low. An equation modeling such a circuit is
(a)
′′
+−
()
+=
xxxxµ
2
10
,
where m is a positive constant.
B. Prelab activities
Before coming to lab, complete the following activities:
1. Write the van der Pol oscillator (a) above as a rst-order system in x and y = x¢.
2. Show that the only critical point (i.e., values of x and y for which both x¢ and y¢ are zero) for the system
from activity 1 is the point (0, 0).
3. Near (0, 0), we can linearize the system by assuming x and y are very small. If this is the case, then terms
like x
2
and x
2
y are very, very small—so small that it is reasonable to drop them to obtain a linear system.
Find a linear system approximating your system from activity 1, and write it in matrix form.
C. Lab activities
Ignoring, for the moment, the fact that part A above indicates m must be positive, consider the case
m = -1.
1. Find numerical solutions to the nonlinear and linear systems you found in the prelab activities for
several initial conditions close to the origin. For each initial condition, plot a component plot of x as a
function of time with the solutions to the linear and nonlinear system on the same graph.
2. Find numerical solutions to the nonlinear and linear systems for larger initial conditions. (Note that for
this value of m, you must carefully choose the range of t values for the nonlinear system when the initial
conditions are large enough.)
118 MAA Instructional Practices Guide
3. Finally, graph two phase portraits, with x on the horizontal axis and y on the vertical. In the rst, include
all the solutions to the linear system you found in activities 4 and 5, and in the second, include all the
solutions to the nonlinear system.
Repeat activities 4, 5, and 6 for m = 1. Note how your results are similar and dierent.
D. Lab writeup
Write a 1–2 page summary that explains how the solution to the linearized system you obtained in pre-lab
activity 3 is similar to and dierent from the solution to the nonlinear system from pre-lab activity 1 and
where the linearization allows us to say something about the solution to the nonlinear system. Include
graphs of solutions to the systems for dierent initial conditions and an explanation of why the results
make sense given how you obtained the linear approximation to the nonlinear system.
For students to productively engage in tasks using technology, the learning environment must promote
the value of this type of student engagement, and instructors must clearly articulate their expectations of
students. In addition, instructors must structure tasks in ways that promote productive collaboration among
students. Finally, the task design must reect the learning goals the task is intended to accomplish.
XT.4.2. Technology and formative assessment
ere are many ways in which technology can be used for formative assessment. Following are two examples
that build on vignettes from the Assessment Practices chapter.
Example 3. In Vignette 1 in section AP.2.1, Dr. Doe gives an in-class quiz over content from the previous
class period and uses the results to adjust what she covers in the subsequent class period to adequately
address students’ diculties with trigonometric identities. e following week, she decides to implement
a similar feedback loop using her online homework system (e.g., ALEKS, MAA WeBWorK, MyMathLab/
MyStatLab, WebAssign) rather than an in-class quiz to provide feedback on students’ understanding prior
to class. (She might instead have used her course management system, e.g., Blackboard, Canvas, Moodle,
Sakai.) Dr. Doe creates a writing assignment, due a few hours before class, that requires students to answer
simple questions over material they were assigned to read prior to class. Reviewing students’ responses be-
fore class will provide her a good sense of the concepts she need not review and those she should cover more
in-depth.
Example 4. A similar application of technology is illustrated in the vignette in section AP.6.2. Professor Or-
dinal utilizes a classroom polling system to assess student learning in real time. is both increases student
engagement in class and provides formative feedback to Professor Ordinal on what students understand and
what their misconceptions are. In this respect, the polling system serves the same role as Dr. Does quiz and
writing assignment (Example 3 above). Instructors who understand their students’ current knowledge states
are better positioned to facilitate increased student learning.
Example 5. Instructors can use formative assessment to encourage student ownership of their own devel-
opment. A common example is a skills or “gateway” test in a rst-semester calculus course with a learning
goal of students mastering various fundamental techniques of dierentiation. Following is an example of a
technology-based test:
Calculus differentiation gateway test
1. Set-up
e dierentiation gateway test is administered through an online homework system and exists in two
identical versions. Both versions allow students as many attempts as they wish, consist of problems drawn
from the same problem bank, have a xed time limit, and allow a single submission. One version, called
the “Practice Gateway Test,” allows student access from any location and requires no proctor authoriza-
Cross-cutting emes 119
tion. e other version, the “Proctored Gateway Test,” allows student access only in a specic computer lab
where proctors are available to verify the students’ identities and provide a required password.
2. Syllabus description
is is a rst course in calculus, and its primary goal is for you to obtain a rich, conceptual understanding
of the fundamental ideas of calculus. Much of the work you will do in this course focuses on conceptual
ideas and will require you to reason about concepts on many levels. However, there are some basic skills
involving dierentiating functions that you will also learn in this course, skills that every student taking
rst-semester calculus can and should master. Toward this end, you are required to take a dierentiation
gateway test on which you must correctly complete at least six of the seven problems within 30 minutes in
a computer lab with a proctor present. You have two weeks to complete this task. You may take the test in
the lab up to twice per day, but you must review your rst test with one of the tutors in the Math Learning
Center before returning to the testing lab to take the test the second time that day. You may practice the
test as many times as you like, from wherever you like, by logging into the online homework system for this
course and clicking the “Practice Gateway Test” assignment. I recommend you complete the practice test
as many times as necessary until you can reliably pass it before you attempt it in the proctored computer
lab. Failure to complete the gateway test successfully by the deadline will lower your grade in this course a
full letter grade.
3. Technical details
e test consists of seven questions, each of which is drawn from a test bank associated with a specic
type of dierentiation problem. One requires using the product rule, another requires implementing the
quotient rule, another requires dierentiating an expression with symbolic parameters, etc. If a student
takes the test twice in the lab and does not pass it, they are required to take a break of at least one full day,
increasing the likelihood that they will use the practice test to hone skills they have not yet mastered.
In this case, the technology serves two purposes: it minimizes the human eort required to manage the
logistics of the testing process, and it provides students immediate formative feedback when they complete
the test as well as correct solutions demonstrating how each problem could have been solved. It also frees
up instructor time from grading, allowing more time for engagement with students outside of class (e.g.,
during oce hours).
XT.4.3. Technology as a tool
ere are fundamental tasks instructors must complete in every course they teach. For example, distributing
the syllabus and other resources, assigning homework, communicating with students outside of class, etc.,
all of which can be facilitated by technology. Instructors may distribute hard copies of materials to students
or post them electronically using a course management system (CMS) such as Blackboard, Canvas, Moo-
dle, or Sakai. Instructors may assign written homework, i.e., “pencil-and-paper” assignments, or may make
assignments to be completed online via systems such as ALEKS, MAA WeBWorK, MyMathLab/MyStatLab,
and WebAssign. ey communicate with students in class and during oce hours and may also use an
online chat, forum, or discussion board application as a stand-alone tool or perhaps embedded in a CMS.
In all cases, instructors should choose technology tools based on functional capabilities of the tool and
how the tool impacts student learning. Students are oen inclined to rst check their CMS for course infor-
mation and resources so distributing materials via a CMS can be a highly eective way to provide access and
encourage student use of the materials. A chat application provides the means for an instructor to engage
in real-time discussions with students at times when the instructor is not available on campus. A forum or
discussion board application can promote communication between the instructor and students as well as
among the students themselves, creating a classroom community in which all participants are working to
advance learning for all.
120 MAA Instructional Practices Guide
Example 6. Section 6.1 of the Assessment Practices chapter includes a discussion of the advantages and
disadvantages of online homework systems in some detail. Below is an example of how an instructor might
include online homework in a course.
Math 314 course components
Online homework: Homework administered through our online homework system will cover most of the
course content and will be due most Wednesdays as indicated in your day-by-day syllabus schedule. On
each problem of each homework assignment, you are allowed up to six attempts, and the system provides
immediate feedback on the correctness of each attempt. Once the assignment has closed, you can see the
correct answers to the problems along with detailed solutions.
Reading homework: Brief written responses to reading questions will be due approximately daily. ese
short assignments comprise three questions over the material you are to read before class. e questions
are chosen to highlight important formulas and ideas and are due slightly in advance of each class period
so I can use them to determine which topics need more or less coverage in class.
Written homework: Approximately weekly, on Fridays, a written homework assignment will be due. ese
require solutions written out with full explanations. e problems are intentionally chosen to be more in-
volved and conceptual than the online homework problems and are designed to give you the opportunity
to explore the course content more deeply. Your lowest written homework score will be dropped from your
course grade calculation.
is example illustrates how the online homework system is used to complement other assessment meth-
ods and promote student engagement with course content. e strength of the technology—providing im-
mediate feedback to students and aggregated results to the instructor—is exploited and frees up “grading
time” for shorter, traditional assignments that focus on deep, meaningful exploration of concepts as well.
We acknowledge the challenges to academic integrity that may be exacerbated by the use of technology
for instruction. For example, a student may use their smartphone to send a copy of the exam to a friend who
can provide solutions to them during the exam. In a multi-section course taught by the same instructor, a
student may use their phone to share a copy of the exam with a student in the subsequent section to allow
that student time to prepare answers prior to the exam. e internet provides free, instantaneous access to
tools such as Wolfram|Alpha as well as the ability to communicate with anyone else anywhere in the world
who has internet access. ere are discussion boards where students can obtain solutions by simply post-
ing even the hardest proof that might be assigned in an advanced graduate mathematics course. ere are
online tutors who provide solutions to any homework assignment an instructor might make. Obtaining
answers to many problems from undergraduate mathematics courses require nothing more than a simple
internet search for the text of the problem. e reality is that the ubiquity and power of technology means
any assignment completed in an unproctored environment can be completed without a student doing any
of the expected work. A positive consequence of this new reality is that student access to answers to skill-
based, procedural tasks opens the door for more sophisticated and diverse assessments that promote deeper
conceptual engagement and understanding. And that, aer all, should be our primary goals: basic skill de-
velopment and procedural uency along with deep understanding of fundamental mathematical concepts.
XT.5. Practical implications
is chapter echoes the themes of intentionality and appropriate task selection that run throughout this
guide. In utilizing technology in learning environments, instructors must be intentional in their design to
ensure the technology helps create classroom environments conducive to student learning and serves as a
powerful formative and summative assessment tool. e Classroom Practices chapter includes a detailed
Cross-cutting emes 121
discussion on selecting appropriate tasks to promote student learning, and the Design Practices chapter
poses appropriate questions to guide the design of learning environments, all of which directly apply to
selection and implementation of technology in the classroom.
Instructors must also consider issues of equity and inclusion in the context of intentionally using tech-
nology for instruction. For example, using technology to show three-dimensional graphs in a multivariate
calculus course raises the question of accessibility for students with vision impairments. A possible solution
would be to use 3-D printer to generate an object the student can hold in their hands while it is discussed
in class. Using an online homework system or discussion board application might cause accessibility issues
for students with limited or no internet access at home. Ensuring students have access to the internet on
campus will likely solve the problem for some students but might not be sucient for students who have
family or work commitments that prevent them from spending additional time on campus outside of class.
Instructors must be intentional in their use of technology and work to create inclusive, non-threatening en-
vironments for all their students. Further discussion of equity and inclusion issues is included in the in the
equity section of this chapter and in the Classroom Practices chapter of this guide.
XT References
Cline, K. and Zullo, H. (eds). (2012). Teaching Mathematics with Classroom Voting: With and Without Clickers. Wash-
ington D.C.: Mathematical Association of America.
Conference Board of the Mathematical Sciences. (2016). Active Learning in Post-Secondary Mathematics Education.
Retrieved from www.cbmsweb.org/archive/Statements/Active_Learning_Statement.pdf.
Ellington, A. (2003). A meta-analysis of the eects of calculators on students in precollege mathematics classes. Jour-
nal for Research in Mathematics Education, 34(5), 433–463.
Ellington, A. (2006). e eects of non-CAS graphing calculators on student achievement and attitude levels in
mathematics: A meta-analysis. International Journal of Instructional Media, 106(1), 16–26.
Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., and Wenderoth, M.P. (2014). Active
learning lncreases student performance in science, engineering, and mathematics. Proceedings of the National
Academies of Sciences, 111(23), 8410–8415.
Hembree, R. and Dessart, D. (1986). Eects of hand-held calculators in precollege mathematics education: A me-
ta-analysis. Journal for Research in Mathematics Education, 17(2), 83–99.
Hirsh, L., and Wiebel, C. (2003). Statistical evidence that web-based homework helps. MAA FOCUS. 23(2), 14.
Kogan, M. and Laursen, S.L. (2014). Assessing long-term eects of inquiry-based learning: A case study from college
mathematics. Innovative Higher Education, 39(3), 183–199.
LaRose, P.G., and Megginson, R. (2003). Implementation and assessment of online gateway testing. PRIMUS: Problems,
Resources, and Issues in Mathematics Undergraduate Studies, 13(4), 289–307.
Laursen, S.L., Hassi, M.L., Koban, M., and Weston, T.J. (2014). Benets for women and men of inquiry-based learning
in college mathematics: A multi-institution study. Journal for Research in Mathematics Education, 45(4), 406–418.
Mathematical Association of America. (2015). 2015 CUPM Guide to Majors in the Mathematical Sciences. Washington,
DC: Mathematical Association of America.
National Council of Teachers of Mathematics. (2011). Using calculators for teaching and learning mathematics.
NCTM Research Brief. Retrieved from www.nctm.org/uploadedFiles/Research_and_Advocacy/research_brief_and_clips/
2011-Research_brief_18-calculator.pdf.
Simelana, S. and Skhosana, P. (2012). Impact of clicker technology in a mathematics course. Knowledge Management
and E-Learning: An International Journal, 4(3), 279–292.
Stowell, J. and Nelson, J. (2007). Benets of electronic audience response systems on student participation, learning and
emotion. Teaching of Psychology, 34(4), 253–258.
122 MAA Instructional Practices Guide
Equity in Practice
XE.1. Introduction
e number of mathematics degrees awarded at the undergraduate and graduate levels provides insight into
the impact of institutional cultures and instructional practices on women and historically underrepresented
groups in science, technology, engineering, and mathematics (STEM). In 2012, only 20% of bachelors, 18%
of masters, and 8% of doctoral degrees in mathematics were awarded to black, Latinx, native American,
native Alaskan, and Hawaiian students combined (National Science Board, 2014) despite the fact that these
racial groups composed approximately 30% of the U.S. population at that time. Further, the 2010 survey of
mathematics departments conducted every ve years by the Conference Board of the Mathematical Sciences
(CBMS) indicated members of these underrepresented groups composed only 9% of the full-time mathe-
matics instructors (CBMS, 2013); while women made up 29% of these full-time instructors, only 3% were
women of color. 
Research has revealed additional and sometimes hidden stressors placed on women and students of color
as they navigate undergraduate and graduate mathematics (Herzig, 2004; McGee and Martin, 2011a; 2011b).
McGee and Martin (2011b) detailed how academically successful black undergraduates pursuing mathe-
matics and engineering majors faced racial stereotypes of low ability and underachievement. Experiences
in undergraduate mathematics classes have also been shown to contribute to womens decisions to leave
STEM elds despite the fact that they are well-prepared and fully capable of succeeding in these elds (Ellis,
Fosdick, and Rasmussen, 2015; Kogan and Laursen, 2013). Such research suggests our community needs
to critically examine factors well beyond students’ academic preparation and achievements in our quest to
increase students’ success in STEM. Such factors include implicit messages our course design and teaching
practices send to students regarding what mathematics is and who “belongs” in mathematics. Adiredja and
Andrews-Larsen (2017) provides a more detailed review of research in postsecondary mathematics educa-
tion related to equity issues at the institutional level.
Fixation in higher education on low achievement rates among women and students of color in math-
ematics, coupled with erroneous notions that mathematical ability is innate and xed, contribute to the
prevalent decit perspective of these underrepresented groups, especially among a predominantly white
teaching force (Battey and Leyva, 2016; Harper, 2010; Valencia, 2010). Such decit perspectives, that focus
on what students cannot do, oen result in instructors reducing the rigor of mathematical tasks and assess-
ments, avoiding instructional strategies that engage students in higher-level reasoning, and failing to build
positive relationships with students from these groups (Battey, Neal, Leyva, and Adams-Wiggins, 2016; Lad-
son-Billings, 1997; Lubienski, 2002). It is incumbent upon us to consider classroom, assessment, and design
practices that arm our students and provide equitable access to rich mathematical learning opportunities
for all. We must challenge the decit perspective among the broader mathematical sciences community and
help our colleagues broaden their notions of mathematical competence and success while still maintaining
high levels of rigor and standards of performance.
XE.2. Denitions
XE.2.1. Four Dimensions of Equity
Gutiérrez (2009) oers a framework to dene and conceptualize equity in mathematics education. Her
model involves four key factors: access, achievement, identity, and power (see Figure 1). Access and achieve-
ment occupy the “dominant axis” as these dimensions of equity focus on supporting students to participate
Cross-cutting emes 123
in the existing dominant culture and practice of mathematics. Addressing issues of access and achievement
support students in learning the rules of mathematics and successfully “play(ing) the game” (Gutiérrez,
2009, p. 6). Attending to access means ensuring all students have access to physical and intellectual resourc-
es to learn mathematics (e.g., good instructors, rigorous curricula, opportunities to think critically about
mathematics). Achievement focuses on student learning outcomes as traditionally measured (e.g., scores on
exams, persistence in mathematics, majoring in STEM).
Dimensions of Equity
Access
Identity
Power
Achievement
Figure 1. Diagram adapted from Gutiérrez (2009).
Identity and power occupy the “critical axis” as these dimensions of equity focus on supporting students
to become critical participants who have the potential to “change the game” of mathematics (Gutiérrez,
2009, p. 6). ese two are the most transformative of the four dimensions in terms of their potential to aect
monumental change in mathematics education. Attending to identity means recognizing ways in which the
constellation of social identities students bring (e.g., race, gender, social class) can be a resource in learning.
We must educate ourselves and remain ever cognizant of the ways students’ social identities impact their
participation in the classroom. We must acknowledge ways in which these identities serve to include or
exclude students based on the prevailing view of various identities in the context of learning mathematics.
For example, the stereotypical view that all Asians are good at mathematics arms that Asians “belong” in
mathematics but excludes other racial identities (Martin, 2009) and can lead to exclusion of students from
groups that have been historically marginalized (e.g., black students, see Nasir and Shah, 2011). Further,
this stereotype can lead to the erasure of the needs of particular Asian groups that have had limited access
to educational opportunities (e.g., 38% of Hmong-Americans have less than a high school degree compared
to the 13.4% national average, Center for American Progress, 2015). Stereotypical hierarchies of intelligence
are damaging for all students.
Attending to issues of power means examining the degree to which learning disrupts or challenges the
existing distribution of resources and inuence in the classroom as well as in society. is distribution is
oen unequal in terms of race, gender, and social class (Gutiérrez, 2009). us, attending to power means
asking questions such as, “Who benets from the teaching of mathematics and to what end?” or “Is this
mathematics empowering students or does it maintain the status quo?”
Challenging existing power dynamics can be achieved by exploring the use of mathematics to critique
social and political issues (Gutstein, 2003). For example, Tus University hosted a workshop for mathema-
ticians on the “Geometry of Redistricting” to analyze the legality of gerrymandering.
Gutiérrez notes that the two axes of equity are oen in tension with each other. For example, support-
ing students to successfully participate in the current practice of mathematics might inadvertently ignore
aspects that exclude some students from participating. Exploring a non-traditional use of mathematics or
challenging an existing power distribution might lead to exclusion of some students in the current culture of
124 MAA Instructional Practices Guide
mathematics. Gutiérrez’s framework can help guide us in thinking critically about ways to broaden access to
mathematics and in designing inclusive and equitable mathematics classrooms where all students can thrive.
XE.2.2. Equity, Inclusion, and Systemic Barriers
A primary tension that comes into play in the process of addressing issues of equity in undergraduate math-
ematics education is distinguishing equity from equality: equity focuses on social justice whereas equality
focuses on sameness (Gutiérrez, 2002). Sameness refers to a response in which all students are treated the
same regardless of their backgrounds and skills. is type of context-free
1
approach oers the illusion of
fairness but ignores the critical roles that students’ experiences and identities play in their education. Guti-
érrez asserts, “To redress past injustices and account for dierent home resources, student identities, social
biases, and other contextual factors, students, in fact, need dierent (not same) resources and treatment to
reach fairness” (2002, p. 152). Context-free approaches ignore these factors and continue to privilege stu-
dents from the dominant groups.
Equality versus Equity
In the rst image, it is assumed that
everyone will benet from the same
supports. ey are being treated
equally.
In the second image, individuals are
given different supports to make it
possible for them to have equal
access to the game. ey are being
treated equitably.
In the third image, all three can see
the game without any supports or
accommodations because the cause
of the inequity was addressed. e
systemic barrier has been removed.
Figure 2. Equity v. Equality (image source: culturalorganizing.org/the-problem-with-that-equity-vs-equality-graphic/).
e rst two images of Figure 2 highlight the critical need to attend to students’ dierent contextual fac-
tors (here, their heights). e third image illustrates the removal of the barrier (the wooden fence), thereby
removing the need for accommodations, which results in equity.
Removing barriers is the real key to equity and inclusion. Achieving equity in undergraduate mathemat-
ics education is a formidable task that will require philosophical shis in the way our community views the
accessibility of mathematics, particularly as a social justice issue. We must rst identify the systemic barriers
inherent in higher education in general, and in mathematics education specically, and then devise strate-
1 We use the term “context-free” instead of “color-blind” or “gender-blind” to describe the lack of attention to individuals back-
grounds. e term “color-blindness” has been useful in describing beliefs about freedom from racial bias and led to powerful cri-
tiques about such beliefs in a racialized society (Bonilla-Silva, 2003). However, the terms discriminate against people with visual
disabilities by erasing or delegitimizing their existence and experiences (Colorblind, 2011).
Cross-cutting emes 125
gies for removing these barriers for our students. All our students deserve access to mathematics.
We must utilize eective methods for supporting students in becoming better learners as we work to
change departmental and institutional processes, policies, and cultures that act as barriers to student suc-
cess. We must ensure all students have the opportunity to experience the rigor, practicality, elegance, and
beauty of mathematics (dominant axis). Perhaps more critically, we must examine mathematics as an in-
stitution with its own set of norms, values, and practices and identify ways to provide a more inclusive,
arming environments for students, particularly students from underrepresented groups (critical axis).
For example, how can we problematize acceptable expression of mathematical ideas when students are still
learning the formal mathematical language? How do we conceptualize rigor in dierent stages of learning
for our students? Certainly, there is no implication here that lowering our expectations and level of rigor is
in any way acceptable. Rather, the onus is on our community to maintain high academic standards as we
consider systemic barriers in learning mathematics.
e “growth versus xed mindset” theory of intelligence (Dweck, 2006) can serve as an instructive exam-
ple in this context. is theory is appealing to the education community because of its explanatory power,
but it has limits of which instructors must remain mindful. In utilizing the theory to eect positive change
toward increased student learning and success, instructors must remain cognizant of the potential to inad-
vertently limit access to mathematics for students. How might that occur?
e theory posits that individuals with a “growth mindset” are more likely to persist and succeed in the
face of challenging tasks compared to individuals with “xed mindset.” at is, those who view intelligence
and ability not as inherent qualities but rather as malleable qualities are more likely to improve their skills
and understandings over time. ose with a xed mindset are less likely to persist on a challenging task. A
recent publication by Boaler (2015) details specic applications of the growth/xed mindset model in math-
ematics, such as the role of struggle in expanding students’ knowledge and abilities. Helpful questions arise
for reection as we consider mindsets in mathematics students: What messages do we send students about
the eld of mathematics? To what extent do we view mathematical ability as innate in students? To what ex-
tent do successful learners of mathematics experience struggle and need time to make sense of mathematics?
e danger can arise when we inadvertently treat students’ adoption of growth mindset as the only means
to address inequities. It is counterproductive when an instructor views students as “change-worthy” and fo-
cuses on changing the students while ignoring the systemic barriers that perhaps prompted the xed mind-
set view students have of themselves. Solely focusing on students’ mindset ignores the impact of systemic
oppression (e.g., racism) on students’ lives and educational experiences. McGee and Stoval (2015) have
extensively discussed a similar xation around the notion of “grit” in education and its failure in account-
ing for the impact of racism on the mental health of black students. e growth mindset model is a useful
concept but should not be viewed as a singular quick x to the very complex issue of equity and inclusion
in mathematics. Such an approach needs to be coupled with continued work to remove systemic and in-
stitutional barriers for all students to be successful. We now oer some principles that can assist with the
implementation of the specic suggestions from earlier chapters and begin the process of addressing equity
in the classroom.
XE.3. Higher-order equity-oriented principles
XE.3.1. Social discourses and narratives impact teaching and learning
Established social discourses and narratives around social identities (e.g., race, gender) and intelligence im-
pact students’ sense of belonging and their opportunities to participate in the classroom (Leyva, 2016; Nasir
and Shah, 2011). Decit narratives about students, particularly black and Latinx students as academically
and intellectually inferior, limit access to educational opportunities (e.g., who is called on in class, who is
126 MAA Instructional Practices Guide
advised into STEM majors). ese narratives can also place unnecessary cognitive burdens on students in
learning environments, particularly for students operating under “stereotype threat.” Steele and Aronson
(1995) identied and dened stereotype threat as a situational predicament in which individuals are at risk
of conrming negative stereotypes about their group because they will be judged based on negative stereo-
types about their group rather than their own merits. ese researchers investigated the eects of stereotype
threat on students when performance was linked to intelligence.
e researchers found that black college freshmen and sophomores performed worse on verbal tests in
an academic environment than white students when their race was emphasized. e typical race gap in
achievement emerged when stereotype threat was activated via a reminder of a negative stereotype about
their groups intelligence. White students performed at the same level under both conditions, but black
students performed as well or better than their white peers in the absence of stereotype threat. ey found
similar patterns in test performance between women and men. Follow-up studies suggest that in situations
where their ability is being evaluated, stereotyped students carry an extra weight on their minds related to
the stereotypes about their group.
XE.3.2. All students are capable of learning mathematics
ere is no special “mathematics gene,” only social valuation of skills that align better with the traditional
methods of instruction in mathematics (e.g., passive lecturing). Ease in understanding mathematics is not
an inherent personal quality but a product of prior opportunities and social positioning. Similarly, students’
and instructors’ behaviors and dispositions are in part a product of socializations. eir knowledge is in-
uenced by their environment and distribution of resources. Categorizations of students as “mathematics
students” versus “non-mathematics students” or “slow” versus “fast” are articial, limiting, and not condu-
cive to learning. Research has shown that the way teachers label and talk about students impact how they
respond to students’ diculties in the classroom (Horn, 2008).
Instructors must deliberately adopt an anti-decit perspective on students and their knowledge in order
to recognize that all students have the ability to contribute in the classroom. Misconceptions and errors in
student thinking are a natural part of learning. e xation on remediation is decit-oriented, undermines
student progress, and hinders the development of mathematical identity. e value of students’ ideas should
not be solely based on proximity to the norm.
XE.3.3. The importance of fostering a sense of classroom community
A critical aspect of learning mathematics is participating in mathematical discourse in an environment that
supports students sharing and critiquing their own and each other’s work. e work of teaching is not an
activity solely between a student and a teacher. Student participation in the classroom is inuenced by the
distribution of authority, status, and power among all participants in the classroom. Authority, status, and
power are all inuenced by students’ social identities. Experiencing other students as resources in learning
fosters students’ connections to the classroom community. is requires a safe environment for students to
share partial understanding, communicate freely with other students, and build on each other’s knowledge.
XE.4. Attending to equity
XE.4.1. An illustration: Students with disabilities
Most mathematics instructors have had experience attending to equity issues in the classroom related to
providing accommodations for students with disabilities. Instructors recognize the importance of providing
Cross-cutting emes 127
accommodations to facilitate the learning process and ensure students with disabilities are not further mar-
ginalized in their learning experiences. ese students have to navigate learning environments dierently
from other students. We recognize that we are not experts on the particular needs of a student. For example,
we cannot treat all students in a wheelchair the same way because they will likely have dierent needs. We
rely on the assistance from both the student and the oce for disability services on campus to understand
the students particular needs. We as instructors work in collaboration with students to create the most
supportive and inclusive learning environment. We understand that an inclusive classroom environment
would benet all students in the class. For example, speaking more slowly in the classroom would help
accommodate an interpreter for a student with hearing impairment as well as provide other students more
time to process information. For additional information on students with disabilities, see section 2.7 in the
Design Practices chapter.
Some of these ideas are helpful as we consider an equity-oriented approach to teaching for other margin-
alized students. We support students by focusing on the needs of individual students and recognizing their
histories and positioning in society. We do not treat all marginalized students the same way. e students are
an important resource in learning about their needs. We need to work in collaboration with them in provid-
ing the most supportive learning environment. We can also draw on resources outside of our own depart-
ments (e.g., Oce for Diversity and Inclusion) to best serve students. For example, these oces in Student
Services are typically equipped to assist in issues related to microaggressions—e.g., everyday communicative
actions or verbal expressions that may or may not intentionally slight target or marginalized individuals
such as students of color (Sue, 2010)—or other challenging conversations in the classroom. Ultimately, our
students are the best resource in our eort to create a more inclusive classroom environment that serves all
of our students.
XE.4.2. Critical need to attend to developmental mathematics
As we consider instructional practices in the context of dierent topics in mathematical and types of insti-
tutions, one particular issue that requires careful consideration is developmental mathematics. e national
pass rates in developmental mathematics courses in both two- and four-year institutions are disconcertingly
low. is has prompted scholars to investigate factors associated with student success in such courses (e.g.,
Fong, Melguizo, and Prather, 2015) and to recognize the value of curricula focused on quantitative reason-
ing and statistics more than algebra (Hoang et al., 2017). Furthermore, poor performance in developmental
mathematics courses is correlated with dropout rates and low transfer rates (Bonsangue, 1999; Fong et al.,
2015). Multiple failed attempts by students to pass these courses place undue nancial burdens on both
students and states (Fong et al., 2015). Even more disconcerting, non-traditional students and various un-
derserved populations are overrepresented in these courses. For example, Larnell (2016) cited studies that
conrm the disproportionate number of black students in these courses (e.g., Attewell, Lavin, Domina, and
Levey, 2006; Bahr, 2008).
e principles and practices outlined in this guide are particularly relevant in the context of developmen-
tal mathematics courses. Many of the students in these courses are there precisely because our tradition-
al teaching practices (e.g., passive lectures) have failed these students. Yet the CBMS 2010 survey reports
that these courses are dominated by traditional lectures. While a minimal amount of traditional lecturing
can have a place in an active-engagement environment, the evidence-based practices detailed in this guide
oer benets and support for students in developmental mathematics courses. A documented barrier to
instructors adopting innovative, evidence-based teaching practices is the perception that students in lower
level courses are unable to engage in deep mathematical reasoning, which brings us back to the notion of
anti-decit perspectives on students and their knowledge.
128 MAA Instructional Practices Guide
XE.4.3. Conclusion: Anti-decit perspective and focus on excellence
Research has consistently shown the positive correlation between instructors’ high expectations of students
and student success in mathematics (e.g., Asera, 2001; Delpit, 2012; Gutiérrez and Dixon-Román, 2011;
National Collaborative on Diversity in the Teaching Force, 2004). Course design as well as instructional
and assessment practices framed by high expectations and anti-decit perspectives have a positive eect
on how students see themselves in relation to mathematics. In her study of instructors supportive of black
students, Ladson-Billings (1995) found that the common factor across all instructors was their anti-decit
perspective.
Some of the documented curricula, programs, and pedagogical approaches shown by research to success-
fully support underrepresented populations in mathematics are strongly driven by anti-decit perspectives
about students. For example, the Treisman Math Workshop program, which originated at the University of
California, Berkeley, dismissed the narrative that black and Latinx students lack resources and motivation
to do well in mathematics (Treisman, 1992). e workshop was designed as an honors program to provide
students with rich learning opportunities to engage critically with mathematics (Asera, 2001). e Meyer-
ho Scholars Program at University of Maryland Baltimore County is one of the few programs that focuses
on underrepresented students’ success in STEM (Miller, Ozturk, and Chavez, 2005). Similarly, inquiry based
learning (IBL) eorts have been shown to “level the playing eld” between male and female students by
building on the premise that all students are capable of engaging in higher level mathematical practices such
as conjecturing and generalizing (Laursen, Hassi, Kogan, and Weston, 2014).
ese innovative programs and practices also impacted the development of students’ mathematical iden-
tities and redistributed power in students’ experiences with mathematics. For example, Oppland-Cordell
and Martin (2014) found that in an Emerging Scholars Program calculus workshop, students sharing their
mathematical work publicly recalibrated peers’ perceptions of intelligence related to race, gender, and other
social identities. rough observing strong mathematical work by fellow students of color, Latinx students
recognized their own excellence in mathematics and challenged existing narratives about the perceived su-
periority of their white and Asian peers in mathematics. Hassi and Laursen (2015) documented how the im-
plementation of IBL instruction in calculus courses resulted in empowerment of female students who then
perceived themselves as mathematically competent and expressed interest in future IBL mathematics cours-
es at higher rates than female peers in non-IBL courses. Anti-decit perspectives shape socially-arming
forms of course design and instruction that position historically marginalized students as constructors of
mathematical knowledge, thus promoting their development of positive social and mathematical identities.
ese ndings further contextualize exemplary practices detailed in this guide. Teaching practices have a
signicant impact on students’ learning experiences and outcomes but are only part of the story. Awareness
of the impact on students’ identities and broader institutional issues can prompt instructors to adhere to the
core principles of evidence-based practices and the inequities they aim to correct. Equity is a process, not
an end goal.
XE References
Adiredja, A.P. and Andrews-Larson, C. (2017). Taking the sociopolitical turn in postsecondary mathematics education
research. International Journal for Research in Undergraduate Mathematics Education, 3(3), 444–465.
doi.org/10.1007/s40753-017-0054-5.
Asera, R. (2001). Calculus and Community: A History of the Emerging Scholars Program. New York: National Task Force
on Minority High Achievement, College Board.
Attewell, P., Lavin, D., Domina, T., and Levey, T. (2006). New evidence on college remediation. e Journal of Higher
Education, 77(5), 886–924.
Cross-cutting emes 129
Bahr, P.R. (2008). Does mathematics remediation work?: A comparative analysis of academic attainment among com-
munity college students. Research in Higher Education, 49(5), 420–450. doi.org/10.1007/s11162-008-9089-4.
Battey, D. and Leyva, L.A. (2016). A framework for understanding whiteness in mathematics education. Journal of
Urban Mathematics Education, 9(2), 49–80.
Battey, D., Neal, R., Leyva, L.A., and Adams-Wiggins, K. (2016). e interconnectedness of relational and content
dimensions of quality instruction: Supportive teacher-student relationships in urban elementary mathematics
classrooms. e Journal of Mathematical Behavior, 42, 1–19.
Boaler, J. (2015). Mathematical Mindsets: Unleashing Students’ Potential through Creative Math, Inspiring Messages and
Innovative Teaching. San Francisco, CA: Jossey-Bass.
Bonilla-Silva, E. (2003). Racism without Racists: Color-Blind Racism and the Persistence of Racial Inequality in the Unit-
ed States. Lanham, MD: Roman and Littleeld.
Bonsangue, M.V. (1999). Factors aecting the completion of undergraduate degrees in science, engineering, and math-
ematics for underrepresented minority students: e senior bulge study. In Gold, B., Keith, Z. K., and Marion, W.
A. (eds), Assessment Practices in Undergraduate Mathematics, MAA Notes Number 49, 216–218.
Center for American Progress. (2015). Who are Hmong Americans? Retrieved from:
cdn.americanprogress.org/wp-content/uploads/2015/04/AAPI-Hmong-factsheet.pdf.
Conference Board of the Mathematical Sciences (2013). Statistical Abstract of Undergraduate Programs in the Mathe-
matical Sciences in the United States: Fall 2010 CBMS Survey. Providence, RI: American Mathematical Society.
Delpit, L. (2012). Multiplication is for White People: Raising Expectations for Other Peoples Children. New York, NY: e
New Press.
Dweck, C. S. (2006). Mindset: e New Psychology of Success. New York, NY: Random House.
Ellis, J., Fosdick, B.K., Rasmussen, C. (2016). Women 1.5 times more likely to leave STEM pipeline aer calculus
compared to men: Lack of mathematical condence a potential culprit. PLoS ONE 11(7). doi.org/10.1371/journal.
pone.0157447.
Esmonde, I., and Langer-Osuna, J. M. (2013). Power in numbers: Student participation in mathematical discussions in
heterogeneous spaces. Journal for Research in Mathematics Education, 44(1), 288–315.
Fong, K. E., Melguizo, T., and Prather, G. (2015). Increasing success rates in developmental math: e complementary
role of individual and institutional characteristics. Research in Higher Education, 56(7), 719–749.
Gutiérrez, R. (2002). Enabling the practice of mathematics teachers in context: Towards a new equity research agenda.
Mathematical inking and Learning, 4(2 and 3), 145–187.
Gutiérrez, R. (2009). Framing equity: Helping students “play the game” and “change the game.Teaching for Excellence
and Equity in Mathematics, 1(1), 5–7.
Gutiérrez, R. and Dixon-Román, E. (2011). Beyond gap gazing: How can thinking about education comprehensively
help us (re)envision mathematics education? In B. Atweh, M. Graven, W. Secada, and P. Valero (eds), Mapping
Equity and Quality in Mathematics Education (pp. 21–34). Springer Netherlands.
Gutstein, E. (2003). Teaching and learning mathematics for social justice in an urban, Latino school. Journal for Re-
search in Mathematics Education, 34(1), 37–73.
Harper, S.R. (2010). An anti-decit achievement framework for research on students of color in STEM. In S. R. Harper
and C. B. Newman (Eds.), Students of color in STEM: Engineering a new research agenda. New Directions for In-
stitutional Research (pp. 63–74). San Francisco: Jossey-Bass.
Hassi, M.L., and Laursen, S.L. (2015). Transformative learning: Personal empowerment in learning mathematics. Jour-
nal of Transformative Education, 13(4), 316–340.
Herzig, A.H. (2004). ‘Slaughtering this beautiful math’: Graduate women choosing and leaving mathematics. Gender
and Education, 16(3), 379–395.
Hoang, H., Huang, M., Sulcer, B., and Yesilyurt, S. (2017). Carnegie Math Pathways 2015-2016 Impact Report Year Re-
view. Carnegie Foundation for the Advancement of Teaching. Stanford, CA. Retrieved from:
www.carnegiefoundation.org/resources/publications/carnegie-math-pathways-2015-2016-impact-report-a-ve-year-review/.
130 MAA Instructional Practices Guide
Horn, I.S. (2008). Fast kids, slow kids, lazy kids: Framing the mismatch problem in mathematics teachers’ conversa-
tions. Journal of the Learning Sciences, 16(1), 37–79.
Kogan, M., and Lauren, S.L. (2013). Assessing long-term eects of inquiry-based learning: A case study from college
mathematics. Innovative Higher Education, 39(3), 183–199.
Kona, G., Hussar W., McFarland J., deBrey C., Musu-Gillette, L., Wang X., Zhang, J., Rathbun, A., Wilkinson-Flicker, S.,
Diliberti M., Barmer, A., Bullock Mann E., and Dunlop Velez, E. (2016). e Condition of Education 2016 (NCES
2016-144). U.S. Department of Education, National Center for Education Statistics. Washington, DC. Retrieved
from nces.ed.gov/pubs2016/2016144.pdf.
Ladson-Billings, G. (1995). Toward a theory of culturally relevant pedagogy. American Educational Research Journal,
32(3), 465–91.
Ladson-Billings, G. (1997). It doesn’t add up: African American students’ mathematics achievement. Journal for Re-
search in Mathematics Education, 25(6), 697–708.
Larnell, G.V. (2016). More than just skill: Examining mathematics identities, racialized narratives, and remediation
among black undergraduates. Journal for Research in Mathematics Education, 47(3), 233–269.
Laursen, S.L., Hassi, M.L., Kogan, M., and Weston, T.J. (2014). Benets for women and men of inquiry-based learning
in college mathematics: A multi-institution study. Journal for Research in Mathematics Education, 45(4), 406–418.
Leyva, L.A. (2016). An intersectional analysis of Latin@ college womens counter-stories in mathematics. Journal of
Urban Mathematics Education, 9(2), 81–121.
Lubienski, S.T. (2002). A closer look at black-white mathematics gaps: Intersections of race and SES in NAEP achieve-
ment and instructional practices data. e Journal of Negro Education, 71(4), 269–287.
Oppland-Cordell, S., and Martin, D.B. (2015). Identity, power, and shiing participation in a mathematics workshop:
Latin@ students’ negotiation of self and success. Mathematics Education Research Journal, 27(1), 21–49.
Martin, D.B. (2009). Researching race in mathematics education. Teachers College Press, 111(2), 295–338.
McGee, E., and Martin, D.B. (2011a). From the hood to being hooded: A case study of a black male PhD. Journal of
African American Males in Education, 2(1), 46–65.
McGee, E.O., and Martin, D.B. (2011b). “You would not believe what I have to go through to prove my intellectual
value!”: Stereotype management among academically successful black mathematics and engineering students.
American Educational Research Journal, 48(6), 1347–1389.
McGee, E. and Stovall, D. (2015). Reimagining critical race theory in education: Mental health, healing, and the path-
way to liberatory praxis. Educational eory, 65(5), 491–511.
Miller, L.S., Ozturk, M.D., and Chavez, L. (2005). Increasing African American, Latino, and native American repre-
sentation among high achieving undergraduates at selective colleges and universities. University of California,
Berkeley the Institute for the Study of Societal Issues (ISSI) Project Reports and Working Papers. Retrieved from:
escholarship.org/uc/item/10s3p1xt.
Nasir, N.S., Hand, V., and Taylor, E. (2008). Culture and mathematics in school: Boundaries between “cultural” and
domain” knowledge in the mathematics classroom and beyond. Review of Research in Education, 32, 187–240.
Nasir, N.S., and Shah, N. (2011). On defense: African American males making sense of racialized narratives in mathe-
matics education. Journal of African American Males in Education, 2(1), 24–45.
National Collaborative on Diversity in the Teaching Force. (2004). Assessment of Diversity in Americas Teaching Force:
A Call to Action. Washington, DC.
National Science Board. (2014). Science and Engineering Indicators 2014. Arlington, VA: National Science Foundation
(NSB 14-01).
Steele, C.M., and Aronson, J. (1995). Stereotype threat and the intellectual test performance of African Americans.
Journal of Personality and Social Psychology. 69(5), 797–811.
Sue, D.W. (2010). Microaggressions in Everyday Life: Race, Gender, and Sexual Orientation. Hoboken, NJ: John Wiley
and Sons.
Cross-cutting emes 131
Treisman, U. (1992). Studying students studying calculus: A look at the lives of minority mathematics students in
college. College Mathematics Journal, 23(5), 362–372.
Valencia, R.R. (2010). Dismantling Contemporary Decit inking: Educational ought and Practice. New York, NY:
Routledge.