Logo Passei Direto
Buscar

Jennifer Quigley (editor), Mindy J Cassano (editor), Julie A Ackerlund Brandt (editor) - Incorporating Applied Behavior Analysis into the General Education Classroom (Springer Texts in Education)

User badge image
Natany

em

Material
páginas com resultados encontrados.
páginas com resultados encontrados.

Prévia do material em texto

Springer Texts in Education
Jennifer Quigley
Mindy J. Cassano
Julie A. Ackerlund Brandt   Editors
Incorporating Applied 
Behavior Analysis 
into the General 
Education Classroom
Springer Texts in Education
Springer Texts in Education delivers high-quality instructional content for gradu-
ates and advanced graduates in all areas of Education and Educational Research. 
The textbook series is comprised of self-contained books with a broad and compre-
hensive coverage that are suitable for class as well as for individual self-study. All 
texts are authored by established experts in their fields and offer a solid method-
ological background, accompanied by pedagogical materials to serve students such 
as practical examples, exercises, case studies etc. Textbooks published in the 
Springer Texts in Education series are addressed to graduate and advanced grad-
uate students, but also to researchers as important resources for their education, 
knowledge and teaching. Please contact Yoka Janssen at Yoka.Janssen@springer. 
com or your regular editorial contact person for queries or to submit your book 
proposal.
mailto:Yoka.Janssen@springer.com
mailto:Yoka.Janssen@springer.com
Jennifer Quigley · Mindy J. Cassano · 
Julie A. Ackerlund Brandt 
Editors 
Incorporating Applied 
Behavior Analysis 
into the General 
Education Classroom
Editors 
Jennifer Quigley 
The Chicago School 
Chicago, IL, USA 
Julie A. Ackerlund Brandt 
The Chicago School 
Chicago, IL, USA 
Yellow Brick Academy 
Eau Claire, Wisconsin, USA 
Mindy J. Cassano 
The Chicago School 
Chicago, IL, USA 
ISSN 2366-7672 ISSN 2366-7680 (electronic) 
Springer Texts in Education 
ISBN 978-3-031-35824-1 ISBN 978-3-031-35825-8 (eBook) 
https://doi.org/10.1007/978-3-031-35825-8 
© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature 
Switzerland AG 2023 
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether 
the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse 
of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and 
transmission or information storage and retrieval, electronic adaptation, computer software, or by similar 
or dissimilar methodology now known or hereafter developed. 
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication 
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant 
protective laws and regulations and therefore free for general use. 
The publisher, the authors, and the editors are safe to assume that the advice and information in this book 
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or 
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any 
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional 
claims in published maps and institutional affiliations. 
This Springer imprint is published by the registered company Springer Nature Switzerland AG 
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
https://orcid.org/0000-0002-3622-7122
https://orcid.org/0000-0003-2275-1756
https://orcid.org/0000-0003-2439-0966
https://doi.org/10.1007/978-3-031-35825-8
Introduction 
Teaching is hard! I found this out firsthand when I was 22 and substitute teaching 
while finishing up my undergraduate degree. Of course, at the time, I thought I 
knew how to teach, and that I was good with kids. I thought I would walk into 
that room and impart all the knowledge and their little faces would light up. But 
when I was on my own, without the safety net of the regular classroom teacher, I 
realized after 30 min (at most) how truly hard it is to teach that many children. I 
realized then that I needed to put my “behavior analyst” hat on instead of just my 
“teacher” hat because I wasn’t just there to teach. I was there to manage the variety 
of behaviors that come from more than 20 5- and 6-year-old children, and that is 
what behavior analysts are trained to do. Thinking back on those days, I don’t think 
I would have been able to return to a classroom (or maybe wouldn’t have been 
asked!) if I hadn’t had that training. The experience has made me more empathetic 
to teachers I’ve worked with over the years, both as a parent and professional. 
Oftentimes, they are sent into classrooms without the oh-so-important knowledge 
of behavior management. As a professional, I’ve been able to help teachers to set 
up group-based reinforcement plans in their classrooms to help increase some of 
the overarching behaviors they want to see. As a parent, I’ve been able to work 
with my son’s teacher to find ways to make the monitoring and managing of one 
child’s specific plan work within the larger group plan. It is all possible, and that 
is what this book is meant to do. We want to give teachers the tools they need and 
as much help as possible in that area of behavior management. 
This project grew out of a series of papers and assignments my doctoral stu-
dents in applied behavior analysis completed in their psychology and education 
course. The papers and assignments were centered around Project Follow Through 
(PFT). PFT began in the 1960s and is the most extensive educational experiment 
that has ever been conducted. It was a government-funded project targeted at 22 
models of instruction to see which would be the most successful at teaching at-
risk children in grades K-3. After ten years of study, which included over 200,000 
children in over 175 different communities, the results of the study were com-
pelling—direct instruction (DI) was the most effective at increasing basic academic 
skills, problem-solving skills, and self-esteem. Unfortunately, DI is rarely used in 
general education classrooms today, regardless of the compelling data that was 
disseminated almost 45 years ago.
v
vi Introduction
Direct instruction (DI) will be described in Chap. 10, but the key information, 
for now, is that much of the procedures in DI are based on the basic behavior-
analytic principles that are presented in this book. As my students and I read the 
original report written by Cathy Watkins (1997), we discussed how DI was not 
the only procedure that could be that effective. There are many procedures that, 
if teachers could use them, we knew would be helpful. This started the idea to 
develop this book; a series of chapters showing how ABA can be used in gen-
eral education classrooms to improve behavior and ease the stress of trying to 
manage so many children all at once. Our hope is that teachers will find it use-
ful and continue to learn about the ways ABA can help across many settings and 
populations! 
Julie A. Ackerlund Brandt 
Jennifer Quigley 
Mindy J. Cassano 
Reference 
Watkins, C. L. (1997) Project follow through: A case study of contingencies influencing instructional 
practices of the educational establishment. Cambridge, MA: Cambridge Center for Behavioral 
Studies.
Contents 
Part I Introduction and Basic Concepts 
1 Introduction to Foundations of Behavior Change Strategies 
Applicable in the General Education Classroom . . . . . . . . . . . . . . . . . . . . 3 
Tyler C. Ré, Rebecca Gonzales, Jennifer Quigley, 
and Tanya Hough 
2 Determining Behavior Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 
Tyler C. Ré, Shannon Hoey, Jennifer Quigley, 
and Tricia R. Clement 
3 Functional Behavior Assessment in General Education 
Classrooms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 
Tyler C. Ré, Jair Yepez Torres, Jennifer Quigley, 
and Tricia R. Clement 
4 Antecedent Interventions: Proactive Strategies for Changing 
Behavior . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 
Mindy J. Cassano, Holly Bruski, and Sarah Bendekovits 
5 Getting the Behavior You Want Using Reinforcement . . . . . . . . . . . . . . 51 
Julie A. Ackerlund Brandt, Mindy J. Cassano, and Tanya Hough 
Part II Group Systems 
6 Positive Behavior Intervention System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 
Gena Pacitto 
7 Group Contingencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 
Tyler C. Ré and Brittany N. Beaver 
8 Fluency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 
Lacy M. Knutson 
9 Equivalence Based Instruction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 
Timothy D. Caldwell and Laura A. Kruse 
10 Direct Instruction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 
Kozue Matsuda
vii
viii Contents
11 Precision Teaching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 
Jared Van and Jennifer Quigley 
12 TAGteach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137 
Robin Arnall 
13 Personalized System of Instruction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145 
Jennifer Quigley 
Part III Successful Integration of ABA into School Systems 
14 The Early Start Denver Model (ESDM) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163 
Laura A. Kruse 
15 Morningside Model of Generative Instruction . . . . . . . . . . . . . . . . . . . . . . 175 
Holly Barszcz 
16 Comprehensive Application of Behavior Analysis to Schooling 
(CABAS®) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 
Tricia Clement
Part I 
Introduction and Basic Concepts 
Julie A. Ackerlund Brandt 
There has been a recent push for behavior analysts to have a larger presence 
within general education classrooms (Hirsch et al., 2021). Given that, this text was 
developed to assist teachers in learning about applied behavior analysis (ABA) and 
hopefully facilitate more extensive collaboration between all teachers, administra-
tors, paraprofessionals, and behavior analysts or specialists. In developing the text, 
this first part targeted providing additional information on ABA as a field and on 
some of the key principles that are most applicable to all classrooms. 
First, you will read a brief history of ABA with an emphasis on how it has 
been used within special education. This is one area in which there is a wealth of 
research and a successful history of collaboration due to the breadth of ABA pro-
cedures being applied to children with autism spectrum disorders (ASD) or other 
developmental disabilities and delays; however, it is the hope that more of these 
principles will be extended to general education classrooms. There are always 
opportunities for behavior change, whether it is a behavioral deficit or excess. 
There are a plethora of procedures—based on ABA—that may be applied. Behav-
ioral excesses often will require some sort of functional behavioral assessment 
(FBA), which is likely one of those terms in which you are already familiar (Bloom 
et al., 2011). FBAs have been conducted in schools for many years (Blood & Neel, 
2007; Conroy et al., 2000), but Chap. 3 will provide a thorough description of the 
specific procedures and why those procedures are implemented the way they are. 
After determining what behavior change you want to see, and determining a 
behavioral function, there are two chapters that provide information on two gen-
eral groupings of behavioral procedures. First, you will read about antecedent 
interventions, which means there will be descriptions of various procedures that 
can be implemented before the target behavior occurs (Kern et al., 2002). Many 
antecedent interventions will be simple environmental changes that can benefit not 
only the specific student’s behavior but also the behavior of multiple students. For 
example, providing choices to students of which book to read first or which of two 
similar worksheets to completes can decrease the likelihood there will be problem 
behavior and increase the student’s feelings of autonomy and control. 
The last chapter in this part will describe differential reinforcement, which can 
be characterized as catching your students “being good.” The primary focus of 
Chap. 5 will be to pay attention to the behavior you want to see because, all too 
often, we focus more on the behaviors we don’t want to see. By increasing the
https://doi.org/10.1007/978-3-031-35825-8_3
https://doi.org/10.1007/978-3-031-35825-8_5
2 Part I: Introduction and Basic Concepts
delivery of behavior specific praise for the behaviors—any behaviors—that we 
want to see repeated, we can not only increase those behaviors but also our overall 
outlook and mood during the day. By the end of Chap. 5 , and this initial part of 
the book, you will have all the tools you need to successfully implement many of 
the group systems detailed in part II. 
References 
Blood, E., & Neel, R. S. (2007). From FBA to implementation: A look at what is actually being 
delivered. Education and Treatment of Children, 30(4), 67–80. https://doi.org/10.1353/etc.2007. 
0021 
Bloom, S. E., Iwata, B. A., Fritz, J. N., Roscoe, E. M., & Carreau, A. (2011). Classroom application 
of a trial-based functional analysis. Journal of Applied Behavior Analysis, 44, 19–31. https:// 
doi.org/10.1901/jaba.2011.44-19 
Conroy, M. A., Clark, D., Fox, J. J., & Gable, R. A. (2000). Building competence in FBA: Are 
we headed in the right direction? Preventing School Failure, 44(4), 169–173. https://doi.org/ 
10.1080/10459880009599802 
Hirsch, S. E., Randall, K., Bradshaw, C., & Lloyd, J. W. (2021). Professional leaning and devel-
opment in classroom management for novice teachers: A systematice review. Education and 
Treatment of Children, 44(2), 291–307. https://doi.org/10.1007/s43494-021-00042-6 
Kern, L., Choutka, C. M., & Sokol, N. G. (2002). Assessment-based antecedent interventions used 
in natural settings to reduce challenging behavior: An analysis of the literature. Education and 
Treatment of Children, 25(1), 113–130.
https://doi.org/10.1007/978-3-031-35825-8_5
https://doi.org/10.1353/etc.2007.0021
https://doi.org/10.1353/etc.2007.0021
https://doi.org/10.1901/jaba.2011.44-19
https://doi.org/10.1901/jaba.2011.44-19
https://doi.org/10.1080/10459880009599802
https://doi.org/10.1080/10459880009599802
https://doi.org/10.1007/s43494-021-00042-6
1Introduction to Foundations 
of Behavior Change Strategies 
Applicable in the General Education 
Classroom 
Tyler C. Ré, Rebecca Gonzales, Jennifer Quigley, 
and Tanya Hough 
1.1 Overview 
Many behavior change strategies used in general education classrooms were devel-
oped based on the learning theory and are further rooted in behaviorism. Applied 
behavior analysis (ABA) suggests that any behavior that a person engages in can 
be studied, evaluated, and intervened upon if necessary. For instance, if a student 
is having difficulty staying in their seat, the teacher may choose to thank or praise 
the student any time they are sitting in their seat. Alternatively, the teacher may 
reprimand the student for not following rules by stepping away from their seat. In 
both scenarios, the teacher is trying to change the behavior of a student. By gain-
ing greater understanding of the principles of behavior, you can further support 
your students across skills and goals. 
One major misconception of ABA is that it is focused exclusively on the 
decrease of problem behavior; however, many of the evidence-based practices you 
implement have been studied by behaviorists.For example, you may have learned 
how important it is to set classroom rules. Behavior analysts have evaluated differ-
ent ways to increase the effectiveness of using rules to support students in being 
motivated to meet your expectations. To highlight the behavior analytic underpin-
nings of many of your classroom evidence-based practices, this initial chapter will 
introduce you to the basic foundations of ABA which are called the seven dimen-
sions of behavior. Furthermore, the authors will discuss foundational components 
found in almost every strategy covered in this book. This chapter will provide you 
with the relevant context to support implementation of the strategies discussed in 
this book and how they can be used across different environments.
T. C. Ré (B) · R. Gonzales · J. Quigley · T. Hough 
The Chicago School of Professional Psychology, Chicago, IL, USA 
e-mail: tre1@thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_1 
3
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_1&domain=pdf
mailto:tre1@thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_1
4 T. C. Ré et al.
1.2 Introduction to Applied Behavior Analysis 
The evidence-based science of behavior analysis can be used to change socially 
significant behavior (Heward et al., 2022). Behavior analysis, as a science, aims to 
identify and categorize knowledge regarding the deterministic nature of the natural 
world, which encompasses three levels of scientific understanding: description, 
prediction, and control (Skinner, 1953, p. 19). Description is defined as quantifiable 
and categorizable systematic observations (Normand, 2008). Prediction is when 
two events may frequently co-occur, but one event may not always cause the other 
to occur (Bouton & Balleine, 2019). The highest level of scientific understanding 
is reached when a functional relation is demonstrated or when manipulating one 
event results in the second event occurring (Bouton & Balleine, 2019). 
As an educator you have undoubtedly demonstrated control or a functional 
relation in the classroom setting. Suppose you implement a contingency that states 
that if each student sits in their chair for 25 min during on-task work, they will 
receive an additional 10 min for recess. All the students in the classroom continue 
to meet the criteria for remaining in their seats during on-task work. The instructor 
continues to repeat the intervention each day for the rest of the week, yielding 
the same results. In that situation, applying the intervention causes changes in the 
behavior of sitting/on-task behavior, demonstrating control or a functional relation. 
Behavior analysis is both a science and a philosophy. As a science, it aligns 
with science’s six attitudes including determinism, empiricism, experimentation, 
replication, parsimony, and philosophical doubt (Cooper et al., 2020). Determin-
ism demonstrates a cause-and-effect relationship and believes that behavior occurs 
based on specific events happening in the environment (Skinner, 1953). Teachers 
can improve their student’s behavior by holding the perspective that all behaviors 
can be changed by identifying the cause or function of their behavior to modify 
future occurrences. 
Empiricism is taking a data-based approach requiring observable and measur-
able descriptions of the events occurring in the environment (Deitz, 1982). Suppose 
a parent tells you their child engages in attention-seeking behaviors and that you, 
as their teacher, should implement planned ignoring when they engage in those 
identified behaviors. Instead of implementing the intervention suggested by the 
parents without further assessment, you may take an empirical-based approach by 
first identifying the maintaining variables and then implementing a function-based 
intervention. 
Repeating the intervention to assess its reliability is known as replication (Nor-
mand, 2008). As mentioned in the example above, if the teacher continued to 
implement the contingency of delivering additional recess contingent on remain-
ing in their chair for X number of minutes, this would determine the intervention’s 
reliability. Parsimony, the fifth attitude, is based on ruling out the simplest explana-
tion before considering a more complex one. Instead of jumping to the conclusion 
that a student requires a full change to their BIP due to increased problem behav-
iors during classroom instruction, the instructor should first rule out the simplest 
explanations.
1 Introduction to Foundations of Behavior Change Strategies Applicable … 5
Philosophical doubt is the final attitude of science, which is identified as having 
healthy skepticism regarding our outcomes (Deitz, 1982). Suppose you’ve imple-
mented a token board for each student in your classroom, and each student begins 
to engage in the desired behaviors. Before immediately assuming your token econ-
omy is effective, you may evaluate other potential reasons for the change in 
behavior with a critical eye. These philosophical assumptions of behavior analysis 
help practitioners take a scientific approach to identifying and modifying environ-
mental variables influencing socially significant behaviors that are practical and 
applicable across environments, other behaviors, and time. 
1.3 Seven Dimensions of ABA 
Applied behavior analysis is based on a set of guiding principles to ensure proper 
services are provided to students (Baer et al., 1968). These seven dimensions of 
ABA include generalization, effective, technological, applied, conceptually sys-
tematic, analytical, and behavioral (Baer et al., 1968). These dimensions are 
utilized to guide practitioners in creating and implementing behavior change 
treatments which are research based, monitored for effectiveness, and socially 
significant to the individual. In the classroom setting, these socially significant 
behaviors may be identified by increasing the rate of reading words or increasing 
the rate of responding by raising their hand in classroom settings. Behavior anal-
ysis as an evidence-based approach can provide an easily accessible technology 
for changing socially significant behaviors across time, environments, and other 
individuals. 
First is the applied dimension. For strategies to be considered as applied, they 
must target socially significant behaviors (Baer et al., 1968). Socially significant 
behaviors are meaningful and important to the student. In your classroom, these 
may be specific to your classroom rules, but may also include how students are 
learning math, grammar, history, etc. Sometimes, these behaviors may be deter-
mined to be important by a team (e.g., other teachers, administrators or even the 
parents of your students) but ultimately these targets should aim to increase the 
student’s quality of life in some way. 
Second is the behavioral dimension. The behavioral dimension states that the 
goal must be observable and measurable (Baer et al., 1968). For instance, a goal 
that meets the behavioral dimension may be a student will solve multiple digit 
addition using the carryover procedure. A goal that does not meet the behavioral 
dimension would be, the student thinks about their actions. At face value, it would 
be very difficult to determine if a student is weighing the pros and cons of engaging 
in a specific behavior before engaging in said behavior. To reword this goal to meet 
the behavior dimension would be to create an operational definition. Operational 
definitions allow anyone to clearly observe the target behavior occurring or not 
occurring. For instance, a student thinking about their actions may be operationally 
defined as, a student writing on a piece of paper the pros and cons of engaging in 
a behavior. With thisdefinition, you could walk by a student’s desk and see a pros
6 T. C. Ré et al.
and cons list related to a dilemma you presented. This behavioral target may be 
more accurately titled problem solving. Operational definitions can be created for 
both desirable and undesirable behaviors. 
The third dimension is the analytic dimension. For you to know if your student 
is making progress, data must be collected. This meets the analytic dimension. As 
a teacher, you will be or already are frequently collecting data! The data you may 
most frequently collect is the percentage of accuracy on assignments. However, 
data can and should be collected on a myriad of targets in the classroom. Admin-
istrators may collect data on the number of opportunities students have to respond 
in your classroom during classroom observations. You may collect data by putting 
a jellybean in a jar anytime the students follow your classroom rules and then 
count the number of jellybeans earned at the end of each week. Once collected, 
this data must then be compared to previous data to evaluate or analyze the effects 
of different strategies. Therefore, the purpose of data is to measure the amount of 
change in a targeted response. Data help to make objective decisions and should 
be used to determine the direction of change toward the specified goal. 
The technological dimension states that any strategy recommended for use must 
be clearly and completely described (Baer et al., 1968). The goal of this dimension 
is that anyone with minimal training can read the description of the strategies used 
and be able to implement them. For instance, in a scenario in which you are not 
physically in the classroom, the person filling in for you needs to know exactly 
how you run your classroom (e.g., classroom schedule, classroom rules, rewards, 
punishments, etc.) to maintain consistency. By providing a detailed description of 
the strategies used in your classroom, you may meet the technological dimension. 
For example, a technological teaching plan will include steps to introduce the les-
son, how to prompt students to engage in the skill, how to fade those prompts, how 
to correct an incorrect response, and how to provide rewards for correct responses. 
The conceptually systematic dimension states that the strategies used to alter the 
occurrence of the target behavior are founded in research or are evidence-based 
practices (Baer et al., 1968). By supporting your teaching strategies with research, 
you are more likely to see greater outcomes across students. For example, if you 
implement the Good Behavior Game (discussed in Chap. 7: Group Contingencies) 
you could support the use of this strategy by related research articles. By holding 
yourself to only evidence-based strategies, you are increasing the quality of your 
academic teaching and classroom management. 
Effectiveness is another dimension of behaviorism. This dimension is very 
closely tied to the behavioral and analytic dimensions. This dimension does exactly 
what it states: you will ensure that the strategies you are implementing in your 
classroom are successful at teaching what you say you are teaching. For instance, 
you are teaching in an elementary school classroom that incorporates spelling tests. 
You provide students with a pre-test to determine which level of spelling words 
they should be focused on. Then you provide a specific strategy (e.g., repeated 
practice) to teach the students how to spell the words for the week. At the end 
of the week, you have a post-test to see how much the students have improved
1 Introduction to Foundations of Behavior Change Strategies Applicable … 7
on their spelling words. If students are increasing the number of words they spell 
correctly, then you can demonstrate the strategy as an effective strategy. 
The last dimension of ABA is generality. Generality states that the change in 
behavior occurs across multiple people, environments, and behaviors (Baer et al., 
1968). As a teacher, you want your students to be able to demonstrate their new 
skills to their parents when they are at home. So, you may create assignments that 
require the student to practice their spelling words at home (e.g., generalization 
across environments). You may do this by incorporating key words from the math 
curriculum as spelling words so there is practice multiple times per school day 
(e.g., generalization across behaviors), or you may have another teacher conduct 
the test with the students (e.g., generalization across people). All the strategies aim 
to show the student has learned the skill and can demonstrate the skill regardless 
of who the student is with or where they are. 
By becoming more familiar with these dimensions, you may feel more confident 
in creating a classroom environment that is evidence-based and geared towards a 
better quality of life for each of your students. Additionally, this knowledge will 
help you understand what behavior analytic strategies are based on and why they 
are recommended. You may be surprised that much of what you are already doing 
is based in behavior analysis. 
1.4 Applied Behavior Analysis in the Classroom 
ABA interventions in the classroom setting have been demonstrated as effec-
tive tools within the general education population for increasing desired targeted 
behaviors. These interventions can be divided into antecedent and consequence-
based interventions. For instance, in the classroom you’ve probably implemented 
interventions before the behavior occurs which may include non-contingent rein-
forcement, behavior momentum, or functional communication training to name a 
few. Each of these interventions are delivered before the behavior (antecedent), 
whereas consequence-based interventions are delivered after the targeted behavior 
occurs. For example, have you delivered reinforcement to a student for engaging 
in raising their hand to respond to a question to replace yelling out the answer? 
That is an example of differential reinforcement of an alternative behavior (DRA). 
The reinforcement was presented contingent on the student emitting the targeted 
alternative behavior to replace the problem behavior of yelling, making this a 
consequence-based intervention. 
1.5 Operant Learning in Applied Behavior Analysis (ABA) 
Operant learning occurs through contact with consequences on behavior. When 
a stimulus is presented or removed immediately following a response (see con-
sequence) resulting in an increase or decrease in the response occurring in the 
future (reinforcement/punishment). The stimulus delivered after the response is
8 T. C. Ré et al.
also known as the consequence which is described as anything that happens fol-
lowing the occurrence of a behavior in ABA (Skinner, 1953). In ABA, we use 
the term contingency when discussing this phenomenon. Much of what we learn 
throughout our development is learned through operant conditioning. In addition, 
we don’t need to be aware of operant learning or the consequences of our behav-
ior for behavior change to occur. Reinforcing and punishing contingencies are 
occurring all around us. 
In the classroom, you may see an operant contingency occur in the form of a 
teacher posing a question, student raising their hand, and the teacher calling on 
them to respond. In this example, hand raising is a desired behavior the teacher 
wants to increase in the future. In addition to the response and consequence, it also 
includes the antecedent which creates the three-term contingency. The antecedent 
in the three-term contingency is the condition or stimulus occurring prior to the 
targeted behavior. Applying the same example above, the antecedent could be 
the teacher providing a question to the students (antecedent), the student raising 
their hand (targeted behavior), and the teacher providing their attention/response 
(consequence) for engaging in the targeted behavior. 
This may be different than what you have previouslythought of as a conse-
quence. This is an important distinction because anything that follows a behavior 
is a consequence (in ABA terms); we then assess the outcomes of the consequence 
to determine if it functioned as a reinforcer or punisher (Chap. 2 provides a more 
thorough description of the antecedent-behavior-consequence relationship). Once 
the consequence occurs, we continue to observe the targeted behavior to know 
what the effect is on the behavior. Consequences only impact future behavior; 
they occur automatically and may select any behavior. It is the temporal relation-
ship between behavior and consequence that impacts the future likelihood of the 
behavior occurring. 
1.5.1 Reinforcement and Punishment 
Reinforcement is the process and procedure that increases the rate of a behav-
ior and punishment is the process and procedure that decreases the rate of a 
behavior (Skinner, 1953). Think about how you have viewed reinforcement or 
punishment in the past. Have you ever told your child “No” after they did some-
thing, and they kept doing it? If so, by definition, that interaction would be viewed 
as reinforcement of the undesirable behavior. The behavior increased following the 
consequence of being told ‘no.’ Likewise, have you ever told your child “Great 
Job!” and then not seen the behavior again? That would be an example of punish-
ment. The wanted behavior decreased after receiving the consequence of praise. 
Being able to distinguish between reinforcement and punishment is highly impor-
tant for the success of any program that you may encounter for your child (see 
Table 1.1). As a stimulus is presented or increased, the future frequency of the 
behavior increases, whereas when the stimulus is removed or decreased the future 
frequency of the behavior decreases.
1 Introduction to Foundations of Behavior Change Strategies Applicable … 9
Table 1.1 Visual depicting 
reinforcement and 
punishment 
Behavior Addition of stimulus Removal of stimulus 
Increase in 
behavior 
Positive 
reinforcement 
Negative 
reinforcement 
Decrease in 
behavior 
Positive punishment Negative 
punishment 
1.6 Summary 
This chapter aims to provide you with the foundations of applied behavior analy-
sis. Throughout the rest of the book, you will encounter interventions that seek to 
increase desirable behavior (e.g., skill acquisition goals) and to reduce undesirable 
behavior (e.g., behavioral excesses or challenging and disruptive behavior). Some 
of the interventions you learn about will be considered to be antecedent interven-
tions, which means the strategy comes before the behavior occurs (e.g., prompting 
and prompt fading procedures). Other interventions come after the behavior which 
are called consequences (e.g., reinforcement and punishment). 
The remainder of this book will provide educators with the foundational princi-
ples of evidence-based strategies for the general education classroom. Each chapter 
will provide you with an overview of the information to be presented, a more 
detailed description of the strategy, examples of implementation, applications in 
the classroom and materials or templates available for your use. 
References 
Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behav-
ior analysis 1. Journal of Applied Behavior Analysis, 1(1), 91–97. https://doi.org/10.1901/jaba. 
1968.1-91 
Bouton, M. E., & Balleine, B. W. (2019). Prediction and control of operant behavior: What you 
see is not all there is. Behavior analysis (Washington, D.C.), 19(2), 202–212. https://doi.org/ 
10.1037/bar0000108 
Cooper, J. O., Heron, T. E., & Heward, W. L. (2020). Applied behavior analysis (3rd Edn.). Pearson 
Education, Inc. 
Deitz, S. M. (1982). Defining applied behavior analysis: An historical analogy. The Behavior 
Analyst, 5(1), 53–64. https://doi.org/10.1007/BF03393140 
Heward, W. L., Critchfield, T. S., Reed, D. D., et al. (2022). ABA from A to Z: behavior science 
applied to 350 domains of socially significant behavior. Perspective Behavior Sci, 45, 327–359. 
https://doi.org/10.1007/s40614-022-00336-z 
Normand, M. P. (2008). Science, skepticism, and applied behavior analysis. Behavior Analysis in 
Practice, 1(2), 42–49. https://doi.org/10.1007/BF03391727 
Skinner, B. F. (1953). Science and human behavior. Macmillan. https://doi.org/10.1901/jaba.2006. 
144-05
https://doi.org/10.1901/jaba.1968.1-91
https://doi.org/10.1901/jaba.1968.1-91
https://doi.org/10.1037/bar0000108
https://doi.org/10.1037/bar0000108
https://doi.org/10.1007/BF03393140
https://doi.org/10.1007/s40614-022-00336-z
https://doi.org/10.1007/BF03391727
https://doi.org/10.1901/jaba.2006.144-05
https://doi.org/10.1901/jaba.2006.144-05
2Determining Behavior Change 
Tyler C. Ré, Shannon Hoey, Jennifer Quigley, 
and Tricia R. Clement 
2.1 Overview 
This comprehensive review of data will cover helpful considerations and options 
educators have when selecting a data collection system. The most important take-
aways from this chapter include that data should be tracked with intention and 
collecting data, on its own, is not an intervention. The responsibilities of edu-
cators make their time and effort precious resources. Inaccurate or unnecessary 
data collection efforts deplete an already burdened system. If wielded correctly, 
data collection is one of the most important tools in an educator’s toolbox. There-
fore, the purpose of this chapter is to provide educators with support in setting up 
effective and accurate data collection systems. 
2.2 Operational Definitions 
As outlined in Chap. 1 of this text, the behavioral dimension of applied behavior 
analysis means that a behavior is defined in a measurable and observable way 
(Baer et al., 1968, 1987; Miltenberger & Weil, 2013). In other words, when a 
definition meets the behavioral dimension of applied behavior analysis, one has 
an operational definition for that behavior (Cooper et al., 2020). When behaviors 
are operationally defined, anyone who has that definition—a substitute teacher, an 
administrator, an art teacher—can determine when a behavior is happening and 
when it is not happening (Altman, 1974; Cooper et al., 2020; Kahng et al., 2021). 
Education is very much a team sport with many different team members. When
T. C. Ré (B) · S. Hoey · J. Quigley · T. R. Clement 
The Chicago School, Chicago, IL, USA 
e-mail: tre1@thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_2 
11
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_2&domain=pdf
mailto:tre1@thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_2
12 T. C. Ré et al.
all those team members are on the same page with identifying the occurrence and 
non-occurrences of behaviors, with help from those operational definitions, there 
will be a much clearer picture of how significant of an impact that behavior may 
be having on a student achieving their maximum potential as a learner. Once there 
is an operational definition and, with that, the ability to differentiate the occurrence 
and non-occurrence of that behavior, then the discussion of setting objective goals 
based on those operational definitions can begin. 
When creating these operational definitions, only include information that per-
tains to what can be seen (Cooper et al., 2020; Kahng et al., 2021; Miltenberger & 
Weil, 2013). People have a tendency in their regular lives to describe events by 
simultaneously assigning a reason. This tendency can sometimes be a barrier when 
creating operational definitions. Here is an example to demonstrate the difference 
between an operational definition and a non-operational definition. An example 
of an operational definition may be the following: property destruction is when 
the student ripspapers during an assigned classroom activity (e.g., worksheets, 
peers’ artwork, own artwork). A non-example of an operational definition may 
be the following: property destruction happens when the student is upset, doesn’t 
want to do his work, and throws the work on the ground or at the teacher. When 
there is room for interpretation, like there is in this second definition, there will be 
inconsistencies in the data collected (Repp et al., 1988). 
Words such as intend should be avoided when creating operational definitions 
(Miltenberger & Weil, 2013). For example, one would want to avoid the following 
language within a definition: aggression is when the student is intentionally trying 
to hurt others. While it may not be technically wrong about a student’s so-called 
intent, it makes room for messy data collection because two people may interpret 
intent differently since it is not something that can be seen (Kahng et al., 2021). 
An operational way to define this behavior could be, aggression is when a part 
of the student’s body (e.g., hand, foot) makes contact with another person’s body 
with enough force to leave a mark on the person’s skin. 
To help provide clarity with definitions, it can be helpful to provide examples 
and non-examples along with the operational definition. These examples and non-
examples do not need to be an exhaustive list of every situation that may qualify 
as an instance of the target behavior, but it can provide some guidelines for the 
staff on what to look for when recording their data. For example, the definition of 
aggression can be expanded upon as follows (Table 2.1).
The combination of the operational definition with the examples and non-
examples of the behavior, provides a very clear picture of what would be counted 
as an occurrence of a behavior and what would be counted as a non-occurrence. 
If there are a number of individuals who know the student well or there are mul-
tiple staff who will be collecting data based on this definition, it will be beneficial 
to get their input on the definition you have crafted. Once the operational definition 
is finalized, the team can differentiate the occurrence and non-occurrence of that 
behavior. At that point, teaching teams can then approach setting objective goals 
based on those behaviors of interest.
2 Determining Behavior Change 13
Table 2.1 Examples and non-examples of an operationally defined behavior 
Operational definition Examples Non-examples 
Aggression is when a part of 
the student’s body (e.g., hand, 
foot) makes contact with 
another person’s body with 
enough force to leave a mark 
on the person’s skin 
An example of aggression 
may be when the student hits 
a peer’s arm with a closed fist 
A non-example of aggression 
may be when the student gives 
a peer a high five or if the 
student and a peer accidentally 
collided during recess
2.3 Objective Goals 
The concept of objective goal creation and implementation is repeated through-
out the book and deserves considerable attention. When goals are derived from 
operational definitions and are objectively written, they will be viewed and under-
stood the same across all teachers and the student (Kahng et al., 2021). Since an 
objective goal is targeting an operationally defined behavior, information about the 
occurrences or non-occurrences of the behavior can be recorded. Just as with the 
importance of communicating operational definitions of behaviors to other team 
members, goals must also be clearly communicated to all members of the team. 
Everyone involved in the student’s success should be aware of the goals and what 
it takes to achieve them. This clarity helps to ensure everyone is on the same page. 
Like the way you have seen IEP goals written, your objective goals must be 
observable, measurable, and state a mastery criterion. An example may be the 
following: during circle time, the student will stay in their assigned spot for 90% of 
recorded intervals for 10 consecutively recorded school days. A non-example may 
be the following: the student will stay seated during circle time. One thing to be 
mindful of when setting mastery criteria for the objective goal is that being 100% 
perfect for all goals is often not realistic. There are certainly some behaviors— 
such as behaviors related to the safety of the student or others—that would be 
understandable to aim for perfect or near-perfect performances. For example, if 
a student’s target behavior is aggression, the team would likely target a mastery 
criterion of zero instances of aggression. Conversely, if a student’s target behavior 
is off-task behavior for a kindergartener, there would be a less stringent mastery 
criterion. Being off-task is not a danger to the student or others and, to a given 
extent, needing reminders to stay on task may be more in-line with the typical 
behaviors of a kindergartener. Objective goals that are meaningful and achievable 
for the student requires that these goals are also individualized. 
Now that there is an operational definition of a target behavior and an objective 
goal created about that operationally defined behavior, the next step is to assess 
progress in attaining that goal. Several tools and strategies may be employed to 
determine where a student is currently performing in relation to an established 
goal. There are benefits and drawbacks to every tool in relation to the specific
14 T. C. Ré et al.
goal and the context of the learner. Therefore, it is important to consider this step 
(see Chap. 3 for information about the assessment of problem behavior). 
If through individual assessment and observation of the student’s progress, it is 
determined that a student is not learning and acquiring adequate skills, the teaching 
methodology needs to be evaluated. If a student isn’t learning, the onus is on us 
as educators. If sufficient student progress is not being made, the teaching strategy 
should be changed. Data can assist in identifying a new teaching strategy by iden-
tifying contexts or behavioral patterns that have shown more or less progress. This 
should be viewed as an opportunity to promote more effective learning opportu-
nities for the student and to meet that student where they are at with their current 
skillset. 
When assessing the effectiveness of the teaching methods, educators will put on 
their scientist hats and approach assessment from a place of curiosity and a desire 
for discovery. As a place to start assessing goals and progress, let’s discuss ranking 
systems. Each student’s progress should be measured against themselves rather 
than solely against their peers. Some instructional programs reference normative-
based assessments but comparing our student’s performance to the “norm” should 
never be the sole indicator for progress. Normative-based assessments may be 
used in conjunction with individual and group measures as exemplified in the 
Morningside Model (see Chap. 15). 
Behavior analytic-based educational programs highlight the importance of stu-
dents proceeding at their own pace. In typical general education classrooms, 
the same or similar assignment and material is presented to all students. This 
may inadvertently create a ceiling of achievement which could hinder academic 
progress for some students. To remove this ceiling effect, we can instead com-
pare the student’s current performance to their previous performance. Therefore, 
when we scaffold instruction to promote growth for the individual student, we may 
include bonus assignments to expand learning for those who have accomplished 
everything in the class or offer different levels/modifications of assignments for 
students based on their current performance level. Behavior analytic-based instruc-
tional programs enable students to progress more quickly or slowly than their 
peers, dependent on the targeted skill. Because programming is individualized, 
students receive the most effective instruction for them.This way of grouping 
learners to deliver differentiated instruction to meet the needs of individual stu-
dents may sound familiar since many general education classrooms have adopted 
similar approaches for instruction delivery for math and reading. 
2.4 Data Collection 
Teachers will know the interventions in place, such as bonuses and differentiated 
activities, are causing the desired change by collecting data. Data collection is one 
of the most important aspects of any classroom and is a concept that educators are 
likely quite familiar with. Teachers check the accuracy on a timed math problem 
worksheet (i.e., rate), provide a percentage correct on a homework assignment (i.e.,
2 Determining Behavior Change 15
percentage), track time from direction to initiation of assignment (i.e., latency), and 
take attendance (e.g., frequency) data often. Teachers also may track how long 
a student is out of the classroom (e.g., duration). Perhaps sometimes a teacher 
may make a mental note of this information or make a note on a sheet of paper 
or in a digital document. The key to collecting data is to know how to use it. 
Data collection itself is not an intervention. It is a tool that can evaluate different 
strategies of teaching and ways of interacting with students. 
Without data collection, it is impossible to determine the effect a teaching 
strategy is having on the student’s learning. From a behavior analytic lens, data 
collection encompasses four of the seven dimensions (e.g., behavioral, analytic, 
effective, and generality; Baer et al., 1968). To collect data, one must first create 
an operational definition with clear objectives. This is so that multiple people can 
all agree that a behavior has occurred or not occurred. For example, out-of-seat 
behavior may be perceived to have occurred by one person if the child walks away 
from their desk, while another person may consider it to have occurred if the stu-
dent stands up and sits back down in their seat during instruction. Data will be 
collected specifically using the operational definition. 
Collecting data also meets the criteria for the analytic and effective dimen-
sions. Data collection allows educators to gather more information on the impact 
of the behavior for that student in relation to their classroom environment (Mil-
tenberger & Weil, 2013). For example, one could see how frequently or how long 
a behavior is occurring. Furthermore, all data can be graphed to be visually ana-
lyzed. Graphs provide a visual picture of what is occurring with the behavior (see 
Figs. 2.1, 2.2, 2.3 and 2.4). An educator can see when a strategy or interven-
tion (e.g., classroom management or teaching strategies) is introduced and if there 
is a change in behavior related to the introduction of the intervention. This visual 
inspection of the data is how one will be able to analyze the occurrence of behavior 
and ensure that your strategies are effective in the behavior change you seek. For 
this to happen, there are several different methods of data collection an educator 
may try.
In special education, data collection is often associated with IEP goals, but it 
can be used for a variety of other reasons as well. Anytime an educator is interested 
in whether the teaching strategy or new reinforcement system is working, they 
could determine that by looking at the data collected. However, data can be used 
in just a slightly different way. For instance, when an educator has observed that 
a student is struggling with spelling words and they decide to change the way 
they are teaching the spelling words, they should look at the accuracy on the 
spelling test before they changed the strategy and then after they changed the 
teaching strategy. Is the student doing better (e.g., increased accuracy) or worse 
(e.g., decreased accuracy) since the change? By looking at these data, an educator 
can see if the change in strategy is working for that student. However, each data 
collection strategy has strengths and weaknesses related to what is being evaluated. 
Any of the data collection systems may be a fit depending on the behavior being 
measured and the context in which it is being measured.
16 T. C. Ré et al.
Fig. 2.1 Trends a ascending, 
b descending, and c no trend
Types of data collection include rate, duration, and a variety of skill acqui-
sition data collection methods (Altman, 1974; Carr et al., 2018; Kahng, 2021; 
Miltenberger & Weil, 2013; Repp et al., 1976). Individual skill acquisition and 
behavior reduction tend to be the most commonly observed in the classroom set-
ting. Other types of data could be used to evaluate the effects of classroom behavior 
management systems. One common misconception is that data must be collected 
every day, all day, to be used to evaluate the effects of interventions. However, 
data can also be collected occasionally (Kahng et al., 2021; Repp et al., 1976). For 
instance, an educator could collect data every other day, once per week, or even
2 Determining Behavior Change 17
Fig. 2.2 Examples of variable and stable data patterns 
Fig. 2.3 Graph representing a line graph with two conditions. Note The vertical line shows where 
strategies were changed. This graph shows effective intervention
every other week to evaluate the effects of a classroom behavior management sys-
tem. Furthermore, data could be collected during one portion (e.g., 10 min period 
during math instruction) of the day instead of all day (Repp et al., 1976). One of 
the most important considerations for data collection is the feasibility of the type
18 T. C. Ré et al.
Fig. 2.4 Graph representing a line graph with two conditions. Note The vertical line shows where 
strategies were changed. This graph shows ineffective intervention
of data chosen for selection. Even if an educator chooses the best way to collect 
data, if that way is not a realistic possibility for their schedule, their classroom, 
etc., accurate data collection is likely to wane over time or not occur at all—then 
it is best to pick an alternative method of data collection. 
Now that it’s known that it is okay and useful to collect data intermittently, 
let’s dive a bit deeper into the different types of data that can be collected. When 
deciding on data collection, an educator must first decide on the measurement, the 
method of collection, and the way the data will be presented. All three should 
align; if recording the rate (e.g., accuracy on a timed math worksheet) of a behav-
ior, the datasheet will need to record the frequency of the behavior within a defined 
time. Ideally the educator would graph the data to visually see the behavior on 
paper, however, educators are most likely looking at a grade book or table that can 
show the changes too. Table 2.2 displays the three aspects an educator should take 
into consideration when selecting a data collection strategy.
As demonstrated in Table 2.2, educators have a number of options, but the 
considerations for these types of data collection methods should not end there. 
These different methods will be reviewed in order of perceived difficulty for the 
classroom setting with feasibility of data collection for staff working in a general 
education classroom in mind. 
2.4.1 Interval Recording 
Interval recording may be the easiest form of behavior data collection to eval-
uate the classroom behavior management system you have implemented in your 
classroom. This is because it does not require constant observation to collect data 
(Repp et al., 1976). Interval data collection is a system in which a person marks if
2 Determining Behavior Change 19
Table 2.2 Examples of data collection decisions domains 
Decision domain Choices Examples 
System of measurement • Frequency
• Rate
• Interval Recording
• Latency
• Duration
• Frequency of noncompliant 
behaviors
• Rate of vocabulary words 
answered correctly
• Latency to first response after 
teacher question
• Duration of elevator speechduring a public speaking class 
Method of data collection • Paper/pen
• Web-based application
• Clicker with recording
• Individualized printed 
datasheet
• Downloaded application
• Waterproof paper
• Clicker 
Presentation of Data • Percentage of occurrence/ 
independence/success
• Rate
• Duration
• Latency
• Percentage of vocabulary 
words defined correctly
• Rate of hand raising
• Duration of tantrum
• Latency to first independent 
answer 
Note Decisions to be made when creating a data collection system for graphical analysis
the behavior occurred or did not occur during a portion of an observation period 
(Cooper et al., 2020; Kahng et al., 2021; Repp et al., 1976). 
For instance, an educator may want to track what percentage of time students 
engage in vocal disruptions during math class. The math class starts at 10:00 A.M. 
and ends at 10:55 A.M. The educator could break the 55 min class into eleven 
5 min intervals. If they then notate that the behavior occurred or did not occur 
during each 5 min interval, they will be able to determine how frequently vocal 
disruptions occurred during math class. Notice how it was mentioned that they 
notate if the behavior occurred or did not occur at any point during a 5 min interval. 
This is known as a partial interval recording system. This kind of data collection 
system is usually used for behaviors desired to be decreased since it tends to 
overestimate how often a behavior is occurring (Cooper et al., 2020). 
Another type of interval recording would require that a teacher mark if the 
behavior occurred for the entirety of the interval or not. This is known as a whole 
interval recording system (Cooper et al., 2020; Kahng et al., 2021). This data 
collection system is typically used when someone wants to see an increase in a 
behavior because it underestimates the occurrence. For example, if a teacher would 
like to see a student remain at their desk, they would only mark the interval if the 
student stayed at their desk for the entire interval. If the student were to move 
away from their desk for any reason and any duration, even as short as 2 s, they 
would not mark the behavior as having occurred for the entire interval. 
Finally, if both of these interval systems are a bit too intensive and not quite 
feasible for the educator’s classroom, they could use a momentary time sample
20 T. C. Ré et al.
interval recording system. Like the other two kinds of interval recordings, a person 
would be taking an observation and breaking it down into equal-length intervals. 
They would then only mark an occurrence of a target behavior if it was occur-
ring at the end of the interval. For example, if a momentary time sample interval 
recording system was used for vocal disruption, the data collector would record an 
occurrence of a vocal disruption if it is occurring at the end of the 5 min interval. 
For whole interval and momentary time sample interval recording systems, the 
data collector would calculate the percentage of intervals in the same manner as 
the partial interval recording system. 
2.4.2 Rate 
Rate data collection may be the next easiest form of data collection. This is because 
the teacher will only make a mark if the specific behavior occurs. Rate is then 
calculated by taking the total number of occurrences and dividing by the total 
number of minutes of observation (Kahng et al., 2021). This will provide the 
teacher with the number of responses per minute, but one could also do the number 
of responses per hour as an evaluation tool. For example, if data were collected 
over a 15 min period and the target behavior happens 3 times, the rate would 
be converted to 1 time every 5 min or they could multiply 3 occurrences of the 
behavior by 4 (e.g., four 15 min intervals in an hour) they would find a rate of 12 
instance per hour. 
2.4.3 Duration 
Duration is another form of data collection. This type involves collecting informa-
tion of how long a behavior occurs from start to finish, again within a specified 
amount of time (Cooper et al., 2020). Typically, a timer or watch would be used 
to see how long an episode of the behavior occurs and the total time is recorded. 
For instance, duration could be used to see how long a student remains on task in 
an academic setting or how long a student is off-task during an academic lesson. 
2.5 Context Within Data Collection 
Educators will also want to collect additional information about the context in 
which those behaviors are occurring. For example, when they take rate data and 
find out that the behavior of tantrums is happening on average 5 times per hour, 
that does not tell them what caused those tantrums and how staff responded. One 
way to get an idea of how often a challenging behavior is happening, to what 
degree, and why it may be occurring is to use a commonly used method of data 
collection called antecedent-behavior-consequence data collection or A-B-C data 
collection (Cooper et al., 2020).
2 Determining Behavior Change 21
Table 2.3 Example of A-B-C data collection 
Antecedent Behavior Consequence 
Johnny was presented with 
instructions to complete a 
worksheet. “Okay Johnny, time 
to finish your math work” 
Johnny ripped the worksheet 
into pieces (e.g., property 
destruction) 
The teacher told Johnny to go 
sit in the hallway until he was 
ready to complete the 
worksheet 
The purpose of this A-B-C data collection system is to figure out the reason 
this behavior is occurring or what the child is getting by engaging in this behavior. 
This type of data collection even reveals patterns associated with this behavior 
(Cooper et al., 2020). Educators may often think to themselves that a student is 
just talking out of turn because they want the educator’s attention, or that a student 
is ripping the worksheet because they don’t want to work. These thoughts are a 
form of A-B-C data collection and can be used to figure out why someone is doing 
something. So, let’s dive a bit deeper into this type of data collection. 
Antecedent (A) is the situation that occurs immediately before the target behav-
ior. The behavior (B) is whatever has been operationally defined as a target 
behavior. The consequence (C) is what occurred immediately following the behav-
ior (Cooper et al., 2020). It should be noted that antecedents and consequences are 
always there, even if they seem ‘unrelated’ to the behavior, or the antecedent/ 
consequence has no clear change in the environment (Table 2.3). 
This type of data collection typically allows educators to get a better picture of 
what types of situations are influencing the occurrence of the target behavior and 
thus the reason why the behavior occurs (or what you may hear referred to as the 
‘function of behavior’ which is discussed more in-depth in Chap. 3). 
2.6 Data Assessment 
Though it is important to collect data, collecting data without analysis is a waste of 
time. Remember, data is a tool…not an intervention! Data should be collected for 
the purpose of analysis; how will the data collected inform the teacher’s decision 
making? What does the data tell the teacher about the instruction’s impact on the 
skill you are teaching? How does the behavior compare before and after the teacher 
introduced the new instructional program? 
Using a graph, instead of a table, can more quickly show you a few things 
about the behavior you are measuring (Cooper et al., 2020). For instance, it is 
easier to see which direction (e.g., trend) the behavior rate/frequency/accuracy/ 
duration/latency is going. The most used graph within ABA is a line graph. The 
line graph shows a change in behavior over time. Whether across hours, days, 
sessions, or weeks, behavior change is shown over time; one data point can be 
compared to the previous or following data point. This comparison then allows 
you to evaluate the effect of the teaching strategy of behavior change strategy.
22 T. C. Ré et al.
Is the behavior rate going up,going down, or staying the same (e.g., ascending, 
descending or no trend; Fig. 2.1)? This is known as the trend (Kimball & Heward, 
1993). Second, one would look to see how much change there is from data point to 
data point. This is known as reviewing the variability of data (Kimball & Heward, 
1993). Variability helps a person to see the consistency the student is doing some-
thing (Fig. 2.2). This may help an educator identify if there is something going 
on at recess or in the educator’s teaching strategy that is changing each day. If 
things are changing each day, it is difficult to predict how a student will learn. 
Therefore, without predictability, it is more difficult to make decisions based on 
the data which in turn can impact student outcomes. 
Once the data are graphed, the teacher can see the effectiveness (or not) of 
the intervention. Has the student’s behavior changed and, if so, has it changed in 
the predicted direction? For example, have targeted vocabulary words increased in 
comparison to the pre-test/baseline condition following your teaching strategies? 
Has calling out behavior decreased following the implementation of a reward pro-
gram for raising hands? Fig. 2.3 depicts one example of a pattern of behavior 
before and after intervention. Responses per minute of correctly spelled words are 
displayed across days. During the first teaching strategy, the rate of correct spelling 
is increasing (i.e., trend) with relatively little variability. During the second teach-
ing strategy, the overall trend is decreasing with highly variable data. A decision 
would need to be made regarding the effectiveness of each teaching strategy. Is 
this a target behavior to increase or reduce? Is variability ideal? Either way, this 
picture allows an educator to make informed decisions regarding the effects of the 
intervention via visual analysis. 
2.7 Summary 
In summary, data collection is something that educators are quite familiar with and, 
hopefully, this chapter will give teachers new (and feasible) ways to determine if a 
teaching strategy is resulting in the desired outcome. These authors recognize and 
respect that the responsibilities of today’s educators are nothing short of expansive. 
Efficient and accurate data collection—while at the surface it may seem like one 
more task to add to the ever-growing list—can help make educators’ lives better. 
Through the strategies outlined in this chapter, educators will be able to pinpoint 
exactly the behavior of interest and be able to efficiently communicate them to 
other team members (i.e., operational definitions). Teachers will then be able to 
set objective goals rooted in those definitions. Then, teachers will select from a 
number of options outlined in this chapter (e.g., interval, rate, duration, A-B-C data 
collection systems) a data collection method that accurately captures the behavior 
of interest, as well as fits within the needs of the educator’s classroom. Teachers 
are then able to display those data in such a way that they can quickly know if the 
teaching strategy is doing what they want it to do.
2 Determining Behavior Change 23
Key Terms 
Operational definition: describing behavior in such a way that anyone who is 
observing, would be able to identify whether the behavior is occurring or not 
occurring. 
Objective goal: a statement that is observable, measurable, and related to opera-
tionally defined behaviors; these goals should specify the mastery criterion. 
Interval data collection systems: a method of data collection where you are indicat-
ing whether or not a behavior occurred during a particular number of intervals within 
an observation; examples may be partial interval, whole interval, and momentary time 
sample interval recording system. 
Rate: the number of instances of a behavior within a given amount of time; examples 
may be instances per minute, instances per hour. 
Duration: the length of time a behavior is occurring for, from start to finish. 
A-B-C: a method of data collection where the antecedent (A), behavior (B), and 
consequence (C) are notated; this form of data collection can provide information 
about why, or under what context, the target behavior is occurring. 
Trend: the direction in which the path of data points is going. 
Variability: how close the data points are to one another in a given data path. 
Stability of data: when data points are very close to one another. 
Appendix 
Materials and Templates: Data Sheets 
See (Tables 2.4, 2.5, 2.26, 2.7 and 2.8). 
Table 2.4 Descriptive blank 
ABC data sheet 
Event Info Antecedent Behavior Consequence 
Time Start: ________ 
Time End: ________ 
Location: ________ 
Activity: ________ 
People Present:
24 T. C. Ré et al.
Table 2.5 Checklist ABC data sheet 
Event Info Antecedent Behavior Consequence 
Time Start: ________ 
Time End: ________ 
Location: ________ 
Activity/Subject: ____ 
____ 
People Present:
⟁ Low attention
⟁ Removal of item/ 
task: _________
⟁ Denied Access of 
item/task: _________
⟁ Transition out of 
the room
⟁ Transition between 
subjects
⟁ Change in routine
⟁ Other: __________
⟁ hitting peers
⟁ Throwing items
⟁ Running away from 
teacher
⟁ Mouth noises/ 
humming
⟁ Task refusal
⟁ Calling out
⟁ Task/Item removed
⟁ Task/Item provided
⟁ Attention Provided
⟁ Child removed 
from class
⟁ Coping strategy 
offered
⟁ Emergency Safety 
Procedures 
Table 2.6 Interval recording blank data sheet 
Interval # (5 min intervals) Date: 
Activity: 
Date: 
Activity: 
Date: 
Activity: 
Date: 
Activity: 
Date: 
Activity: 
1 
2 
3 
4 
5 
6 
7 
8 
9 
10 
Total # of Intervals Behavior Observed 
% of Intervals of Behavior Observed 
Note this data sheet can be used for partial interval, whole interval, or momentary time sample 
interval data collection
2 Determining Behavior Change 25
Table 2.7 Rate data sheet 
Rate of Behavior Date: Date: Date: Date: Date: 
Start Time: 
End Time: 
Activity: 
Start Time: 
End Time: 
Activity: 
Start Time: 
End Time: 
Activity: 
Start Time: 
End Time: 
Activity: 
Start Time: 
End Time: 
Activity: 
Average Rate 
(Rate 1 + Rate 2 + · · · Rate 5) divided by 5 
Note Tallies would be put into a blank box for the frequency of the target behavior per activity 
Table 2.8 Duration data sheet 
Sample # Date: Date: Date: Date: Date: 
1 Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
2 Start Time: 
End Time: 
Start Time: 
End time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
3 Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
4 Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start time: 
End Time: 
5 Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Start Time: 
End Time: 
Average Duration (Sample 1 
+ Sample 2 … Sample 5) 
divided by 5 
References 
Altman, J. (1974). Observational study of behavior: Sampling methods. Behavior, 49(3), 227–267. 
https://doi.org/10.1163/156853974x00534
https://doi.org/10.1163/156853974x00534
26 T. C. Ré et al.
Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behavior 
analysis. Journal of Applied Behavior Analysis, 1(1), 91–97. https://doi.org/10.1901/jaba.1968. 
1-91 
Baer, D. M., Wolf, M. M., & Risley, T. R. (1987). Some still-current dimensions of applied behav-
ior analysis. Journal of Applied Behavior Analysis, 20(4), 313–327. https://doi.org/10.1901/ 
jaba.1987.20-313 
Carr, J. E., Nosik, M. R., & Luke, M. M. (2018). On the use of the term “frequency” in applied 
behavior analysis. Journal of Applied Behavior Analysis, 51(2), 436–439. https://doi.org/10. 
1002/jaba.449 
Cooper, J. O., Heron, T. E., & Heward, W. L. (2020). Applied behavior analysis. Pearson. 
Kahng, S., Ingvarsson, E. T., Quigg, A. M.,Seckinger, K. E., Teichman, H. M., & Clay, C. J. 
(2021). Defining and measuring behavior. In W. W. Fisher, C. C. Piazza, & H. S. Roane (Eds.), 
Handbook of applied behavior analysis (pp. 135–154). The Guilford Press. 
Kimball, J. W., & Heward, W. L. (1993). A synthesis of contemplation, prediction, and control. 
The American Psychologist, 48(5), 587–588. https://doi.org/10.1037/0003-066X.48.5.587 
Miltenberger, R. G., & Weil, T. M. (2013). Observation and measurement in behavior analysis. In 
G. J. Madden, W. V. Dube, T. D. Hackenberg, G. P. Hanley & K. A. Lattal (Eds.), APA handbook 
of behavior analysis, Vol. 1: Methods and principles; APA handbook of behavior analysis, 
Vol. 1: Methods and principles (pp. 127–150, Chap. xxx, 567 Pages). American Psychological 
Association, American Psychological Association. 
Repp, A. C., Nieminen, G. S., Olinger, E., & Brusca, R. (1988). Direct observation: Factors 
affecting the accuracy of observers. Exceptional Children, 55, 29–36. 
Repp, A. C., Roberts, D. M., Slack, D. J., Repp, C. R., & Berkler, M. S. (1976). A comparison of 
frequency, interval and time-sampling methods of data collection. Journal of Applied Behavior 
Analysis, 9(4), 501–508. https://doi.org/10.1901/jaba.1976.9-501
https://doi.org/10.1901/jaba.1968.1-91
https://doi.org/10.1901/jaba.1968.1-91
https://doi.org/10.1901/jaba.1987.20-313
https://doi.org/10.1901/jaba.1987.20-313
https://doi.org/10.1002/jaba.449
https://doi.org/10.1002/jaba.449
https://doi.org/10.1037/0003-066X.48.5.587
https://doi.org/10.1901/jaba.1976.9-501
3Functional Behavior Assessment 
in General Education Classrooms 
Tyler C. Ré, Jair Yepez Torres, Jennifer Quigley, 
and Tricia R. Clement 
3.1 Overview 
The importance of determining the function of behavior is highlighted throughout 
this chapter. All behavior serves a function (Skinner, 1963); in its most simplistic 
form, any behavior will produce a consequence, either the occurrence of some-
thing (i.e., getting an object) or the removal/avoidance of something (i.e., getting 
out of a non-preferred task). Determining the function can help you and other edu-
cators to ensure the proper intervention is implemented to help your classroom run 
smoothly. There are two common ways to determine the function of a particular 
behavior or groups of behaviors. One is known as functional analysis (FA), which 
is the experimental manipulation of consequences. Through this systematic analy-
sis, the function of behavior can be determined and lead to the development of a 
function-based treatment. While an FA is a well-known evidence-based strategy to 
determine the function of a behavior (Hanley et al., 2003), most educational envi-
ronments incorporate the use of a functional behavior assessment (FBA). Both an 
FA and an FBA consist of similar components. Both include an indirect assessment 
(e.g., interviews and surveys) and direct assessment (e.g., direct observation), but 
an FA includes an empirical (e.g., testing across different consequences) assess-
ment as well. Based on these differences it is important to note that an FBA can 
develop a hypothesized function, but an FA can determine the empirically sup-
ported function of a behavior (Lewis et al., 2015). These assessments are vital 
to supporting both students and staff because without knowing the reason why a 
behavior occurs, we can only guess at the most effective intervention.
T. C. Ré (B) · J. Y. Torres · J. Quigley · T. R. Clement 
The Chicago School, Chicago, IL, USA 
e-mail: tre1@thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_3 
27
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_3&domain=pdf
mailto:tre1@thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_3
28 T. C. Ré et al.
In schools, many special education teachers participate in the development of 
a functional behavior assessment (FBA). An FBA is a non-experimental assess-
ment that allows us to figure out why a student is engaging in a specific behavior. 
An FBA consists of two major aspects. First, the assessment administrator will 
use an indirect assessment tool (i.e., questionnaire) to gain a better understanding 
of the situations in which the behavior is exhibited. The second aspect is direct 
observation. Based on the questionnaire, the assessment administrator will deter-
mine a time to come and directly observe the situation that leads to undesirable 
behavior. This observation will most likely occur several times to ensure accurate 
information is gathered. As a teacher, you may already be doing this informally. 
For example, how many times have you thought that a student is talking out of turn 
just to get your attention? If you had that thought and you observed the behavior, 
you have informally started to look at the function of a behavior and provided 
an educated decision that the student is engaging in the behavior for attention. 
You also know that if you do not intervene appropriately, the learning opportuni-
ties for that student and every other student in your classroom may be negatively 
impacted. Determining the function of a behavior can help you intervene accord-
ingly using a function-based intervention, which will allow you to provide more 
instructional opportunities to achieve better student outcomes. Undesirable behav-
iors can often impact learning and, as we know, if a student isn’t learning, a teacher 
isn’t teaching. 
3.2 Indirect Assessment 
The functional behavior assessment process usually begins with an indirect assess-
ment procedure. An indirect assessment procedure is conducted by gathering 
information from an individual who is in close contact with the student and the 
challenging behavior (Floyd et al., 2005). In a classroom setting, this person may 
be a teacher, teacher’s aide, or any other person in close contact with the chal-
lenging behavior. Indirect assessment measures involve the use of tools to record 
information from these informants, in other words, the behavior is not measured 
directly. For example, these indirect assessments may include interviews, check-
lists, and rating scales (Floyd et al., 2005). These tools are structured to gather 
information about possible antecedent and consequence variables that maintained 
the challenging behavior. The Function Analysis Screening Tool (FAST; Iwata & 
DeLeon, 2005) is a great example of an indirect assessment questionnaire. This 
tool presents a list of yes/no questions related to variables that may occur before 
or after the challenging behavior. Once this assessment is scored, the results may 
provide a prospective answer about the possible maintaining variables of the chal-
lenging behavior. Another example is the Functional Assessment Checklist for 
Teachers and Staff (FACTS; March et. al, 2000). This assessment tool is divided 
into two parts. Part A was designed to help the assessor identify which routines 
and challenging behaviors to target. Then part B can be used to identify envi-
ronmental variables that may affect the target behavior previously identified. A
3 Functional Behavior Assessment in General Education Classrooms 29
variety of other indirect assessments can be listed here such as the Questions 
about Behavioral Function (QABF; Paclawskyj, 1998), preliminary functional 
assessment survey, teacher functional behavior assessment checklist, the aberrant 
behavior checklist, and others (Anderson, et al., 2015). Although these tools may 
vary in the format in which they are presented, they all are based on informa-
tion obtained from an informant and not by observing the student engaging in the 
problem behavior. Because information described in these assessments is based on 
the ability of informants to recall information, these assessments may be proneto 
error (Floyd et al., 2005). For this reason, usually, the next step in the FBA process 
is to conduct a direct assessment. 
3.3 Direct Assessment 
Direct assessment procedures involve the collection of information directly from 
the source or in other words by observing and recording environmental vari-
ables that occur before, during, and after the challenging behavior. Different from 
indirect assessments, this method allows the observer to record information avoid-
ing perception or memory errors from the reporters (Alter et al., 2008). Direct 
assessment procedures are usually presented in different formats such as ABC 
recordings, structural analysis, and scatter plots (Anderson et al., 2015). Overall, 
direct assessments involve an observer taking detailed notes on antecedent and con-
sequence variables that occur during the challenging behavior. An observer can use 
a form that has been pre-filled with different antecedents and consequences options 
that are being selected during the observation session. This method is more time-
efficient and allows the observer to spend less time taking notes. Often an assessor 
or observer can use both methods by combining pre-filled forms with open-ended 
notes. The most common tool for direct assessments are ABC data forms. These 
forms are divided into three sections, the first section (A) refers to antecedent vari-
ables that occur immediately before the challenging behavior. The second section 
(B) stands for behaviors and in this section information about challenging behavior 
is described. The third section (C) stands for consequences and focuses on vari-
ables that occur immediately after the behavior are recorded. For example, in an 
ABC form, under the A section, the teacher placing a request can be identified as 
an antecedent event, then under the B section, a challenging behavior such as the 
student eloping from the desk can be described; finally, under the C section, the 
teacher’s reprimands or other consequence events can be detailed. 
3.3.1 The ABCs 
The ABCs of ABA refer to antecedent-behavior-consequence and is often referred 
to as the three-term contingency (Skinner, 1965). When discussing the role of 
the ABCs in instructional programming, one should look at what is being done 
prior to the student’s targeted behavior and what is directly following the behavior.
30 T. C. Ré et al.
If behavior is not changing, it is due to the current strategies. Throughout the 
chapters, the ABCs may be referenced in a variety of ways, but when broken 
down, they refer to the environmental context prior to the targeted behavior and 
the consequences delivered following the behavior. In ABA, consequences are not 
a bad thing. They are just something that directly follows behavior. Only across 
time are we able to tell the effect a consequence has on behavior; the behavior 
increased, decreased, or stayed the same. 
3.4 Functions of Behavior 
When a student engages in a challenging behavior that is disrupting the learning 
environment, a behavior assessment is recommended to determine the function or 
the reason behind this undesirable behavior. This is known as a functional behavior 
assessment (FBA). An FBA will include a combination of indirect measures (e.g., 
survey questions directed to you) and direct observation (e.g., data collected when 
someone observes the student in a class setting). Indirect measures may include 
interviews with teachers and caregivers, questionnaires, or surveys. Direct obser-
vation includes a variety of data collection methods, typically done by a behavior 
analyst or the school psychologist. The overall purpose of these assessments is to 
form a hypothesis as to why the behavior occurs. Behavior analysts call this the 
function of behavior. You have probably encountered a time when you or someone 
you know has said something like: “He is only doing it to get attention.” Or “She 
is just trying to get out of having to do work.” Little did you know that you or 
your friend were discussing the basic forms of the functions of behavior. Without 
getting too technical, there are four potential functions of every human behav-
ior (attention, tangible, escape, and automatic; Cooper, 2020). An important note 
before going into this discussion is to realize that all behaviors are maintained 
by at least one of these four functions; meaning, both undesirable behavior and 
desirable behavior meet these interactions. 
3.4.1 Attention Function 
Access to attention is one of the four functions of behavior. It is important to note 
that with this particular function, the type of attention is not specified, meaning 
that does not matter if attention is viewed as desirable or undesirable. An example 
of a behavior that is maintained by access to attention is: The child is sitting at 
their desk working quietly and then raises their hand. The result of them raising 
their hand is that you come over to their desk to see what they need. The act 
of raising their hand was an attempt to gain your attention in a desirable way. A 
student can engage in undesirable behavior to get your attention too. For example, 
your class is participating in individual seat-work. A student is sitting at their desk 
working on a worksheet and throws their eraser in your direction. You then address 
the student to inform them that is not appropriate classroom behavior. Here, your
3 Functional Behavior Assessment in General Education Classrooms 31
correction and response gave the student attention. It was negative attention but 
still a form of attention. 
3.4.2 Tangible Function 
Access to tangibles is another one of the four functions of behavior. A behav-
ior maintained by this function may occur either by being denied or prevented 
from getting a specific item or from the removal of a specific item or activity. 
For example, a child is playing with blocks during indoor recess. When the class 
is instructed that indoor recess is over, the student playing with blocks yells that 
recess is not over. The teacher doesn’t insist recess is over which allows the student 
to play with the blocks a few more minutes. Another example, everyone has been 
to a grocery store and seen a child yelling and crying about wanting to get some-
thing from the check out candy section. The child is crying and yelling because 
they have been denied the specific piece of candy. Furthermore, if the parent gives 
in and allows the child to get the piece of candy after they have been yelling 
and crying, this behavior is more likely to occur next time they are in the same 
situation. The behavior of crying has been reinforced by giving the candy bar. 
3.4.3 Escape Function 
A third function is escape or avoidance of an interaction. A distinction can be 
made within this function. First, escape means that the task/activity is terminated. 
Avoidance is when the start of the task/activity/interaction is delayed or not pre-
sented due to the behavior. An example of escape-maintained behavior may be 
when a student yells profanities at the music teacher during a music class activity. 
The teacher sends the student to the office and the student is not allowed to return 
to the class that day or week. An example of avoidance-maintained behavior would 
be a student ripping up the math worksheet. The student is then sent to the hall 
and asked to return when they are ready to complete the worksheet. The student’s 
behavior has just allowed them to potentially avoid completing the worksheet. 
3.4.4 Automatic Function 
The final function of behavior is known as automatic. Automatically-maintained 
behavior suggests that access to the preferred sensation or removal of aversive 
sensation can be achieved without the assistance of another person. This behavior 
would occur regardless of whether another person is in the area, or the person is 
alone in a room. Many people engage in automatically-maintained behavior and 
don’t even know it. There are many behaviors thatare categorized as automatic: 
scratching an itch, wiping a runny nose, stretching a cramping muscle, sitting in a 
chair when tired, a person cracking their knuckles, bouncing their knee, humming
32 T. C. Ré et al.
to themselves, rubbing their beard, or twirling their hair; just to name a few. How-
ever, in the classroom, automatically maintained behavior that may disrupt the rest 
of the class may include a group of behaviors known as stereotypical responding 
such as, hand flapping, excessive tapping of the feet or rocking. 
The functions of behavior may be one of the most important aspects of devel-
oping a classroom behavior management program. The functions of behavior and 
other situations surrounding the occurrence of a specific behavior guide the devel-
opment of behavior change procedures; as well as are used to increase socially 
appropriate behaviors. 
3.5 Implementation of an FBA 
Once the need for FBA is identified in a classroom, appropriate consent should 
be gathered, and relevant participants are selected, the assessor will initiate the 
FBA process as follows. For purposes of this example, the challenging behavior to 
assess will be elopement from the classroom without permission for a child named 
Jonny. To start the assessment process, the assessor needs to gather information to 
clearly define the challenging behavior and identify times, settings, and variables 
to observe during the direct assessment process. To do so, the assessor may con-
duct an interview with the teacher and teacher’s aide. With this information, it is 
important that the assessor clearly defined the challenging behavior with objective 
and descriptive language. In this example, elopement from the classroom is defined 
as any instance the student leaves his chair or designated area and moves to the 
classroom’s exit without the teacher’s permission. Once this challenging behavior 
is identified, the teacher or teacher’s aide can be asked to complete an additional 
indirect assessment such as the FAST (Iwata & DeLeon, 2005) or FACTS (March 
et al., 2000). With this information, the assessor now is able to identify the circum-
stances where the direct assessment observation should take place such as time, 
class setting, teacher in charge, etc. Before continuing into the direct assessment 
observation, the assessor should have an idea of what antecedent and consequence 
events may be maintaining the elopement behavior. Knowing these, the asses-
sor will schedule an observation or a series of observations at the setting and 
time when the challenging behavior is more likely to happen. During the direct 
assessment observation, the assessor will use an ABC form to fill out, these forms 
can include pre-filled options. In this example, the observer may write down the 
following information: 
(A) Antecedent: During music class, the teacher instructed the students to get into 
groups and start a dancing activity. 
(B) Behavior: Immediately after the presentation of the teacher’s request, Jonny 
yelled “look there is a giraffe in the hallway” and ran toward the exit door. 
Jonny opened the door and exited the classroom.
3 Functional Behavior Assessment in General Education Classrooms 33
(C) Consequence: The music teacher immediately ran after Jonny who was stand-
ing behind the door on the hallway. The teacher reprimanded Jonny stating 
that he was not allowed to exit the class without permission and that there was 
not a giraffe in the hallway. Jonny entered the classroom and his classmates 
laughed and asked him about the giraffe. 
The assessor may continue observing and recording additional instances of chal-
lenging behaviors. Once this information is obtained, the assessor may determine 
the possible function of the behavior based on the recorded antecedent and con-
sequence events. For this example, the possible function may include possible 
escape from demands or non-preferred activity (dancing) and possible access to 
social attention (laughs and interactions with peers). 
3.6 Summary 
Every behavior that we engage in during our daily lives serves a function or a 
purpose as some people may say. Challenging behaviors shown by students in a 
classroom are not different. Although it is not possible to conduct a formal exper-
imental analysis in a classroom, FBAs help us understand challenging behaviors 
and the environmental variables that are maintaining or increasing these. The FBA 
process starts with the assessor conducting an indirect assessment, usually conduct-
ing an interview with the teacher or teacher’s aide. Additional indirect assessment 
tools can be used to collect further information on variables affecting challenging 
behavior. With this information, assessors can arrange a direct assessment obser-
vation at times, locations, and with people in which the challenging behavior is 
more likely to occur. During the direct assessment observation, the assessor will 
collect ABC data. The ABC data collected represents what is known as the three-
term contingency or in other words the antecedents, behaviors, and consequences. 
With this information, the assessor can start concluding the possible functions of 
the challenging behavior. The four potential functions that behaviors can serve are 
attention, tangible, escape, and automatic functions. Once one or more of these 
functions are identified, then a possible course of treatment can be developed. An 
appropriate treatment or behavior plan would use the information obtained during 
the FBA process to appropriately create a plan, this process will be described in 
the next chapters. 
Key Terms 
ABC Data. A description of antecedents, behaviors, and consequences collected by 
an observer. 
Antecedent. An environmental event that occurs immediately before challenging 
behavior.
34 T. C. Ré et al.
Attention Maintained. A behavior that is being reinforced by access to social 
attention such as peers, parents, teachers, etc. 
Automatic Maintained. A behavior that is being reinforced by access to an internal 
stimulation that cannot be measured and that is not socially mediated. 
Behavior. Any action in which a living organism engages that produces a change in 
the environment. 
Consequence. An environmental event that occurs immediately after the challenging 
behavior. 
Direct Assessment. The process in which behavior and related variables are recorded 
by observing the actual behavior. 
Escape/Avoidance Maintained. A behavior that is being reinforced by terminating 
or preventing an aversive event. 
Function of Behavior. The observed relation between the occurrence of the behavior 
and the occurrence of a consequence event that reinforces the behavior making it 
more likely to occur in the future, 
Indirect Assessment. The process in which the behavior and related variables are 
recorded based on information obtained from other observers (i.e. teacher or teacher’s 
aide). 
Tangible Maintained. A behavior that is being reinforced by access to objects or 
activities. 
Appendix 
Useful Resources 
FAST; Iwata, B., & DeLeon, I. (2005). The functional analysis screening tool. 
Gainesville, FL: The Florida Center on Self-Injury, University of Florida. https:// 
depts.washington.edu/dbpeds/Screening%20Tools/FAST.pdf 
QABF; Paclawskyj, T. R. (1998). Questions About Behavioral Function 
(QABF): A Behavioral Checklist for Functional Assessment of Aberrant 
Behavior. 127. https://arbss.org/wp-content/uploads/2021/05/Questions-about-Beh 
avioral-Function-QABF-Google-Docs.pdf 
FACTS; March, R. E., Horner, R. H., Lewis-Palmer, T., Brown, D., Crone, D. A., 
Todd, A. W., et al. (2000). Functional assessment checklist for teachers and staff 
(FACTS). Eugene, OR: University of Oregon. https://doe.virginia.gov/support/stu 
dent_conduct/functional_assessment_checklist.pdf
https://depts.washington.edu/dbpeds/Screening%20Tools/FAST.pdf
https://depts.washington.edu/dbpeds/Screening%20Tools/FAST.pdf
https://arbss.org/wp-content/uploads/2021/05/Questions-about-Behavioral-Function-QABF-Google-Docs.pdfhttps://arbss.org/wp-content/uploads/2021/05/Questions-about-Behavioral-Function-QABF-Google-Docs.pdf
https://doe.virginia.gov/support/student_conduct/functional_assessment_checklist.pdf
https://doe.virginia.gov/support/student_conduct/functional_assessment_checklist.pdf
3 Functional Behavior Assessment in General Education Classrooms 35
ABC Data Collection Template; https://www.kalamazoopublicschools.com/site/ 
handlers/filedownload.ashx?moduleinstanceid=2010&dataid=2002&FileName= 
abc_recording_form_20150909_155712_1.pdf 
References 
Alter, P. J., Conroy, M. A., Mancil, G. R., & Haydon, T. (2008). A comparison of functional 
behavior assessment methodologies with young children: descriptive methods and functional 
analysis. Journal of Behavioral Education, 17(2), 200–219. https://doi.org/10.1007/s10864-
008-9064-3 
Anderson, C. M., Rodriguez, B. J., & Campbell, A. (2015). Functional behavior assessment in 
schools: current status and future directions. Journal of Behavioral Education, 24(3), 338–371. 
https://doi.org/10.1007/s10864-015-9226-z 
Cooper, J. O. (2020). Applied behavior analysis (3rd ed.). Pearson Education, Inc. 
Floyd, R. G., Phaneuf, R. L., & Wilczynski, S. M. (2005). Measurement properties of indirect 
assessment methods for functional behavioral assessment: A review of research. School Psy-
chology Review, 34(1), 16. 
Hanley, G. P., Iwata, B. A., & McCord, B. E. (2003). Functional analysis of problem behavior: 
A review. Journal of Applied Behavior Analysis, 36(2), 147–185. https://doi.org/10.1901/jaba. 
2003.36-147 
Iwata, B., & DeLeon, I. (2005). The functional analysis screening tool. The Florida Center on Self-
Injury, University of Florida. https://depts.washington.edu/dbpeds/Screening%20Tools/FAST. 
pdf 
Lewis, T. J., Mitchell, B. S., Harvey, K., Green, A., & Mckenzie, J. (2015). A comparison of func-
tional behavioral assessment and functional analysis methodology among students with mild 
disabilities. Behavioral Disorders, 41(1), 5–20. https://doi.org/10.17988/0198-7429-41.1.5 
March, R. E., Horner, R. H., Lewis-Palmer, T., Brown, D., Crone, D. A., Todd, A. W., et al. (2000). 
Functional assessment checklist for teachers and staff (FACTS). University of Oregon. 
Paclawskyj, T. R. (1998). Questions about behavioral function (QABF): A behavioral checklist for 
functional assessment of aberrant behavior. 127. 
Skinner, B. F. (1965). Science and Human Behavior. Free Press. http://ebookcentral.proquest.com/ 
lib/tcsesl/detail.action?docID=4934291 
Skinner, B. F. (1963). Operant behavior. American Psychologist, 18(8), 503–515. https://doi.org/ 
10.1037/h0045185
https://www.kalamazoopublicschools.com/site/handlers/filedownload.ashx?moduleinstanceid=2010&dataid=2002&FileName=abc_recording_form_20150909_155712_1.pdf
https://www.kalamazoopublicschools.com/site/handlers/filedownload.ashx?moduleinstanceid=2010&dataid=2002&FileName=abc_recording_form_20150909_155712_1.pdf
https://www.kalamazoopublicschools.com/site/handlers/filedownload.ashx?moduleinstanceid=2010&dataid=2002&FileName=abc_recording_form_20150909_155712_1.pdf
https://doi.org/10.1007/s10864-008-9064-3
https://doi.org/10.1007/s10864-008-9064-3
https://doi.org/10.1007/s10864-015-9226-z
https://doi.org/10.1901/jaba.2003.36-147
https://doi.org/10.1901/jaba.2003.36-147
https://depts.washington.edu/dbpeds/Screening%20Tools/FAST.pdf
https://depts.washington.edu/dbpeds/Screening%20Tools/FAST.pdf
https://doi.org/10.17988/0198-7429-41.1.5
http://ebookcentral.proquest.com/lib/tcsesl/detail.action?docID=4934291
http://ebookcentral.proquest.com/lib/tcsesl/detail.action?docID=4934291
https://doi.org/10.1037/h0045185
https://doi.org/10.1037/h0045185
4Antecedent Interventions: Proactive 
Strategies for Changing Behavior 
Mindy J. Cassano, Holly Bruski, and Sarah Bendekovits 
4.1 Overview 
This chapter discusses a variety of proactive strategies for changing behavior that 
are applicable within the general education classroom setting. Antecedent vari-
ables (what happens prior to) can help predict behaviors and will be presented 
in the context of motivating operations (encourage or discourage a certain behav-
ior). In addition, descriptions of antecedent interventions such as, high-probability 
response sequence, antecedent exercise, providing choices, visual schedules, fore-
shadowing, and countdowns are explained. Examples of how each one of these 
interventions can be applied will be provided. 
4.2 Implementation and Application 
As discussed in earlier chapters, in behavior analysis, the term antecedent refers 
to any stimulus that is present or occurs before the target behavior of interest 
(Cooper et al., 2007). Antecedents are the first part of the three-term contin-
gency, commonly called ABC contingency which stands for Antecedent, Behavior 
(what a person does) and Consequence (what occurs after the behavior). However, 
remember from previous chapters that a consequence is not necessarily a nega-
tive response or a punishment. For example, consider a teacher asks the class a 
question (antecedent), a student calls out (behavior), and then the teacher tells the 
student to raise their hand and answer the question (consequence).
M. J. Cassano (B) · H. Bruski · S. Bendekovits 
The Chicago School of Professional Psychology, Chicago, IL, USA 
e-mail: mjcassano@hotmail.com; mcassano@ego.thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_4 
37
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_4&domain=pdf
mailto:mjcassano@hotmail.com
mailto:mcassano@ego.thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_4
38 M. J. Cassano et al.
In the early years of behavior analysis, antecedents were not commonly con-
sidered as variables that affected behavior change. However, with the publication 
of Michael’s (1982) discussion on motivating operations, the role of antecedent 
variables in changing behavior came to the forefront. This is important to know 
so behavior can be changed by better understanding the motivation behind it. 
Then strategies can be put into place in order to manipulate those antecedents to 
increase positive behavior. Using the above example, when the teacher answered 
the question (even though they did correct the behavior), they inadvertently rein-
forced the calling out behavior. The motivation behind the behavior was attention 
for the teacher to call on them. The student still received attention in the form of 
the teacher’s response; therefore, the student is more likely to call out again in the 
future. Now knowing this, the teacher can either model the appropriate behavior 
(by raising their hand with their finger over their mouth) or completely ignore the 
calling out behavior by not responding and then reinforce (praise) the next time 
that child raises their hand without calling out. Using this strategy, will reduce 
the negative behavior (calling out) and will increase the positive behavior (raising 
hand). 
4.3 Motivating Operations (MO) Definitions and Examples 
Antecedent variables are often assessed based on their motivating operations. Moti-
vating operation (MO) is an umbrella term referring to establishing operations 
(EO) and abolishing operations (Cooper et al., 2007). EOs increase the value of a 
reinforcer and increase the frequency of a behavior. Therefore, EO’s increase the 
motivation to access a reinforcer. For example, if a student had a small breakfast 
before coming to school, being hungry would be the EO to gain access to food. 
Further, if a student is hungry, they may be more likely to engage in those appro-
priate behaviors or tasks that are required in order to get to lunch. A student who 
dislikes math may be more motivated to do their work close to lunch time. Being 
hungryincreases the value of food and increases behaviors that gain access to the 
food. Abolishing operations (AOs) decrease the reinforcing effectiveness of a rein-
forcer and decreases the frequency of a behavior. In other words, AO’s decrease 
the motivation to access a reinforcer. Using the above example, if a student just had 
a large snack or is no longer hungry, food or using the prompt “it is almost lunch 
time” as the motivation to go to lunch would serve as an AO, decreasing the rein-
forcing effectiveness of food at that time. The fact that the student just ate means 
they will most likely no longer be motivated by food. Because of this decrease in 
reinforcing effectiveness, the student may also be less likely to engage in those 
same appropriate task-oriented behaviors mentioned above, such as, completing a 
task (Fig. 4.1).
4 Antecedent Interventions: Proactive Strategies for Changing Behavior 39
Motivating Operations (MOs) 
Establishing Operations (EOs) 
● Increase the reinforcing 
effectiveness of a stimulus 
● Increase the current frequency of 
a behavior 
Abolishing Operations (AOs) 
● Decease the reinforcement 
effectiveness of a stimulus 
● Decrease the current frequency of 
a behavior 
Fig. 4.1 Characteristics of motivating, establishing, and abolishing operations 
4.4 Stimulus Control 
It is possible that students may have more challenging behavior when in the pres-
ence of someone they are unfamiliar with such as a substitute teacher. This is 
likely because the classroom teacher has gained stimulus control of the classroom, 
and the substitute teacher does not. Stimulus control can be used to describe when 
a behavior is triggered by the presence or absence of something. Stimulus con-
trol can either encourage or discourage a behavior. In this case, the students have 
learned what the consequences (positive or negative) are with the regular teacher 
through a history of reinforcement. For example, if they don’t listen or follow 
directions (behavior) they may have to stay in for recess or don’t get extra free 
time at the end of the day. 
In chapter one, authors discussed that a consequence in the three-term contin-
gency or the ABC contingency refers to something that occurs after a behavior. 
Behavior tends to be controlled by its consequences. For example, a student may 
be very talkative with friends, but not talkative during class. With friends, a stu-
dent gets reinforcement by having reciprocating conversations (maybe having a 
friend laugh at their joke). In class, the teacher may either reprimand talking or 
not respond when they are talking out of turn. Students learn that in order to access 
reinforcement (teacher praise or even just a response), they must work quietly and
40 M. J. Cassano et al.
behave accordingly to access reinforcement (praise or maybe something tangible 
if the teacher had a classroom behavior management plan). 
In general, stimulus control influences the probability of a student complying 
with teacher requests. If stimulus control is achieved, the probability that a stu-
dent will comply with the teacher’s instruction will increase. In contrast, if there is 
little stimulus control, the student will be less likely to comply with instructions. 
Stimulus control can be strengthened by providing consistent consequences when 
a student engages in a particular response. For example, if a teacher always pro-
vides praise (consequence) for on-task behavior (remaining seated, attending to the 
teacher, completing work, etc.), it is likely that students will continue to engage 
in these appropriate behaviors. In contrast, if students are intermittently allowed 
to leave their desks and not check if work is completed, they will be less likely to 
remain on task. For a behavior to be under stimulus control, the stimulus or event 
that precedes the behavior must continuously be followed by that specific behavior 
or response. Using these examples, the student learns that reinforcement (possibly 
praise) will be given after staying on task, or quietly working. Stimulus control 
is pertinent to increasing the frequency of appropriate behavior in any learning 
situation. 
In order to establish stimulus control in a classroom for not calling out, students 
behavior can be praised (and/or answer them) when they raise their hands. For 
example, teachers can provide behavior specific praise by saying they liked that 
the student raised their hand and answered their question. In the same manner, 
teachers should not respond to students who are calling out. For example, when 
a student calls out “Ms. C, Ms. C I have a question.” They can either praise the 
student for raising their hand, make a general statement to the whole class that they 
will only answer students who have their hands raised” or they can model hand 
raising by raising their own hand. Then the teacher should immediately answer 
and praise the student engaging in the appropriate behavior. This contingency that 
is implemented will influence the probability of a student complying with future 
requests from the teacher. 
4.5 Increasing Compliance 
In addition to stimulus control, there are many different types of antecedent 
interventions that can be used to increase student compliance. Here, compliance 
refers to the completion of a given request or demand. Some of these interven-
tions include techniques such the high-probability response sequence, antecedent 
exercise, providing choices, visual schedules, foreshadowing, and countdowns. 
4.5.1 High-Probability Response Sequence 
The high probability (high-p) response sequence can be referred to the process 
when a person starts something and then is likely to continue it until an outside
4 Antecedent Interventions: Proactive Strategies for Changing Behavior 41
force stops them. The high-p response sequence is an instructional sequence where 
2–5 low effort instructions are rapidly delivered that have a high-probability of 
occurring based on a student’s past performance and following these with a single 
request that has a low-probability of occurring to increase compliance (Mace et al., 
1988). The low effort instructions should be in the student’s repertoire (skills they 
know). 
The high-probability response sequence may be beneficial for a student who 
has a history of not completing a given instruction or is slow at completing tasks. 
The high-probability response sequence shouldn’t be used for students who don’t 
know how to perform a particular skill, rather, this should be used for students that 
know how perform a particular skill but won’t. Research has demonstrated that 
high-p sequences have been effective in increasing compliance in medical exam-
ination requests (Riviere et al., 2011), increasing social interactions (Davis et al., 
1994), and increasing compliance during academic tasks (Brosh et al., 2018). In the 
high-p sequence, a teacher presenting a workbook page to a typically noncompliant 
student could precede that low-probability request of a workbook page with three 
high-probability requests such as, “Take out workbook,” “Take out pencil,” and 
“Write name on top.” By preceding the request to start the workbook page with 
three relatively easy requests that the student has a history of reliably completing 
(and is already in their repertoire), the student will be more likely to comply with 
the less preferred task (given they have the required skill to do so). This sequence 
should produce “momentum-like” effects. When placing the demands, they should 
be placed as clear and concise statements and not as a question. For example, “sit 
down in chair” versus asking the student if they can sit down over there? Another 
example may be asking a student to turn on the lights for the class because that’s 
a preferred activity and then asking to them to sit (again higher probability task) 
followed by get started on the task (non-preferred and low probability request). 
One key variable that affectsthe efficacy of the high-p response sequence is how 
quickly the requests are delivered. Requests that are delivered in quick succession 
have been shown to be more effective at increasing compliance than requests that 
are delivered more slowly. In reference to the previous example, if the student is 
not told, “take out workbook” until several seconds have lapsed from them sitting 
down, they will be less likely to comply than if the request to take out workbook 
is delivered as soon as the student sits. 
4.5.2 Antecedent Exercise 
There is all this talk in education today about giving movement breaks but 
why? Breaks and movement increase productivity and help our brains learn new 
skills. Antecedent exercise has also been shown to be an effective intervention 
for increasing compliance and decreasing problem behaviors (Cannella-Malone 
et al., 2011). Antecedent exercise refers to engagement in physical activity prior 
to the onset of a demand (Morrison et al., 2011). In addition, Folino et al. (2014) 
discovered that 30 minutes of exercise resulted in 90 minutes of “behavioral
42 M. J. Cassano et al.
Table 4.1 Movement breaks 
used to help gain attention 
and calm bodies 
Movement breaks Website 
Cosmic yoga https://cosmickids.com/ 
GoNoodle https://www.gonoodle.com/ 
Headspace Www.Headspace.com 
Calm app for kids www.calm.com 
improvements” (p. 447), with the greatest improvement being within the first 
30 minutes. Movement breaks can be beneficial for students who engage in stereo-
typy, self-injury, and disruptive behaviors (Baumeister & MacLean, 1984; Kern 
et al., 1982; McGimsey & Favell, 1988). Kern et al. (1984) found that 15 minutes 
of mild exercise, such as playing ball, had little or no influence on stereotypy, but 
vigorous exercise such as 15 minutes of jogging showed reduced stereotypy. Based 
on previous research, by preceding periods of work with physical activity, a stu-
dent may be more likely to comply with presented tasks. For example, if a teacher 
finds that many students in a classroom are less likely to comply with a read-
ing assignment as compared to other school subjects, they may choose to schedule 
reading after gym. Thirty minutes of gym directly preceding reading class may 
increase compliance with demands placed during reading. There are also times in 
every teachers day when they feel like they are losing the kids focus and attention; 
maybe the math lesson ran late or they notice many of the students are increas-
ingly wiggly. On the other end, they may notice that when their students come 
back from recess, they have loads of energy and it’s difficult to get them back to 
work. Here are a few examples of videos that can be played in efforts to regain stu-
dent’s lost attention and also help calm their minds. The examples provided vary 
from vigorous workouts and guided dances, on GoNoodle and calming videos on 
Headspace, Cosmic Yoga, and Calm App for Kids. Depending on the student and 
what behavior they’re engaging in, one type of video may be more beneficial over 
another (Table 4.1). 
4.5.3 Providing Choices 
The provision of choices has also been demonstrated to be an effective means of 
increasing compliance when presenting a nonpreferred task. Through choosing a 
particular item or task, a student is allowed more control over their environment, 
typically resulting in increased compliance and a reduction in disruptive behavior 
(Dunlap et al., 1994). Choices can be as simple as giving the student a choice on 
which task they can complete first or what they can earn upon the completion of 
a task. For example, allowing the student to choose any five math problems out of 
the nine listed (even if a teacher throws in a few extra to start with, knowing they 
will be giving the student a choice). Or giving a student the choice to write or type 
their essay for a student that may dislike writing. Additionally, giving students the 
choice on what they want to earn, such as a book, or a puzzle after completing the
https://cosmickids.com/
https://www.gonoodle.com/
http://Www.Headspace.com
http://www.calm.com
4 Antecedent Interventions: Proactive Strategies for Changing Behavior 43
task. Cole and Levinson (2002) found that providing choices reduced challenging 
behavior for students who were typically noncompliant. By providing reasonable, 
controlled choices, students are more likely to follow the direction. It gives them 
a sense of control. In the end, teachers can ask themselves, does it really matter 
how the task is completed as long as it gets completed? 
4.5.4 Visual Schedules 
Visual schedules are often a staple in behavioral interventions, as they allow the 
student to reference what is expected of them and what is to come at any given 
moment. This can often be effective for reducing challenging behavior, especially 
for students who have difficulty with transitions (Dettmer et al., 2000; Dooley 
et al., 2001; Pierce et al., 2013). Visual schedules can be simple or complex in 
nature, and may range from simple work tasks to picture icons depicting spe-
cific ordered tasks (see Fig. 4.2 for examples of early education visual schedules). 
Visual schedules can be used for individual students or for the whole class. They 
help students to know what comes next and how many activities or steps are 
left (see Fig. 4.2a). Students can even remove or check the activity icons that 
have already taken place, so they have a visual of what tasks are still left and what 
tasks have been completed (see Fig. 4.2b). If a student is struggling with a step 
in the schedule, the step can be broken down into smaller parts. For example, a 
general handwashing visual (see Fig. 4.2c) may be beneficial for some students 
but the broken down steps (Fig. 4.3d) of the handwashing routine may be bene-
ficial for others. For the upper elementary grades, teachers may consider writing 
the subjects of the day on a white board for all the students to see, and then they 
can cross off or erase as each activity/class is complete. As students get familiar 
with visual schedules, students may become more independent and compliant with 
transitions to daily activities.
4.5.5 Foreshadowing and Countdowns 
Foreshadowing, or providing a student with an idea of what is to come, can be an 
effective means of preparing individuals for upcoming changes in their environ-
ment and help to increase compliance. Verbal reminders, visuals, and timers can be 
used as foreshadowing stimuli. For example, an educator may remind their class 
using a verbal reminder, “One minute until we need to line up” as a foreshadowing 
technique to indicate when a change is expected. In this case, the teacher would 
want to use that warning to let the students know it is almost time to go inside 
after recess. This could help to alleviate or reduce any problem behavior during 
transitions. 
Countdowns are another form of foreshadowing that can be effective at increas-
ing compliant responding to environmental changes. Countdowns can be verbal, 
visual, and/or auditory in nature. For example, counting down from five “5, 4,
44 M. J. Cassano et al.
Arrival 
Calendar 
Math 
Reading 
Bathroom 
Lunch/ 
Recess 
Music 
Art 
Go home 
a A sample visual schedule of a daily routine 
that can be usedfor the whole class or 
an individual student 
b Unpacking visual schedule checklist 
Open backpack and start unpacking 
Put snack in snack basket 
Put lunch in lunch bin 
Put homework in homework bin 
Hang up backpack 
Sit at desk and start morning work 
Fig. 4.2 Example of a visual schedule for early grades
3, 2, 1” while simultaneously using their fingers to represent the change in time 
(putting down one finger at a time) or incorporating a visual timer which depicts 
the amount of time left. Some visual timers may also beep for the last five or 
ten seconds in a countdown, which provides students with an auditory stimulus to 
indicate an upcoming environmental change. Middle school andhigh schools tend 
to use warning bells to let students know they have a few more minutes until they 
need to get to class and another bell for when they should be in their classroom. 
For an early education teacher, time trackers and visual timers can be useful to 
help with transitions, improve time management and increase productivity within 
tasks. Time trackers (Fig. 3a) can be used to keep students on task and allow them 
to know upcoming changes both with light and sound cues. The red visual timers 
(Fig. 3b) are helpful to show the passage of time as the red disc disappears. These
4 Antecedent Interventions: Proactive Strategies for Changing Behavior 45
c Simple handwashing visual schedule 
Turn on water 
Turn on water 
and Wet hands 
Get soap and rub 
hands together 
Rinse soap off 
Dry hands 
d Hand washing visual schedule broken down 
Turn on water 
Wet hands 
Get soap 
Rub hands 
together for 10 
seconds 
Rinse soap off 
Turn off water 
Get paper 
towel 
Dry hands 
Fig. 4.2 (continued)
can be used with individual students on their desk or visual cues for the entire 
class.
4.5.6 Setting Events 
The foreshadowing example provided above depicts modifications in the environ-
ment that directly affect the behavior. Using this example, the Antecedent would be 
the teacher giving the 1 minute warning, the Behavior would be the student coming 
in from recess. In contrast to antecedents that are a part of the three-term contin-
gency (ABC contingency), setting events influence behavior (Carter & Driscoll, 
2007), but may not directly affect it. For example, if a student does not sleep well
46 M. J. Cassano et al.
a. b. 
Fig. 4.3 a Time tracker® can be used to keep students on task using both light and sound cues. 
b Red visual timers are helpful to show the passage of time as the red disc disappears
and is fatigued while at school, they may put their head down on the desk when 
presented with a difficult worksheet, rather than completing the task. Here, lack 
of sleep does not directly determine whether an individual will engage in a spe-
cific behavior, such as putting their head down, but given a specific antecedent 
(presentation of a difficult worksheet), the student is more likely to engage in the 
behavior because of the setting event (lack of sleep). Conversely, if a preferred 
activity was presented, such as playing outside, the student may be less likely to 
put their head down. Setting events can make students more reactive to changes in 
the environment. 
An option to mitigate this behavior would be to use a communication log 
between the teacher and the parents. Teachers could ask parents via email or 
note if there are changes in the home setting that may influence school behavior. Of 
course, this would be cumbersome to do with every student, but we suggest only 
for students that are exhibiting a change of behavior, different from their personal 
norm. Teachers can also use data sheets similar to the ABC data sheet mentioned 
in previous chapters on how to track antecedents, behaviors and consequences in 
order to determine patterns. See Table 4.2 for an example. To determine if a setting 
event is affecting problem behaviors, it is suggested to keep track using these data 
sheets (or taking anecdotal notes) and look for patterns such as, every Monday the 
student is with one parent, or every Tuesday mom works the night shift, etc. which 
may have an impact on behavior.
Setting events are important to consider when a student is displaying problem 
behaviors. Equally important, setting events should be considered when a student 
is performing particularly well. For example, if a student is typically noncompli-
ant when asked to do math worksheets, then one day completes the task without 
displaying any problem behaviors, setting events should be examined to possibly 
determine whether a stimulus change, such as a good night’s sleep, eating a large
4 Antecedent Interventions: Proactive Strategies for Changing Behavior 47
Ta
b
le
 4
.2
 
A
B
C
 d
at
as
he
et
 w
ith
 s
et
tin
g 
ev
en
ts
 
E
ve
nt
 I
nf
o
Se
tti
ng
 E
ve
nt
s
A
nt
ec
ed
en
t
B
eh
av
io
r
C
on
se
qu
en
ce
 
T
im
e 
St
ar
t: 
__
__
__
_ 
T
im
e 
E
nd
: _
__
__
__
 
L
oc
at
io
n:
 _
__
__
__
 
A
ct
iv
ity
/S
ub
je
ct
: _
__
__
__
 
Pe
op
le
 P
re
se
nt
:
⟁
L
at
e 
to
 s
ch
oo
l
⟁
Fe
el
in
g 
si
ck
⟁
Si
ck
 y
es
te
rd
ay
⟁
D
id
n’
t e
at
 b
re
ak
fa
st
⟁
T
ir
ed
⟁
C
ha
ng
e 
in
 s
ch
ed
ul
e
⟁
Pa
re
nt
 s
ai
d 
ch
an
ge
 in
 h
om
e
⟁
M
ed
ic
at
io
n 
ch
an
ge
s
⟁
O
th
er
 
__
__
__
_
⟁
L
ow
 a
tte
nt
io
n
⟁
R
em
ov
al
 o
f 
ite
m
/ta
sk
: _
__
_ 
__
__
⟁
D
en
ie
d 
ac
ce
ss
 o
f 
ite
m
/ta
sk
: 
__
__
__
__
⟁
T
ra
ns
iti
on
 o
ut
 o
f 
th
e 
ro
om
⟁
T
ra
ns
iti
on
 b
et
w
ee
n 
su
bj
ec
ts
⟁
C
ha
ng
e 
in
 r
ou
tin
e
⟁
O
th
er
: _
__
__
__
__
⟁
H
itt
in
g 
pe
er
s
⟁
T
hr
ow
in
g 
ite
m
s
⟁
R
un
ni
ng
 a
w
ay
 f
ro
m
 te
ac
he
r
⟁
M
ou
th
 n
oi
se
s/
hu
m
m
in
g
⟁
Ta
sk
 r
ef
us
al
⟁
C
al
lin
g 
ou
t
⟁
Ta
sk
/it
em
 r
em
ov
ed
⟁
Ta
sk
/it
em
 p
ro
vi
de
d
⟁
A
tte
nt
io
n 
pr
ov
id
ed
⟁
C
hi
ld
 r
em
ov
ed
 f
ro
m
 c
la
ss
⟁
C
op
in
g 
st
ra
te
gy
 o
ff
er
ed
⟁
E
m
er
ge
nc
y 
sa
fe
ty
 
pr
oc
ed
ur
e 
us
ed
48 M. J. Cassano et al.
meal, seating changes, or alleviation of a headache, could have influenced the stu-
dent’s level of compliance. A data sheet similar in Table 4.2 could be created to 
track this or anecdotal data to examine these variables. Setting events can provide 
additional information on the likelihood of a given behavior and can help teachers 
predict a behavior’s occurrence based on events that precede the antecedent. 
In addition to looking at the setting events on a data sheet similar to Table 
4.2, the ABC data could be evaluated. Collecting and evaluating ABC data can 
be referred to as a functional behavior assessment (FBA). An FBA can be used 
to determine what typically happens before the behavior (antecedent), what the 
behavior looks like (behavior), and what follows the behavior (consequence) 
(Cooper et al., 2007; Jeong & Copeland, 2020). The results of an FBA can be used 
to determine the function of the behavior (why it’s occurring). Based on this infor-
mation, teachers can alter or enhance the environment to increase the likelihood 
of preferred behaviors occurring and decrease the likelihood of the occurrence of 
problem behaviors. 
4.6 Summary 
As authors discussed, an antecedent means the event that occurs right before a 
behavior. Analyzing the events that occur right before a behavior, and taking into 
consideration setting events, can help teachers predict why a behavior is occur-
ring. Based on this prediction or hypothesis, teachers can then come up with 
a proactive strategy to enhance or modify the environment to increase positive 
behaviors and reduce negative behaviors. Research has found that these antecedent 
interventions: high-probability response sequence, antecedent exercise, providing 
choices, visual schedules, foreshadowing and countdowns can be helpful to aid in 
producing socially significant change in behavior. 
References 
Amor, J. (2021). Cosmic kids yoga. www.cosmickids.com 
Baumeister, A. A., & MacLean, W. E., Jr. (1984). Deceleration of self-injurious and stereotypic 
responding by exercise. Applied Research in Mental Retardation, 5(3), 385–393. https://doi. 
org/10.1016/s0270-3092(84)80059-4 
Brosh, Fisher, L. B., Wood, C. L., & Test, D. W. (2018). High-probability request sequence: an 
evidence-based practice for individuals with autism spectrum disorder. Education and Training 
in Autism and Developmental Disabilities, 53(3), 276–286 
Cannella-Malone, H. I., Tullis, C. A., & Kazee, A. R. (2011). Using antecedent exercise to decrease 
challenging behavior in boys with developmental disabilities and an emotional disorder. Jour-
nal of Positive Behavior Interventions, 13(4), 230–239. https://doi.org/10.1177/109830071140 
6122 
Carter, M.,& Driscoll, C. (2007). A conceptual examination of setting events. Educational Psy-
chology, 27(5), 655–673. https://doi.org/10.1080/01443410701309183
http://www.cosmickids.com
https://doi.org/10.1016/s0270-3092(84)80059-4
https://doi.org/10.1016/s0270-3092(84)80059-4
https://doi.org/10.1177/1098300711406122
https://doi.org/10.1177/1098300711406122
https://doi.org/10.1080/01443410701309183
4 Antecedent Interventions: Proactive Strategies for Changing Behavior 49
Cole, & Levinson, T. R. (2002). Effects of within-activity choices on the challenging behavior of 
children with severe developmental disabilities. Journal of Positive Behavior Interventions, 
4(1), 29–37. https://doi.org/10.1177/109830070200400106 
Cooper, J. O., Heron, T. E., & Heward, W. L. (2007). Applied behavior analysis (2nd ed.). Pearson 
Education Inc. 
Davis, Brady, M. P., Hamilton, R., McEvoy, M. A., & Williams, R. E. (1994). Effects of high-
probability requests on the social interactions of young children with severe disabilities. Jour-
nal of Applied Behavior Analysis, 27(4), 619–637. https://doi.org/10.1901/jaba.1994.27-619 
Dettmer, S., Simpson, R. L., Smith Myles, B., & Ganz, J. B. (2000). The use of visual supports 
to facilitate transitions of students with autism. Focus on Autism and Other Developmental 
Disabilities, 15(3), 163–169. https://doi.org/10.1177/108835760001500307 
Dooley, P., Wilczenski, F. L., & Torem, C. (2001). Using an activity schedule to smooth school 
transitions. Journal of Positive Behavior Interventions, 3(1), 57–61. https://doi.org/10.1177/109 
830070100300108 
Dunlap, G., DePerczel, M., Clarke, S., Wilson, D., Wright, S., White, R., & Gomez, A. (1994). 
Choice making to promote adaptive behavior for students with emotional and behavioral chal-
lenges. Journal of Applied Behavior Analysis, 27(3), 505–518. https://doi.org/10.1901/jaba. 
1994.27-505 
Folino, A., Ducharme, J. M., & Greenwald, N. (2014). Temporal effects of antecedent exercise 
on students’ disruptive behaviors: An exploratory study. Journal of School Psychology, 52(5), 
447–462. https://doi.org/10.1016/j.jsp.2014.07.002 
GoNoodle, Movement and Mindfulness for Kids. www.gonoodle.com 
Jeong, & Copeland, S. R. (2020). Comparing functional behavior assessment-based interventions 
and non-functional behavior assessment-based interventions: A systematic review of outcomes 
and methodological quality of studies. Journal of Behavioral Education, 29(1), 1–41. https:// 
doi.org/10.1007/s10864-019-09355-4 
Kern, L., Koegel, R. L., & Dunlap, G. (1984). The influence of vigorous versus mild exercise on 
autistic stereotyped behaviors. Journal of Autism and Developmental Disorders, 14(1), 57–67. 
https://doi.org/10.1007/BF02408555 
Kern, L., Koegel, R. L., Dyer, K., Blew, P. A., & Fenton, L. R. (1982). The effects of physical 
exercise on self-stimulation and appropriate responding in autistic children. Journal of Autism 
and Developmental Disorders, 12(4), 399–419. https://doi.org/10.1007/BF01538327 
Mace, F. C., Hock, M. L., Lalli, J. S., West, B. J., Belfiore, P., Pinter, E., & Brown, D. K. (1988). 
Behavioral momentum in the treatment of noncompliance. Journal of Applied Behavior Anal-
ysis, 21(2), 123–141. https://doi.org/10.1901/jaba.1988.21-123 
McGimsey, J. F., & Favell, J. E. (1988). The effects of increased physical exercise on disruptive 
behavior in retarded persons. Journal of Autism and Developmental Disorders, 18(2), 167–179. 
https://doi.org/10.1007/BF02211944 
Michael, J. (1982). Distinguishing between discriminative and motivational functions of stimuli. 
Journal of the Experimental Analysis of Behavior, 42(1), 363–376. https://doi.org/10.1901/jeab. 
1982.37-149 
Morrison, H., Roscoe, E. M., & Atwell, A. (2011). An evaluation of antecedent exercise on behav-
ior maintained by automatic reinforcement using a three-component multiple schedule. Journal 
of Applied Behavior Analysis, 44(3), 523–541. https://doi.org/10.1901/jaba.2011.44-523 
Pierce, J. M., Spriggs, A. D., Gast, D. L., & Luscre, D. (2013). Effects of visual activity sched-
ules on independent classroom transitions for students with autism. International Journal of 
Disability, Development and Education, 60, 253–269. https://doi.org/10.1080/1034912X.2013. 
812191 
Puddicombe, A. (2022). Headspace. www.headspace.com 
Riviere, V., Becquet, M., Peltret, E., Facon, B., & Darcheville, JC. (2011). Increasing compli-
ance with medical examination requests directed to children with autism: Effects of a high-
probability request procedure. Journal of Applied Behavior Analysis, 44(1), 193–197. https:/ 
/doi.org/10.1901/jaba.2011.44-193 
Smith, M. A., & Tew, A. (2022). Calm. www.calm.com
https://doi.org/10.1177/109830070200400106
https://doi.org/10.1901/jaba.1994.27-619
https://doi.org/10.1177/108835760001500307
https://doi.org/10.1177/109830070100300108
https://doi.org/10.1177/109830070100300108
https://doi.org/10.1901/jaba.1994.27-505
https://doi.org/10.1901/jaba.1994.27-505
https://doi.org/10.1016/j.jsp.2014.07.002
http://www.gonoodle.com
https://doi.org/10.1007/s10864-019-09355-4
https://doi.org/10.1007/s10864-019-09355-4
https://doi.org/10.1007/BF02408555
https://doi.org/10.1007/BF01538327
https://doi.org/10.1901/jaba.1988.21-123
https://doi.org/10.1007/BF02211944
https://doi.org/10.1901/jeab.1982.37-149
https://doi.org/10.1901/jeab.1982.37-149
https://doi.org/10.1901/jaba.2011.44-523
https://doi.org/10.1080/1034912X.2013.812191
https://doi.org/10.1080/1034912X.2013.812191
http://www.headspace.com
https://doi.org/10.1901/jaba.2011.44-193
http://www.calm.com
5Getting the Behavior You Want Using 
Reinforcement 
Julie A. Ackerlund Brandt, Mindy J. Cassano, and Tanya Hough 
5.1 Getting the Behavior You Want Using Reinforcement 
Knowing the behavior you want to see and knowing how to get it to happen 
can be two different skills. Oftentimes, someone knows what students should do, 
but it is the process of getting those behavior to happen that can be the trickiest 
part. The key is quite simple: reinforcement. Don’t mistake simple for easy; using 
reinforcement is not always easy, but the principles break down to the same “sim-
ple” basics every time. The general principle is called differential reinforcement 
(DR) and it means to provide reinforcers, or rewards, for the behaviors that should 
happen again and not for those behaviors that shouldn’t (Hangen et al., 2020; 
Vladescu & Kodak, 2010; Vollmer & Iwata, 1992; Vollmer et al., 2020). Again, 
this definition is deceptively simplistic because there is a myriad of ways that this 
can be done, including shaping, demand fading, and specific DR procedures like 
differential reinforcement of other behavior (DRO), differential reinforcement of 
alternative behavior (DRA), differential reinforcement of incompatible behavior 
(DRI), among others (Jessel & Ingvarsson, 2016; Vladescu & Kodak, 2010). Any 
of these variations can work depending on your goals, and whether you are looking 
to replace a behavior you don’t want to see or teach a new behavior. 
5.1.1 Increasing Behaviors 
Most often, teachers are looking to teach. This includes imparting knowledge 
and new skills to their students. There are many reinforcement strategies to be
J. A. Ackerlund Brandt (B) · M. J. Cassano · T. Hough 
The Chicago School of Professional Psychology, Chicago, IL, USA 
e-mail: jbrandt@thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_5 
51
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_5&domain=pdf
mailto:jbrandt@thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_5
52 J. A. Ackerlund Brandt et al.
used within their teaching, but to varying degrees of effectiveness. The efficacy 
is altered because it is so important to understand what reinforcers reallyare. In 
many instances, they are called rewards, prizes, etc., but something isn’t a rein-
forcer until it changes behavior, making it a functional (or working) reinforcer. 
Functional reinforcer may take some time to figure out with some children, but 
the options break down into a few categories: attention, items, and escape. These 
groups encompass almost anything that you, or any other person, can provide that 
a child might want to behave in a certain way to obtain. 
5.1.1.1 Demand Fading 
Demand fading is often used with escape-maintained or task-avoidant behaviors 
(Piazza et al., 1996), but it is a procedure that can be used with any type of rein-
forcer (Ringdahl et al., 2013). The key components to these procedures are starting 
the student at a level that they can be successful at, and then slowly increasing the 
requirement until they reach the ultimate goal. One example might be working 
on multiplication. Memorizing multiplication tables can be very difficult for some 
children, which makes it a great skill for fading. By starting with a small goal, 
maybe five correct problems within 60 s, and then providing a reward upon reach-
ing that goal, the student will be able to build fluency (see Chap. 7 for a complete 
description) and confidence through their successful experiences. As the goals are 
increased, and they continue to reach those goals, and success continues to grow, 
it helps to build a strong foundation for learning and confidence in their ability to 
understand and complete difficult tasks. 
5.1.1.2 Shaping 
Another procedure that may help to enhance a skill is shaping. When using shap-
ing, you start with a behavior that your student can already do that is similar 
a more complex and/or difficult version of the skill; then, you slowly begin to 
change the behavior that is required until you reach the goal (Athens et al., 2007). 
An example may be in gym class, a student may be working on a gymnastic skill 
that requires substantial balance and coordination. Instead of starting with that dif-
ficult and complex behavior, an easier skill such as standing with one foot slightly 
raised would be a better starting point. As each approximation is mastered and the 
student is able to progress toward the complex behavior, they are able to do so by 
optimizing successes and minimizing failures. 
5.2 Implementation 
Increasing new behaviors can be difficult, especially in a class of 20 or more 
children with varying levels of readiness. Understanding some of these proce-
dures for increasing skills gradually can be very important for helping some of the 
lower-level learners begin to catch up to others. They can be effortful though, and 
oftentimes it helps to have the parent’s support. In some cases, token systems can 
be used in demand fading (Ringdahl et al., 2013) or shaping (Howie & Woods,
5 Getting the Behavior You Want Using Reinforcement 53
1982) and the parents can be responsible for providing stronger or large-scale rein-
forcers. This is much easier and more efficient now with the use of technology. 
For example, using an app called Class Dojo©, a second-grade teacher can give 
students points during the day for task completion and that may correspond to the 
amount of time the student plays a video game when he gets home from school. 
It’s a program that can be used for class-wide goals but is also used for setting up 
specific goals for students. 
5.2.1 Replacing Behaviors 
Another common challenge in classroom management may be the abundant occur-
rence of various problem behaviors. These behaviors can encompass behaviors that 
affect an individual student’s learning or multiple student’s learning; they may also 
range from small “annoying” behaviors to more aggressive or dangerous behav-
iors. There will always be behaviors that need to be decreased, but it is important 
to remember that for any behavior that is decreased, there should be another behav-
ior to take its place (Vollmer et al., 1993). Every behavior happens for a reason; 
therefore, if a behavior isn’t purposely replaced by another, the reason it occurred 
will still exist and another behavior may develop. 
Purposely replacing one behavior with another usually means using one of 
the three differential reinforcement procedures mentioned earlier (a) differential 
reinforcement of other behaviors (DRO), (b) different reinforcement of alternative 
behaviors (DRA), or (c) differential reinforcement of incompatible (DRI) behav-
iors (Boudreau et al., 2015; Vollmer et al., 2020). These are only three versions of 
differential reinforcement procedures but are likely the most useful in a classroom 
situation. 
5.2.1.1 Differential Reinforcement of Other Behavior 
Differential Reinforcement of Other Behavior (DRO) is the most flexible of the 
procedures discussed here. When using DRO, a timer of some sort is used to 
alert when delivery of a reward should occur, but only if any behavior other than 
the problem behavior is occurring (Rey et al., 2020). The time frame used in a 
DRO is short enough to observe something other than the problem behavior. This 
procedure might be useful during rest time because, sometimes, there will be a 
handful of students who will start talking or making noise when they are supposed 
to be quiet. If the problematic behaviors start after 5 min typically, a plan to stop 
by their desks and provide praise for “being quiet” or “not talking” at 4 or even 
4.5 min may be developed. If praise isn’t working, something like a token to trade 
in after rest time for some kind of prize after rest time might be a good option. 
Once the first-time frame is successful, you can increase it from 4 to 5 min, then 
from 5 to 7, and continue increasing the time until all the children are remaining 
quiet for the entire time. They may not all be resting, some may be reading or 
playing with a fidget toy, for example, but they have all replaced their “noisy” 
behaviors with something quiet.
54 J. A. Ackerlund Brandt et al.
5.2.1.2 Differential Reinforcement of Alternative Behavior 
Differential Reinforcement of Alternative Behavior (DRA) is a little more inten-
sive than DRO, because you are looking to replace one behavior with another 
specific behavior (Petscher et al., 2009). When using DRA, try to remember that 
all behavior has a reason and may be a way to communicate something. For exam-
ple, a student who blurts out answers before any else can try to answer may be 
working to get your attention because they don’t get any at home. Instead of sim-
ply telling them not to speak out, teaching them to raise their hand would be 
an example of DRA because it is a specific behavior that replaces the original 
problem behavior. To capitalize on the power of attention, providing excited and 
enthusiastic attention when they raise their hand and answer as opposed to a calm 
reminder to raise their hand if they forget, may increase the efficacy. 
Instead of using different types of attention to prompt through the different 
behaviors, DRA may be implemented with a procedure called extinction. If extinc-
tion is used, then the problem behavior does not result in delivery of the reinforcer, 
only the target behavior earns that reward. Using the example from before, this will 
include ignoring the student if they blurt out an answer and only responding, and 
responding excitedly, when they raise their hand. Although this can be an effective 
way to implement the procedure, there are other side effects that might occur. For 
example, something called an extinction burst might happen, where the behavior 
would get worse. This might look like the student repeating the answer louder 
and louder. Another example would be variations in problem behavior, meaning 
the student might try new and similar behaviors, such as standing up, throwing 
their book, or swearing, to get your attention before they start using the planned 
alternative behavior of raising their hand. These issues can often be mitigatedby 
talking with the student before class starts or by using the corrective prompting 
procedure mentioned initially. 
Like DRO, DRA should initially be implemented very often and can then be 
weaned out. This may mean calling on that student every time they raise their 
hand for a while, but once they are no longer speaking out of turn, calling on 
them every other time they raise their hand. As the student remains successful, 
you can continue to stretch out the regularity of calling on them versus the other 
children. 
5.2.1.3 Differential Reinforcement of Incompatible Behavior 
Differential reinforcement of incompatible behavior (DRI) is the most intense of 
the versions of differential reinforcement described, but it may also be the most 
effective. When using DRI, rewards are only delivered after a behavior that is 
incompatible with the problem behavior, meaning they cannot happen at the same 
time (Kelly & Bushell, 1987). The reason this may be more successful than DRA 
is that the problem behavior cannot happen if the new behavior is occurring. In 
the DRA example, it would have been possible that the student could raise their 
hand and still blurt out the answer, which might extend the learning time. An 
example of DRI would be if a student reaches out and grabs items off their peers’ 
desks, so whenever they were sitting with their hands folded in their lap or folded
5 Getting the Behavior You Want Using Reinforcement 55
on their desk, a reward was delivered. Having their hands folded on their desk 
is incompatible with reaching onto their peers’ desks; therefore, increasing that 
behavior will decrease the problem behavior, potentially without having to call 
attention to the problem behavior. 
5.3 Summary 
Across all the differential reinforcement procedures, the key component is know-
ing the functional reinforcer for the problem behavior. Remember, every behavior 
has a reason (or function), and if you want to replace one behavior with another, 
you want to be sure that you’re using the same reason to increase that new behav-
ior. If a student breaks their pencils or rips worksheets to escape work and you 
try to praise task completion, the planned behavior change is not likely to occur. 
However, if the student is breaking their pencils or ripping worksheets so you’ll 
reprimand them and bring them more materials, provided better attention immedi-
ately for getting started, more often for task completion will likely result in more 
behavior change. By understanding what your students are trying to tell you with 
their behaviors and keeping in mind that you always want to have something to 
replace a problem behavior, you are much more likely to use reinforcement-based 
procedures successfully. 
5.4 Application: Reinforcers Need to Be Worth It 
Having something “worth” working for is very important—especially when teach-
ing new skills. It would be wonderful if all children loved reading or math or 
science just for the sake of learning; however, sometimes the effort needed to learn 
these skills outweighs the natural enjoyment of the activity. This is when social 
reinforcers (meaning something delivered by another person) can be used. Social 
reinforcers can include praise, points, grades, treats, extra recess time, breaks from 
class, tickets, and the list goes on; it’s practically endless, the only limit is your 
creativity. 
5.4.1 Attention as a Reinforcer 
Many times, when talking about reinforcers, people think of edible reinforcers. 
Although these can be used, there are many types of attention that can be used just 
as successfully as edibles or toys. Attention can encompass many types ranging 
from smiles and winks to simple praise statements to physical interactions like 
high five or hugs (Harper et al., 2021). Using attention as a reinforcer is common, 
partially because it is readily available and relatively easy to deliver because it 
does not require proximity; meaning a teacher may be across the room and praise
56 J. A. Ackerlund Brandt et al.
a student’s behavior just as effectively as if they were standing next to the student 
(Floress et al., 2018). 
One of the most used types of attention is praise; however, there are many ways 
that praise can be varied to enhance effectiveness. For example, delivery can make 
a huge difference in the value of praise. Imagine teaching a class full of students 
and when one answers a question, if a small smile and “Yep, that right” without 
any change in tone or volume occurs, it may not be effective. It might be difficult 
time even imagine that scenario because it seems counterintuitive; however, this 
happens very often in classrooms when there are a lot of questions being asked and 
time is short. It is important to always try to make your praise as enthusiastic as 
possible to increase the likelihood that it will be a desirable outcome and reinforce 
the behavior you’d like to see again (Clausen et al., 2007; Weyman & Sy, 2018). 
Another variation of praise is based on the content of the statements. General 
praise includes statements which are basic and typically short; some examples 
include “good job” or “that’s awesome.” These statements are easy to remember 
and quickly delivered; however, putting a little extra information in the statement 
can increase the efficacy of it as a reinforcer and aid in learning. Behavior-specific 
praise takes the basic praise statement and enhances it by adding in a description 
of the behavior being praised; for example, “nice job writing your name” or “I like 
how you raised your hand quietly.” This is a simply way to help clarify exactly 
what behavior earned praise and enhance the likelihood that the specific behavior 
will happen again (Polick et al., 2012). 
5.4.2 Strength of Reinforcers 
Not only are reinforcers varied and many, but they can differ wildly across chil-
dren. This is an important consideration because sometimes things that “everyone 
likes” won’t work for some children. For example, praise is often delivered in class 
when a student answers a question correctly. If that student continues to answer 
questions, or even better, tries to answer more questions, this would be an example 
of reinforcement; however, if that student doesn’t raise their hand to answer ques-
tions or offer answers, praise likely isn’t a reinforcer so something else like a token 
or high five might work better. Another example is when children are sent into the 
hall or to the principal’s office. The intention is often to punish—or decrease— 
whatever behavior happened prior to being sent out of the room; however, it may 
actually reinforce the behavior by allowing the child to escape classwork. Some-
times, when a task is especially difficult, the strongest reinforcer is to get away 
from it, so providing a break for trying and completing some of the assignment 
might be a great idea. As the skill becomes easier, it is always possible to increase 
the amount needed to take a break.
5 Getting the Behavior You Want Using Reinforcement 57
5.5 Preference Versus Reinforcement 
In behavior analytic instruction, preference and reinforcement are two different 
things. As a teacher, it is important to identify the different aspects in your class-
room. For example, preferred items are those that we like, enjoy, and may choose 
to have/interact with; whereas reinforcers are those that actually change our behav-
ior. There are many things that one may like, but only certain things will change 
one’s actions in specific situations. Preference assessments aim to identify the most 
preferred items for a person or group of people at the time of the assessment 
(Hagopian et al., 2001). For example, have you ever asked your class what they 
want to earn if the entire class meets an expectation, maybe extra recess time or a 
night of no homework. Once the class comes to an agreement, you have evaluated 
which reinforcer or reward is most preferred by the largest group of the class.Preference assessments can be conducted as frequently as needed to identify 
items that a person likes. At times, we may draw conclusions from these assess-
ments and say, if this is their most preferred item, it is likely to also function as a 
reinforcer. The item is then used as a reinforcer and behavior change is measured. 
If the behavior changes in the intended direction (i.e., increase for skill acquisi-
tion, decrease for behavior reduction), the item can be said to be functioning as a 
reinforcer (Hagopian et al., 2001). 
5.6 Summary 
Most of the examples provided are of individual behavior problems or reinforce-
ment considerations, but it is important to keep in mind that it is possible to 
use reinforcement on a large-scale across all the students in your class, or even 
throughout an entire school. Examples include group contingencies (see Chap. 6) 
and positive behavior support (this chapter) which are detailed in later chapters. 
These larger-scale programs are based on the “simple” procedure explained in this 
chapter, and all can be broken down and considered based on the goal you’re 
working toward. 
References 
Athens, E. S., Vollmer, T. R., & St. Peter Pipkin, C. C. (2007). Shaping academic task engagement 
with percentile schedules. Journal of Applied Behavior Analysis, 40(3), 475–488. https://doi. 
org/10.1901/jaba.2007.40-475 
Boudreau, B. A., Vladescu, J. C., Kodak, T. M., Argott, P. J., & Kisamore, A. N. (2015). A com-
parison of differential reinforcement procedures with children with autism. Journal of Applied 
Behavior Analysis, 48(4), 918–923. https://doi.org/10.1002/jaba.232 
Clausen, K. A., Alden-Anderson, E., Stephenson, K., Mueller, A., & Klatt, K. P. (2007). The effec-
tiveness of enthusiasm on skill acquisition by children with autism. Journal of Speech and 
Language Pathology and Applied Behavior Analysis, 2, 32–45.
https://doi.org/10.1901/jaba.2007.40-475
https://doi.org/10.1901/jaba.2007.40-475
https://doi.org/10.1002/jaba.232
58 J. A. Ackerlund Brandt et al.
Floress, M. T., Jenkins, L. N., Reinke, W. M., & McKown, L. (2018). General educations teach-
ers’ natural rates of praise: A preliminary investigation. Behavioral Disorders, 43(4), 411–422. 
https://doi.org/10.1177/0198742917709472 
Hagopian, L. P., Rush, K. S., Lewin, A. D., & Long, E. S. (2001). Evaluating the predicitive validity 
of a single stimulus engagement preference assessment. Journal of Applied Behavior Analysis, 
34(4), 475–485. https://doi.org/10.1901/jaba.2001.34-475 
Hangen, M. M., Romero, A. N., Neidert, P. L., & Borrero, J. C. (2020). “Other” behavior and the 
DRO: The roles of extinction and reinforcement. Journal of Applied Behavior Analysis, 53(4), 
2385–2404. https://doi.org/10.1002/jaba.736 
Harper, A. M., Dozier, C. L., Briggs, A. M., Diaz de Villegas, S., Ackerlund Brandt, J. A., & Jowett 
Hirst, E. S. (2021). Preference for and reinforcing efficacy of different types of attention in 
preschool children. Journal of Applied Behavior Analysis, 54, 882–902. 
Howie, P. M., & Woods, C. L. (1982). Token reinforcement during the instatement and shaping 
of fluency in the treatment of stuttering. Journal of Applied Behavior Analysis, 15(1), 55–64. 
https://doi.org/10.1901/jaba.1982.15-55 
Jessel, J., & Ingvarsson, E. T. (2016). Recent advances in applied research on DRO procedures. 
Journal of Applied Behavior Analysis, 49(4), 991–995. https://doi.org/10.1002/jaba.323 
Kelly, M. B., & Bushell, D. J. (1987). Student achievement and differential reinforcement of 
incompatible behavior: Hand raising. Psychology in the Schools, 24(3), 272–281. https://doi. 
org/10.1002/1520-6807(198707)24:3%3C273::AID-PITS2310240312%3E3.0.CO;2-1 
Petscher, E. S., Rey, C., & Bailey, J. S. (2009). A review of empirical support for differential rein-
forcement of alternative behavior. Research in Developmental Disabilities, 30(3), 409–425. 
https://doi.org/10.1016/j.ridd.2008.08.008 
Piazza, C. C., Moes, D. R., & Fisher, W. W. (1996). Differential reinforcement of alternative behav-
ior and demand fading in the treatment of escape-maintained destructive behavior. Journal of 
Applied Behavior Analysis, 29, 569–572. 
Polick, A. S., Carr, J. E., & Hanney, N. M. (2012). A comparison of general and descriptive praise 
in teaching intraverbal behavior to children with autism. Journal of Applied Behavior Analysis, 
45, 593–599. 
Rey, C. N., Betz, A. M., Sleiman, A. A., Kuroda, T., & Podlesnik, C. A. (2020). The role of 
adventitious reinforcement during differential reinforcement of other behavior: A systematic 
replication. Journal of Applied Behavior Analysis, 53(4), 2440–2449. https://doi.org/10.1002/ 
jaba.678 
Ringdahl, J. E., Kitsukawa, K., Andelman, M. S., Call, N., Winborn, L., Barretto, A., & Reed, G. 
K. (2013). Differential reinforcement with and without instructional fading. Journal of Applied 
Behavior Analysis, 35(3), 291–294. https://doi.org/10.1901/jaba.2002.35-291 
Vladescu, J. C., & Kodak, T. (2010). A review of recent studies on differential reinforcement dur-
ing skill acquisition in early intervention. Journal of Applied Behavior Analysis, 43, 351–355. 
https://doi.org/10.1901/jaba.1984.17-175 
Vollmer, T. R., & Iwata, B. A. (1992). Differential reinforcement as treatment for behavior dis-
orders: Procedural and functional variations. Research in Developmental Disabilities, 13, 
393–417. https://doi.org/10.1016/0891-4222(92)90013-V 
Vollmer, T. R., Iwata, B. A., Zarcone, J. R., Smith, R. G., & Mazaleski, J. L. (1993). The role 
of attention in the treatment of attention-maintained self-injurious behavior: Noncontingent 
reinforcement and differential reinforcement of other behavior. Journal of Applied Behavior 
Analysis, 26, 9–21. 
Vollmer, T. R., Peters, K. P., Kronfli, F. R., Lloveras, L. A., & Ibañez, V. F. (2020). On the defini-
tion of differential reinforcement of alternative behavior. Journal of Applied Behavior Analysis, 
53(3) 
Weyman, J. R., & Sy, J. R. (2018). Effects of neutral and enthusiastic praise on the rate of discrim-
ination acquisition. Journal of Applied Behavior Analysis, 51, 335–344.
https://doi.org/10.1177/0198742917709472
https://doi.org/10.1901/jaba.2001.34-475
https://doi.org/10.1002/jaba.736
https://doi.org/10.1901/jaba.1982.15-55
https://doi.org/10.1002/jaba.323
https://doi.org/10.1002/1520-6807(198707)24:3%3C273::AID-PITS2310240312%3E3.0.CO;2-1
https://doi.org/10.1002/1520-6807(198707)24:3%3C273::AID-PITS2310240312%3E3.0.CO;2-1
https://doi.org/10.1016/j.ridd.2008.08.008
https://doi.org/10.1002/jaba.678
https://doi.org/10.1002/jaba.678
https://doi.org/10.1901/jaba.2002.35-291
https://doi.org/10.1901/jaba.1984.17-175
https://doi.org/10.1016/0891-4222(92)90013-V
Part II 
Group Systems 
Julie A. Ackerlund Brandt 
Now that we have provided you with the building blocks, we will introduce 
some of the most common group-based procedures, which will be helpful for 
most—and potentially all—students in your classrooms! We will start off with 
a description of positive behavior intervention support (PBIS; Goh & Bambara, 
2012), which is one of the most widely used behavior management systems in the 
USA. Many schools use this three-tiered system to encourage positive behaviors 
and discourage negative behaviors (Pinkelman & Horner, 2019). There is a strong 
focus on promoting those positive behaviors using many of the procedures dis-
cussed in Chaps. 4 and 5. It is also a perfect introductory chapter to lead into the 
many procedures that may be applied to a small or large group! 
Whereas PBIS is a system that needs to be implemented with a large group, 
some of the systems in this part can work with a smaller group or even individually. 
For example, working on fluency can be done with one student or can be worked 
on by everyone in the class. Chapter 8 will explain the benefits of fluency and how 
it might be applied in different scenarios. There is also TAGteach, personalized 
systems of instruction, equivalence-based instruction,and precision teaching that 
can be developed on an individual level or implemented across multiple students. 
At the core of all these systems is the importance of individualized goals and 
progress. Using any of these systems, you have the power to meet each student at 
his or her level and help them maximize their abilities and reach higher goals. 
Moving to the systems that are typically implemented with larger groups, you 
will read about group contingencies (Groves et al., 2023) and direct instruction 
(Twyman, 2021). “Group contingencies” may be a new term to you, but the con-
cept likely isn’t. Much of the research on group contingencies uses the name the 
“Good Behavior Game” to highlight the importance of the behavior(s) you want 
to see happen again and again. In Chap. 7, you will be introduced to the vari-
ous contingency set-ups that you can use, depending on the behavior(s) of interest 
and how many students you must work with. These procedures are often easy to 
implement and fun for teachers along with students! 
Another system devised for larger groups is direct instruction (DI), and one you 
may be familiar with. This was a system tested in the 1970s by the Department 
of Education and shown to be the most effective group-based instructional system. 
Over the past 50 years, there have been multiple iterations and modifications made 
to the initial iteration of this system. There are many different programs that can
https://doi.org/10.1007/978-3-031-35825-8_4
https://doi.org/10.1007/978-3-031-35825-8_5
https://doi.org/10.1007/978-3-031-35825-8_8
https://doi.org/10.1007/978-3-031-35825-8_7
60 Part II: Group Systems
be implemented across subjects and with interesting and creative adaptations. DI 
can help learners who may be struggling to catch up to their peers through sys-
tematic and methodical presentations of various skills and is a wonderful example 
of using ABA with groups of children (Mason & Otero, 2021). Overall, this part 
of the textbook is meant to be a bridge from the introductory section with basic 
terms and principles to the last part which describes some of these most success-
ful settings in which these systems have been implemented. In this part, all the 
basics previously described should fall together and get you excited to read about 
examples of achievement at larger, entire-school-sized success stories! 
References 
Goh, A. E., & Bambara, L. M. (2012). Individualized positive behavior support in school settings: 
A meta-analysis. Remedial and Special Education, 33(5), 271–286. https://doi.org/10.1177/074 
1932510383990 
Groves, E. A., Najafichaghbouri, M., Seel, C. J., Fisher, S., Thomas, C., & Joslyn, P. R. (2023). 
A systematic review of group contingencies in alternative education settings. Education and 
Treatment of Children. Advance online publication. https://doi.org/10.1007/s43494-023-000 
95-9 
Mason, L., & Otero, M. (2021). Just how effective is direct instruction? Perspective on Behavior 
Science, 44(2), 225–244. https://doi.org/10.1007/s40614-021-00295-x 
Pinkelman, S. E., & Horner, R. H. (2019). Applying lessons from the teaching-family model: Pos-
itive behavioral interventions and supports (PBIS). Perspectives on Behavior Science, 42(2), 
233–240. https://doi.org/10.1007/s40614-019-00199-x 
Twyman, J. S. (2021). The evidence is in the design. Perspectives on Behavior Science, 44(2), 195– 
223. https://doi.org/10.1007/s40614-021-00309-8
https://doi.org/10.1177/0741932510383990
https://doi.org/10.1177/0741932510383990
https://doi.org/10.1007/s43494-023-00095-9
https://doi.org/10.1007/s43494-023-00095-9
https://doi.org/10.1007/s40614-021-00295-x
https://doi.org/10.1007/s40614-019-00199-x
https://doi.org/10.1007/s40614-021-00309-8
6Positive Behavior Intervention 
System 
Gena Pacitto 
6.1 Overview 
Positive Behavior Intervention Support (PBIS) is a research-based, three-tiered, 
system-wide intervention designed to promote learning outcomes and prevent and 
decrease problem behaviors (Sugai & Horner, 2006). This system is grounded 
in three principles: promoting evidence-based practice, supporting change at the 
systems-level, and promoting change and effective practice that are sustained over 
time (Sugai & Horner, 2006). “The foundation for school wide PBIS lies in the 
application of these features to the whole school context in an effort to prevent, 
as well as change, patterns of problem behavior” (Horner & Sugai, 2005, p.360). 
Prevention is a main tenet of PBIS and is implemented across three tiers from a 
public health perspective with a variety of interventions designed to prevent prob-
lem behavior (primary) and reduce the impact and frequency of problem behavior 
(secondary and tertiary; Sugai & Horner, 2006). Figure 6.1 is a visual representa-
tion of the tiers that will later be discussed in detail. This chapter will provide 
you with background, implementation, and application of PBIS in the general 
education setting.
Positive Behavior Support (PBS), the original framework prior to the concep-
tualization of PBIS, was developed to improve the quality of life for students with 
severe, challenging behavior and developmental disabilities (Carr et al., 2002). 
PBS has a solid foundation in behavioral science, specifically, applied behavior 
analysis (Sugai & Horner, 2002). Following the implementation of amendments 
to the Individuals with Disabilities in Education Act (IDEA’97) and the spread of 
awareness for the need for more support for individuals with disabilities, PBS was
G. Pacitto (B) 
The Chicago School of Professional Psychology, Chicago, IL, USA 
e-mail: gpacitto1@thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_6 
61
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_6&domain=pdf
mailto:gpacitto1@thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_6
62 G. Pacitto
Fig. 6.1 Three-tiered prevention continuum of PBIS
extended to multiple settings including home, school, and community, students 
without disabilities, and varied ranges of social and academic problem behav-
iors. After the PBS model was more widely disseminated in public schools, it 
was renamed the Positive Behavior Intervention System (PBIS) in 2004. PBIS 
has become substantially widespread in recent years, with approximately 25,000 
schools in the United States implementing the interventions (PBIS, 2023). 
6.2 PBIS Framework in Schools-SW-PBIS 
PBIS is not a formal curriculum. Instead, it is a framework by which school 
districts and individual schools can adopt organizational practices that estab-
lish effective and preventative behavior interventions (Horner et al., 2014). PBIS 
encompasses core features that are drawn from decades of systematic research in 
education, mental health, and behavior analysis (Biglan, 1995). This chapter will 
describe the implementation of PBIS both school-wide and in the general educa-
tion classroom. After reading this chapter, you will have a better understanding 
of this three-tiered system and how you can use PBIS to promote more effective 
classroom management. 
It is important to keep in mind that PBIS is not a curriculum that can be pur-
chased, and it cannot be effectively implemented following a one- or two-day
6 Positive Behavior Intervention System 63
training. It involves commitment from the entire school staff to promote change at 
the systems level. If a commitment to this level of change is made, though, you 
will see your students improve both socially and academically (PBIS.org). 
Implementation of PBIS in a school begins with self-assessment to determine 
what elements are currently in place and to select implementation that builds upon 
the strengths that are already present in the organization. One example of thistype of assessment is the School-Wide Evaluation Tool (SET). The goal of the 
SET is to assess the features already in place, determine annual goals for the 
school building, evaluate on-going efforts in supporting positive behavior, design 
and revise procedures in place, and compare results from year to year (Mercer 
et al., 2017). Following initial assessment, the school team will typically receive 
three to six days of training and support from a coach (Scott & Martinek, 2006). 
Following coaching and throughout implementation, teams are expected to collect 
ongoing data on the fidelity of their initial application of the system. These fidelity 
measures are also in place to maintain sustainability of the procedures. Examples 
of coaching and treatment integrity/fidelity data procedures are provided later in 
this chapter. 
As mentioned above, PBIS is characterized by three tiers: Tier 1 (universal), 
Tier 2 (targeted), and Tier 3 (intensive). Each of these tiers is aligned with the 
types of support the specific student or group of students need. 
6.2.1 Tier 1 
Tier 1 is the universal level, which means that support is provided to the entire 
student body in the school. There are core features that are specific to Tier 1 (see 
Table 6.1) including (a) the commitment of staff to proactive strategies and a for-
mal approach to discipline; (b) positively stated behavioral expectations that are 
identified at implementation; (c) the teaching of those expectations in multiple set-
tings across the school day; (d) a reward system that acknowledges those students 
who display the expectations; (e) specific consequences for students who display 
problem behavior; (f) data collection to drive decision-making (Sugai & Horner, 
2009).
Below is a summary of the practices and systems included in Tier 1 (PBIS.org). 
Usually there are three to five behavioral expectations taught schoolwide. They 
are developed by the school-based PBIS team. Following teaching, modeling, 
and practice, students can earn tokens or tickets for rewards. Often school-based 
rewards include a school store, assemblies, or fun days (McDaniel et al., 2015). 
Figure 6.2 is an example of an incentive that may be earned in Tier 1.
If Tier 1 is implemented with fidelity, it is expected that 80% of the student 
body will have their needs met by the universal system (Lewis & Sugai, 1999). 
PBIS teams are encouraged to self-assess for fidelity to monitor implementation 
and identify next steps (Mercer et al., 2017). Typically, PBIS teams are composed 
of different staff members from the building including teachers, paraprofessionals, 
support staff, and administrators. These staff members often volunteer to be a part
64 G. Pacitto
Table 6.1 Tier 1 system and practices 
Tier 1 foundational systems Tier 1 practices 
An established leadership team School-wide positive expectations and 
behaviors are taught 
Regular meetings Established classroom expectations aligned 
with school-wide expectations 
A commitment statement for establishing a 
positive school-wide social culture 
A continuum of procedures for encouraging 
expected behavior 
On-going use of data for decision making A continuum of procedures for discouraging 
problem behavior 
Professional development plans Procedures for encouraging school-family 
partnership 
Personnel evaluation plan
Fig. 6.2 Sample Tier 1 reward
of the PBIS teams. Some districts also hire outside PBIS consultants or “coaches”. 
There are a variety of fidelity surveys that are widely used to assess Tier 1 imple-
mentation. Fidelity of implementation and data collection facilitates data-based 
decision making, maintenance, and sustainability of PBIS (Mercer et al., 2017). 
The students who do not make adequate progress with the support provided at the
6 Positive Behavior Intervention System 65
universal level could be eligible for Tier 2 intervention (McDaniel et al., 2015). 
Eligibility for Tier 2 services is determined by collaboration of the PBIS team and 
school administrators. 
6.2.2 Tier 2 
The supports that characterize PBIS Tier 2 (See Table 6.2) usually apply to 10%-
15% of the students in a school. These supports are usually provided to small 
groups, and they are monitored more regularly when compared to Tier 1. Below 
is a summary of the practices and systems included in Tier 2 (PBIS.org). 
Monitoring and data collection are used to inform further interventions and 
to determine responsiveness to the universal supports and incentives (McDaniel 
et al., 2015). These could be in the form of data collection on frequency of tar-
get behaviors in the classroom or review of discipline records. The interventions 
may include social skills groups, strategies targeting self-regulation, and “check-
in/check-out” (CICO), which is a widely used example of a Tier 2 intervention. 
CICO has also been referred to as the Behavior Education Program (Wolfe et al., 
2016). When implementing CICO, the student is provided increased opportunities 
for positive interaction with adults, in addition to frequent feedback. This process 
involves the target student meeting with an assigned adult mentor every morning 
for “check-in”. During this brief meeting, the mentor may review school rules, 
set a goal collaboratively with the student, and provide the student with a daily 
progress report (See Fig. 6.3). At the end of the school day (check-out) the mentor 
will review the progress report with the student and possibly deliver an incentive 
for goals met. Mentors can be identified as an individual with whom this student 
already has a rapport, for example a special area teacher or previous teacher. Men-
tors are often discussed and chosen at PBIS team meetings (Wolfe et al., 2016). A 
sample of a CICO log is below.
Table 6.2 Tier 2 systems and practices 
Tier 2 foundational systems Tier 2 practices 
An intervention team with a coordinator Increased instruction and practice with 
self-regulation and social skills 
Behavioral expertise Increased adult supervision 
Fidelity and outcome data are collected Increased opportunities for positive 
reinforcement 
A screening process to identify students 
needing Tier 2 support 
Increased pre-corrections 
Access to training and technical assistance Increased focus on possible function of 
problem behaviors 
Increased access to academic supports 
66 G. Pacitto
Check In/Check Out Log 
Check In Date: Time: Initials: 
Check 
Out 
Date: Time: Initials: 
Fig. 6.3 Check-in/Check-out log 
6.2.3 Tier 3 
Some students, approximately 5% of the school population, do not respond to the 
support in the universal and targeted tiers. These students may require the more 
intensive and individualized support available in Tier 3 (see Table 6.3) due to their 
complex and extensive behavior problems and histories (McDaniel et al., 2015). 
Below is a summary of the practices and systems included in Tier 3 (PBIS.org). 
The interventions developed in Tier 3 are function-based and often include 
comprehensive functional behavior assessments (FBA) that are conducted prior 
to developing treatment plans. Data is gathered from various school personnel and 
through direct observation and multiple methods are used to develop a multicom-
ponent intervention. The FBA process is both time-consuming and labor-intensive, 
and these students often need other Tier 3 supports, such as mental health counsel-
ing. Tier 3 supports require multiple professionals and resources, so it is imperative 
that schools make efforts to implement effective and consistent Tier 2 interventions 
to ensure appropriate allocation of resources (McDaniel et al., 2015).
Table 6.3 Tier 3 systems 
and practices 
Tier 3 foundational systems Tier 3 practices 
A multi-disciplinary team Function-based assessments 
Behavior support expertise Wraparound supports 
Formal fidelity and outcome data 
are collected 
Cultural and contextual fit 
6 Positive Behavior Intervention System 67
6.3 ImplementationPBIS has been successfully implemented across seven different states (Horner 
et al., 2014). PBIS state coordinators from Colorado, Florida, Illinois, Maryland, 
Missouri, North Carolina, and Oregon completed the State Implementation and 
Scaling Survey (SSIS). The data generated from the survey included the number 
of schools in the state implementing PBIS and the timeline of the implementation 
of different stages of PBIS. All of the states were able to establish scaling (larger 
implementation across more schools) of PBIS due to administrative support, exper-
tise in training, and availability of trained coaches. These states were all able to 
achieve a large-scale implementation of PBIS due to establishment of training, 
coaching, and technical expertise at the local capacity (Horner et al., 2014). 
The goal when facilitating implementation of PBIS is the achievement of effec-
tive school-wide behavior support, not only for students, but for all members of 
the school community. Effective implementation of Tier 1 supports school-wide 
will lead to positive outcomes for the school related to both behavior and overall 
school climate (Bradshaw et al., 2012). Several studies suggest that schools that 
implement PBIS have more positive relationships among staff (Bradshaw et al., 
2008) and significantly fewer suspensions (Bradshaw et al., 2010). 
6.4 Application 
You may be wondering, how do I apply these strategies in my own classroom, 
especially with all the other expectations? Good news: you probably are already 
implementing Tier 1 and Tier 2 supports without even realizing it! The following 
section will provide you with some examples and tools for the most common 
behavior issues you may be struggling with in your own classroom. 
6.4.1 Tier 1 Supports 
You may have some students who will do anything to avoid completing an 
academic task. There are some antecedent (before the behavior occurs) and con-
sequent (after the student engages in the desired behavior) interventions that you 
can and/or already do use in your classroom (Fig. 6.4).
Breaks are a quick, easy to implement option for students that struggle with 
completing tasks or become easily overwhelmed with longer academic tasks. 
Breaks can be prompted, or a student can be given a specific number of break 
cards to use throughout the day. Benefits of giving the students options to take 
multiple breaks include increasing motivation, avoiding a power struggle, helping 
the student refocus and refresh. Breaks provide students with the escape they are 
seeking, without them engaging in inappropriate behavior to access that escape 
(Tiger et al., 2008).
68 G. Pacitto
Fig. 6.4 Example of a break 
card
Class-wide reward systems promote positive behavior in the classroom and 
should be a part of every general education classroom. A visual representation 
of the system, like the one below, provides tangible indicators of the students’ 
success. In this example, students may earn pom-poms for prosocial behaviors 
and win a class-wide reward after a certain criterion is met (Fig. 6.5). 
You may have a student or group of students that struggle with organization, 
transitions, and following routine. A visual schedule is another Tier 1 support 
that can be easily implemented and can especially benefit younger students.
Fig. 6.5 Example of a 
class-wide incentive system 
6 Positive Behavior Intervention System 69
Fig. 6.6 Sample visual schedule 
Visual schedules promote independence and responsibility and can increase on-
task behavior. Below is an example of a visual schedule that can be used for 
individual students or posted somewhere visible in the room for the whole class 
(Horner et al., 2005) (Fig. 6.6). 
6.4.2 Tier 2 Supports 
As mentioned earlier in this chapter, you may have students in your classroom 
that require more intensive, individualized support. Several examples are shared 
below. They include individual behavior contracts and reward/token boards. When 
students are struggling to adhere to expectations or complete academic tasks in 
your classroom, a reward system or contract can provide consistency and increase 
motivation and effort. These do not need to be time consuming, simply set a goal 
with the student, and determine a reinforcer for them to access when they have 
successfully met that goal. Be sure the expectation is reasonable- you want them 
to “touch” that reinforcement and feel successful! (Figs. 6.7 and 6.8).
6.5 Summary 
PBIS is a three- tiered, research-based framework to both prevention and interven-
tion in the school setting. The literature supports the effectiveness of PBIS, but 
it requires both accurate and consistent implementation to prove successful. Ide-
ally, administration will take an active role in guiding and monitoring their staff 
to ensure that consistency is achieved. If the team collaborates and administration 
guides and supports their staff, PBIS has the potential to be both effective and
70 G. Pacitto
Fig. 6.7 Example of a behavior contract 
Fig. 6.8 Sample token board
6 Positive Behavior Intervention System 71
self-sustaining (Evanovich & Scott, 2016). While you can utilize these supports in 
your classroom, even if your school doesn’t use a PBIS framework, it was devel-
oped as a system-wide approach. There are several steps you can take to kickstart 
implementation in your own building. 
1. Download the PBIS Implementation Blueprint (can be found on PBIS.org) 
2. Meet with leadership team/administration 
3. Set school-wide goals/determine a purpose for school-wide change 
4. Contact the local or state coordinator. Every state has a team that can help a 
school get started with PBIS. (Links can be found on PBIS.org). 
After reading this chapter, you have hopefully walked away with tools to promote 
evidence-based strategies to assist with your own classroom management. PBIS is 
a system-wide approach, though, that when implemented effectively, can improve 
not only overall student achievement, but the climate and relationships in your 
organization. 
References 
Biglan, A. (1995). Changing cultural practices: A contextualistic framework for intervention 
research, Context, Reno, NV. 
Bradshaw, C. P., Koth, C. W., Bevans, K. B., Ialongo, N., & Leaf, P. J. (2008). The impact of 
school-wide positive behavioral interventions and supports (PBIS) on the organizational health 
of elementary schools. School Psychology Quarterly, 23(4), 462–473. https://psycnet.apa.org/ 
doi/10.1037/a0012883 
Bradshaw, C. P., Mitchell, M. M., & Leaf, P. J. (2010). Examining the effects of schoolwide pos-
itive behavioural interventions and supports on student outcomes: Results from a randomized 
controlled effectiveness trial in elementary schools. Journal of Positive Behavior Interventions, 
12(3), 133–148. https://doi.org/10.1177/1098300709334798 
Bradshaw, C. P., Waasdorp, T. E., & Leaf, P. J. (2012). Effects of school-wide positive behav-
ioral interventions and supports on child behavior problems. Pediatrics, 130(5):e1136– 
45. https://doi.org/10.1542/peds.2012-0243. Epub 2012 Oct 15. PMID: 23071207; PMCID: 
PMC3483890 
Carr, E. G., Dunlap, G., Horner, R. H., Koegal, L. K., & Fox, L. (2002). Positive behavior support: 
Evolution of an applied science. Journal of Positive Behavior Support, 4, 4–20. https://doi.org/ 
10.1177/109830070200400102 
Center on PBIS. (2023). Positive behavioral interventions and supports (website). https://www.pbi 
s.org 
Evanovich, L. L., & Scott, T. M. (2016). Facilitating PBIS implementation: An administrator’s 
guide to presenting the logic and steps to faculty and staff. Beyond Behavior, 25, 4–8. https:// 
doi.org/10.1177/107429561602500102 
Horner, R. H., Sugai, G., Todd, A. W., & Lewis-Palmer, T. (2005). School-wide positive behavior 
support. In L. Bambara & L. Kern (Eds.), Individualized supports for students with problem 
behaviors: Designing positive behavior support plans (pp. 359–390). Guilford Press. 
Horner, R. H., Kincaid,D., Sugai, G., Lewis, T., Eber, L., Barrett, S., & Johnson, N. (2014). Scaling 
up school-wide positive behavioral interventions and supports: Experiences of seven states with 
documented success. Journal of Positive Behavior Interventions, 16, 197–208. https://doi.org/ 
10.1177/1098300713503685
https://psycnet.apa.org/doi/10.1037/a0012883
https://psycnet.apa.org/doi/10.1037/a0012883
https://doi.org/10.1177/1098300709334798
https://doi.org/10.1542/peds.2012-0243
https://doi.org/10.1177/109830070200400102
https://doi.org/10.1177/109830070200400102
https://www.pbis.org
https://www.pbis.org
https://doi.org/10.1177/107429561602500102
https://doi.org/10.1177/107429561602500102
https://doi.org/10.1177/1098300713503685
https://doi.org/10.1177/1098300713503685
72 G. Pacitto
Lewis, T. J. & Sugai, G. (1999). Effective behavior support: A systems approach to proactive 
schoolwide management. Focus on Exceptional Children, 31, 1–24. https://doi.org/10.17161/ 
foec.v31i6.6767 
McDaniel, S. C., Bruhn, A. L., & Mitchell, B. S. (2015). A tier 2 framework for behavior identi-
fication and intervention. Beyond Behavior, 24, 10–17. https://doi.org/10.1177/107429561502 
400103 
Mercer, S. H., McIntosh, K., & Hoselton, R. (2017). Comparability of fidelity measures for assess-
ing tier 1 school-wide positive behavioral interventions and supports. Journal of Positive 
Behavior Interventions, 19, 195–204. https://doi.org/10.1177/1098300717693384 
Scott, T. M., & Martinek, G. (2006). Coaching positive support in school settings: Tactics and data-
based decision making. Journal of Positive Behavior Interventions, 8, 165–173. https://doi.org/ 
10.1177/10983007060080030501 
Sugai, G., & Horner, R. R. (2002). Introduction to the special series on positive behavior support 
in schools. Journal of Emotional and Behavioral Disorders, 10, 130–135. https://doi.org/10. 
1177/10634266020100030101 
Sugai, G., & Horner, R. R. (2006). A promising approach for expanding and sustaining school-
wide positive behavior support. School Psychology Review, 35, 245–259. https://doi.org/10. 
1080/02796015.2006.12087989 
Sugai, G., & Horner, R. H. (2009). Defining and describing schoolwide positive behavior support. 
In W. Sailor, G. Dunlop, G. Sugai, & R. Horner (Eds.), Handbook of positive behavior support 
(pp. 307–326). Springer Publishing Company. https://doi.org/10.1007/978-0-387-09632-2_13 
Tiger, J. H., Hanley, G. P., & Bruzek, J. (2008). Functional communication training: A review and 
practical Guide. Behavior Analysis in Practice, 1, 16–23. https://doi.org/10.1007/BF03391716 
Wolfe, K., Pyle, D., Charlton, C. T., Sabey, C. V., Lund, E. M., & Ross, S. W. (2016). A sys-
tematic review of the empirical support for check-in check-out. Journal of Positive Behavior 
Interventions, 8, 74–88. https://doi.org/10.1177/1098300715595957
https://doi.org/10.17161/foec.v31i6.6767
https://doi.org/10.17161/foec.v31i6.6767
https://doi.org/10.1177/107429561502400103
https://doi.org/10.1177/107429561502400103
https://doi.org/10.1177/1098300717693384
https://doi.org/10.1177/10983007060080030501
https://doi.org/10.1177/10983007060080030501
https://doi.org/10.1177/10634266020100030101
https://doi.org/10.1177/10634266020100030101
https://doi.org/10.1080/02796015.2006.12087989
https://doi.org/10.1080/02796015.2006.12087989
https://doi.org/10.1007/978-0-387-09632-2_13
https://doi.org/10.1007/BF03391716
https://doi.org/10.1177/1098300715595957
7Group Contingencies 
Tyler C. Ré and Brittany N. Beaver 
7.1 Overview 
Classrooms require a carefully devised group of rules. Many teachers have proba-
bly had a time where students are choosing not to follow the rules. Teachers may 
have even asked themselves, “how can I get more students to follow the rules?” 
One strategy is to create group contingencies. Let’s break this term down a bit. 
First, group means to focus on several students instead of a single student. Contin-
gency means a rule that includes the expected behavior and what can be earned if 
the rule is met or taken away if the rule is not met. Therefore, group contingencies 
are rules that aim to alter the occurrence of a behavior for an individual and a 
group by explicitly stating what will happen for adherence or breaking of the rule 
(Litow & Pumroy, 1975). There are three different group contingencies, indepen-
dent, dependent, and interdependent, that will be discussed in this chapter. The 
major benefit of a group contingency is that it applies for a group of people and 
their behavior; therefore, you are not required to memorize individualized plans 
for that group of people. For instance, it is common for individuals to be part of a 
team where the team needs to meet a goal to obtain a reward. This is an example 
of a group contingency. 
7.1.1 Benefits and Challenges 
Group contingencies have many benefits. As already mentioned, the ease of imple-
menting a contingency for a group of people is much easier than implementing
T. C. Ré (B) · B. N. Beaver 
The Chicago School, Chicago, IL, USA 
e-mail: tre1@thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_7 
73
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_7&domain=pdf
mailto:tre1@thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_7
74 T. C. Ré and B. N. Beaver
different contingencies for everyone within a group (Litow & Pumroy, 1975). For 
instance, it would be rather difficult to implement a plan where one student earns 
5 min on their iPad after a 40-min period of following directions and another 
student earns a walk after 20 min of not calling out during a class lesson. Then 
continue to add seven more students with different rules and expectations, which 
would be nearly impossible for one teacher to implement in a class of up to 20 
students. It would be much easier to say that each member of the group can earn 
something after they have met the rules for a 40-min period. Second, a group 
contingency can stand on its own or can work in conjunction with other teaching 
strategies (Litow & Pumroy, 1975). For example, visual support, such as sched-
ules placed in the classroom and reminder cards on students’ desks, can be used 
to assist students in meeting the criteria. Token economies can also be used by 
visually displaying students’ progress while following the rules and earning the 
reinforcer. Third, there is the possibility of harnessing the peer pressure that occurs 
in most environments to achieve goals in a more proactive manner (Cooper et al., 
2007). Just as members on a team cheer one another on and congratulate one 
another, students can provide encouragement and praise to their peers to increase 
the likelihood they will all meet the expectation. 
As with all strategies there will be some challenges. With a group contingency, 
even though peer pressure is a benefit, it can quickly turn into a challenge. For 
instance, peer support may run a fine line of encouragement and potential bullying. 
How teachers handle the situation when one person is the reason the group does 
not meet the goal and other members of the group become angry at the person is 
critical to the success of the intervention. Let’s look at the three different group 
contingencies and figure out if it is possible to minimize negative peer pressure 
and maximize positive peer pressure. 
7.2 Three Types of Group Contingencies 
7.2.1 Independent Group Contingency 
An independent group contingency is where every student is working towards the 
same goal (e.g., completing 10 math problems), but each student needs to inde-
pendently meet the specified rule prior to accessing the reward (Litow & Pumroy, 
1975). For instance, say the same reward will be provided for each person for 
completing 10 math problems. The reward is the same,but each person will only 
earn the reward once they have individually met the goal. Meaning, the reward 
is given to Sally only after she completes the math problems, and to Harry only 
after he completes the math problems and so on. One way this strategy can be 
implemented is by using a token economy; the reward menu can be the same for 
all students, but each student must earn the required number of tokens to purchase 
the specific items. Let’s look at the vignette below to further illustrate a token 
economy system.
7 Group Contingencies 75
7.2.1.1 Vignette 
A school has decided to implement a school wide positive behavior system. As one 
component of this system, a token economy (rewards of tokens or “money” that 
can be exchanged for something else) has been developed. In a token economy, 
“tokens” can be in the form of any visual item, such as poker chips, popsicle 
sticks, school money, or tallies on a poster board. Students can earn tokens for 
engaging in specific behaviors related to being safe, responsible, and respectful. 
In essence, the goal of this system is to catch students being good. For example, 
a student is walking down the hall and holds the door open for a peer. This could 
be an opportunity for a teacher to provide a token for being respectful. Another 
student in the classroom is the only student who put away the materials that were 
used during class. This is another opportunity for the teacher to give a token for 
being responsible. A third student watches peers jump over railings on the sidewalk 
outside of the building. They choose to walk around the railing to enter the school. 
Here the teacher could use this as an opportunity to give a token for being safe. 
Furthermore, within the classroom, a token could be given each time a student 
turns in an assignment by the due date. Once students earn tokens, they need to 
learn why the tokens are valuable. 
To establish token value, a token “store” should be created. Students should be 
able to access the store either on a pre-determined schedule (e.g., every Friday 
after lunch) or when they have earned enough tokens to purchase an item. A pre-
determined schedule may be the easiest strategy to manage as a teacher must be 
available at the store to assist in the trade. Items must be available for differing 
values, just like any store the student will encounter outside of school. This store 
can also serve as a fantastic learning opportunity about money value, money man-
agement, and budgeting. Some of the items may be little trinkets such as pens, 
pencils, erasers, or single candy items. For medium level points that “cost” more 
could be computer time (i.e., 10 min), card games, balls (e.g., play size football, 
basketball or soccer ball), and king size candy bars available for purchase. For stu-
dents who want to budget and save (e.g., work towards a long-term goal), they may 
be able to purchase a lunch with the principal, a coupon for a free pizza from a 
local purveyor, or the chance to do the morning announcements. These are simply 
suggestions for what could be included in your store, you should generate a menu 
based on your students’ preferences and the resources you have available. Make 
sure that the number of points is equal to the effort/number of tokens required to 
purchase the item/activity. 
7.2.2 Dependent Group Contingency 
A dependent group contingency requires an individual or a smaller portion of 
the group to meet a goal in order for the entire group to earn access to the reward 
(Litow & Pumroy, 1975). For instance, if one person is regularly interrupting class, 
a procedure for the entire group to earn a reward could be contingent on the 
one person meeting the goal of interrupting fewer times per class period. Another
76 T. C. Ré and B. N. Beaver
example would be if you set a rule that at least 5 students in the class must receive 
a high score on their spelling tests. If at least 5 students achieve this goal, the 
entire class earns an extra 5 min at recess. Hopefully this demonstrates how a 
dependent group contingency can be implemented using two different strategies; 
focusing on either a single student or a small group of students. In either strategy, 
an expectation along with a reward will be selected for how the student or students 
meet the criteria. 
The dependent group contingency is sometimes known as the Hero procedure 
as one student is responsible for earning the reward for the group (Cooper et al., 
2007). This title was developed to support the student who earns or does not earn 
the reward for the entire group. The major concern related to this procedure is if 
students know who the student is that will earn the reward for the group, there 
is a potential for negative peer pressure. However, if the group knows who the 
student is, and the student starts to struggle, the group may provide encourage-
ment to the student to improve the results. Another option is to use a “mystery” 
student procedure. Meaning, the group does not know who the student is and thus 
would not necessarily know who failed to meet the criteria, limiting the opportu-
nity for bullying. But, when the criterion is met, the student should be identified 
so the group can congratulate and thank the Hero. This procedure reduces the risk 
of negative peer pressure, while hopefully increasing the chance of positive peer 
pressure among classmates. 
7.2.2.1 Vignette 
A classroom has chosen to implement the Mystery Hero Procedure (Jones, et al., 
2008). In this example the teacher sets a criterion of how many points the students 
should ideally earn during a given class period, morning, or entire school day. 
The criteria should be announced before starting. For example, “I have selected 
someone in this class. If they earn 10 tokens by 10 a.m., the entire class will 
have the rest of the class period to play board games.” During the class period, 
the teacher can either tally points on the board for each student, tally points on 
the desk of each student, or give a tangible token (e.g., coin, sticker, etc.). If the 
targeted student has met the criteria at 10 a.m., the teacher should stop the class 
and inform them that Johnny met the criteria, and everyone can put away their 
materials to play board games. If the targeted student does not meet the criteria, 
the teacher can either continue with the activity or inform the group that the criteria 
was not met and continue with the scheduled activity. 
7.2.3 Interdependent Group Contingency 
The third type is a combination of the two previous strategies called an interde-
pendent group contingency. This contingency requires that each student within the 
class meets the criteria such that the group as whole meets the goal (Litow & 
Pumroy, 1975). For instance, if a goal is set for each student to get 90% on a test 
and the class to get an average of 95%. If one student scored an 89%, the class
7 Group Contingencies 77
would not meet the criteria, even if the class average met the 95% goal. This type 
of group contingency is most like the Good Behavior Game (GBG; Barrish et al., 
1969). 
The GBG is a strategy in which the class is divided into at least two teams 
and rules are set for the teams to follow. Most of the time, points are earned for 
following rules; but other versions could include “strikes” for not following rules. 
The team who has the most points earn the reward. Another option is to set a goal 
to be met, which allows either or both groups to earn access to the reward. In this 
scenario, both teams can earn if they meet the goal or neither team will earn if 
they do not meet the goal. 
7.2.3.1 Vignette 
A classroom has chosen to combine the two procedures listed above (e.g., token 
economy and mystery hero). The teacher should set up the expectations exactly 
like they did with the mystery hero with one minor modification. Students can earn 
to play board games if two criteria are met. First, individual students will have toearn at least 7 tokens. Second, the mystery student will have to earn at least 10 
tokens. If both expectations are met, then the students who met the contingency 
can play board games. If both expectations are not met, then the students would 
be required to continue working on their assigned tasks. 
7.3 Implementation of Group Contingency 
Babyak et al. (2000) compared variations of contingencies by analyzing eight 
components. Each strategy was evaluated based on the organization, management, 
specific behaviors, measurement, reinforcement strategy, criterion for reinforce-
ment, delivery systems, and feedback loops (Babyak et al., 2000). In the descrip-
tion below, each component is further described with the addition of a system to 
fade the strategies. Ensuring that all aspects of group contingencies are addressed 
requires planning. This foundation will hopefully support the development of 
group contingencies in your setting. 
1. Organization: What type of behavior needs to be altered? By answering this 
question, teachers can determine which group contingency will best fit your 
classroom based on the benefits and limitations of each strategy. It may also be 
helpful to ask how often challenging behaviors occur and under what contexts, 
meaning are there certain times of day the behavior is more likely to occur? 
Another question to ask is, what kind of procedure or reinforcement system will 
be used? Will positive or negative reinforcement be used? Meaning, students 
will either earn access to preferred items or will earn the option to remove tasks 
from their to-do lists. 
2. Management: In order to determine if the strategy is effective at changing how 
often a negative behavior or positive response occurs, a data collection system 
should be used to evaluate from period to period or day to day. If the strategy
78 T. C. Ré and B. N. Beaver
uses points, you can track the number of points earned for each period and 
compare. However, it is recommended to evaluate negative behaviors as well. 
For instance, if the goal is to reduce the number of call outs, data may be col-
lected on the number of call outs or vocal disruptions and the number of times 
the student raises their hand without interrupting. Although it’s recommended 
to do both, only one data collection option can be selected. As a teacher, it 
may be difficult to collect data on both if a paraprofessional is not available for 
support. 
3. Target Behavior: A contingency is the rule(s) that lead to accessing a reward. 
Any well-designed group contingency will clearly and concisely state the rules 
and/or expectations for the students to follow. For instance, the rule may be: 
You raise your hand before speaking in class. A contingency specifies what will 
happen if the rule is followed. For instance, if you raise your hand and wait to 
speak during class, you will earn a sticker. 
4. System of Reinforcement: During class, how \student(s) are informed whether 
or not the rule was followed needs to be identified. Several options can be used 
to signal reinforcement depending on the structure of the contingency. Some 
options may be to provide points for following a rule (positive reinforcement), 
take points away (response-cost procedure), provide vocal praise or reprimand 
statement and/or only provide attention to those following the rules (differential 
reinforcement). If points are used, the points earned will be traded in for access 
to preferred items or activities. 
5. Criteria to Meet Contingency: What is the criterion to earn access to the 
reward? The criterion should be based on data collection and target approx-
imately a 50% change in the behavior. For instance, when trying to increase 
hand raising before speaking and data can be collected, by the teacher or some-
one else, on the number of times the students raise their hands for a few class 
periods. If the data show that the class only raised their hands a total of 5 times 
in the class period, maybe the target criteria would be for them to raise their 
hands 7 times during the class period. If the goal is to reduce disruptions and 
the data showed that the class interrupted learning opportunities 10 times during 
the class period, the goal could be for the class to interrupt learning opportuni-
ties only 5 times during the class period. Essentially, what the students must do 
in order to earn something must be identified based on how they are currently 
behaving. 
6. Delivery of Reinforcer: The reward must be selected. This is a vital component. 
This can be achieved by asking the class what kinds of things they like to do in 
school (e.g., extra recess, free time, etc.). However, it is important to note that 
the reward must be highly preferred by a majority of students in order to see a 
change in student behavior. Furthermore, teachers must decide how to inform 
the group if they did or did not meet the criteria. Either way, a great way to 
inform the group of how they did is to make an announcement at the end of 
the class period or day. If the class or a portion of the class has earned the 
reward the teacher should inform them in an excited tone of voice and provide 
time for them to get their reward at the end of the class period or day. If the
7 Group Contingencies 79
class or a portion of the class did not meet the expectations, the teacher can let 
them know they can try again another day. When letting them know they can 
try again, be as natural as possible and use a neutral tone of voice 
7. Measurement and Evaluation: Based on the target behaviors (e.g., hand rais-
ing, interruptions, etc.), data should be collected and evaluated regularly to 
determine if the procedure e used is successful. It is recommended that data be 
collected at least once per week and compared to the previous week’s data to 
determine the procedures’ effectiveness. This allows the teacher to fluidly alter 
the plan if the behavior doesn’t change in the way you wanted it to. 
8. Strategy to Fade: As with any strategy aimed to alter how often a behavior 
occurs, a plan to fade the procedure must be created so that the positive behav-
ior starts to occur more naturally. This can be achieved in a few different ways. 
First, the effort required to meet the rule can be gradually increased. For exam-
ple, if the goal was for students to raise their hands 7 times per class period, 
increase the goal to 10 times per class period to earn the reward. Second, the 
value of the reward earned can be reduced while also increasing the amount of 
verbal praise provided to the students. For instance, instead of increasing the 
goal from 7 hand raises to 10, the goal can remain at 7 hand raises, but a less 
valuable reward can be delivered while continuing to deliver behavior specific 
praise for hand raising. Third, the procedure can be implemented on some days 
(e.g., Tuesday this week and Thursday next) and not others or during some 
portions of a class period (e.g., only during math) and not others. Finally, an 
announcement that the procedure is being implemented or not during a class 
period may not always be delivered. If the contingency is going to be run during 
the next math lesson, an announcement about the rule may or may not be made 
before starting, but then at the end of the lesson the class would be informed if 
they raised their hands 7 times and have earned access to a selected reward. 
7.3.1 Application of Group Contingency 
The following example of how to ensure the necessary components are planned for 
is based on the vignette from the independent group contingency section above. 
This example uses a third-grade classroom. Additional examples for each type of 
group contingency are provided at the end of this chapter. 
1. Organization: This classroom is trying to increase safe, respectful, and 
responsible behavior in the classroom environment for all students. 
2. Management: A data collection system was created to collect data on the total 
numberof tokens given for the target behavior (e.g., putting materials away, 
raising hand, helping a peer, etc.) and the number of undesirable behaviors, 
such as call outs.
80 T. C. Ré and B. N. Beaver
3. Target behavior: Each of the target behaviors should be stated as a rule and 
rule violation. 
a. Rules: 
i. Raise your hand to speak in class. 
ii. Put all materials away when you are finished with them. 
iii. If you are having trouble, ask a friend to help. 
b. Rule Violation: 
i. Do not speak without raising your hand. 
ii. Do not leave the classroom without putting items back where they came 
from. 
iii. Do not yell or cry if you are frustrated and need help. 
c. Reward: Earn tokens each time the student follows a rule. 
4. System of Reinforcement: Students will learn that they earned a token by 
the teacher placing a tally mark on a sheet on their desk. This is a positive 
reinforcement strategy. 
5. Criteria to Meet Contingency: “When you follow a rule, you will earn a token. 
At the end of class, you will be able to exchange tokens for a break activity. If 
you do not have enough points, you can continue to work on the assignments.” 
6. Delivery of Reinforcer: Each student will have approximately 15 min before 
lunch and before dismissal to cash in their tokens for the purchased activity/ 
item. 
7. Measurement and Evaluation: Data will be collected daily on the total num-
ber of tokens earned and the number of negative behaviors. Evaluation of the 
success of the program will occur bi-weekly. 
8. Strategy to Fade: Over the course of the year, the target responses to earn 
points will systematically be altered. For example, during the first two months 
of the school year every two instances of desirable behavior will be met with a 
token. By the end of the school year, it may be that after approximately seven 
desirable behaviors, the student will earn a token. 
7.4 Summary 
Group contingencies are fantastic tools that allow a teacher to change the behavior 
of a group of students without having individualized rules and expectations for 
each student. Ease of implementation is key to ensuring that the classroom expec-
tations are consistently followed. Group contingencies allow teachers to capture 
and support desirable peer pressure for meeting goals; however, this could lead 
to undesirable peer pressure in the form of bullying. So, teachers need to keep a 
close eye on the type of interactions between students. 
Multiple strategies can be implemented simultaneously with group contingen-
cies. This type of contingency easily intertwines with a positive behavior support 
system (see chapter X). It should be noted that it is okay if the group contin-
gency stands independently, however, research does suggest that classrooms that
7 Group Contingencies 81
use multiple evidence-based practices result in the best outcomes (Epstein, et al., 
2008; Närhi, et al., 2017; Sugai & Horner, 2002). 
Appendix A: Independent Group Contingency 
See Table 7.1. 
Table 7.1 Independent group contingency example 
Problem Students do not complete assigned math problems 
during independent work time 
Goal Increase number of problems completed during 
independent work time 
Step 1: Organization • Students will complete the assigned math worksheet in 
the specified time period
• Students will remain engaged with the math worksheet 
for the duration independent work time or until the 
worksheet is completed
• Independent group contingency
• When each student completes the specified number of 
math problems on the assigned worksheet, the student 
can have access to their choice for the remained of the 
independent work time 
Step 2: Management • Data collected twice per week
• Number of math problems completed during 
independent work time
• Permanent product- students will turn in the worksheet 
and the teacher will record the number of problems 
completed for each student 
Step 3: Target • Contingency will be written on the board in the front 
of the class
• Contingency will be delivered vocally to the class at 
the start of independent work time 
Step 4: Signaling • Vocal praise will be delivered every 5 to 10 min when 
all students are working on their math 
Step 5: Criteria to meet Contingency • Baseline: 10 out of 15 math problems in 25 min
• Criteria: If students complete 13 math problems, 
students can have access to their choice for at least five 
minutes 
Step 6: Delivery of Reinforcer • Preference assessment in the form of a survey 
delivered to all students
• Students will turn in worksheet when completed, 
teacher will inform students they can have access to 
their choice
• Vocal announcement at the end of independent work to 
inform students who did not meet the criteria they can 
try again next time
(continued)
82 T. C. Ré and B. N. Beaver
Table 7.1 (continued)
Step 7: Evaluation • Monitor completion to determine if number of math 
problems completed is increasing 
Step 8: Strategy to Fade • Once criteria is met for two consecutive weeks, 
increase to 15 math problems
• Decrease duration of reward access
• Deliver reward three times per week
• Deliver reward once per week 
Appendix B: Dependent Group Contingency Example 
See Table 7.2. 
Table 7.2 Dependent Group Contingency Example 
Problem A few students regularly call out during group lessons 
Goal Decrease calling out and increase raising their hands 
Step 1: Organization • Students will raise their hands before asking a question 
or making a comment
• Students will remain engaged (i.e., oriented toward 
teacher or classmates and/or looking at teacher/class 
materials/classmates) for the duration of the class 
period
• Dependent group contingency
• If the selected students meet criteria, students will have 
access to a chosen reward a the end of the school day 
Step 2: Management • Data collected twice per week
• Frequency of students calling out without raising their 
hands
• Teacher will hold a golf counter during class, at the 
end of the class period, teacher will record the total 
number of call outs on the data sheet 
Step 3: Target • Contingency will be written on the board in the front 
of the class
• Contingency will be delivered vocally to the class at 
the start of the period
• The selected students will not be names 
Step 4: Signaling • Vocal praise will be delivered every 5 to 10 min when 
all students are raising their hands to ask questions 
Step 5: Criteria to meet Contingency • Baseline: 25 times per day
• Criteria: If students call out fewer than 12 times per 
day, access to a selected reward will be provided for 
the last five minutes of the class period
(continued)
7 Group Contingencies 83
Table 7.2 (continued)
Step 6: Delivery of Reinforcer • Preference assessment in the form of a survey 
delivered to all students
• Vocal announcement at the end of the class period-
inform students they can access reward or try again 
tomorrow 
Step 7: Evaluation • Monitor frequency data to determine if cell phone 
manipulation is decreasing 
Step 8: Strategy to Fade • Once criteria is met for two consecutive weeks, 
increase to two days below 12 call outs per day
• Three days below five occurrences per day
• Access to reward is delivered once per week on 
Fridays (or the last day of the school week) if 
frequency of call outs the previous days of the week 
was below five per period 
Appendix C: Interdependent Group Contingency Example 
See Table 7.3. 
Table 7.3 Interdependent group contingency example 
Problem Students use cell phones frequently during class 
Goal Decrease cell phone use and increase on-task behavior 
Step 1: Organization • Students will refrain from using cell phones in an 
inappropriate manner (i.e., for more than 5 s at a time, 
in any manner not related to the current task) for the 
duration of the class period
• Students will remain engaged (i.e., oriented toward 
teacher or classmates and/or looking at teacher/class 
materials/classmates) for theduration of the class 
period
• Interdependent group contingency
• If all students meet criteria, students will have free 
access to manipulate cell phone for the final five 
minutes of the class period 
Step 2: Management • Data collected twice per week
• Frequency of students manipulating cell phone
• Teacher will hold a golf counter during class, at the 
end of the class period, teacher will record the total 
number of manipulations on the data sheet 
Step 3: Target • Contingency will be written on the board in the front 
of the class
• Contingency will be delivered vocally to the class at 
the start of the period
(continued)
84 T. C. Ré and B. N. Beaver
Table 7.3 (continued)
Step 4: Signaling • Vocal praise will be delivered every 5 to 10 min when 
all students are refraining from using their phones 
Step 5: Criteria to meet Contingency • Baseline: 10 times per class period
• Criteria: If students use their cell phones fewer than 
five times per class period, free access to their phones 
will be provided for the last five minutes of the class 
period 
Step 6: Delivery of Reinforcer • Preference assessment in the form of a survey 
delivered to all students
• Vocal announcement at the end of the class period-
inform students they can access reward or try again 
tomorrow 
Step 7: Evaluation • Monitor frequency data to determine if cell phone 
manipulation is decreasing 
Step 8: Strategy to Fade • Once criteria is met for two consecutive weeks, 
increase to two days below five occurrences per class 
period
• Three days below five occurrences per class period
• Access to reward is delivered once per week on 
Fridays (or the last day of the school week) if 
frequency the previous days of the week was below 
five per period 
References 
Babyak, A. E., Luze, G. J., & Kamps, D. M. (2000). The good student game: Behavior management 
for diverse classrooms. Intervention in School and Clinic, 35, 216–223. https://doi.org/10.1177/ 
105345120003500403 
Barrish, H. H., Saunders, M., & Wolf, M. M. (1969). Good behavior game: Effects of individ-
ual contingencies for group consequences on disruptive behavior. Journal of Applied Behavior 
Analysis, 2, 119–124. https://doi.org/10.1901/jaba.1969.2-119 
Cooper, J. O., Heron, T. E., & Heward, W. L. (2007). Applied behavior analysis (2nd ed.). Pearson. 
Epstein, M., Atkins, M., Cullinan, D., Kutash, K. & Weaver, K. (2008). Reducing behavior prob-
lems in the elementary school classroom (Washington, D.C., National Center for Education 
Evaluation and Regional Assistance, Institute of Education Sciences, US Department of Edu-
cation). https://ies.ed.gov/ncee/wwc/Docs/PracticeGuide/behavior_pg_092308.pdf 
Jones, M., Boon, R. T., Fore, C., III., & Bender, W. N. (2008). “Our Mystery Hero!”: A group 
contingency intervention for reducing verbally disrespectful behaviors. Learning Disabilities: 
A Multidisciplinary Journal, 15(2), 61–69. 
Litow, L., & Pumroy, D. K. (1975). A brief review of classroom group-oriented contingencies. 
Journal of Applied Behavior Analysis, 8, 341–347. https://doi.org/10.1901/jaba.1975.8-341 
Närhi, V., Kiiski, T., & Savolainen, H. (2017). Reducing disruptive behaviours and improving 
classroom behavioural climate with class-wide positive behaviour support in middle schools. 
British Educational Research Journal, 43, 1186–1205. https://doi.org/10.1002/berj.3305 
Sugai, G., & Horner, R. (2002). The evolution of discipline practices: School-wide positive behav-
ior supports. Child & Family Behavior Therapy, 24, 23–50. https://doi.org/10.1300/J019v24n0 
1_03
https://doi.org/10.1177/105345120003500403
https://doi.org/10.1177/105345120003500403
https://doi.org/10.1901/jaba.1969.2-119
https://ies.ed.gov/ncee/wwc/Docs/PracticeGuide/behavior_pg_092308.pdf
https://doi.org/10.1901/jaba.1975.8-341
https://doi.org/10.1002/berj.3305
https://doi.org/10.1300/J019v24n01_03
https://doi.org/10.1300/J019v24n01_03
8Fluency 
Lacy M. Knutson 
fluency, fluency building, acquisition, mastery, precision 
teaching 
8.1 Overview 
All of these individuals are at the top of their respective fields. They perform 
elegantly and without hesitation. They are fluent performers. A fluent performer 
is someone who can perform a skill with both speed and accuracy (Binder, 1996). 
Performing with speed does not mean speed like the Road Runner, as very few 
skills need that degree of swiftness. However, all skills do require a certain level 
of timeliness, appropriate to the task, that makes it valuable and functional. This 
chapter will provide you with information on the value of fluency with respect to 
ensuring that students are performing in ways that will best serve their success in 
and outside of the classroom. 
You may be asking yourself, why bother with fluency? I already know how my 
students are performing in my class. Fluency adds the extra detail of how quickly 
that skill was performed in addition to how accurate (Alberto & Troutman, 2013; 
Binder, 1996). For example, Emily, a sophomore wanting to gain work experience 
for their college applications, is looking to apply to fill an administrative assistant 
position responsible for taking dictation. The job posting states that during the
L. M. Knutson (B) 
The Chicago School of Professional Psychology, Chicago, IL, USA 
e-mail: lacy.knutson@sdstate.edu 
All Kids Can Therapy, LLC, Madison, USA 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_8 
85
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_8&domain=pdf
mailto:lacy.knutson@sdstate.edu
https://doi.org/10.1007/978-3-031-35825-8_8
86 L. M. Knutson
interview, a sample dictation will be completed. It also states a required skill of 
typing at a rate of 80 words per minute or higher. Clearly, the employer recognizes 
it will not only be important to ensure the applicants are accurate in their dictation, 
but also that the dictation can be completed in a timely manner. Thus, ensuring that 
whoever is hired is productive and efficient. By including this information in the 
job posting, it helps communicate the skills necessary for the job, but also informs 
job seekers as to if they have the appropriate skills for the position. Unfortunately, 
Emily only types at a rate of 70 words per minute. Not to despair, even if an 
applicant like Emily may not presently meet the rate that an employer is seeking, 
it provides a clear indication of the type of performance that would make them 
eligible and gives a concrete goal to aim for. Emily decides to pursue the job 
and concludes that to be a valuable applicant, they need to increase their typing 
fluency in preparation for the interview. Fluency therefore is a powerful tool that 
can inform decision making for all invested parties. 
8.2 Learning Phases 
Any parent who has been running late but patiently waited for their young child to 
tie their shoes before running out the door, can tell you that there is a difference 
between acquiring a skill and being fluent with that skill. Learning can be broken 
down into two general phases: acquisition building and fluency building (White, 
1985). Acquisition building is the first phase during which a student is learning 
the sequence of steps to perform the skill. During this phase, teachers and aides 
support the student by providing instructions, prompts, and feedback. To monitor 
progress, the system most often used is percent correct measures (White, 1985; 
White & Haring, 1980). 
To prepare for an upcoming interview, Emily has been practicing taking dicta-
tion. Emily listens to a podcast and types what the speakers say. Emily listens for 
5 min and then reviews their work. To review, they play back the podcast while 
reading what has been typed. Any word typed incorrectly or missed is scored as 
an error. Anywords matching what the speakers said are scored as correct. The 
next step is to calculate their performance accuracy. On Monday, there were 324 
correctly typed words out of 360 total words—90% accurate. Not too shabby! 
Practice continues throughout the week and by Friday, Emily correctly types 328 
out of 345 words—95% accurate! Even better—or is it? 
A common misconception is that if a student’s accuracy is increasing (e.g., 
accuracy changes from 90 to 95% correct), that means that the student’s com-
petency in that skill is increasing. However, accuracy measures alone can be 
misleading. This is why the timeliness of a skill is also critical. Let’s give it a 
try.
8 Fluency 87
8.2.1 Concept Activity 
In this two-part concept check, you will see that accuracy doesn’t always tell us 
the full picture. The whole activity should only take you 1–2 min. 
Materials—2 addition math facts worksheets; pencil; timer. 
Part 1 
1. Set your timer to 30 s. 
2. Place one worksheet in front of you and the pencil/pen in your dominant hand. 
3. Press start on the timer and begin answering the worksheet. 
4. When the timer ends, stop your calculations. Pencils down. 
5. Count the number of math facts you completed correctly and incorrectly. 
Part 2 
1. Set your timer to 30 s. 
2. Place one worksheet in front of you and the pencil in your non-dominant hand. 
3. Press start on the timer and begin answering the worksheet. 
4. When the timer ends, stop your calculations. Pencils down. 
5. Count the number of math facts you completed correctly and incorrectly. 
8.2.2 Concept Activity Review 
It is likely that since you have been doing basic addition for quite some time 
that you were overall quite accurate in your calculations on both worksheets. Per-
haps you even received 100% correct on both worksheets. Where you likely saw 
the largest difference was in the number of digits you were able to write during 
the 30 s timing. You also likely noticed a marked difference in the legibility of the 
digits you wrote with your dominant versus non-dominant hand. These are both 
examples of differences in fluency. Your dominant hand is more fluent with the 
basic writing strokes (horizontal, vertical, curves, etc.) as compared to your non-
dominant hand (unless you are ambidextrous—lucky you!). This disfluency slows 
you down in your writing while not impacting your ability to remain accurate. As 
you can see, accuracy only tells us part of a student’s ability. 
8.3 Application of Fluency Timings 
If instruction stops at the acquisition phase, artificial limitations are placed on 
students from becoming proficient with the skill. To ensure students are competent 
and the second phase of the learning process, fluency building, must be included. 
During fluency building, a student gains ease and finesse with the skill, making it
88 L. M. Knutson
useful in everyday life. This is achieved through repeated practice opportunities, 
monitoring, and feedback (Kubina & Yurich, 2012; White, 1985). 
Emily has achieved the first phase of learning, acquisition. Dictation is com-
pleted while maintaining high accuracy—90–95% correct. Now, Emily needs to 
transition to the second phase of learning, fluency building. In this phase of learn-
ing, accuracy is examined, but it is done within the context of time. By monitoring 
how many words are correctly typed in a five-minutes timing, Emily can monitor 
how fluent they are becoming with dictation. 
To measure fluency, the skill must first be defined and then determine the cur-
rent speed or rate that the student can perform the skill. Essentially, this is a count 
(or number of times a response occurred) per unit of time. Classroom examples 
may include: three hand raises in an hour; twelve times out of seat in 30 min; one 
page of a textbook read in 3 min. As you can see, all these examples included 
a count of how many times something occurred, as well as the timeframe in 
which it occurred. Time is a key element to measuring fluency because everything 
humans do occurs within time (seconds, minutes, hours, days, weeks, months, 
years, decades, etc.). All skills require a level of fluency to be functional to the 
task and are therefore suitable for fluency monitoring. Anything from basic skills 
such as letter identification, phonetic awareness, basic writing strokes, fine motor 
skills such as pinching, to more complex behaviors such rate of words read cor-
rectly, math facts completed, correct science terms, or number of peer-appropriate 
social exchanges (see Tables 8.1 and 8.2 for additional skills to consider). As you 
can see in Table 8.2, fluency measures can also be used to examine a student’s 
abilities to use appropriate replacement skills instead of more challenging behav-
iors. The possibilities are limitless. To learn even more, check out Chap. 11 on 
how to select and describe skills targeted for fluency programs in a highly precise 
manner. 
So, you’ve decided adding fluency practice into your classroom would be worth-
while, but how are you going to do it? Adding this valuable learning component 
is easier than you may think.
Table 8.1 Possible 
academic skills to assess 
student fluency 
Reading Writing Mathematics 
Letter sounds per 
minute 
Basic strokes per 
minute 
Names digits per 
minute 
Words per minute Traces letters per 
minute 
Math facts per 
minute 
Sentences per 
minute 
Words per minute Skip count per 
minute 
Summarize ideas 
per minute 
Basic shapes per 
minute 
Symbol 
identification per 
minute
8 Fluency 89
Table 8.2 Possible 
social-emotional skills to 
assess student fluency 
Social skills Replacement skills Challenging 
behavior 
Greetings per day Requests for breaks 
per hr 
Aggressive acts per 
day 
Eye contact with 
peers per 10 min 
Requests for help 
per 30 min 
Self-harm per hr 
Offers toy to peer 
per 30 min recess 
Requests attention 
per 10 min 
Disruption per 
60 min 
Cooperative acts 
per 10 min 
Trips to ‘calm 
zone’ per day 
Talking out per hr
8.3.1 Measurement Materials 
To complete a fluency building session, you will only need a few items: a clear 
definition of what is being counted, a method to measure your count (pencil, paper, 
clicker, golf counter, beans, pennies, etc.), and a tool to measure time (stopwatch, 
clock, timer), a data sheet to document your data (see Appendices of Chap. 11), 
and a visual display to monitor your progress (see Appendix in this chapter). 
8.3.2 One-Minute Timings 
Fluency building for many skills can be practiced using one-minute timings 
(Graf & Lindsley, 2002). This saves valuable classroom time. To carry out a one-
minute timing session, you will want to ensure that you have enough materials 
so that the student can perform at a pace where they will not run out before the 
timing ends. If there are not enough materials, this could result in an artificial per-
formance ceiling. Essentially, the timing window and activity should be structured 
to be as barrier-free as possible for the student. With the materials ready, set your 
timer for one minute. Let the student know that they will have one-minute to do 
as much of the work as accurately as they can. They should be told to start right 
away when you start the timer, and to keep going until it beeps/ends. Begin the 
timer (give them a count-down if you like) and let them roll. Or even better, have 
them run the timer themselves or with a student partner. After the timer sounds, 
signaling the end of the window, review the work, totaling both the number of 
items completed correctly and the number of items completed incorrectly. These 
two numbers are the rates you will be using to monitor progress. Since both cor-
rect and incorrect performances vary independently, it is important to keep an eye 
on both.
90 L. M. Knutson
8.3.3 Alternative Timings 
For some skills, one-minute timings may not be the best option for monitoring 
fluency. When one-minute timings do not fit skill being targeted,different timing 
durations can be used (e.g., 10 s interval, 30 s interval, 2-min interval, 8-h interval, 
16-h interval). For example, to monitor the frequency of checking social media on 
a cell phone, an individual may use a 16-h interval giving them rate of checking 
per waking day (assuming they slept for 8 h). Additionally, for a student who is 
new to fluency building practice sessions, a one-minute timing might be too long 
for them to sustain a fast-pace. By using a smaller timing window (10 or 30 s), 
the student might be able to maintain an adequate pace, building fluency in small 
increments. Play around with the duration of time to find which suits the skill and 
student’s needs best. Once you have the information from the timing, what do you 
do with it? You use it to guide your instruction. 
After reviewing progress from Monday to Friday, Emily was a bit discouraged. 
Despite being more accurate (90–95%), their typing rate slowed (72 to 69 words 
per minute). Their interview goal now seemed farther away than ever. Rather than 
giving up, Emily made a plan to ensure they were a qualified candidate for the 
position. The first step was to review and identify where errors were being made. 
In reviewing the practice dictation, Emily noticed that the words most often scored 
as incorrect were those that required use of a pinky finger and pressing keys such 
as ‘q’ and ‘p’. Rather than adding typing practice of all letters, Emily decided to 
prioritize building finger dexterity of reaching the ‘q’ and ‘p’ keys. Each practice 
session started by completing a regular 5-min dictation timing and calculating the 
typing rate (words correct and incorrect per minute). Emily used this timing as the 
primary score to monitor progress. They would then complete a 10 s pinky prac-
tice session to improve accuracy with their right hand, followed by a 10 s pinky 
practice session to improve accuracy with their left hand. During these sessions, 
Emily typed as many single ‘p’s and ‘q’s as they could moving their pinking from 
the resting position key up to the target key and back in 10 s. The total number 
of ‘p’s and ‘q’s typed in 10 s was then calculated. Emily then took a 2-min finger 
stretch/break before completing another pinky practice session for each hand. Fol-
lowing another 2-min finger stretch/break and calculating letter rates, the session 
concluded by completing a final 5-min dictation. Emily compared this dictation 
rate to the first one of the day, examining total words typed as well as accuracy. 
The total process took less than 15 min. By using the performance in the initial 
5-min timing to guide the instruction of focusing on pinky dexterity, Emily was 
able to prioritize skills and save valuable time.
8 Fluency 91
8.3.4 Analyzing Fluency 
In the late 1960’s, Dr. Ogden Lindsley developed a highly sensitive tool known 
as the Standard Celeration Chart (SCC) specifically for monitoring performance 
fluency (see Appendix; Pennypacker et al., (2003)). This chart allows for accurate 
monitoring and supports decision-making for all types of fluency-related skills. 
With the help of a group of 7th grade students, Dr. Lindsley identified and labeled 
the visual patterns of learning that emerged on the SCC when fluency building data 
were plotted (Graf & Lindsley, 2002). They called these different patterns learning 
pictures. The learning picture that emerges during fluency building guides the next 
step in the decision-making process. For example, a “jaws” learning picture is a 
pattern in which correct responses (indicated by dots; •) are increasing and incor-
rect responses (indicated by an X) are decreasing, resulting in the two data-paths 
separating like a shark opening their mouth wide to take a bite (see Fig. 8.1). This 
improving learning picture means that the program is working well, and there is 
no need to change anything. 
However, a worsening learning picture like a “snowplow,” where incorrect 
responses are increasing, and correct responses are decreasing means that some-
thing needs to be modified in the program (see Fig. 8.2). The sooner, the 
better.
While use of the SCC will provide you with the most sensitive display to 
analyze your data, fluency data can be displayed in the more commonly used 
equal-interval graph. So, if you are worried that fluency might not be best for your 
classroom because you are not familiar with the SCC, don’t stress. Start adding 
fluency building to your classroom routine and analyze it in a way that makes 
sense to you. When you’re ready to take on the task of expanding your skills to 
the SCC, consider checking out the Precision Teaching chapter within this book 
(Chap. 11), the Handbook of the standard celeration chart (Pennypacker et al.,
Fig. 8.1 Data showing ‘jaws’ pattern 
92 L. M. Knutson
Fig. 8.2 Data showing ‘snowplow’ pattern
2003) or the DailyBA’s YouTube video Understand the Standard Celeration Chart 
(SCC) in Eleven Minutes (https://www.youtube.com/watch?v=IhCx6rcfdIY). 
8.3.5 Fluency Aims 
You might be wondering, “what speed am I trying to build my student’s skills 
to?” Emily was lucky that the job ad was very clear in what they were looking 
for (i.e., 80 words per minute or higher). However, for many classroom skills, 
this information is not always provided. Thanks to the field of Precision Teaching, 
there has been a lot of work on this question. Precision Teachers advocate for 
expert level aims (i.e., high performance goal rates). If you are unfamiliar with the 
term ‘aim’, aims are the rate or responses per unit of time you want the behavior 
performed at (e.g., 200 words per minute). They are the level students should 
‘aim’ to build up to. In The Precision Teaching Book, Kubina and Yurich (2012) 
provide a variety of suggested aims across skills. When looking to add fluency 
building into your classroom, this book would be a great resource. High aims are 
selected for several important reasons. In brief, high fluency aims have been found 
to be superior to lower aims (White, 1985) and result in a variety of additional 
benefits for the student such as performing well in the face of distractions, after 
long breaks in instruction, and in new contexts (Johnson & Street, 2004). To say 
it another way, when students are held to expert level standards, they not only 
can perform the skill in the context they need it, they also can do it despite other 
things happening in the classroom and can retain it over summer break. After all, 
classrooms aren’t always the quietest, least distracting location on the planet. If 
you are limited in funds and are not able to purchase the resources listed above, 
another way of determining an appropriate aim is to identify a student that you 
would consider to be performing at peak performance in the skill. Complete a
https://www.youtube.com/watch?v=IhCx6rcfdIY
8 Fluency 93
timing with that student to identify their current performance rate and use that as 
an aim to build other student’s skills up to. In this way, you are using a ‘classroom 
expert’ to identify an aim for students to build towards. 
8.3.6 More Than just Timings 
As discussed earlier, all behavior occurs in time, but doing timing checks is 
not enough to build fluent performers. Fluency timings, and analyzing the data 
patterns, should be a roadmap to guide you in making effective and efficient pro-
grammatic decisions. Do I turn left, or do I turn right? For example, you may start 
your reading class by having students do a 1-min timing check to see how many 
words they read correctly and incorrectly per minute. For those learners who are 
on track towards their goals, you might assign them work to expand vocabulary or 
to assist a peer as a model. For learners who are not tracking towards their goals, 
you may decide to give more direct teacher support or 1:1 peer mentor to focus on 
component skills that may be weak (e.g., word decoding). By doing a quick 1-min 
check, each student experiencesan individualized plan for their reading progress 
that is best suited to their needs. What’s better than that? 
8.4 Concluding Thoughts on Fluency 
Fluency building can be the deciding factor between being able to do the skill and 
being able to do the skill in a manner that is relevant for everyday application. 
Accuracy is only half of the learning equation. To be truly valuable, a skill must 
also have the relevant speed related to it. Measuring and monitoring fluency is 
a fairly simple process and can be incorporated into all types of settings. If you 
have the means to count and to time, you have the means to monitor fluency. 
When selecting the fluency to build a skill to, several resources are available to 
assist, or you can find a high performer and use them as your benchmark. Building 
competent learners includes building accurate and timely performance.
94 L. M. Knutson
Appendix 
Likeness of a daily per minute standard celeration chart 
References 
Alberto, P. A., & Troutman, A. C. (2013). Applied behavior analysis for teachers (9th ed.). Pearson. 
Binder, C. (1996). Behavioral fluency: Evolution of a new paradigm. The Behavior Analyst, 19(2), 
163–197. https://doi.org/10.1007/BF03393163 
Graf, S. & Lindsey, O. R. (2002). Standard celeration charting 2002. Graf Implements. 
Johnson, K. R. & Street, E. M. (2004). The Morningside model of generative instruction. Cam-
bridge Center for Behavioral Studies. 
Kubina, R. M., Jr., & Yurich, K. L. (2012). The precision teaching book. Greatness Achieved 
Publishing Company. 
Likeness of a Daily per minute Standard Celeration Chart. Standard Celeration Charts are available 
at Behavior Research Company, Box 3351, Kansas City, KS 66103-3351. VM 913-362-5900. 
www.behaviorresearchcompany.com 
Pennypacker, H. S., Gutierrez, A. Jr., & Lindsley, O. R. (2003). Handbook of the standard celera-
tion chart. Cambridge Center for Behavioral Studies. 
The Daily BA (June 22, 2021). Understand the standard celeration chart (SCC) in eleven minutes 
[Video]. Youtube. https://www.youtube.com/watch?v=IhCx6rcfdIY
https://doi.org/10.1007/BF03393163
http://www.behaviorresearchcompany.com
https://www.youtube.com/watch?v=IhCx6rcfdIY
8 Fluency 95
White, O. R. (1985). Decisions, decisions. British Columbia Journal of Special Education, 9(4), 
305–320. 
White, O. R., & Haring, N. G. (1980). Exceptional teaching (2nd ed.). Charles E. Merrill Publish-
ing Co. 
Lacy M. Knutson is now an assistant professor in the College of Arts, Humanities, and Social 
Sciences at South Dakota State University.
9Equivalence Based Instruction 
Timothy D. Caldwell and Laura A. Kruse 
9.1 Overview 
The abstract language-based connections between different items, events, and sit-
uations are some of the most complex behaviors that need to be taught. Sidman 
(1971) found an effective way to teach these relations through equivalence-based 
instruction (EBI). In EBI, sequences of stimuli (i.e., items or events within the 
environment) are taught to be associated with each other through match to sam-
ple training (described further below). Teaching this way can lead the individual 
to learn how to connect stimuli together that have never before been directly 
trained. The ability to learn language relations without direct teaching, referred to 
as generative learning, demonstrates one way that instruction can be arranged to 
promote efficiency. This chapter will provide you with an overview of equivalence-
based instruction along with steps to utilize this technology to promote generative 
language learning among your students. 
9.1.1 Equivalence-Based Instruction 
As an educator the goal is to teach as efficiently and effectively as possible. If each 
individual skill had to be taught directly, there would not be enough time in the 
day to meet the required learning outcomes. Luckily, humans are able to engage
T. D. Caldwell (B) 
Behavior Interventions Inc., Philadelphia, USA 
e-mail: tdc@behaviorinterventions.org 
L. A. Kruse 
The Chicago School, Chicago, USA 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_9 
97
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_9&domain=pdf
mailto:tdc@behaviorinterventions.org
https://doi.org/10.1007/978-3-031-35825-8_9
98 T. D. Caldwell and L. A. Kruse
in generative learning. Generative learning is the ability to demonstrate skills that 
have not been explicitly taught (Critchfield & Twyman, 2014). By taking advan-
tage of this ability, teachers can focus on base skills and then facilitate students 
being able to derive additional behaviors. While many teachers already focus on 
base skills which serve as a foundation for future learning, approaching the teach-
ing of these skills more systematically can lead to greater efficiency in teaching 
procedures. 
Equivalence-based instruction is based on stimulus relations, or the associations 
between various stimuli that are both presented by a teacher or provided as an 
answer by a student, which can allow teachers to maximize learning time. For 
example, an instructional program could be developed to teach children to match 
the picture of a state, with the state’s name, and the state’s capital. Trials could be 
implemented to teach two of the relations directly with the other stimulus relations 
being learned without direct training. This ability to engage in untrained responses 
is something that is often taken for granted and just assumed to occur. If we 
directly teach many different stimulus relations, most children learn to engage in 
generative learning, which can make additional learning more efficient. 
Sidman (1971) developed much of the early research related to derived learning, 
which was initially called stimulus equivalence (Sidman & Tailby, 1982). Stimu-
lus equivalence defines three ways individuals can engage in generative learning. 
First, there is reflexivity, which is the ability to understand that a stimulus (A) 
equals an identical stimulus (A). Once a student learns to recognize the number 
“5” for example, they should be able to recognize it each time it is presented. Next 
is symmetry, the ability to demonstrate that when taught that stimulus (A) equals 
stimulus (B) the individual can select stimulus (B) to match with stimulus (A). So, 
if a student is taught to match five items (stimulus B) to the written number “5”, 
symmetry would occur if they can also match the written number “5” to the group 
of five items. The third relation is transitivity, which states that if an individual 
is taught that a stimulus (A) equals stimulus (B), and stimulus (A) also equals 
stimulus (C), then the individual should be able to state that B equals C with-
out formal instruction between these two stimuli. Continuing with our examples, 
transitivity would be demonstrated if the student could now identify the group of 
five items (stimulus group A) when provided the spoken word “five” (stimulus C). 
The six different relations that can be learned are shown within Fig. 9.1. Using 
equivalence-based instruction, a teacher should only need to teach two relations 
(A-B, A-C) in order to derive four additional relations (B-A, C-A, B-C, C-B).
The basic examples above of stimulus equivalence are not sufficient to explain 
all generative learning. Derived relational responding, or the ability to demon-
strate untaught skills due to the relationship between stimuli, is a crucial skill 
for students to have. Many upper elementary school teachers use logic puzzles 
(see Fig. 9.2 below) to help strengthen the ability to engage in derived relational 
responding. Students are given some information, such as a person’s name (e.g., 
John, Matt, Tony) and the positions within the race, but not everything is stated 
explicitly. Instead, students are taughtto derive the order of placement based on 
clues provided.
9 Equivalence Based Instruction 99
Fig. 9.1 A simplified example of relations demonstrating stimulus equivalence
 1 2 3 
John - X - 
Matt X - - 
Tony - - X 
Clue 1: John did not finish first or last 
Clue 2: Matt finished before Tony 
Fig. 9.2 An example of a simple logic puzzle. Clue 1: John did not finish first or last. Clue 2: Matt 
finished before Tony 
Within these logic puzzles, students learn to derive other relations 
besides equivalence. For example, the puzzle may say that John was not first or 
last. One could derive then that “not first” could mean second or third, and “not 
last” could mean first or second. Therefore, John must have been in second place. 
If the puzzle then states that Matt finished before Tony, students could derive that 
Matt must have finished first (i.e., above Tony and not in second place) and that 
Tony must have finished last (i.e., below Tony and not in second place). This ability 
to derive connections such as this is an example of a relational frame of hierarchy. 
Relational frame theory states that we learn a number of “frames” which allow us 
to relate one stimulus to another even when they are not equivalent (Blackledge, 
2003). In addition to the hierarchy frame, there are frames which allow for oppo-
sitional relations (i.e., large, and small, if an elephant is large it cannot be small) 
and temporal concepts (i.e., If plane A arrived before plane B, then plane B arrived 
after plan A) as well as a few other relation types (Cassidy et al., 2010). 
Practicing this ability to derive relational responses through match-to-sample 
based instruction has been shown to be beneficial even to the general education
100 T. D. Caldwell and L. A. Kruse
student on a more global level. Cassidy et al. (2016) looked at the effects of provid-
ing fluency-based practice of derived relations on the IQ of middle school and high 
school aged students. The students did 45–60 min of training two to three times 
per day completing the entire training in an average of 90 days. From this fluency 
practice, the middle schoolers showed an average increase in IQ of one standard 
deviation and the high school aged students showed significant increases in the 
verbal and numerical reasoning subtests. Cassidy et al. (2016) suggests providing 
high rates of practice in teaching derived relations to students. Match-to-sample 
(see next section) procedures can be used to teach and practice acquiring stim-
ulus relations and relational frames across subjects within the general education 
curriculum. The next section will discuss specific examples within the classroom 
where equivalence-based instruction can improve teaching efficiency. 
9.2 Application 
The main form of teaching derived equivalence and non-equivalence relations is 
match-to-sample (MTS) training. In MTS, a sample stimulus is set apart from 
other stimuli, typically above a group of comparison stimuli that are arranged for 
the student to select the comparison that matches the sample. For example, if a 
picture of a cat is placed above a row of comparison written words (e.g., cat, dog, 
bear), then the correct match would be selecting the written word “cat” from the 
comparison array. In the next trial, a picture of a dog could be the sample stimulus 
with the same or similar written words in the comparison array, (e.g., cat, dog, 
bird). In this case, the written word dog would be correct if selected to match the 
picture of the dog. See Appendix 1 for samples of MTS stimulus cards. 
As relations are trained within the MTS format, tests for derived or untrained 
relations can occur. Within a MTS format, training trials typically contain feedback 
as to the accuracy of a response where test trials do not provide any feedback on 
correct or incorrect responding. If the picture stimuli are trained to the written 
words and then the spoken words are trained to the written words, a test in which 
the spoken words serve as the sample and the pictures within a comparison array 
can demonstrate whether students are able to derive the untrained relations between 
the spoken words and pictures (i.e., transitive responding). If the student is able to 
show these untrained relations a measure of efficiency in teaching has been gained 
through the use of equivalence-based instruction within an MTS format. 
In order to increase teaching efficiency, teachers should carefully choose which 
base skills they are going to teach. Research suggests that teaching student’s 
expressive language skills such labeling or answering fill-in the blank ques-
tions will more consistently lead to generative learning than receptive language 
skills such as multiple-choice tasks (Sprinkle & Miguel, 2012). Fortunately, there 
are no specific materials that are needed to promote generative learning beyond 
knowledge of the skills you need your students to demonstrate. 
In mathematics, a simple example of stimulus relations is fact families. When 
a student can be taught 3 + 2 = 5, an understanding of the equivalence relation
9 Equivalence Based Instruction 101
would allow them to derive that 2 + 3 = 5 as well. If the student understands 
that addition is the opposite of subtraction, then the opposition relation should 
then allow them to derive that 5 − 3 = 2 and 5 − 2 = 3. Another example 
is understanding the relation between greater than (>) and less than (<). Once 
that relation is taught, students should be able to use them in a variety of novel 
instances. For example, if the student is taught that 6 > 3 and they have learned 
that the (>) is the opposite relation of (<), they should be able to derive that 3 < 
6. A third example of teaching using stimulus relations is teaching coins and their 
values. Keintz et al. (2011) taught students to find a picture of a coin given its 
name, find a value when given a coin, and to find a written value when told the 
value. After these three relations were taught, the participants were able to engage 
in 4–7 novel relations such as matching the printed price to the correct coin. 
There are many other examples of using EBI to teach stimulus relations. The 
language arts examples earlier in this chapter demonstrate how you can train a 
student to identify a picture to a written sight word which may lead to matching the 
pictures to the spoken sight words. Similar procedures have also been used to teach 
students a second language (May et al., 2016), to teach young children the contact 
information of their caregivers (LaFond et al., 2020), and to instruct students to 
sort items that could be recycled, composted, or trashed (Bolanos et al., 2020). 
While the examples above have a more limited number of untrained skills that 
can be derived, this technique can be used to teach more complex concepts as 
well. Fienup et al., (2010) used these same strategies to teach college students 
about brain anatomy, two functions of each part of the brain, and the symptoms 
of damage to each area. By merely teaching the relation between the brain lobe 
name to two functions and the brain lobe name to the location and the symptom 
of damage, the students were able to match functions to locations and symptoms, 
showing twice as many untrained relations as trained relations. 
9.3 Implementation 
In order to provide an equivalence-based instructional sequence through MTS 
training, the following steps should be conducted. First, the teacher should take 
all of the stimuli to be taught or tested and arrange them into classes. Each group 
of stimuli can then be classified with a letter (designating the stimulus group); as 
well as a number (designating the stimulus within that group). For example, the 
relations mentioned above would be separated into three groups (A-spoken words 
B-pictures, and C-written words). Additionally, the stimulus examples could be 
numbered as follows (1-cat, 2-dog, and 3-bird). Thus, the picture of a cat would 
be labeledas A1 while the spoken word “dog” would be labeled as C2. This 
classification system is beneficial for arranging teaching and testing trials. 
The next step would be designating which stimulus groups will serve as the 
sample and which will serve as the comparison array during testing and training 
trials. It is important that training occurs without some relations being directly 
trained in order for the efficiency of derived learning to occur. For example, you
102 T. D. Caldwell and L. A. Kruse
could designate the A stimuli as a sample in training session 1 along with the B 
stimuli serving as the comparison array. In training session 2, the A stimuli could 
again serve as the sample stimuli while the C stimuli now serve as the comparison 
array. Tests for the untrained matching of B to C stimuli, with the C stimuli serving 
as the sample could potentially demonstrate derived learning. If scores on test 
sessions among B and C stimuli are low, additional training sessions of the A and 
B as well as the A and C relations should be conducted. Then additional testing of 
the B and C relations could occur until an established criterion score is reached. 
An example of classifying stimulus groups and planning training and testing trials 
is provided in Appendix 2. 
In addition, there are some software programs which can also be utilized to 
implement EBI teaching sequences. Blair and Shawler (2020) list some computer-
based solutions that can allow for teaching in a match-to-sample format as well 
as suggestions on what you can look for in potential software programs. In addi-
tion, within the article’s supplementary materials, there are detailed instructions on 
building an EBI sequence in PowerPoint™ and Google Slides™ (Blair & Shawler, 
2020). Lastly, collecting data on the accurate responding during training and test-
ing trials is important to see if your student is learning the trained relations and 
learning how to derive the untrained relations. Appendix 3 contains a simple data 
collection sheet which can be utilized to collect data. Marking the accuracy of 
train and test trials will help you know when it is time to introduce new stimuli to 
be taught. Recording the types of errors made in the last column will allow you 
to identify patterns of incorrect responding so additional training trials with those 
stimuli can be conducted. 
9.4 Summary 
Equivalence-based instruction has been shown to promote generative or untrained 
learning among school age students (Keintz et al., 2011; May et al., 2016). The use 
of a match to sample format allows us to teach some relations and test to see if the 
student can derive additional relations. As students begin to demonstrate derived 
relational responding, their efficiency in learning new relations will increase. Set-
ting up an EBI sequence requires selecting the stimuli to be taught, arranging them 
into stimulus types (e.g., spoken words, pictures, and written words), and creating 
a letter/number name for each stimulus. Teaching trials can then be designed to 
teach some relations and test for others that have not yet been trained. Deriving 
language-based relations is an important skill for every student and explicit instruc-
tion to teach generative language can have a significant impact on the effectiveness 
and efficiency of student learning.
9 Equivalence Based Instruction 103
Appendix 1: Sample MTS Stimulus Cards 
Stimulus Group A—Spoken Words 
“Bird” 
“Cat” 
“Dog” 
Stimulus Group B—Pictures 
Stimulus Group C—Written Words 
Cat Dog 
Bird 
104 T. D. Caldwell and L. A. Kruse
Appendix 2: Sample Stimulus Groups and MTS Train/Test 
Sequence 
Stimulus Groups 
A-Spoken word 1-Cat 
B-Pictures 2-Dog 
C-Written word 3-Bird 
Sample Training and Testing Sequence: A-B Relations 
Trial type Sample stimulus Comparison 1 
Left 
Comparison 2 
Center 
Comparison 3 
Right 
Training A1 
(Spoken—Cat) 
B2 
(Picture—Dog) 
B1 
(Picture—Cat) 
B3 
(Picture—Bird) 
Training A2 
(Spoken—Dog) 
B3 
(Picture—Bird) 
B1 
(Picture—Cat) 
B2 
(Picture—Dog) 
Training A3 
(Spoken—Bird) 
B3 
(Picture—Bird) 
B2 
(Picture—Dog) 
B1 
(Picture—Cat) 
Training A2 
(Spoken—Dog) 
B2 
(Picture—Dog) 
B3 
(Picture—Bird) 
B1 
(Picture—Cat) 
Training A3 
(Spoken—Bird) 
B2 
(Picture—Dog) 
B1 
(Picture—Cat) 
B3 
(Picture—Bird) 
Training A1 
(Spoken—Cat) 
B3 
(Picture—Bird) 
B1 
(Picture—Cat) 
B2 
(Picture—Dog) 
Test B2 
(Picture—Dog) 
A2 
(Spoken—Dog) 
A3 
(Spoken—Bird) 
A1 
(Spoken—Cat) 
Test (B-A) B1 
(Picture—Cat) 
A3 
(Spoken—Bird) 
A1 
(Spoken—Cat) 
A2 
(Spoken—Dog) 
Test (B-A) B3 
(Picture—Bird) 
A1 
(Spoken—Cat) 
A2 
(Spoken—Dog) 
A3 
(Spoken—Bird) 
Correct matching responses are bolded 
Sample Training and Testing Sequence: A-C Relations 
Trial Type Sample Stimulus Comparison 1 
Left 
Comparison 2 
Center 
Comparison 3 
Right 
Training A1 
(Spoken—Cat) 
C1 
(Written—Cat) 
C2 
(Written—Dog) 
C3 
(Written—Bird) 
Training A2 
(Spoken—Dog) 
C3 
(Written—Bird) 
C1 
(Written—Cat) 
C2 
(Written—Dog) 
Training A3 
(Spoken—Bird) 
C2 
(Written—Dog) 
C3 
(Written—Bird) 
C1 
(Written—Cat) 
Training A2 
(Spoken—Dog) 
C2 
(Written—Dog) 
C1 
(Written—Cat) 
C3 
(Written—Bird) 
Training A3 
(Spoken—Bird) 
C1 
(Written—Cat) 
C2 
(Written—Dog) 
C3 
(Written—Bird)
9 Equivalence Based Instruction 105
Trial Type Sample Stimulus Comparison 1
Left
Comparison 2
Center
Comparison 3
Right
Training A1 
(Spoken—Cat) 
C3 
(Written—Bird) 
C1 
(Written—Cat) 
C2 
(Written—Dog) 
Test (C-A) C2 
(Written—Dog) 
A1 
(Spoken—Cat) 
A2 
(Spoken—Dog) 
A3 
(Spoken—Bird) 
Test (C-A) C3 
(Written—Bird) 
A3 
(Spoken—Bird) 
A2 
(Spoken—Dog) 
A1 
(Spoken—Cat) 
Test (C-A) C1 
(Written—Cat) 
A1 
(Spoken—Cat) 
A3 
(Spoken—Bird) 
A2 
(Spoken—Dog) 
Correct matching responses are bolded 
Sample Testing Sequence: B-C, C-B Relations 
Trial Type Sample Stimulus Comparison 1 
Left 
Comparison 2 
Center 
Comparison 3 
Right 
Test (B-C) B1 
(Picture—Cat) 
C2 
(Written—Dog) 
C3 
(Written—Bird) 
C1 
(Written—Cat) 
Test (B-C) B2 
(Picture—Dog) 
C1 
(Written—Cat) 
C2 
(Written—Dog) 
C3 
(Written—Bird) 
Test (B-C) B3 
(Picture—Bird) 
C3 
(Written—Bird) 
C1 
(Written—Cat) 
C2 
(Written—Dog) 
Test (C-B) C2 
(Written—Dog) 
B2 
(Picture—Dog) 
B1 
(Picture—Cat) 
B3 
(Picture—Bird) 
Test (C-B) C3 
(Written—Bird) 
B1 
(Picture—Cat) 
B3 
(Picture—Bird) 
B2 
(Picture—Dog) 
Test (C-B) C1 
(Written—Cat) 
B3 
(Picture—Bird) 
B2 
(Picture—Dog) 
B1 
(Picture—Cat) 
Correct matching responses are bolded 
Appendix 3: MTS Data Sheet
Trial Type Sample Stimulus 
(List letter and #) 
Correct If Incorrect (List Stimulus Selected) 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No
106 T. D. Caldwell and L. A. Kruse
Trial Type Sample Stimulus
(List letter and #)
Correct If Incorrect (List Stimulus Selected)
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
Training or Test Yes No 
References
Blackledge, J. T. (2003). An introduction to relational frame theory: Basics and applications. The 
Behavior Analyst Today, 3(4), 421–433. https://doi.org/10.1037/h0099997 
Blair, B. J., & Shawler, L. A. (2020). Developing and implementing emergent responding training 
systems with available and low-cost computer-based learning tools: Some best practices and 
a tutorial. Behavior Analysis in Practice, 13(2), 509–520. https://doi.org/10.1007/s40617-019-
00405-x 
Bolanos, J. E., Reeve, K. F., Reeve, S. A., Sidener, T. M., Jennings, A. M., & Ostrosky, B. D. 
(2020). Using stimulus equivalence-based instruction to teach young children to sort recycling,trash, and compost items. Behavior and Social Issues, 29(1), 78–99. https://doi.org/10.1007/s42 
822-020-00028-w 
Cassidy, S., Roche, B., & O’Hora, D. (2010). Relational frame theory and human intelligence. 
European Journal of Behavior Analysis, 11(1), 37–51. https://doi.org/10.1080/15021159.2010. 
11434333 
Cassidy, S., Roche, B., Colbert, D., Stewart, I., & Grey, I. M. (2016). A relational frame skills 
training intervention to increase general intelligence and scholastic aptitude. Learning and 
Individual Differences, 47, 222–235. https://doi.org/10.1016/j.lindif.2016.03.001 
Critchfield, T. S., & Twyman, J. S. (2014). Prospective instructional design: Establishing condi-
tions for emergent learning. Journal of Cognitive Education and Psychology, 13(2), 201–217. 
https://doi.org/10.1891/1945-8959.13.2.201 
Fienup, D. M., Covey, D. P., & Critchfield, T. S. (2010). Teaching brain-behavior relations eco-
nomically with stimulus equivalence technology. Journal of Applied Behavior Analysis, 43(1), 
19–33. https://doi.org/10.1901/jaba.2010.43-19 
Keintz, K. S., Miguel, C. F., Kao, B., & Finn, H. E. (2011). Using conditional discrimination 
training to produce emergent relations between coins and their values in children with autism. 
Journal of Applied Behavior Analysis, 44(4), 909–913. https://doi.org/10.1901/jaba.2011. 
44-909
https://doi.org/10.1037/h0099997
https://doi.org/10.1007/s40617-019-00405-x
https://doi.org/10.1007/s40617-019-00405-x
https://doi.org/10.1007/s42822-020-00028-w
https://doi.org/10.1007/s42822-020-00028-w
https://doi.org/10.1080/15021159.2010.11434333
https://doi.org/10.1080/15021159.2010.11434333
https://doi.org/10.1016/j.lindif.2016.03.001
https://doi.org/10.1891/1945-8959.13.2.201
https://doi.org/10.1901/jaba.2010.43-19
https://doi.org/10.1901/jaba.2011.44-909
https://doi.org/10.1901/jaba.2011.44-909
9 Equivalence Based Instruction 107
LaFond, T. R., Reeve, K. F., Day-Watkins, J., Reeve, S. A., Vladescu, J. C., & Jennings, A. 
M. (2020). Using stimulus equivalence-based instruction to teach young children their Care-
givers’ contact information. Behavioral Interventions, 36(1), 105–125. https://doi.org/10.1002/ 
bin.1742 
May, R., Downs, R., Marchant, A., & Dymond, S. (2016). Emergent verbal behavior in preschool 
children learning a second language. Journal of Applied Behavior Analysis, 49(3), 711–716. 
https://doi.org/10.1002/jaba.301 
Sidman, M. (1971). Reading and auditory-visual equivalences. Journal of Speech Language and 
Hearing Research, 14(1), 5–13. https://doi.org/10.1044/jshr.1401.05 
Sidman, M., & Tailby, W. (1982). Conditional discrimination versus matching to sample: An 
expansion of the testing paradigm. Journal of the Experimental Analysis of Behavior, 37(1), 
5–22. https://doi.org/10.1901/jeab.1982.37-5
https://doi.org/10.1002/bin.1742
https://doi.org/10.1002/bin.1742
https://doi.org/10.1002/jaba.301
https://doi.org/10.1044/jshr.1401.05
https://doi.org/10.1901/jeab.1982.37-5
10Direct Instruction 
Kozue Matsuda 
10.1 Overview 
If the student isn’t learning, the teacher isn’t teaching. 
—Adams and Engelmann (1996) 
Direct Instruction (DI) is a research-based approach to instructional design and is a 
powerful teaching system that combines logical analysis that you can use every day 
in any subjects in your classroom (Binder & Watkins, 1990; Heward & Twyman, 
2021). Also called explicit instruction, it is an instructional program in which 
the teacher provides explicit teaching for a skillset by demonstrating material to 
students, giving students opportunities to practice, providing systematic feedback 
and high rates of student engagement (Engelmann, 1968; Heward & Twyman, 
2021). Direct Instruction can be used to teach concepts and skills to individuals or 
to groups using choral (unison) responses (Engelmann, 1968), and it often involves 
students responding in planned steps and a script for the teacher. This chapter will 
provide a general overview of Direct Instruction and how it can be incorporated 
into the general educational classroom.
K. Matsuda (B) 
Children Center, Tokyo, Japan 
e-mail: Kozue.matsuda@pepperdine.edu 
Pepperdine University, Malibu, USA 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_10 
109
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_10&domain=pdf
mailto:Kozue.matsuda@pepperdine.edu
https://doi.org/10.1007/978-3-031-35825-8_10
110 K. Matsuda
10.1.1 What Is Direct Instruction? 
Based on Siegfried Engelmann’s Theory of Instruction, Direct Instruction was 
developed in the 1960s (Bereiter & Engelmann, 1966). Engelmann and Wesley 
Becker developed DISTAR Reading (Direct Instruction System of Teaching and 
Remediation), the first formal DI program that served as the foundation for all the 
DI programs that followed (Heward & Twyman, 2021). 
Direct Instruction operates on five key philosophical principles, as described by 
Engelmann:
• All children can be taught.
• All children can improve academically and in self-image.
• All teachers can succeed if provided with appropriate training and materials.
• Any low-performance children in a class must be taught at a faster rate if they 
are to catch up.
• All details of instruction must be controlled to minimize the chance students 
will misinterpret the information and to maximize the reinforcing effect of 
instruction (National Institute for Direct Instruction, 2015). 
10.1.2 Why Should Teachers Use Direct Instruction? 
Direct Instruction may assist in the classroom if a teacher has issues with 
“Common Classroom Problems”. These may include student(s) not following the 
teacher’s instructions, students receiving poor grades; or “unrelated” issues that 
interrupt actual teaching time. 
10.1.2.1 Case Study; Ms. M’s Common Classroom Problem 
Ms. M is assigned to a third-grade classroom in which six of the 20 children 
have behavioral concerns and present with delays in academic development. When 
Ms. M discusses this problem with her principal, she is told to send any students 
engaging in problem behavior to the office. After repeated office visits, Ms. M does 
not observe any improvements in behavior, and the constant interruptions continue 
to disrupt the class. Ms. M also talks with the children’s parents without finding 
a solution. While Ms. M focuses on teaching the six children, the other students 
engage in off-task behavior such as chatting, looking at books, or gazing out the 
windows. As a teacher, it is difficult “ignore” students’ problem behavior. Ms. 
M is often embarrassed by her classroom situation, no matter how hard she tries
10 Direct Instruction 111
to stop and manage her classroom, she receives emails from “concerned parents” 
complaints about their child’s slow academic achievement. Ms. M fears her annual 
feedback from her school administration. 
10.1.2.2 How to Improve Ms. M’s Classroom by Direct Instruction 
The role of a teacher is to teach. That said, Ms. M’s teaching is continually 
interrupted to address specific students’ behavioral concerns. In this case, imple-
menting Direct Instruction can help provide an appropriate teaching environment 
for the whole classroom as well as the six students who require additional support. 
Although the results of this approach may sound like magic, Direction Instruction 
is based on the science of behavior and provides an instructional strategy that 
enables the teacher to successfully work with learners of varying abilities. In Ms. 
M’s case, she notices that her teaching has changed after implementing Direct 
Instruction. First, Ms. M does not have a “chance” to scold those students while 
running Direct Instruction in the classroom. Instead, she must concentrate on her 
DI script, which gives her information about the cues she must provide whilewatching her students’ behavior. Second, Ms. M has noticed that her students, 
including the six children requiring behavioral support, join her teaching WITH-
OUT her need to tell them to sit nicely. Third, and most importantly, her students 
smile more often after the lesson. Thus, even though the students still show some 
issues and learning difficulties, Ms. M has regained her confidence as a competent 
teacher. 
10.1.3 Direct Instruction Is Based on Behavior Analytic Research 
10.1.3.1 Direct Instruction (“Capital DI”) 
Siegfried Engelmann and colleagues created Direct Instruction, sometimes referred 
to as “capital DI,” in 1968 as an explicit, carefully sequenced, and scripted model 
of instruction during Project Follow Through, the most extensive experimental 
project in education funded by the U.S. federal government (Watkins, 1997). 
Project Follow Through aimed to determine the most effective way to teach low-
income children in the U.S. from kindergarten to third grade (Watkins, 1997). This 
included more than 200,000 children in 178 communities and evaluated the Direct 
Instruction model as one of 22 educational strategies. After nine years of data col-
lection, students taught using Direct Instruction consistently outperformed those 
within other instructional programs on basic, cognitive, and effective measures 
(Engelmann, 1968).
112 K. Matsuda
10.1.4 Direct Instruction Implementation 
Direct Instruction is explicit, systematic, and scripted instruction designed to 
maximize teacher efficiency and effectiveness using behavioral principles (Engel-
mann & Carnine, 1991). Two major principles of Direct Instruction are teaching 
more in less time and controlling the details of the curriculum. 
10.1.4.1 Teach More in Less Time 
The basic principle of Direct Instruction is to increase the amount of learning by 
systematically developing important background knowledge and explicitly apply-
ing it and linking it to new knowledge (Stein et al., 1998). During the Direct 
Instruction teaching, the teachers need to check their behavior as well as the 
students’ learning behavior. Engelmann introduced in his Theory of Instruction, 
the concept of faultless communication, which is a form of communication that 
transmits just one interpretation of a concept at a time. The basis of faultless com-
munication is perhaps best summarized by Adams and Engelmann (1996), “If the 
student is not learning, the teacher is not teaching.” If the student does not grasp 
the concept or makes mistakes, it is a fault in communication independent of his or 
her characteristics; the determination of “faultless” is structural rather than related 
to student behavior—that is, it is possible to analyze communication to assess if it 
is faultless without reference to learner behavior. 
Engelmann and Carnine (1991) suggest faultless communication between teach-
ers and students can be achieved by controlling the environment and the strategy in 
designing instructional sequences. The teacher’s delivery must be clear and include 
precise wording and the students’ fixed response are controlled through faultless 
communication by repeating the instruction sequence; at the same time, the teacher 
must control for elements outside of teaching, such as other students’ behavior and 
noise from other classrooms (Engelmann, 1968; Kinder & Carnine, 1991). 
10.1.4.2 Control the Details of the Curriculum 
This concept includes two core components: task analysis and modeling. A task 
analysis breaks complex tasks into a sequence of smaller steps or actions. After 
the teachers identify the skills they want to teach, they may need to conduct a task 
analysis of those skills to teach them in an orderly, systematic manner in which one 
skill builds on another (Joseph et al., 2016). For example, a teacher who wishes 
to teach addition may create a task analysis by breaking “teaching addition” into 
smaller steps and prerequisite skills (e.g., counting, reading numbers). The teacher 
must ensure the students master one step before moving to the next one (e.g., 
knowing how to count before learning to add numbers). 
“Modeling” refers to demonstrating how to perform a skill (Joseph et al., 2016). 
Many teachers hope for students to demonstrate original behavior and think for 
themselves, however, if the teacher wishes to impart a skill, they must first model 
it and help the student imitate the behavior. Modeling is a type of prompting, 
which can include a visual, auditory, or physical cue to lead the students to the
10 Direct Instruction 113
correct response. For example, if the teacher wants to teach students to say “hello” 
in French, then he or she needs to model “bonjour” with an auditory prompt. 
10.2 Application 
10.2.1 Components of Direct Instruction 
To utilize Direct Instruction, teachers need the following components: research-
tested curriculum, scripted lesson plans, brisk pacing with unison responding, and 
frequent assessments. 
10.2.1.1 Research-Tested Curriculum 
Direct Instruction always begins with a clear and systematic presentation of con-
tent (Kim & Axelrod, 2005). The choice of content is based on a curriculum that 
has been extensively tested to determine the most effective method to lead students 
to mastery of the subject (National Institute for Direct Instruction, 2015). The cur-
riculum must be explicit about what needs to be learned, how it must be learned, 
and how to teach prerequisite skills in a manner that links to the new material log-
ically and systematically (Kim & Axelrod, 2005). The content of the curriculum 
must include generalizable concepts and skills (Stein et al., 1998). Direct Instruc-
tion is often implemented for at-risk students and thus includes ideas and learning 
strategies determined to have the broadest application and the most fundamen-
tal impact across academic disciplines. The logical and systematic presentation of 
Direct Instruction is carefully designed in a specific sequence for the review, appli-
cation, and mastery of the new topic. Skills are taught in sequence until students 
have mastered them. Each lesson begins with a review test, with the subsequent 
lesson building on previously mastered skills (Stein et al., 1998). 
10.2.1.2 Scripted Lesson Plans 
Direct Instruction is popularly known as “scripted” lesson plans. Scripted lesson 
plans ensure faultless communication, control the teacher’s behavior and other 
environmental variables. If you are teaching science, then you should not make 
“friendly talk” about anything other than the topic you are teaching. Instead, the 
teacher needs to ensure the application of instructional strategies such as active stu-
dent participation, positive reinforcement, brisk pacing, explicit instruction, guided 
practice, distributed review, and constant feedback (Slocum, 2004; Stein et al., 
1998). 
To successfully implement Direct Instruction and move students toward mas-
tery, coaches or facilitators are often present in the classroom to monitor and 
aid the teacher. Placement of these coaches or facilitators is case-specific and 
depends on school funding. As well-researched Direct Instruction produces student 
achievement, this level of interaction is often required to implement the model.
114 K. Matsuda
10.2.1.3 Brisk Pacing and Unison Response 
The brisk pace of intensive instruction in scripted lessons and the unison response 
are often regarded as the hallmarks of Direct Instruction. During scripted teacher 
instruction, the teacher prompts three to 20 responses per minute from every stu-
dent (Stein et al., 1998). The response is often made in unison as a group, and the 
teacher typically has already modeled the response. For instance, when teaching 
the country’s capital city, the teacher will say, “United States, Washington, D.C.” 
Then the students will all say the exact same thing. The teacher may need to state 
“United States, Washington, D.C.” twice to initiate the D.I. session. Although the 
pacingis brisk and a scripted lesson is being followed, the teacher must con-
stantly observe class response. Therefore, in the initial implementation of the 
Direct Instruction lesson, students are divided into small groups of eight to 12 
students following the initial placement test. All groups receive appropriate Direct 
Instruction for the most rapid possible student progress. Direct Instruction often 
requires pre-grouping based on the students’ skills and may allow teachers more 
opportunities to assist struggling students. 
10.2.1.4 Frequent Assessments 
Students are assessed at the beginning of the program with a placement test, as 
well as during the program with formal assessments (Kim & Axelrod, 2005). 
Direct Instruction also provides a built-in assessment within the scripted teach-
ing to ensure all students achieve mastery of the subject (Kim & Axelrod, 2005). 
The assessment is not testing the ability of the students. Instead, you are test-
ing whether your teaching is accurate and check if the students get what you are 
trying to teach. In Direct Instruction, assessment involves the teacher identifying 
and detecting any students who might need additional assistance rather than mark-
ing or ranking their performance. The level of students’ mastery can also provide 
information on how well Direct Instruction is working in the classroom and on 
the teacher’s instruction. The instructional scripts are also continuously revised to 
ensure that most students within a particular performance group achieve a 90% 
success rate when instructed according to the script (National Institute for Direct 
Instruction, 2015). 
10.2.2 Sample Direct Instruction Programs 
Direct Instruction has been used for a variety of curricula. For example, commer-
cial programs such as the Direct Instruction System for Teaching and Remediation 
(DISTAR) and Corrective Reading include sets of systematically organized reading 
skills so that newly taught skills build on previously mastered ones (Engelmann & 
Bruner, 1988). A sample of Direct Instruction programs developed and used by 
educators include (Fig. 10.1).
10 Direct Instruction 115
Reading 
Corrective Reading 
https://www.nifdi.org/programs/reading/corrective-reading.html 
Reading Mastery Plus 
https://www.nifdi.org/80-programs/374-rm-plus 
Reading Horizons 
https://readinghorizons.com/ 
Teach Your Child to Read in 100 Easy Lessons 
https://www.nifdi.org/programs/reading/teach-your-child-to-read-in-100-easy-lessons.ht 
ml 
Writing and Spelling 
Spelling Mastery 
https://www.nifdi.org/programs/spelling/spelling-mastery.html 
Expressive Writing 
https://www.nifdi.org/programs/writing-and-language/expressive-writing.html 
Reasoning and Writing 
https://www.nifdi.org/programs/writing-and-language/reasoning-writing.html 
Mathematics 
Connecting Math Concepts 
https://www.nifdi.org/programs/mathematics/cmc.html 
DISTAR Arithmetic 
https://www.nifdi.org/programs/mathematics/distar-arithmetic.html 
Corrective Mathematics 
https://www.nifdi.org/programs/mathematics/corrective-math.html 
Language 
Language for Learning 
https://www.nifdi.org/resources/free-downloads/programs/writing-and-language/langua 
ge-for-learning.html 
Language for Thinking & Language for Writing 
https://www.nifdi.org/programs/writing-and-language/language-for-thinking.html 
History 
Understanding U.S. History 
https://www.nifdi.org/programs/science-ss/understanding-us-history.html 
Fig. 10.1 A list of subjects for the Direct Instruction program
116 K. Matsuda
10.2.2.1 Additional Notes on Reading and Math 
The National Reading Panel emphasizes five elements of effective reading 
instruction—phonemic awareness, phonics, fluency, vocabulary building, and com-
prehension—referred to as “reading to learn” skills (National Institute for Direct 
Instruction, 2000). The Direct Instruction programs covers all of these. Direct 
Instruction programs for mathematics use a strand-based design, meaning they 
are organized into strands that focus on teaching prerequisite skills to mastery 
before teaching more complex skills. In strand-based programs, both teachers and 
students are constantly reviewing mathematical concepts and skills, and new con-
cepts and skills are only introduced after students have demonstrated mastery of 
prerequisite skills (Fig. 10.2).
10.2.3 Direct Instruction Implementation 
Below is a sample of how to implement DI with your students. If teachers are 
sure of what they are going to teach, they can create their own Direct Instruction 
programming. The following steps can be used to implement Direct Instruction:
1. Conduct placement tests and within-program assessments. To discover the 
areas in which students have the most difficulty, test the students in narrow 
areas/concepts. This assessment is not meant to grade the students but rather 
to assist teachers in student placement (even within the classroom). Within-
program assessments allow teachers to place students in the proper level/lesson 
and to track their performance. 
2. Prepare clear teacher scripts. Write down scripts that specify what the teacher 
will say (typically noted in color) and do (noted in regular print), as well 
as what students need to say or do (noted in italics). Each sentence and the 
students’ responses should be short. 
3. Use choral (unison) response and signals. Ask students to respond in choral 
response or unison to increase their opportunities to respond and to receive 
teacher feedback. If students respond vocally, have them do so together. If they 
respond by mini-whiteboard, have them show their boards together. 
4. Use signals to prompt students to respond together. Decide on a cue for stu-
dents to respond. For example, knocking on the wall or tapping the blackboard 
with a pen (that is, making a sound) will provide a clear cue for students to 
respond to the scripts. 
5. Allow students to copy others. Teachers love for students to be independent. 
However, at least initially, let the students copy each other or change their 
answer. 
6. Invite competent students to act as models. As the students get used to the 
unison responses, teachers can start asking individual students who respond 
correctly to show the model behavior. 
7. Continue to assess students’ progress. Within-program assessments help deter-
mine the efficacy of instruction. Assessment helps teachers decide whether
10 Direct Instruction 117
Teacher says Students say 
The teacher provides a worksheet 
to each student. 
Touch the first problem. Read it. Students: 508 – 326 
What does the ones column tell us 
to do? 
Start with 8 and take away 
What is 8 – 6? 2 
How many tens are you starting 
with now? 
0-2 
Can you take one ten from zero 
ten? 
No 
Where are you going to get 1 ten? From 50 tens 
What is 50 tens minus 1 ten? 49 tens 
Cross out the 50 and write 49 
above it (check work) 
What is 1 ten and 0 ten? 10 
What is 10 - 2 8 
Write it (Check work) 
How many hundreds are you 
starting with now? 
4 
What is 4-3 1 
Write it. (Check work). 
Read the whole problem and say 
the answer 
508- 326= 182 
Fig. 10.2 D.I example for mathematics
118 K. Matsuda
to repeat lessons to ensure responses (mastery) before moving on. Ensure 
assessments occur every day or at regimented points in the program.
8. Never stop. Unless there’s an emergency, the teacher should not stop the script 
and the students should keep responding. It is not easy to ignore students 
demonstrating disruptive behaviors, and students are often puzzled as to why 
the teacher asks them to provide a scripted answer. Despite its name empha-
sizing instruction, Direct Instruction is based on behavior science emphasizing 
what comes after the behavior—that is, feedback. Providing positive feedback 
should change the behavior of individual students who did not initially join the 
choral response. 
9. Know what to do when errors occur. Errors could include, for example, if the 
teacher says, “How manytens in 100?” and the students respond, “100.” The 
teacher can use Direct Instruction for error correction; “starting over” should 
be incorporated as well to ensure that students “get it.” For example: 
Teacher: My turn. How many tens in 100? Ten. 
Your turn. How many tens in 100? 
Repeat task after doing several other problems (“How many tens in 100?”). You 
are not explaining their mistakes, but simply teach the correct answer and moving 
forward. 
10.3 Application 
Direct Instruction is often used in elementary school (pre-K to Grade 6) and spe-
cial education programs (Kim & Axelrod, 2005). Direct Instruction materials have 
been developed for most common disciplines throughout the K–12 curriculum, 
including reading, writing, math, science, social studies, and higher-order think-
ing (Adams & Engelmann, 1996). Curriculum materials have generally been used 
both schoolwide and in various subject areas. For reading intervention programs, 
Corrective Reading can be used with struggling students from fourth grade to 
adulthood (Kim & Axelrod, 2005). 
Ziegler and Stern (2016) implemented Direct Instruction for sixth-grade stu-
dents in Switzerland to teach elementary algebraic transformations in German, 
the students’ native language. They assigned students to two groups and used the 
following Direct Instruction methods: pretest on prior algebra knowledge, black-
board instruction, scripted teacher instruction, worksheets, repetition, and a logical 
reasoning test. The results showed that the group receiving Direct Instruction out-
performed the other group and that Direct Instruction was effective in teaching 
complex mathematical concepts. Ziegler and Stern also noted that current text-
book curriculum order might not be effective, especially in teaching required 
mathematical logic when the students were capable of learning.
10 Direct Instruction 119
10.3.1 Criticisms of Direct Instruction 
One of the main criticisms of Direct Instruction is that its structured nature 
offers few opportunities for creativity and fails to account for students’ differ-
ences (McMullen & Madelaine, 2014). Because teacher instructions are scripted, 
Direct Instruction is mainly used for core subjects such as English and mathemat-
ics (McMullen & Madelaine, 2014). Students may benefit from a balance between 
structured teaching time and more open-ended activities (McMullen & Madelaine, 
2014). Possible confusion may also arise over the difference between rote and 
explicit instruction. While Direct Instruction has been inaccurately described as 
mass memorization (Stein et al., 1998), it is an approach using behavioral prin-
ciples to teach the targeted concept through explicit and systematic presentations 
of the material with frequent reviews and assessments to ensure mastery (Kim & 
Axelrod, 2005). Direct Instruction assists the students learning and provides the 
strong foundation for creativity and individuality in the classroom. 
10.4 Summary 
Direct Instruction is an effective structured instructional program that is based on 
a behavior-analytic approach. Researchers have found that Direct Instruction is 
effective when teaching a wide range of subjects to students of many different 
ages. Direct Instruction is currently in use in thousands of schools in Canada, the 
UK, and Australia. Although in the school setting some teachers may have limited 
access to relevant resources, by following some simple rules, they can create a 
Direct Instruction–oriented classroom to promote student learning. 
References 
Adams, G. L., & Engelmann, S. (1996). Research on direct instruction: 25 years beyond DISTAR. 
Educational Achievement Systems. 
Bereiter, C., & Engelmann, S. (1966). Teaching disadvantaged children in the preschool. Prentice 
Hall. 
Binder, C., & Watkins, C. L. (1990). Precision teaching and direct instruction: Measurably superior 
instructional technology in schools. Performance Improvement Quarterly, 3(4), 74–96. 
Engelmann, S. (1968). Relating operant techniques to programming and teaching. Journal of 
School Psychology, 6(2), 89–96. https://doi.org/10.1016/0022-4405(68)90002-2 
Engelmann, S., Becker, W. C., Carnine, D., & Gersten, R. (1988). The direct instruction follow 
through model: Design and outcomes. Education and Treatment of Children, 11(4), 303–317. 
Engelmann, S., & Bruner, E. (1988). Reading mastery I: DISTAR reading. Science Research 
Associates. 
Engelmann, S., & Carnine, D. (1991). Theory of instruction: Principles and applications. ADI 
Press. 
Heward, W. L., & Twyman, J. S. (2021). Whatever the kid does is the truth: Introduction to the spe-
cial section on Direct Instruction. Perspectives on Behavior Science, 44(2–3), 131–138. https:/ 
/doi.org/10.1007/s40614-021-00314-x
https://doi.org/10.1016/0022-4405(68)90002-2
https://doi.org/10.1007/s40614-021-00314-x
120 K. Matsuda
Joseph, L. M., Alber-Morgan, S., & Neef, N. (2016). Applying behavior analytic procedures to 
effectively teach literacy skills in the classroom. Psychology in the Schools, 53(1), 73–88. https:/ 
/doi.org/10.1002/pits.21920 
Kim, T., & Axelrod, S. (2005). Direct instruction: An educators’ guide and a plea for action. The 
Behavior Analyst Today, 6(2), 111–120. https://doi.org/10.1037/h0100061 
Kinder, D., & Carnine, D. (1991). Direct instruction: What it is and what it is becoming. Journal 
of Behavioral Education, 1(2), 193–213. https://doi.org/10.1007/BF00957004 
McMullen, F., & Madelaine, A. (2014). Why is there so much resistance to direct instruction? 
Australian Journal of Learning Difficulties, 19(2), 137–151. https://doi.org/10.1080/19404158. 
2014.962065 
National Institute for Direct Instruction. (n.d.). Basic philosophy. http://www.nifdi.org/what-is-di/ 
basic-philosophy 
Prater, M. A. (1993). Teaching concepts: Procedures for the design and delivery of instruction. 
Remedial & Special Education, 14(5), 51–62. https://doi.org/10.1177/0741932593014005 
Slocum, T. A. (2004). Direct instruction: The big ideas. In D. J. Moran & R. W. Malott (Eds.), 
Evidence-based educational methods (pp. 81–94). Elsevier Academic Press. https://doi.org/10. 
1016/B978-012506041-7/50007-3 
Stein, M., Carnine, D., & Dixon, R. (1998). Direct instruction: Integrating curriculum design and 
effective teaching practice. Intervention in School and Clinic, 33(4), 227–233. https://doi.org/ 
10.1177/105345129803300405 
Watkins, C. L. (1997). Project follow through: A case study of contingencies influencing instruc-
tional practices of the educational establishment. Cambridge Center for Behavioral Studies. 
http://www.behavior.org/resources/901.pdf 
Ziegler, E., & Stern, E. (2016). Consistent advantages of contrasted comparisons: Algebra learning 
under direct instruction. Learning and Instruction, 41, 41–51. https://doi.org/10.1016/j.learninst 
ruc.2015.09.006
https://doi.org/10.1002/pits.21920
https://doi.org/10.1037/h0100061
https://doi.org/10.1007/BF00957004
https://doi.org/10.1080/19404158.2014.962065
https://doi.org/10.1080/19404158.2014.962065
http://www.nifdi.org/what-is-di/basic-philosophy
http://www.nifdi.org/what-is-di/basic-philosophy
https://doi.org/10.1177/0741932593014005
https://doi.org/10.1016/B978-012506041-7/50007-3
https://doi.org/10.1016/B978-012506041-7/50007-3
https://doi.org/10.1177/105345129803300405
https://doi.org/10.1177/105345129803300405
http://www.behavior.org/resources/901.pdf
https://doi.org/10.1016/j.learninstruc.2015.09.006
https://doi.org/10.1016/j.learninstruc.2015.09.006
11Precision Teaching 
Jared Van and Jennifer Quigley 
The learner is always right! 
Precision Teaching is an evidence-based measurement and decision-making system 
that enables educators to evaluate instruction using the Standard Celeration Chart 
(SCC; Evans et al., 2021; West & Young, 1992). Precision teaching guides deci-
sion making within any academic program or educational curriculum. The goals 
of precision teaching are to strengthen fluent responding by reinforcing high fre-
quencyresponding. Instructors and students display their time series data on the 
standard celeration chart to show the data and tell a story of what is happening. 
The motto of precision teaching is that one should never give up on the learner; 
the data should be assessed and modifications to teaching occur as necessary. The 
standard celeration charted data (see Fig. 11.1) guides all the decisions.
11.1 Overview 
Ogden Lindsley founded precision teaching, though he gives full credit of devel-
oping precision teaching to the teachers he was directly working with. Lindsley 
(1990) found that precision teaching could bridge the gap between behavior analy-
sis and education. Precision teaching developed out of the methods of free-operant 
conditioning, specifically self-recording, rate of response, and standard recording
J. Van (B) 
Penn State University, State College, Pennsylvania, USA 
e-mail: jpv5303@psu.edu 
J. Quigley 
The Chicago School, Chicago, USA 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_11 
121
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_11&domain=pdf
mailto:jpv5303@psu.edu
https://doi.org/10.1007/978-3-031-35825-8_11
122 J. Van and J. Quigley
Fig. 11.1 Standard celeration chart
(Lindsley, 1990). Free-operant (a behavior that can occur at any time like read-
ing, writing, or raising a hand) conditioning within the classroom enables students 
to respond at their own pace rather than placing limits on the instructional pro-
cedures or materials. For example, in most classrooms 100% is the performance 
ceiling and students are limited to content included in the class/grade level only. 
In a precision teaching classroom, student progress is measured using frequency, 
which removes measurement-imposed ceilings; students can also progress contin-
uously through content as they demonstrate fluency with the previously targeted 
material. Ogden Lindsley chose to use the term frequency instead of rate because 
they believed it would be easier for teachers. Frequency (or rate) is simply count 
divided by time. 
Precision teachers try to avoid percentage correct due to inherent limitations. If 
Johnny took an hour to complete a test and scored 100% and Layla took 10 min 
to complete the same test and scored 100%, it is clear the two are not equal in 
skill competency. Precision teaching avoids percent correct limitations by using 
frequency as a measure of skill acquisition rather than percentage correct. Lind-
sley highlighted the importance of moving beyond acquisition to fluency using 
frequency as a core metric because it was a more sensitive measure (Lindsley, 
1990). 
Percent correct demonstrates that a student can complete a skill but does not 
specify the time it took the student to complete the skill, how many errors the stu-
dent made while completing the skill, or how long it took the student to achieve 
the correct score. Percent correct also leaves out an important component of all 
behavior, time. All behavior happens with the bounds of time and foregoing its
11 Precision Teaching 123
measurement leaves important information unanswered (e.g., How long a behav-
ior or observation occurred). Counting the number of occurrences in time results 
in frequency. In a classroom implementing precision teaching, students would 
progress through material at their own pace and demonstrate fluency through 
repeated practice. Fluency is the combination of accuracy plus speed that char-
acterizes competent performance (Binder, 1996). Students’ progress to fluency as 
shown by frequency is charted (i.e., visually displayed) and evaluated to ensure 
forward movement is being demonstrated. 
Many single-case experimental studies, group designs, and longitudinal studies 
have shown precision teaching to be effective. The largest study completed, The 
Great Falls Precision Teaching Project (Beck and Clement, 1991), showed pre-
cision teaching to be a successful educational model (Binder, 1988). Elementary 
students participated in 20–30 min of timed practice, charting, and decision mak-
ing per day across four years. After the conclusion of the project, the students who 
participated in the precision teaching model had gained 19–44 percentile points on 
subtests of the Iowa Test of Basic Skills as compared to other children within the 
school district (Binder, 1988). Another set of classroom studies found that just 
by adding daily short and timed practice sessions, student performance can be 
improved in a cost-effective way (Binder, 1988). 
11.2 Precision Teaching Principles 
Precision teachers have four guiding principles: (1) the learner knows best; (2) 
only use observable behavior; (3) measure behavior using frequency; and (4) chart 
data on the standard celeration chart (Kubina, 2012). 
(1) Learner knows best 
The precision teaching phrase “learner knows best” communicates that a student’s 
performance over time demonstrates how instruction affected the student. If accel-
eration dots (•) on the standard celeration chart are going up and deceleration Xs 
are going down at acceptable rates, instruction is working. But if the dots (•) and 
Xs don’t fall within acceptable rates then the instruction needs to be changed. 
That is the student saying something is not working through their data. When 
using precision teaching, there are three ways the data can describe the behavior. 
First, the student demonstrates progress with the targeted behavior. The program 
continues as is and new targets are prepared to be added to the program. Second, if 
the student’s data are not progressing, there is something wrong with the program 
and the teaching strategies need to be examined. Is the skill too complex? Is the 
teaching plan unclear? Is the prompting strategy being implemented incorrectly? 
Rather than the student being blamed for not mastering the material, the teaching 
is reexamined. Finally, if the student’s data show behavior are going in the wrong
124 J. Van and J. Quigley
direction, there is something wrong with the program and the teaching strategies 
need to be examined. 
In precision teaching, the students remain active participants. They track their 
own behavior to see their learning (Lindsley, 1992). There are many benefits to 
student tracking. First, this removes a task typically required of the teacher or an 
observer (e.g., aide, direct-care staff, assistant teacher) which enables those edu-
cators to work on other tasks. Second, student tracking has been shown to be as 
reliable as instructor-tracked data and more valid (Lindsley, 1990). Also, effects of 
student tracking have typically been shown to be larger than when teachers graph 
and analyze data. Finally, precision teaching encourages all involved to trust the 
learner. The teacher becomes a learner of the student by observing their perfor-
mance. As the student teaches the teacher what works well and what does not work 
well, the teacher then is able to make quick, data based, decisions on instruction. 
(2) Only use observable behavior 
Precision teaching focuses on directly observable behavior. The behavior targeted 
for increase or decrease must be objectively defined and observable for data to 
be collected on the rate of occurrence/non-occurrence. Lindsley (1992) addressed 
the need to also target what are deemed private events or inner behavior such as 
silent reading, problem solving, or monitoring thoughts and feelings. To target an 
inner behavior, the behavior must be defined to the individual. Then an observable 
behavior can serve as an indicator. For example, to target silent reading, the student 
could be asked to read aloud, vocally decode words, or answer questions about 
their readings to check comprehension (White, 1986). 
Another precision guideline basedon directly observable behavior is what to 
address. For example, when choosing a behavior to target the instructor and stu-
dent should focus on what is happening instead of what is not happening. Instead 
of targeting not cursing, the targeted behavior may be using polite language or 
engaging in targeted coping skills. Targeted behaviors should also be movements 
rather than labels. A student may be labeled as shy, clumsy, or sloppy, but such 
terms are not objective descriptions and only serve as loose descriptions of a set of 
behaviors. Instead, a targeted behavior for someone who is aiming to reduce clum-
siness may include jumping rope, completing stages within an obstacle course, or 
kicking a ball into a goal. 
11.2.1 Pinpoint+ 
Behavior analysts describe behaviors using an operation definition as shown in 
Chap. 1. Precision teaching uses a pinpoint or pinpoint+ (see Fig. 11.2). The 
pinpoint+ is composed of a learning channel, movement cycle, and context. The 
pinpoint by itself is a movement cycle and the context. The + is acquired through
11 Precision Teaching 125
Fig. 11.2 Pinpoint+ 
the addition of the learning channel. Derived from measurement science, com-
bining these three elements provide a precise method for counting and detecting 
behavior (Kubina, 2019). 
Figure 11.3 shows the Academic-Personal-Social Learning Channel Matrix 
originally developed by Eric Haughton. While other learning channel matrices 
exist, the one selected here pertain to things that typically occur in the classroom. 
Across the left side of the matrix are sensory inputs or stimuli that occur before 
behavior. Sensory inputs describe how we take in information from both the world 
around us and inside our body. Across the bottom are sensory outputs or ways to 
classify observable behaviors a learner can emit or produce. If a student is tak-
ing a spelling test where they hear the word and then have to write it out, the 
input would be “hear” and the output would be “write.” The learning channel 
would be recorded as “hear-write.” A hyphen [-] is typically used to show the 
break between a sensory input and physical output. The formula for writing a 
pinpoint+ for a spelling test could appear “hear-taps keyboards spelling word on 
laptop” (i.e., learning channel + movement cycle + context).
One important thing to note about learning channels is to attend to how one 
teaches and how one tests. Sometimes instruction does not fall under the same 
learning channels as testing which can result in poor performance from the learner. 
If students hear questions and are asked to vocalize answers during instruction 
(hear-say) but read questions and write answers during testing (see-write) perfor-
mance in the see-write may not be as high due to receiving instruction in different 
learning channels (i.e., assessment = see-write, instruction = hear-say). Teaching 
and testing within the same learning channels may help resolve these discrepan-
cies. While trans- and inter-channel interaction research are still in their infancy, 
Binder and Haughton (2002) recommend practicing the learning channels needed 
for applying to the real world. 
Learning Channels may introduce readers of this book to information that is 
outside of this book’s scope such as yoking learning channels (combining them) 
and diagnosing learning problems (Kubina & Yurich, 2012, p. 91). If you are 
interested in learning more about learning channels, we recommend looking into 
The Precision Teaching Book by Kubina and Yurich (2012).
126 J. Van and J. Quigley
Fig. 11.3 Academic-personal-social learning channel matrix
A movement cycle is composed of observable, clear, active verb (i.e., an action) 
as well as the object that receives the action. Some examples of clear and observ-
able action verbs and objects receiving the action are writes number, touches card, 
grabs pencil, and raises hand. Each example involves some type of movement that 
describes what the behavior looks like with little room for ambiguity. The pre-
cise description helps observers accurately count the number of times the behavior 
occurs in a way multiple observers can agree on each instance of an occurrence. 
Examples of active verbs that are imprecise include, execute, preform, discover, 
and perceive. An easy way to pick an active verb is to ask someone to do the 
behavior of interest, write whatever verb(s) best describe the action, and use a dic-
tionary to find the most accurate verb that describes the behavior (Kubina, 2019). 
There are three basic guidelines for the movement cycle, choose an action verb, 
make it in the simple present tense, and pick the object receiving the action. First, 
pick a verb such as write, touch, grab, or raise. Second, make sure the verb is in 
the simple present tense which means using an “s” at the end of a verb instead of 
“ing.” Writing, touching, grabbing, and raising turn into writes, touches, grabs, and 
raises. The importance of using the simple present tense is to show the repeatabil-
ity of the action. Using grabs denotes a clear beginning and ending of a behavior 
whereas grabbing denotes an ongoing behavior. With clear beginning and end-
ings, observers can more easily detect and count each instance a behavior occurs. 
Third, select the object receiving the action such as name, pencil, ball, or hand. 
The movement cycle could be writes name, touches pencil, grabs ball, or raises 
hand. 
The next component of the pinpoint+ formula is the context. The context further 
describes the movement cycle (i.e., action verb + object receiving the action). For 
a learner who will write, we can describe where they are writing (writes word on
11 Precision Teaching 127
tablet device), for the learner who will imitate an action, we can describe when 
they are repeating it (imitates gross motor action after teacher demonstration), 
and for learners who shake a hand, we can describe with whose hand they are 
shaking (shakes hand of peer). The movement cycle, writes word, has the advan-
tage of producing a “count of one” for the target behavior. Similarly, imitates gross 
motor action and shakes hand all have clear “counts of one.” The context enhances 
the count of one by contextualizing where, when, with whom, or with what the 
movement cycle will occur (Kubina, 2019). 
(3) Measure behavior using frequency 
Ogden Lindsley chose to use the term frequency instead of rate because he believed 
it would be easier for teachers. Frequency (or rate) is simply count divided by time. 
We mentioned some of the reasons for using frequency rather than percent correct 
in the overview of this chapter. 
11.2.2 Accuracy Pairs and Fair Pair Rule 
When we have an acceleration target (identified as the correct behavior) and 
a deceleration target (identified as the incorrect behavior), and we look at the 
relationship between them we have an accuracy pair. This gives us a better under-
standing of performance. If we are looking to increase the number of correct math 
problems written within a minute, we also want to see if the number of incorrect 
responses is going down. Whether you are looking for increasing or decreasing a 
target behavior you should always check to see if there is a correct or incorrect 
form of the behavior as well. 
If you want to work on decreasing a behavior, you should also look for one or 
more desirable behavior to increase. The fair pair rule is when a precision teacher 
looks for desirable behavior(s) to increase for every target behavior they want to 
decrease. From a behavior analytic perspective, it provides a replacement behavior, 
something to do instead of the problem behavior. Teachers should ask themselves, 
“what would I rather have my student do instead?” to identify behaviors for the fair 
pair rule. When we have a correct behavior we want to increase, like answering 
questions correctly, we also want to measure the number of incorrect responses. 
We measure both correct and incorrect to make surethat our instruction is making 
correct responses increase, and incorrect responses decrease. If we have a pro-
gram that is increasing correct responses but also increasing incorrect responses at 
high frequencies, we need to change instruction, so the incorrect responses are not 
increasing, but decreasing.
128 J. Van and J. Quigley
(4) Chart data on the standard celeration chart 
The standard celeration chart is engine of precision teaching (Fig. 11.1). The ver-
tical axis displays the “Count per Minute” of the targeted behavior. Round dots (•) 
represent “Acceleration points.” They are behaviors selected to increase. Xs rep-
resent “Deceleration points,” and conversely are behaviors identified to decrease. 
Said in simplest terms, dots (•) go up, good; Xs go down, good. Dots (•) go down, 
bad; Xs go up bad. Placing a line through the middle of the Xs or dots creates 
a celeration line. The frequencies may range from one per minute to 1,000 per 
minute depending on the behavior being measured. The horizontal axis displays 
calendar time, namely successive days. 
After data has been graphed, the slope of the data path, shown on the y axis, 
shows the index of learning (Lindsley, 1992). Celeration, either acceleration or 
deceleration, is demonstrated by the slope of the line and shows how the targeted 
behavior is changing. 
The advantages of an SCC are that the same chart can be used to display 
any and all behaviors; this allows comparison across any and all behaviors as 
well (White, 1986). The SCC also saves time in that there is no time needed to 
determine labels for the chart or orient to the information because all SCCs are 
the same (standard). A standard graphing procedure also removes the potential 
for inadvertently misrepresenting the data. A ratio scale maintains that all ratios 
of behavior change will be represented the same way on the chart (i.e., relative 
change). Line graphs show absolute change but ignore relative change. Relative 
change allows for useful comparisons between high-rate and low-rate behaviors 
on the same chart. For example, if student A knows 2 words and learns 2 more 
that is a 100% increase. The student now knows twice as many words. Two to 
4 is a relatively large increase. Comparing student B who knows 100 words and 
learns 2 more words translates to only a 2% change. The standard celeration chart 
clearly shows student A’s behavior had doubled whereas student B grew only a 
small amount. 
11.2.3 Basic Chart Elements 
Figure 11.1 shows a daily per minute chart. As discussed previously, around “Suc-
cessive Calendar Days” there are several labeled blanks. The performer is the 
person whose behavior is being counted and charted; this is where the name of 
the learner/student goes. The line in-between performer and counted is provided 
for any identifying information such as age, grade, class, etc. The “counted” blank 
is for the pinpoint (+). The manager indicates who is the person working with 
the student on a frequent (daily) basis such as a teacher. Anyone who advises the 
manager and student has their name placed in the adviser blank and anyone who 
supervises the advisor goes above supervisor. If there is no adviser or supervisor a 
“–” or NA can be placed in those spots. The timer blank is for the person measur-
ing time. The counter blank is for the person who is counting each instance of the
11 Precision Teaching 129
behavior and recording them. The charter blank is for the person dropping dots 
and Xs on the SCC. Sometimes the timer, counter, and charter are all the same 
person, if so, the same name can be placed in all necessary blanks. It is recom-
mended that the performer eventually chart their own behavior to become more 
involved in their own learning process. 
Across the horizontal axis are successive calendar days. Each thick vertical 
line is a Sunday. The days between the thick vertical lines are Monday through 
Saturday. The distance from one solid vertical line to another is one week. Each 
vertical blue line is called a day line. By entering in the day, month, and year at 
the top of the chart, the SCC can be synchronized for our learners. 
Horizontal lines are called counting lines. They go from 1000 at the top to 0.001 
at the bottom with 1 in the middle. Starting at 1 in the middle, when you go up, 
you can count each line up to 10 by ones. Once you get to 10 each line is counted 
in tens (i.e., 10, 20, 30, etc.). Once you reach the 100 s you count by hundreds 
(i.e., 100. 200, 300, etc.). Michael Maloney came up with a helpful rhyme for 
understanding how to count the numbers on the left-hand side, “Big numbers that 
start with 1, tell you what to count by and what to count from” (Maloney, 2013). 
We will explain why scaling the chart like this is important later in the chapter, 
but for now if you put your finger on the 1 line on the first Monday, that spot 
represents one count per minute. It will always represent a count of 1 per minute 
if data is displayed there (assuming your timing bar is at one). If you move your 
finger up to 10 on the same vertical line that is 10 per minute on that Monday. 
Moving your finger back to one and going down to 0.1 that means one per 10 min 
(on the right there is a second axis with different times). We can’t have a tenth of 
a behavior, but we can have one behavior occurring every 10 min. As is shown 
in the SCC to go up we multiply by 10 and to go down we divide by 10, so that 
multiplying or dividing by a constant number gives us a constant distance. 
We already know some of the most important symbols that go on the SCC, 
the Acceleration dots (indicated by dots; •), and the Deceleration Xs. To show 
how long we observed the learner we place a horizontal dash (-) we call a time 
bar. Many precision teachers use one minute for the amount of time they observe 
a learner. To find where to place the time bar for a learner look to the right-
hand side of the chart and find “1’ min.” Place the time bar at one minute on 
the correct day and now we have charted the observation time. We would place 
the acceleration dot (•) and the deceleration X on the same day line but in their 
appropriate frequency lines. 
You will notice that the chart does not have a frequency line for certain numbers 
like 15. To chart 15, you would have to go between 10 and 20, find the middle 
and drop the dot (•) or X there. It can be tricky at first but with repeated practice 
and reinforcement for correct responses, the probability of correct response in the 
future can increase!
130 J. Van and J. Quigley
11.2.4 Celeration = Learning 
Frequency is achieved by counting the behavior and dividing it by time. The fre-
quency data is for one timed performance. When gathering multiple performances 
over time to measure the change in frequency, a new metric is born called celer-
ation. Celeration shows the rate of learning over time. Celeration reflects learning 
of behavior change. Celeration = (count/time)/time. 
Taking the “ac” off acceleration and the “de” off deceleration leaves the word 
celeration. Celeration means change but does not describe which direction the 
change goes in. Once five or more dots (•) and Xs are placed on a standard cel-
eration chart we can draw a celeration line. The celeration line has a value that 
specifies how fast or slow a behavior is growing or decaying. Referring back to 
Fig. 11.1, on the right vertical axis below count per minute there is a “celeration 
fan” composed of nine lines. Celeration lines can do one of three things: go up 
(grow or increase), go down (decay or decrease), or stay the same (no change). 
Quantify the celeration line as shown in the celeration fan. Above×1.0 numbers 
with × next to them show increases in celeration. Below×1.0 the numbers show 
decreases in celeration. Without going too deep into the details, the celeration 
lines determine how fast a student is learning and if the methods used to teach 
the student are producing fast enoughresults to be considered effective. If the line 
is at × 1.4 the behavior grew 40% each week. If the line is at ÷2.0 the behavior 
decayed 50% each week. One of the first acceptable growth celerations begin at× 
1.25 for 25% weekly growth (though this is considered low by some contemporary 
precision teachers). Acceptable decay celerations begin at÷1.25 for 25% weekly 
decay (once again still low). If the students’ performance does not meet accept-
able growth or decay celerations, stop and change instructional methods to better 
support faster (and more accurate) learning. 
11.3 Precision Teaching and Education 
Precision teaching differs from many measurement methods within education in 
several ways. Some classes include a few quizzes throughout the presentation of 
the material and a large examination at the end of the class. Precision teaching 
relies on frequent data that is used to monitor student progress and assess whether 
learning is or is not occurring. A core component of precision teaching is that the 
learner knows best; just because something has been taught does not mean that 
learning has occurred. 
In general, precision teaching focuses more on practice than instruction. Once 
the material has been presented, the students practice through to fluency using 
methods such as SAFMEDS (Graf & Auman, 2005) and display their results on the 
standard celeration chart. To assess progress, frequency aims are used as the mas-
tery criterion instead of percent correct. Aims enable students to achieve beyond 
100% and remove the ceiling effect that 0–100% grading creates. Precision teach-
ing has also been found to be more cost effective than traditional teacher training.
11 Precision Teaching 131
One survey of schools implementing precision teaching found that the median cost 
of teacher training was $300/year and $60 for subsequent years (Lindsley, 1991). 
11.4 Implementation 
11.4.1 Pinpoint+ The Behavior 
As discussed previously in this chapter, most precision teachers use pin-
point+ instead of operational definitions. The first component of a precision 
teaching implementation is to precisely define the behavior by using the pin-
point+ formula: learning channel + movement cycle + context. One of the best 
ways to test the quality of your pinpoint+ is to show it to someone who was not 
involved in making it and see if they can count the behavior accurately and reliably 
or simply perform the correct form of the behavior by doing it. 
11.4.2 Set an Aim 
Setting an aim refers to identifying the final targeted level of performance. This is 
typically a high frequency of correct responses and a zero frequency of incorrect 
responses. As a reminder, in precision teaching frequency refers to the count of 
a learner’s behavior (i.e., targeted response) over time. The aim should describe 
fluent behavior; behavior that is occurring quickly, accurately, and naturally (Lind-
sley, 1990, 1992). Determining an aim may be done by assessing the frequency of 
an individual who is currently completing the skill at an optimal level. For exam-
ple, a teacher might select their classes best reader, measure how many corrects 
(and incorrects) they get in a minute and set that as the aim. This norm-referenced 
criterion has historically been the identified method of setting an aim. Note, the 
aim selected is not the average group of people but a sampling of fluent performers. 
More recently, a balance of retention, endurance, and application have been stud-
ied to determine the optimal frequency across specific classes of behavior (Binder, 
1996). 
11.4.3 Count and Teach 
Instructors and students display their time series data on the SCC to show data 
and tell a story of what is happening. The behavior being counted is the behavior 
being targeted for increase or decrease (i.e., incorrect, and correct responses). If 
the targeted behavior is one of saying or doing, the teacher may just observe the 
learner and tally each time the behaviors occur. If the task being targeted is one 
that requires a physical demonstration (i.e., math test), the teacher may review a 
permanent product (the responses on the test) at the end of a set time period.
132 J. Van and J. Quigley
One instructional strategy specific to precision teaching is referred to as 
SAFMEDS, say all fast minute every day shuffled (Graf & Auman, 2005). 
SAFMEDS is a fluency building procedure. To implement SAFMEDS, flash cards 
are used where we have a target response on one side and a word or image on the 
other. Once all cards are created, they are shuffled, and the performer will see one 
side and try to say the response on the other side for one-minute. The total cards 
answered correctly and incorrectly are totaled at the end of one minute and the 
frequency is charted. This is repeated across days to demonstrate improvement in 
fluency of the material. 
11.4.4 Develop a Learning Picture 
The learning picture is developed through graphing correct and incorrect responses 
on the standard celeration chart. The chart is then inspected to determine if any 
trends are occurring. Specifically, changes in frequency per minute across correct 
and incorrect responses should be analyzed. See The Precision Teaching Handbook 
(2012) for visual examples of learning pictures. 
11.5 Applications of Precision Teaching 
Precision teaching has been used within programs for a variety of individuals 
across demographics: children who are gifted, children who have been diag-
nosed with learning or attention deficits, undergraduate students in college classes, 
employees of a national bank, individuals diagnosed with depression, medical 
students with electrocardiograms (Rabbitt et al., 2020), and individuals testing 
below grade level. Because precision teaching is not an instructional strategy, 
but a method of measurement and decision making, it overlaps well with a vari-
ety of curriculums and instructional strategies. Precision Teaching has also been 
combined with Direct Instruction (see Chap. 9; Johnson & Layng, 1992, 1994), 
Personalized System of Instruction (see Chap. 12; McDade & Olander, 1987), and 
the Morningside Model (see Chap. 14). The Center for Personalized Instruction 
at Jacksonville State University developed a Computer-Based Precision Teaching 
Learning System which was based on the precision teaching goal of develop-
ing fluency through reinforcing high frequency behaviors (McDade, 1992). At 
the college level, precision teaching has been incorporated into courses includ-
ing anthropology, archeology, biology, geography, history, mathematics, political 
science, and psychology (McDade & Olander, 1987). Outside of the classroom, 
Binder and Bloom (1989) used precision teaching to improve the fluency of 
bank employees. They identified a clear business advantage through the acqui-
sition of employee skills of reciting facts about products and services. Patterson 
and McDowell (2015) used precision teaching to increase self-management skills
11 Precision Teaching 133
Fig. 11.4 Daily frequency counting data sheet. Kubina and Yurich (2012) 
and decrease behaviors associated with depression. All nine participants demon-
strated an increase in self-management behaviors and a decrease in negative inner 
behaviors associated with depression. 
Some Figures (Figs. 11.4, 11.5 and 11.6) are provided after the conclusion of 
this chapter from Kubina and Yurich (2012) to help with data collection prior to 
charting on the SCC.
11.6 Summary/Conclusion 
This chapter aimed to introduce precision teaching and the standard celeration 
chart. Put simply, precision teaching is a way to measure the quality of instruc-
tion using the standard celeration chart, and to make rapid data-based decisions. 
Precision teachers have identified four guiding principles: (1) the learner knows 
best; (2) only use observable behavior; (3) measure behavior using frequency; and 
(4) chart data on the Standard Celeration Chart (Kubina, 2012). Precisionteaching 
can be applied to almost any instructional program and shows the rate with which 
learning occurs. There are a plethora of videos, articles, and books to help begin-
ner precision teachers and even apps that will drop dots (•) and Xs for the user! 
While the standard celeration chart can seem scary and difficult to understand,
134 J. Van and J. Quigley
Fig. 11.5 Observation session frequency counting data sheet (cross off version). Kubina and 
Yurich (2012)
once most people “get it” they tend to remain precision teachers for life using the 
motto, “have the heart to chart!” 
11.7 How to Learn More 
There is a large community of precision teachers who work hard to provide a 
wide range of resources. The Standard Celeration Society has a website and a 
holds conferences centered on Precision Teaching. Searching “precision teaching” 
or “heart the chart” on YouTube will yield many results that can help in under-
standing the chart. There are podcasts such as “The ABA and PT Podcast” as 
well as “ABA on Call” which devote hours of time describing precision teaching 
strategies, outcomes, and information. There are also centers such as Fit Learning 
who use precision teaching to develop skills in math, reading, writing, and logic. 
Finally, there are quite a few books on precision teaching such as Handbook of The 
Standard Celeration Chart, The Precision Teaching Book, The Precision Teaching 
Implementation Manual, Reflections on Precision Teaching, and Weaving Love and 
Science into Educational Practice.
11 Precision Teaching 135
Fig. 11.6 Weekly frequency counting data sheet. Kubina and Yurich (2012)
References 
Binder, C. (1988). Precision teaching: Measuring and attaining exemplary academic achievement. 
Youth Policy, 10(7), 12–15. 
Binder, C. (1996). Behavioral fluency: Evolution of a new paradigm. The Behavior Analyst, 19, 
163–197. 
Binder, C., & Bloom, C. (1989). Ruent product knowledge: Application in the financial services 
industry. Performance and Instruction, 28(2), 17–21. 
Binder, C., & Haughton, E. (2002). Using learning channels and the learning channel matrix. Paper 
presented at the International Precision Teaching Conference, Harrisburg, PA. 
Beck, R., & Clement, R. (1991). The great falls precision teaching project: A historical examina-
tion. Journal of Precision Teaching, 8(2), 8–12. 
Evans, A. L., Bulla, A. J., & Kieta, A. R. (2021). The precision teaching system: A synthesized 
definition, concept analysis, and process. Behavior Analysis in Practice, 14, 559–576. https:// 
doi.org/10.1007/s40617-020-00502-2 
Graf, S. A., & Auman, J. (2005). SAFMEDS: A tool to build fluency. Graf Implements. 
Johnson, K. R. & Layng, T. V. J. (1994). The Morningside Model of generative instruction. In 
R. Gardner III, D. M., Sainato, J. O., Cooper, T. E., Heron, W. L., Heward, J., Eshleman, & 
Grossi, T. A. (Eds.), Behavior analysis in education: Focus on measurably superior instruction 
(pp. 173–197). Brooks/Cole. 
Johnson, K. R., & Layng, T. V. J. (1992). Breaking the structuralist barrier: Literacy and numeracy 
with fluency. American Psychologist, 47, 1475–1490. 
Kubina, R. M., Jr., & Yurich, K. K. L. (2012). Precision teaching book. Greatness Achieved 
Publishing Group.
https://doi.org/10.1007/s40617-020-00502-2
https://doi.org/10.1007/s40617-020-00502-2
136 J. Van and J. Quigley
Kubina, R. M. (2019). The precision teaching implementation manual. Greatness Achieved Pub-
lishing Company. 
Lin, F., & Kubina, R. M. (2004). Learning channels and verbal behavior. The Behavior Analyst 
Today, 5, 1–14. 
Lindsley, O. R. (1990). Precision teaching: By teachers for children. Exceptional Children, 22(3), 
10–15. https://doi.org/10.1177/004005999002200302 
Lindsley, O. R. (1991). Precision Teaching’s unique legacy from B. F. Skinner. Journal of Behav-
ioral Education, 1, 253–266. 
Lindsley, O. R. (1992). Precision teaching: Discoveries and effects. Journal of Applied Behavior 
Analysis, 25, 51–57. 
Maloney, M. (2013, July 4). The Maloney method precision teaching series—Part 3 of 5: Common 
conventions of using the standard Celeration Chart. The Maloney Method. Retrieved Octo-
ber 29, 2022, from https://www.maloneymethod.com/resources/precision-teaching-series-part-
3-of-5-common-conventions-of-using-the-standard-celeration-chart/ 
McDade, C. E. (1992). Computer-based precision learning: A course builder application. Behavior 
Research Methods, Instruments, & Computers, 24, 269–272. 
McDade, C. E., & Olander, C. P. (1987). Precision management of instructional technology: A 
program update. Educational Technology, 27, 44–46. 
McGreevy, P. (1983). Teaching and learning in plain English. Plain English Publications (Univer-
sity of Missouri). 
Patterson, K., & McDowell, C. (2015). Using precision teaching strategies to promote self-
management of inner behaviors and measuring effects on the symptoms of depression. Euro-
pean Journal of Behavior Analysis, 10(2), 283–295. https://doi.org/10.1080/15021149.2009. 
11434326 
Rabbitt, L., Byrne, D., O’Connor, P., Gorecka, M., Jacobsen, A., & Lydon, S. (2020). A pragmatic 
randomised controlled trial of SAFMEDS to produce fluency in interpretation of electrocardio-
grams. BMC Medical Education, 20, 1–9. https://doi.org/10.1186/s12909-020-02021-8 
West. R. P., & Young, K. R. (1992). Precision teaching. In R. P. West & L. A. Hamerlynck (Eds.), 
Designs for excellence in education: The legacy of B. F. Skinner (pp. 113–146). Sopris West, 
Inc. 
White, O. R. (1986). Precision teaching-precision learning. Exceptional Children, 52(6), 522–534.
https://doi.org/10.1177/004005999002200302
https://www.maloneymethod.com/resources/precision-teaching-series-part-3-of-5-common-conventions-of-using-the-standard-celeration-chart/
https://www.maloneymethod.com/resources/precision-teaching-series-part-3-of-5-common-conventions-of-using-the-standard-celeration-chart/
https://doi.org/10.1080/15021149.2009.11434326
https://doi.org/10.1080/15021149.2009.11434326
https://doi.org/10.1186/s12909-020-02021-8
12TAGteach 
Robin Arnall 
12.1 Overview 
If you have ever experienced frustration teaching a motor skill-based movement or 
couldn’t figure out why something that works in a typical seated classroom won’t 
work in a setting like a gym class, you aren’t alone. In a seated classroom setting, 
you may not have the opportunity to give individualized praise or feedback to 
multiple students at once. You may also have difficulty addressing specific steps 
of motor movements, such as foundations for athletic skills, life skills, writing, 
etc. To be more specific, a student may struggle with one step of a skill that may 
require targeted teaching. In addition, what works in a typical classroom setting 
may not work as efficiently in other types of classes such as physical education 
classes, when coaching sports, and other health related classes that require complex 
motor skills. 
TAG™, or TAGteach is an acronym for Teaching with Acoustical Guidance™, 
which pairs a familiar sound with a targeted response, or a behavior. TAGteach™ 
is an effective method of providing quick and immediate feedback for these situ-
ations mentioned above. Some associate TAGteach as just clicker training, but it 
involves more complex teaching strategies. Clicker training simply uses a clicker 
in place of, or along with, an item that serves as a reinforcer (such as a token or 
praise), while TAGteach uses additional proactive strategies. TAGteach is a method 
of instruction that is trademarked and trained through TAGteach International, the 
company which created the principles outlined in this chapter. Since there are 
additional strategies involved with the use of TAGteach, it is recommended that
R. Arnall (B) 
Russell Sage College, Albany, NY, USA 
e-mail: rarnall@thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
EducationClassroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_12 
137
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_12&domain=pdf
mailto:rarnall@thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_12
138 R. Arnall
teachers who want to pursue training in TAGteach either attend a training sem-
inar or go through an online training course through https://www.tagteach.com/. 
Anyone seeking training can choose varying levels of certification. 
TAG has its own language for several concepts that may be similar in teaching. 
It is imperative that these terms (when using them to define TAGteach procedures) 
are appropriately used, as they are specific to the organization. For instance, a tag-
ger is anything that creates an audible cue or sound to signal to the student a correct 
response has been made (generally a handheld clicker, but it can be anything that 
creates a recognizable sound like a tap, beep, horn, etc.), and a tag™ is the sound 
that a tagger creates. Another example is a concept that will be used throughout 
this chapter, the tag triangle™, which includes the main instructional components 
used in TAGteach; identifying, tagging, and reinforcing. This chapter will provide 
a general overview of TAGteach methods (including instructional delivery, break-
ing down teaching skills, and rewarding the movements) and how this teaching 
strategy can be incorporated into educational goals and outcomes. 
12.2 Implementation 
TAGteach is based on principles demonstrated through research studies conducted 
in Applied Behavior Analysis (Luiselli & Reed, 2015). These studies are demon-
strated across different skills and are considered solid foundations for motor skill 
development. The concept of reinforcement is used heavily in TAGteach, which 
is used in many classrooms already. Reinforcement involves an action or conse-
quence given after a behavior that will result in the behavior occurring more. For 
example, reinforcement could occur when you praise a student in class for writing 
their name correctly or answering a question correctly. Reinforcement can take 
on various forms, but what if you aren’t able to easily use reinforcers or tokens 
because the targeted behavior is too quick or complex? If you are working on 
an athletic skill such as a high jump, you can’t necessarily interrupt that behavior 
safely or easily. In addition, what if you don’t want to interrupt the positive behav-
ior? If a student is taking a test, for instance, you may want to praise the student for 
working diligently, but you also don’t want to interrupt them and possibly distract 
them. TAGteach takes the concept of reinforcement and uses it to pair the sound 
of a tagger to a reinforcer, such as a token, then fade that additional reinforcer 
and only use the tagger for steps completed correctly. Praise can be paired with 
the tagger as well, and when used with TAGteach, it could be easier to provide 
precise feedback to specific behavior as it occurs rather than using vocal praise 
alone. 
Shaping is another concept demonstrated through ABA research, which is 
highly effective at teaching new skills. Shaping involves reinforcing approximate 
responses to a complex response until the desired outcome is achieved or reward-
ing small steps to a larger behavior (Skinner, 1975). In TAGteach, shaping is done 
using modified task analysis, which involves breaking steps of a complex response 
into several easier to follow and watch movements. The steps that you identify to
https://www.tagteach.com/
12 TAGteach 139
teach are called tag points. Tag points™ are used in TAGteach interventions and 
should meet WOOF criteria. WOOF™ involves the following; what you want, one 
criterion, observable and measurable, and five words or less. For instance, the tag 
point “extend your arm in a straight line directly above your head” would not 
meet WOOF criteria but using “raise arm above head” would. Making sure your 
tag points meet the WOOF criteria should ensure easier teaching for both teachers 
and students by making the steps clear, direct, and simple. 
When starting the teaching process with TAGteach, specific tag points are iso-
lated and taught. As the student learns one tag point, the teacher would then start 
teaching other tag points until the skill is learned. It is important to note that 
TAGteach also involves the use of a “three try rule”, which gives the student up 
to three tries at a tag point. If the student does not demonstrate a tag point by the 
third try, the teacher is encouraged to create a different tag point that the student 
may learn more easily. This would generally involve breaking the tag point further 
down into additional tag points, using smaller movement increments. For example, 
if the original tag point of “bend waist and square shoulders” is not easy for the 
student, the teacher may break down the original tag point into two new tag points 
and work on them separately, such as “bend waist” and “square shoulders”. 
The way feedback is given is also very important in TAGteach. Generally, for 
feedback to be the most effective, it must be given immediately after the behavior 
occurs (Anderson & Kirkpatrick, 2002). This can be especially difficult when the 
behavior occurs in the middle of a complex and multiple component movement, 
such as those targeted for sports skills. For example, if you are teaching a student 
how to kick a ball correctly, stopping the student as the foot is making contact 
with the ball or rising would be difficult and possibly result in frustration by both 
the teacher and student. Feedback should also be consistent, as when it is not, the 
student does not know their expectations. TAGteach is helpful at addressing these 
concerns. When using TAGteach, the sound from a tagger is delivered as soon as 
the tag point is demonstrated, making sure that the student is aware it was correct 
or not. For example, if the tag point is “foot touches lower middle” to make contact 
with the ball in a specific spot, as soon as the teacher sees the correct contact of 
the foot to the ball, the tagger should be used within 1–2 seconds to signal to the 
student they demonstrated the tag point correctly. 
Positive phrasing and providing feedback on appropriate behavior versus inap-
propriate behavior is another skill used commonly with teaching (Eckert et al., 
2006). Instead of telling students what not to do, it is recommended to tell them 
what to do in a succinct matter using precise language. This is to increase their 
chances of success and that they will engage in this appropriate behavior in the 
future. For example, if a student is running in the hallway you may say, “I’d like 
to see walking feet.” instead of “stop running.” TAGteach uses a concept called 
the Focus Funnel™, which involves taking a large amount of instructional infor-
mation and condensing it into smaller components, also helping with the creation 
of tag points. Directions should be given to students in a positive manner and be 
concise. For example, instead of saying, “You are going to write your name, which
140 R. Arnall
will involve holding your pencil using a pincher grasp, moving the pencil appro-
priately to form the letters of your name, and then putting the pencil down when 
you’re done,” a teacher would say, “we are going to start by ‘picking up pencil,’ 
and when you do this, you get a tag (click).” No more information should be given 
than is needed for the student to understand the direction(s) and tag point(s). 
12.3 Application 
12.3.1 Materials Needed 
TAGteach is simple to use for teaching and requires few materials. The teacher 
should be trained and confident in the procedures prior to using them. You can 
train through TAGteach International or be supervised by a certified TAGteacher 
or familiar person. Teachers can either use TAGteach for individual students or 
individual tag points as the sole person using TAGteach, orfor peer pairings (i.e., 
peers tagging tag points called out by the teacher for other peers), which can be 
an effective method of teaching when working with a larger group. You need a 
tagger, or anything that makes a consistent sound. This is generally a handheld 
clicker, or a small plastic device with a metal interior that when pressed, makes 
an audible “click” sound. It can also be any device or movement that creates a 
consistent audible cue like tapping a table, using a megaphone beep, etc. In a 
classroom setting or when you are the only person using TAGteach, only one 
tagger is needed (as you could just isolate teaching to one student, for one step of 
a skill and rotate “calling out” to different students, etc.). If you want to have peers 
tag each other in a group session, you will need taggers for each pair of students 
participating. You also need tag points for the skill you want to teach, which could 
be identified while teaching in the moment or created beforehand. 
TAGteach is a concept that, while heavily supported in sports, fitness, and life 
skill support research for varied students, can be used in any type of classroom 
or educational setting. For instance, in a general education classroom, TAGteach 
could be used for STEM activities that require more motor movements and applied 
practices, reading, writing, math, and any other activity that requires a specific 
action or movement. TAGteach could also be used in life skill-based classrooms 
to support neurodivergent students with necessary skills to develop autonomy such 
as cooking, safety, vocational, and other relevant skills. In specialized classes such 
as physical education, art, music, etc., TAGteach could be used to target specific 
responses such as acquiring the correct way to perform a sports skill, like throwing 
a ball, or even painting a specific design. In all these settings and across skills, 
the way TAGteach is delivered can vary either with the teacher implementing 
TAGteach with groups of students or to individual students, or with peer pairings 
or groupings where the students would tag each other’s behavior.
12 TAGteach 141
12.3.2 Procedures 
For teachers who are new to TAGteach, the Focus Funnel concept is generally the 
most difficult to demonstrate. In lieu of providing excessive amounts of informa-
tion, such as, “this is a keyboard and mouse. You use them to do various things, 
such as navigating through a computer system. You can use the mouse to click 
on different icons using a cursor, and use the keyboard to type out various words, 
symbols, and numbers. Also, on the keyboard you have various function keys that 
can do different things within the computer’s menu.” A teacher using TAGteach 
would condense the information to “we are going to learn how to type your name 
using the letters J, o, h, and n on the computer keyboard. Your first tag point is….” 
The following is an example of a task analysis sheet with tag points that meet 
WOOF criteria (as defined above) for writing the capital letter “A.”
• Pick up pencil
• Hold in right (or left) hand
• Fingertips to point (pincher grasp)
• Pencil touches paper
• Line down on left
• Line down on right
• Line connects in middle
• Lift pencil from paper
• Put pencil down. 
When first learning to tag correct responding, timing is important. The tag (click 
or sound) must be delivered when the response is demonstrated and immediately 
following the response, as mentioned previously. Common errors include allowing 
seconds to lapse between the response and the delivery of the tag or even tagging 
an incorrect response. The following is an example of a teacher teaching a novel 
skill (using a stove) to a student using TAGteach in a life skills classroom setting.
• The teacher provides coaching, such as, “we are going to learn how to use a 
stove today. When I tell you a tag point and you show me correctly, you will 
hear this (click tagger). If you don’t show me correctly, I will give you two 
more chances to try again before we will try another step.”
• Teacher coaches, “dial to medium/high.” Student touches dial but does not turn 
it.
• Teacher provides the cue 2 more times with no success. Due to the three try 
rule, the teacher modifies the tag point.
• Teacher coaches, “hand on dial.”
• Student touches the dial.
• Teacher provides a tag. This should happen as the student finishes touching the 
dial, completing the tag point.
• Teacher coaches, “turn the dial.”
• Student turns the dial.
142 R. Arnall
• Teacher provides an immediate tag and continues to other tag points. 
If you have a large group, such as coaching for a sport or teaching a group a 
difficult skill that requires motor movements, you may want to use peer pairings 
with TAGteach to be more efficient with similar results. This could work by setting 
up pairs of students together and giving them brief instructions on when the tag 
should occur. You would then tell them a tag point and have them watch each other 
to see if the tag point happens. It is easier to move across students and monitor 
how they are doing. Using peer pairings also would not call out a student in front 
of a large group who may not prefer that type of feedback. A peer pairing session 
could look like:
• Teacher pairs students up into partners
• Teacher gives instruction “one student will have the tagger, while the other 
practices jumping jacks. If you have the tagger, I will tell you what your partner 
needs to do. If they do it, you press the clicker as soon as you see it. If you are 
doing jumping jacks, you will listen for your partner to click the tagger. If you 
hear it, keep doing what you are doing. If you don’t, practice the step by itself 
or raise your hand for help.”
• Teacher gives tag point “first look for ‘legs make a mountain’” (i.e., legs spread 
about hip width apart when jumping into position)
• Students doing jumping jacks start practicing, while tagging students watch and 
press the tagger when they see the tag point
• Students needing help raise their hands, teacher demonstrates or helps students 
practice tag point as needed. 
12.4 Summary 
Most of the research studies involving the use of TAGteach teach sports skills, 
making TAGteach ideal for physical education classes or coaching sports. How-
ever, there is growing research for autonomy skills and other types of necessary 
skills such as independent living skills, self-help skills, etc. Luiselli and Reed 
(2015) highlighted several types of sports studies in a comprehensive literature 
review. Specific sports skills taught in the studies include football, martial arts, 
soccer, tennis, basketball, swimming, baseball, track, gymnastics, speed skating, 
dance, and golf skills. Harrison and Pyles (2013) used TAGteach with high school 
athletes to learn football tackling. Brief megaphone beeps were used as the tag to 
teach the skills. After participating in the study, students demonstrated more pro-
ficiency in tackling. Another example includes dance skills. Quinn et al. (2015) 
taught novel dance movements (a turn, kick, and leap) to competitive dancers 
using TAGteach. By the end of the study, the students not only demonstrated pro-
ficiency of the targeted movements but reported enjoying using TAGteach along
12 TAGteach 143
with having better dance skills. If using TAGteach for coaching for physical educa-
tion, you could use several approaches. If you don’t have the resources to conduct 
a full class for one student and one skill, you could use a tagger and isolate a 
tag point in the moment while working with either a single student or a group of 
students. You could also give taggers and specific tag points to a group of students 
and teach them how to tag for each other, monitoring the teaching and providing 
additional feedback as needed. 
There is a growing body of literature to suggest that TAGteach can be a use-
ful teaching tool in many educational areas, such as medical training. For instance,Levy et al. (2016) taught surgical skills to medical students using TAGteach. When 
compared to a control group, the intervention group (taught the skills through 
TAGteach) demonstrated all skills targeted in the procedure accurately, while 
the control group demonstrated less accuracy. Even though the students taught 
using TAGteach did not perform the skills quicker, this study demonstrated that 
TAGteach can help with accuracy and precision for skills taught. 
Since TAGteach can supplement teaching for varied skills, this would be a 
method of teaching that would be appropriate to use in conjunction with fluency-
based training programs, such as precision teaching (featured in another chapter). 
After a skill has been taught, fluency might be the next appropriate educational 
goal. Some examples could involve writing numbers in a quick and efficient man-
ner, doing a set number of sit ups in a specified amount of time, etc. TAGteach 
could be a first step in teaching a new skill to a student, then precision teaching 
could be a second step to ensure fluency and maintenance of the skill once taught. 
When working with dysregulated or challenging behavior, using tokens or other 
types of reinforcers, particularly in a classroom, may not be available, consistent, 
or feasible for a teacher to use. TAGteach could be used as a signal of approval 
for students as part of a behavior plan instead. Students may prefer this more as 
it could be discrete, but still effective. For some, there may be a stigma associ-
ated with the use of a hand clicker for humans (as many associate hand clickers 
with animal training), however, students, teachers, and researchers report prefer-
ring TAGteach over other teaching methods for particular skills (Ennett et al., 
2019; Quinn et al., 2015). This should ultimately be student preference when 
considering different types of feedback, including TAGteach methods. 
Since research using TAGteach methods is still growing beyond sport skills, 
considering other types of skills (e.g., life skills, novel academic skills) is encour-
aged. Using TAGteach with any complex skill could result in positive outcomes. 
TAGteach is just one of many effective methods of teaching and it could also 
easily be embedded into a program. It is encouraged (when appropriate) to over-
lap with other methods discussed in additional chapters (e.g., precision teaching, 
differential reinforcement, CABAS, etc.), rather than just serve as a standalone 
intervention.
144 R. Arnall
References 
Anderson, G., & Kirkpatrick, M. (2002). Variable effects of a behavioral treatment package on the 
performance of inline roller speed skaters. Journal of Applied Behavior Analysis, 35(2), 195– 
198. https://doi.org/10.1901/jaba.2002.35-195 
Eckert, T., Dunn, E., & Ardoin, S. (2006). The effects of alternate forms of performance feedback 
on elementary-aged students’ oral reading fluency. Journal of Behavioral Education, 15(3), 
148–161. https://doi.org/10.1007/s10864-006-9018-6 
Ennett, T. M., Zonneveld, K. L. M., Thomson, K. M., Vause, T., & Ditor, D. (2019). Comparison 
of two TAGteach error-correction procedures to teach beginner yoga poses to adults. Journal 
of Applied Behavior Analysis, 53(1), 222–236. https://doi.org/10.1002/jaba.550 
Harrison, A., & Pyles, D. (2013). The effects of verbal instruction and shaping to improve tackling 
by high school football players. Journal of Applied Behavior Analysis, 46(2), 518–522. https:/ 
/doi.org/10.1002/jaba.36 
Levy, I. M., Pryor, K. W., & McKeon, T. R. (2016). Is teaching simple surgical skills using an 
operant learning program more effective than teaching by demonstration? Clinical Orthopedics 
and Related Research, 474(4), 945–955. https://doi.org/10.1007/s11999-015-4555-8 
Luiselli, J., & Reed, D. (2015). Applied behavior analysis and sports performance. In H. S. 
Roane, J. L. Ringdahl, & T. S. Falcomata (Eds.), Clinical and organizational applications of 
applied behavior analysis (pp. 523–553). https://psycnet.apa.org/doi/10.1016/B978-0-12-420 
249-8.00021-6 
Quinn, M., Miltenberger, R., & Fogel, V. (2015). Using TAGTeach to improve the proficiency of 
dance movements. Journal of Applied Behavior Analysis, 48(1), 11–24. https://doi.org/10.1002/ 
jaba.191 
Skinner, B. (1975). The shaping of phylogenic behavior. Journal of the Experimental Analysis of 
Behavior, 24(1). 117–120. https://doi.org/10.1901/jeab.1975.24-117
https://doi.org/10.1901/jaba.2002.35-195
https://doi.org/10.1007/s10864-006-9018-6
https://doi.org/10.1002/jaba.550
https://doi.org/10.1002/jaba.36
https://doi.org/10.1007/s11999-015-4555-8
https://psycnet.apa.org/doi/10.1016/B978-0-12-420249-8.00021-6
https://psycnet.apa.org/doi/10.1016/B978-0-12-420249-8.00021-6
https://doi.org/10.1002/jaba.191
https://doi.org/10.1002/jaba.191
https://doi.org/10.1901/jeab.1975.24-117
13Personalized System of Instruction 
Jennifer Quigley 
Personalized System of Instruction (PSI), also known as The Keller Plan, is an 
instructional program created in the 1970s. PSI is based on the theory that learning 
is most effective when it occurs gradually, feedback is delivered rapidly and con-
sistently, and reinforcement is immediate (Eyre, 2007). Since PSI’s development, 
it has been implemented in general education, special education, and advanced 
education settings with success. After an initial increase in usage through the 
1980s, the number of PSI programs in progress had declined prior to the advent 
of technology and the incorporation of personalized feedback across applications 
and programming (Hammerschmidt-Snidarich et al., 2019). Across the last two 
decades, personalized systems of instruction have gained in popularity and use 
through the rise in interactive applications and computer programs. The U.S. 
Department of Education’s National Education Technology Plan Update states that, 
“When carefully designed and thoughtfully applied, technology can accelerate, 
amplify, and expand the impact of effective teaching practices.” (U.S. Department 
of Education, 2017, p. 5). 
13.1 Overview 
PSI is an individualized, personalized instructional method. Students create their 
own goals, manage their learning, progress at their own pace, and support and com-
municate with others throughout the process (Eyre, 2007). PSI specifically refers to 
the delivery method of instruction; it can be used across various subjects and con-
tent areas. Within PSI, teachers serve as the facilitators; rather than being depended
J. Quigley (B) 
The Chicago School, Chicago, IL, USA 
e-mail: jquigley1@thechicagoschool.edu 
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 
J. Quigley et al. (eds.), Incorporating Applied Behavior Analysis into the General 
Education Classroom, Springer Texts in Education, 
https://doi.org/10.1007/978-3-031-35825-8_13 
145
http://crossmark.crossref.org/dialog/?doi=10.1007/978-3-031-35825-8_13&domain=pdf
mailto:jquigley1@thechicagoschool.edu
https://doi.org/10.1007/978-3-031-35825-8_13
146 J. Quigley
on to provide the knowledge, teachers support students in accessing the content, 
providing feedback, and adding supportive resources to supplement the instruc-
tion (Eyre, 2007). Along with many of the other teaching programs and strategies 
included within this book, individualization of programming, dense schedules of 
reinforcement, and specific feedback are key components of PSI. Though PSI can 
stand alone and be implemented independently, it has also been incorporated into 
other teaching systems such as the Morningside Model of Generative Instruction 
(see Chap. 15) and Comprehensive Application of Behavior Analysis to Schooling 
(CABAS®; see Chap. 16). 
Individualized instruction may seem like the ideal rather than a potential reality 
in the general education setting, but PSI has been shown to be effectively imple-
mented across general education classrooms with the support of administration 
and technology access. Building on the conceptsof individualized education and 
differentiated instruction, the one size fits all practice within the current educa-
tional system can feel hindering. PSI differs from traditional classroom lecture in 
that it enables students to progress at their pace while all students demonstrate 
mastery across all class material. In addition, with technological advances, mate-
rials are more efficiently adapted and personalized. PSI is data-driven and student 
focused. It can be implemented manually or electronically. Programming includes 
concise, specific teaching objectives within specified learning modules and focuses 
on written study guides over vocal lectures (Akera, 2017). 
13.1.1 Key Components of PSI 
Many of the components within the PSI program are going to sound familiar as 
they closely align with what is currently happening in many classrooms as far as 
written materials being presented and quizzes to demonstrate mastery. The key 
differences are the preparation of setting up the class prior to it beginning and the 
emphasis on self-pacing. To break it down further, we will discuss the five main 
components of PSI shown in Table 13.1. 
Table 13.1 Five components of PSI 
Components of PSI Application examples 
Written communication Study guides with study questions 
Unit quizzes with mastery criterion Quiz requiring 90% mastery 
Student pacing Proceeding through material at individual place 
Use of proctors Students from higher grade assists in the classroom (peer 
mentor) 
Lectures as motivation Reinforcing videos/materials that add to the content 
Note Each component presented with application examples (Eyre, 2007; Johnson, 2008)
13 Personalized System of Instruction 147
13.1.1.1 Written Communication 
PSI emphasizes the importance of written communication between the teacher 
and students. This can include a course syllabus or manual (see Fig. 13.1), written 
study guides (see Fig. 13.2), and supplemental materials. Most of the teacher-
student communication of unit material is completed through written materials 
with clear expectations. The role of the study guide within PSI is to provide the 
student with a clear outline of what is to be learned and what students need to 
do to learn the material (Grant & Spencer, 2003). The study guide may include 
teacher notes to highlight certain lessons, provide practice problems to prepare 
for the readiness test, and/or include a supplementary reading list. Other written 
materials within PSI include the course manual and course objectives. The course 
manual outlines all requirements of the course and the PSI model. Course objec-
tives clearly state what the student is expected to learn upon completion of the 
course via each sequenced unit.
To prepare for a PSI course, the teacher would divide the content of the class 
into modules or units. This can follow directly assigned chapters or material that 
are already divided into segments in a typical general education classroom. Within 
each unit, students then are assigned material (the syllabus or manual of the class 
and study guide) that they work through. There is then an assigned criterion or 
knowledge-based quiz to complete at the end prior to moving on to the next mod-
ule. For example, if the module is on reading an assigned chapter from a book, 
the objectives may include (1) identifying three attributes of a lead character, (2) 
summarizing the main idea and three supporting details from the chapter, and (3) 
state at least one event to come based on foreshadowing. The study guide may 
include fill-in-the-blank excerpts from the chapter to supplement the reading and 
prepare the students for the unit quiz. Then, the unit quiz directly relates to the 
learning objectives identified for the unit. Once the student has reached mastery on 
the unit quiz, they would move on to the next chapter with its own assigned work, 
study guide, and quiz. In between each chapter, supplemental material would be 
available to add on to the content provided within each unit. 
13.1.1.2 Unit Quizzes with Mastery Criterion 
Each learning module within PSI is broken into small units of material that are pre-
sented in a set sequence. The student must demonstrate mastery on each unit prior 
to moving on to the next one. Mastery is defined for each unit such as completing 
a quiz with 90% correct answers (see Table 13.2), writing an essay on the assigned 
topic, or completing a sports exercise with 100% accuracy. For example, they must 
be able to show competency in Chap. 1 before moving on to Chap. 2. Teachers can 
assess if a student understands concepts by testing them with quizzes, asking for 
a physical demonstration, or by reviewing essays. During evaluation, feedback is 
given immediately; correct responses are praised, and incorrect responses receive 
corrective feedback. Students can test out of the material via written submission 
or physical demonstration. For physical demonstration (within physical education 
classes as an example), students could record themselves completing the assigned 
action and submit the video as the assignment. The teacher or student proctors
148 J. Quigley
PSI Course Overview 
Introduction: Welcome to your course! This virtual classroom has everything you need to be 
successful in this course. In this class you will learn how to x, respond to y, and demonstrate 
xyz. As your instructor, I will play the role of supporting you and providing feedback 
throughout this course, but the learning will come from the materials within this course. Your 
xyz course is being taught using the personalized system of instruction (PSI). Unlike what you 
have become accustomed to in other classes, this class allows you to learn at your own pace. 
The pace of progress is set by you; you can proceed as quickly as you can while demonstrating 
success. 
 PSI-courses are achievement based. This means that you are expected to master the content of 
the course, attend class regularly, follow the policies/procedures, and complete the 
assignments at your pace. 
Student expectations: Your role in this class is to work through the virtual classroom and 
complete all related activities. You will not need to depend on me to work on the coursework. 
The class is set up for you to work through at your own pace. 
Teacher expectations: As your teacher, my role is in supporting you via the created 
coursework, supporting your proctors in providing feedback and grades to you, and providing 
any additional individual attention needed. This can be seen as a facilitator role in contract to a 
traditional teacher role. 
Assessment: 
1. Performance Skills: Your performance skills will be measured via roleplay activities 
incorporated into the virtual classroom. During several assignments, you will be asked 
to choose the correct response within the given scenario. 
2. Knowledge Skills: Your skills will be measured via a unit quiz following each 
assigned section. This will include content from each learn unit and will be comprised 
of open-ended and multiple-choice questions. A 95% is required to move to the next 
unit. 
Attendance: To be marked as present, you must bring your laptop and headphones to class. 
0-2 absences: No penalty 
3-5 absences: Deduct 5 points from final average 
6-10 absences: Deduct 10 points from final average 
11+: Deduct a full letter grade from final average 
Fig. 13.1 An example course manual. Note A templated example of a course manual with assess-
ment criteria
13 Personalized System of Instruction 149
Learning Module 1 
In this module, you will practice the xyz skills from Unit 1. You have completed the 
xyz chapter that has reviewed what x is, how to apply y, and incorporating feedback if 
z occurs. When applying y, it is important to remember to check in with the person you 
are interacting with, recruit feedback throughout the interaction, and provide feedback 
prior to ending the conversation. Prior to completing thelearning activity, watch this 
video (linked here) and take notes on the implementation of the feedback component of 
the interaction. 
Three types of feedback provided: 
1) ______________________ 
2) ______________________ 
3) ______________________ 
Learning Activity 
This is an example conversation to practice the xyz skills of feedback that you are 
learning. Your conversation partner, Lucy, has approached you about a concern she 
observed during recess. Following Lucy reading her script about her concerns from the 
playground at recess, provide feedback with specified follow up steps. 
Conversational Goals 
What aspects of the conversational components were included in this exchange? 
What types of feedback would you provide? 
Session Training Activities 
What would you recommend to Lucy as next steps? 
How will you follow up on these steps? 
How can you demonstrate empathy in responding to Lucy’s concerns? 
Fig. 13.2 Study guide example. Note A template of a course study guide with learning activities
150 J. Quigley
Table 13.2 Criterion-based quiz 
Exercise Skills to demonstrate Score (satisfactory/ 
unsatisfactory) 
Comments Proctor’s 
initials 
Provide 
feedback to a 
peer
• Note the peer’s concern
• Confirm you have 
identified the concern
• Provide feedback on 
the concern using one 
of the identified forms 
taught in lesson 1
• Determine follow up 
schedule 
Grading: Scores must be listed as Satisfactory (S) to pass this skill-based quiz. If the attempt is 
unsatisfactory (US), you will be asked to practice it again after further review of Unit 1 content. 
Specific feedback will be provided in the comments section for both S and US performances 
Note An example of a criterion-based quiz targeting peer feedback 
would then review and return the assignment with feedback. This process would 
continue until the student submits an assignment that reaches mastery criterion. 
Within PSI, the focus is on positive and corrective feedback. All feedback is 
helpful; it moves the student closer to the goal. Oftentimes in educational settings, 
students engage in behaviors to escape negative consequences. This may be in the 
form of completing homework to avoid getting a poor grade, participating in class 
to avoid being called on without warning, or attending school to avoid detention or 
suspension. In PSI, these negative consequences are minimized through the inclu-
sion of multiple opportunities to pass each unit, the ability to continue studying 
prior to retaking the unit test, and immediate corrective feedback which shows 
the student the correct answer and how the correct answer is determined. Through 
these components, the stigma of failure is removed; there are only more opportu-
nities to pass the unit. PSI also uses grades as incentives rather than as a ranking 
system. All students can, and must, get the same grade; though the time it takes to 
get there will differ. The student must pass prior to being presented with the next 
unit. 
To incorporate this framework, the teachers may establish a specific number of 
units that must be completed within the quarter or semester to achieve a passing 
grade. For example, in a class that has 10 units per quarter, passing all 10 units 
may equal an A, 9 units a B, 7 units a C, 6 units a D, and 5 or under units as 
a non-passing grade. During the next quarter, the students may pick up where 
they left off or repeat previously presented material to ensure comprehension and 
mastery. Additional materials to supplement the content may be added to increase 
the students’ understanding of the material.
13 Personalized System of Instruction 151
13.1.1.3 Self-paced Programming 
The self-pacing aspect of PSI refers to each student progressing at their own pace. 
Students move through the material at a speed that is proportionate to their abil-
ity and other competing demands at the time. These competing demands may 
include other required services (e.g., special instruction, speech/language, occu-
pational therapy, physical therapy, behavioral support), unscheduled absences, or 
extracurricular activities (dependent on the grade level and school structure). There 
are no penalties for requiring more or less time to learn; grades aren’t given based 
on time required for acquisition, but demonstration of unit mastery. Because of 
the self-paced aspect of PSI, PSI courses may be implemented within a tradi-
tional semester model (see above example), but it is best when the whole school 
is following a self-paced model (Akera, 2017). 
In a traditional general education classroom, the time to learn the concepts is 
fixed. For example, Book A is covered for one week, a quiz is given at the end of 
the week, and then the class moves onto Book B. Some students may have passed 
the test and others may not have, yet everyone moves on and through the class 
material at the same time. In contrast, a PSI classroom bases progress on quality 
or mastery of the content. Student 1 may master Book A in two days, student 2 
may master Book A in seven days, and student 3 may take 14 days to master the 
same material. Each student achieves the same quality (i.e., mastery of the topic 
measured by reaching mastery criterion on the Book A quiz) at their own pace. 
13.1.1.4 Use of Proctors (And the Teacher’s Role) 
Proctors within PSI are students who have previously mastered the material being 
taught. Think of a proctor as a peer-model. For younger grades, these proctors 
may be students from the middle school or high school that assist as a learning 
opportunity. In high school courses, proctors may be students in the same course 
that have worked more quickly through the material or who have previously com-
pleted the course. These proctors are available throughout the class; they are used 
to implement the unit tests, grade the assignment, and give a reward for the stu-
dent’s performance. In general, proctors are used to help the student learn. Keller 
(1968) believed proctors significantly enhanced the personal-social aspect of edu-
cation. Proctors were relatable to the students as they had recently been in the 
same situation. Proctors can also be internal or external. Internal proctors would 
be students who are currently enrolled in the same course but are further along in 
the units while external proctors are those that have previously passed the entire 
course (Akera, 2017). In the education setting, external proctors could consist of 
students in higher grade levels who assist as part of extracurricular teaching or 
peer-mentor programs. 
The teacher’s role in PSI is to prepare all the course materials (see Table 13.3). 
PSI requires much of the work for a course to be completed prior to the class 
beginning. All materials, including the study guides, course policy and manual, 
unit objectives, and readiness tests, must be prepared ahead of time which creates 
a front-heavy workload for the teacher. The teacher also supports the proctors in 
their roles of supporting and guidance as necessary. Once the class is underway,
152 J. Quigley
Table 13.3 The teacher’s role 
Benchmarks How to verify 
Teacher creates PSI-course materials that are 
easily understandable and accessible to the 
students 
1. Virtual course is posted/In-person course is 
scheduled 
2. All necessary materials are linked and links 
are functional/materials are printed 
3. Assess student questions in regards to 
access/materials 
Teacher spends less than 5% of time in class 
in a supervisory role 
1. Observed/tracked time by teacher or proctor 
Teacher spends majority of class time in 
individual feedback/attention delivery 
1. Observed/tracked time by teacher or proctor 
Benchmarks are achievable 1. 80% of students complete course on 
schedule 
2. Less than 5% of students require additional 
time than allotted within the complete course 
3. Any benchmarks not achieved within the 
above standards are reevaluated and modified 
Teacher observes/verifiesless than 25% of all 
mastery attempts 
1. Proctor completes 75% or more of mastery 
attempts as noted in quiz/unit initials 
No more than 5% of class time is allotted to 
presentations/group instruction 
1. As tracked by teacher/proctor via timer 
Note Example of teacher requirements and roles within a PSI program 
the teacher monitors student progress, assesses the overall progress of the course, 
and provides individual feedback to specific students. The proctors take on most 
of the daily responsibilities of grading and feedback. 
13.1.1.5 Lectures as Motivation 
Viewing verbal lessons (or lectures) as optional rather than the building blocks 
of the instruction is contrary to the typical approach to education in the public-
school systems. In PSI, lessons are primarily used only for motivation and are 
optional even in that role. Written material is the primary source of information 
on learning new skills. Lectures were initially listed as a component of PSI as a 
form of motivation, but research completed since has not shown lectures to be a 
beneficial component of PSI in that lectures did not improve student performance 
(Brothen & Wambach, 2001; Johnson & Ruskin, 1977). The lessened emphasis and 
decreased use of lectures, even as a supplemental feature, highlight the data-driven 
feature of PSI. As a model, PSI is fluid and influenced by empirical research. 
If a lecture is used, it may be given at the end of a unit to summarize the infor-
mation, apply the lesson to real-world examples, and draw connections between 
previously mastered material and upcoming lessons. Lectures are aimed to stimu-
late interest in the upcoming material rather than as a vehicle to teach the targeted 
lesson. Let’s say the lesson was on a chapter in a novel. The supplementary lec-
ture, or lesson, may include watching a clip from a movie, playing a game related
13 Personalized System of Instruction 153
Table 13.4 Tech-based 
home screen 
Main menu Game time Skills check Exit 
Share Modules 
Break Unit 1 instruction Unit 1 skills 
check 
Unit 1 quiz 
Unit 2 instruction Unit 2 skills 
check 
Unit 2 quiz 
Unit 3 instruction Unit 3 skills 
check 
Unit 3 quiz 
Note A tech-based example of learning modules from a home 
screen of a virtual classroom. All buttons would activate corre-
sponding pages associated with the PSI course 
to the context of the novel, watching a current TV show that highlights parallels 
between the characters of the novel and of the show, or creating a social media 
account for the characters of the novel. Creative, interesting, and attention-getting 
videos and activities can be used to build on student interest in the current topic 
and upcoming lessons. 
13.1.1.6 Virtual Learning and PSI 
The increase in individuality and adaptability of technology has been directly 
applied to PSI programs. This has been further incorporated through the increase 
in virtual and distance learning. Education delivered virtually, either synchronously 
or asynchronously, requires more reliance on the written word (Grant & Spencer, 
2003). Preparation takes place prior to the classes being published or shared, align-
ing closely with the instructional methodology of PSI. Virtual learning using PSI 
also allows for flexibility which can accommodate learners with varying sched-
ules, who are more successful in learning on their own time, or who may have 
disabilities which impact their ability to participate in the traditional classroom 
setting. Students can complete their work at any physical location and can engage 
with interactive and instructional resources (Grant & Spencer, 2003). Table 13.4 
presents a visual example of a home screen display of learning modules within a 
virtual PSI course. Grayed out units are unavailable/inactive until the prior units 
are completed at mastery. Supplementary materials are presented in game time and 
students can choose to share their work with a proctor throughout. 
13.2 Implementation 
13.2.1 Measuring Effectiveness 
Effectiveness of a PSI program is shown by student success in mastering the 
assigned units. Quizzes are scored as they would be in general education with 
the caveat of the quick turnaround in feedback. Proctors (or the teacher or teacher 
assistant) return the completed quizzes with supplemental materials as appropriate.
154 J. Quigley
Students engage in supplemental lessons as it interests or motivates them. This 
allows students to focus on their progress while also diving deeper into content 
areas that they are most interested in. Quick response (QR) codes have also been 
integrated into PSI programs to enhance the technology component. QR codes can 
be linked to videos, instructional materials, PDFs, quizzes, and more (Baxter et al., 
2018). 
An early example of implementing PSI within a general elementary educa-
tion setting was completed by Werner and Bono (1977). This study evaluated PSI 
implementation across a second-grade general education classroom. The time and 
schedule allotted to each taught subject remained unchanged. The current text-
books and curriculums were reviewed and adapted to match the PSI structure. 
For example, rather than the current assignment of one word set per day with a 
quiz on Friday within spelling class, the PSI program allowed students to move 
through the four sets of words at their own pace followed by the unit quiz once 
the word sets were completed. The transition from the traditional structure to 
the PSI structure was gradual; students were informed about PSI and what the 
goals of the method were (Werner & Bono, 1977). The class practiced the social 
skills involved in PSI: praising good work, advocating for help, and speaking qui-
etly. Proctors were initially the top performers in the class but eventually grew 
to include any student who had completed the material to mastery could proc-
tor that set of mastered material. Mastery criterion also varied slightly across the 
PSI implementation. Originally set at 90%, it was later changed to 100% due 
to teacher feedback that these building block skills taught in second grade were 
vital to future success. A visual chart showed student progress across the targeted 
classes: math and spelling. A smiley face indicated completion of the unit. This 
chart is displayed at a shared table at which the teacher sits. Students can drop 
off their completed workbooks with the teacher at this shared space and ask the 
teacher questions or the teacher can provide student direction (Werner & Bono, 
1977). During implementation, it was determined that the intrinsic motivation of 
receiving ‘A’s’ or successful completion of units would be enough to provide rein-
forcement. As this was observed to be untrue, a point system was added in which 
points were exchanged for toys and candy. This case study is a great example of 
implementing PSI within a general education classroom using the materials and 
resources available; though modifications were required to be made throughout, 
these could be done within the constraints of the standard school system. Consis-
tent and careful implementation and analysis are critical to making the adoption 
of a PSI system successful (Werner & Bono, 1977). 
13.2.2 Computer-Aided Personalized System of Instruction 
(CAPSI) 
CAPSI is a combination of PSI and computer-mediated communication imple-
mented in a web-based program (Pear & Novak, 1996). CAPSI is a web-based
13 Personalized System of Instruction 155
Progress in Level 1 
4/10 completed 
Fig. 13.3 Progress example. Note A technology-based screenshot example of progress feedback 
within a PSI-based application 
system that can be applied to teaching any type of coursework (https://home.cc. 
umanitoba.ca/~capsi/webCAPSI.htm). CAPSI is a program that delivers unit tests, 
assigns student responses to peer reviewers, and tracks all student data (Pear & 
Crone-Todd, 1999). The teacher’s role within CAPSI is to design the course mate-
rials includingthe study guide and unit tests, assign course readings, and be 
available during regularly scheduled office hours such as afterschool or during 
homeroom. In addition to being tech-based, CAPSI can decrease the amount of 
work required by proctors by assigning peers within the course as peer-reviewers 
(Pear et al., 2011). Feedback and unit completion can be displayed in a variety 
of technological manners (see Fig. 13.3 for one example). This can be dictated 
dependent on grade level and peers available. CAPSI can also be delivered in 
a hybrid model combining in-person and virtual learning (Svenningsen & Pear, 
2011) making it highly adaptable to a variety of school programs. 
Proponents of CAPSI highlight its efficient use of human and computer 
resources in course planning and implementation. This program helps teachers 
focus on other aspects of teaching as the computer-based program helps deliver 
the materials and present the quizzes. The number of students that one teacher 
can individually interact with is increased via the assistance of the computer pro-
gram, and once the course is designed, the teacher can focus on other aspects of 
teaching rather than being physically present in the classroom. The students also 
benefit from this type of instruction by serving as peer reviewers (i.e., proctors; 
Pear & Crone-Todd, 1999). Seventy-one percent of students surveyed by Pear and 
Crone-Todd (1999) reported that proctoring the course enhanced their understand-
ing of the course content. In addition to further learning, a positive correlation 
between performance and motivation was found when CAPSI was compared to an 
alternative teaching method (Pedreira & Pear, 2015). CAPSI could enable schools 
to offer a larger variety of classes because the class schedule is not dependent 
on a teacher’s schedule. It can enable more students to take the same classes 
without requiring additional resources. CAPSI also reduces scheduling conflicts 
since it enables students to participate in the course at their convenience (Pear & 
Crone-Todd, 1999).
https://home.cc.umanitoba.ca/~capsi/webCAPSI.htm
https://home.cc.umanitoba.ca/~capsi/webCAPSI.htm
156 J. Quigley
13.2.3 PSI and Physical Education 
Participation in physical education has been declining in secondary education 
according to the CDC (2011). Health-related fitness directly correlates with par-
ticipation in physical education and has historically been targeted using a range of 
instructional strategies. PSI has been effective in teaching physical activity in that 
it enables more practice opportunities and decreases teacher instruction time (Pre-
witt et al., 2015). Following the core components of PSI, when teaching a skill 
such as volleyball, the curriculum may be broken down into modules targeting 
passing, hitting, blocking, and serving. Each module would then be broken down 
into a “developmentally appropriate progression” (Colquitt et al., 2011, p. 47) that 
students’ progress through at their own pace. Within each module, students would 
demonstrate the skill as a component of the mastery criterion prior to moving 
onto the next targeted module or skill. Using the volleyball example, the students 
would show mastery to set the ball before learning the skill to spike and then upon 
mastery, learn how to pass the ball and so on. 
In one research-based example, PSI was compared to Direct Instruction (see 
Chap. 10) in teaching secondary physical education in a general education pro-
gram. The PSI program consisted of 16 modules completed during six weeks. 
Students completed a knowledge assessment before, during, and after the course to 
determine the effect of the coursework. During the PSI program, students worked 
their way through the material at their own pace and in their chosen order. They 
tracked their progress via trials of ten. At the end of the study, students in the PSI 
group increased their knowledge more than the control Direct Instruction group in 
the same amount of time (Prewitt, 2014). 
13.3 Application 
Student success within PSI is primarily predicted by the student’s grade point 
average prior to enrollment (Pear & Crone-Todd, 1999) and the student’s rate of 
progress at the beginning of the PSI course (Henneberry, 1976). Experience with 
PSI is not a relevant factor in success within a PSI course. Henneberry (1976) 
demonstrated that college students who classified as slow-starters, or students who 
hadn’t completed the fourth unit by the fourth week of class, had significantly 
lower grades and gains at the end of the semester than fast-starters. 
Allen (1984) assessed the impact of PSI on the writing abilities of undergrad-
uate students. Following completion of the course, ten of the eleven students who 
participated in the undergraduate writing course rated the course as significantly 
improving their writing. Specifically, the students highlighted the breakdown of 
writing projects into smaller segments to receive more frequent feedback and 
the ability to choose to pass or retake after each assignment rather than a tradi-
tional grade being assigned (Allen, 1984). Limitations of the instructional program 
included high turnover prior to the start of the course; students chose to take other 
classes because they were worried about their poor writing skills and wanted to
13 Personalized System of Instruction 157
avoid demonstrating their writing skills so late in their undergraduate career. The 
course also required a large workload from both students and teacher. Eleven stu-
dents produced eighty-one papers; seventy of which were deemed successful, and 
each paper took between 25 and 40 min to edit (Allen, 1984). Though this is an 
example from the college level, the application of PSI within high school class-
rooms could directly address these limitations. By requiring specific core courses 
as school programs do, all students would receive and master the same skills 
within these courses due to the setup of PSI. Students would not be passing classes 
without sufficiently demonstrating the targeted skills such as writing, reading, or 
math. 
One of the barriers facing general education teachers currently is larger class 
sizes with less preparation time. PSI takes the onus off the teacher and shares the 
responsibility with the students. The students’ responsibilities within a PSI course 
are the following: (1) start class, (2) bring equipment/materials, (3) use and return 
equipment, (4) take attendance, (5) present the tasks, (6) structure the tasks, (7) 
complete the assessment, and (8) monitor their progress (Metzler, 2005). Once 
the course is set up, teachers can then focus on assessing the mastery attempts, 
observing the mastery attempts, and monitoring overall learning (Colquitt et al., 
2011). Students would do this through the application or software being used. The 
requirements of each unit are built in. Students receive feedback on their progress, 
can only move forward when units are mastered, and access only eligible material 
based on their status in the course. Final grades at the end would be determined 
by students completing a specified number of the modules to mastery (e.g., 14 out 
of 17 modules by the end of the term). 
Potential drawbacks of PSI include fast learners receiving less instruction, stu-
dent progress being reliant on intrinsic motivation, and increased procrastination 
(Colquitt et al., 2011; Hannon et al., 2008; Prewitt et al., 2015). By teachers being 
aware of these limitations, modifications in the courses can be included such as 
attending to advanced learners early on in programming, including extra credit for 
completion of additional materials, and setting target dates for completion of spec-
ified units. PSI may also not be the most appropriate teaching system for classes 
whose content relies on group interactions or in-person experiences (Fox, 2004). 
13.4 Summary 
PSI is a student-driven, active learning instructional program that emphasizesfive 
key components. These components include emphasis on the written word, unit 
quizzes with mastery criterion, self-pacing, students as proctors, and lectures as 
motivation. With the advances in technology, PSI can now be more easily incorpo-
rated into in-person and virtual learning environments by establishing well written 
objectives with criteria assigned to each unit. PSI allows the teacher to focus 
on feedback and supplemental materials while students can all achieve the same 
success at their own pace. PSI has already been successfully incorporated into 
general education classroom, as well as into online learning platforms, and the 
opportunities for further individualization continue.
158 J. Quigley
Appendix 
See Table 13.5.
Table 13.5 Examples of PSI 
Type of program Examples 
Distance education/learning management CAPSI http://www.webcapsi.com 
ALEKS® http://www.aleks.com 
WebCT® 
Blackboard® 
Canvas 
Class Dojo 
Schoology 
Mastery Physics 
Fitness programs Teach physical wellness within physical education 
(Colquitt et al., 2011; Prewitt et al., 2015) 
Teach tennis skills (Metzler, 1984, 1986) 
Customized learning Cloudmath 
Data-driven programs PracTutor 
Amazon’s TenMarks 
McGraw-Hill Thrive 
Rosetta Stone’s Lexia 
Tutoring System (Paiva et al., 2017) 
Adaptive learning Smart Sparrow 
Knewton 
Mobile learning Computer assisted language learning 
Mobile assisted language learning 
Juju English Vocabulary Learning App 
Note This table displays examples of programs across the type of instruction within personalized 
systems of instruction and is not meant to be all encompassing (Arumugam & Md Noor, 2021; 
Bulger, 2016; Colquitt et al., 2011; Metzler, 1984; Metzler, 1986)
http://www.webcapsi.com
http://www.aleks.com
13 Personalized System of Instruction 159
References
Akera, A. (2017). Bringing radical behaviorism to revolutionary Brazil and back: Fred Keller’s 
personalized system of instruction and cold war engineering education. Journal of Historical 
Behavioral Science, 53(4), 364–382. https://doi.org/10.1002/jhbs.21871 
Allen, G. J. (1984). Using a personalized system of instruction to improve the writing skills of 
undergraduates. Teaching of Psychology, 11(2), 95–98. https://doi.org/10.1207/s15328023top 
1102_10 
Arumugam, R., & Md Noor, N. (2021). Mobile apps based on Keller personalized system of 
instruction to promote english vocabulary acquisition. International Journal of Interactive 
Mobile Technologies, 15(23), 4–17. https://doi.org/10.3991/ijim.v15i23.27227 
Baxter, McEntyre, K., & Woodruff, E. A. (2018). Using QR codes to enhance the personalized 
system of instruction. In K. Andrew & R. Richards (Eds.), Strategies (Reston, Va.) (Vol. 31, 
Issue 1, pp. 45–47). https://doi.org/10.1080/08924562.2018.1395666 
Brothen, T., & Wambach, C. (2001). Effective student use of computerized quizzes. Teaching of 
Psychology, 28(4), 292–294. https://doi.org/10.1207/S15328023TOP2804_10 
Bulger, M. (2016). Personalized learning: The conversation we’re not having. Data & Society, 
22(1), 1–29. 
Centers for Disease Control and Prevention. (2011). Physical activity levels of high school stu-
dents. Morbidity and Mortality Weekly Report, 60, 773–777. 
Colquitt, G., Pritchard, T., & McCollum, S. (2011). The personalized system of instruction in fit-
ness education. Journal of Physical Education, Recreation & Dance, 82(6), 46–54. https://doi. 
org/10.1080/07303084.2011.10598645 
Eyre, J. L. (2007). Keller’s personalized system of instruction: Was it a fleeting fancy or is there 
a revival on the horizon? The Behavior Analyst Today, 8(3), 317–324. https://doi.org/10.1037/ 
h0100623 
Fox, E. J. (2004). The personalized system of instruction: A flexible and effective approach to 
mastery learning. In D. J. Moran, & R. W. Malott (Eds.), In educational psychology, evidence-
based educational methods (pp. 201–221). Academic Press. https://doi.org/10.1016/B978-012 
506041-7/50013-9 
Grant, L. K., & Spencer, R. E. (2003). The personalized system of instruction: Review and appli-
cations to distance education. The International Review of Research in Open and Distributed 
Learning, 4(2). https://doi.org/10.19173/irrodl.v4i2.152 
Hammerschmidt-Snidarich, S. M., Edwards, L. M., Christ, T. J., & Thayer, A. J. (2019). Lever-
aging technology: A multi-component personalized system of instruction to teach sight words. 
Journal of School Psychology, 72, 150–171. https://doi.org/10.1016/j.jsp.2018.12.005 
Hannon, J., Holt, B., & Hatten, J. (2008). Personalized systems of instruction model: Teaching 
health-related fitness content in high school physical education. Journal of Curriculum & 
Instruction (Greenville, N.C.), 2(2), 20–33. https://doi.org/10.3776/joci.2008.v2n2p20-33 
Henneberry, J. K. (1976). Initial progress rates as related to performance in a personalized system 
of instruction. Teaching of Psychology, 3(4), 178–181. https://doi.org/10.1207/s15328023top 
0304_6 
Johnson, K. (2008). Personalized system of instruction. Encyclopedia of Educational Psychology, 
2, 786–790. 
Johnson, K. R., & Ruskin, R. S. (1977). Behavioral instruction: An evaluative review (pp. xv–182). 
American Psychological Association. 
Keller, F. S. (1968). “Good-bye, teacher . . .” Journal of Applied Behavior Analysis, 1(1), 79–89. 
https://doi.org/10.1901/jaba.1968.1-79 
Metzler, M. (2005). Instructional models for physical education (2nd ed.). Holcomb Hathaway. 
Metzler, M. (1984). Analysis of a mastery learning/personalized system of instruction for teaching 
tennis. Sport pedagogyIn M. Pieron & G. Graham (Eds.), The 1984 Olympic scientific congress 
proceedings (Vol. 6, pp. 63–70). Human Kinetics.
https://doi.org/10.1002/jhbs.21871
https://doi.org/10.1207/s15328023top1102_10
https://doi.org/10.1207/s15328023top1102_10
https://doi.org/10.3991/ijim.v15i23.27227
https://doi.org/10.1080/08924562.2018.1395666
https://doi.org/10.1207/S15328023TOP2804_10
https://doi.org/10.1080/07303084.2011.10598645
https://doi.org/10.1080/07303084.2011.10598645
https://doi.org/10.1037/h0100623
https://doi.org/10.1037/h0100623
https://doi.org/10.1016/B978-012506041-7/50013-9
https://doi.org/10.1016/B978-012506041-7/50013-9
https://doi.org/10.19173/irrodl.v4i2.152
https://doi.org/10.1016/j.jsp.2018.12.005
https://doi.org/10.3776/joci.2008.v2n2p20-33
https://doi.org/10.1207/s15328023top0304_6
https://doi.org/10.1207/s15328023top0304_6
https://doi.org/10.1901/jaba.1968.1-79
160 J. Quigley
Metzler, M. (1986, April). Teaching tennis by the Keller method: A comparison between “tradi-
tional” and PSI-based instruction. In Paper presented at the Annual Meeting of the American 
Educational Research Association, San Francisco, CA. 
Paiva, R. C., Ferreira, M. S., & Frade, M. M. (2017). Intelligent tutorial system based on of instruc-
tion to teach or remind mathematical concepts. Journal of Computer Assisted Learning, 33(4), 
370–381. https://doi.org/10.1111/jcal.12186 
Pear, J. J., Schnerch, G. J., Silva, K. M., Svenningsen, L., & Lambert, J. (2011). Web-based 
computer-aided personalized system of instruction. New Directions for Teaching and Learning, 
2011(28), 85–93. https://doi.org/10.1002/tl.471 
Pear, J. J., & Novak, M. (1996). Computer-aided personalized system of instruction: A program 
evaluation. Teaching of Psychology, 23(2), 119–123. https://doi.org/10.1207/s15328023top230 
2_14 
Pear, J. J., & Crone-Todd, D. E. (1999). Personalized system of instruction in cyberspace. Journal 
of Applied Behavior Analysis, 32(2), 205–209. https://doi.org/10.1901/jaba.1999.32-205 
Pedreira, K., & Pear, J. (2015). Motivation and attitude in a computer-aided personalized system 
of instruction course on discrete-trials teaching. Journal on Developmental Disabilities, 21(1), 
45–51. 
Prewitt. (2014). The personalized system of instruction: Fidelity and effect on health-related fitness 
knowledge and in-class physical activity. In Dissertation abstracts International (Vol. 76, Issue 
3, Suppl. A,

Mais conteúdos dessa disciplina