Top Banner
Randomized controlled trial of seminar-based training on accurate and general implementation of practical functional assessments Cory J. Whelan Western New England University and May Institute and The Autism Community Therapists Gregory P. Hanley Western New England University Robin Landa and Emily Sullivan Western New England University and May Institute Kara LaCroix Western New England University and The Autism Community Therapists Rachel Metras Western New England University General and long-term outcomes of functional analysis training have not yet been reported. Within a randomized control trial, we trained 18 behavior analytic practitioners to interview caregivers, design and then conduct a personalized analysis as a part of a practical functional assessment (PFA). Participants were randomly assigned to groups, and those who experienced the seminar prior to conducting PFA with a confederate demonstrated more component skills than those who were provided the same materials but did not experience the seminar (mean scores: 87%, 36% respectively). Participants who experienced the seminar considered the train- ing valuable and reported greater condence in their ability to achieve control in an analysis. Several participants then conducted a PFA with a client who engaged in SPB. Results showed that skills transferred to these authentic applications. Results suggest that a seminar-based train- ing can increase practitionersability to functionally analyze problem behavior and leads to sub- sequent analytic activity. Key words: functional analysis, IISCA, problem behavior, RCT, staff training Children and adolescents who engage in severe problem behavior (SPB) often cause dis- ruption to the classroom environment and pose safety risks to themselves, other students, and staff. When confronted with SPB, it is a behav- ior analysts ethical responsibility to conduct a functional behavior assessment (FBA) prior to implementing intervention (Professional and Ethical Compliance Code 3.01; Behavior Ana- lyst Certication Board, 2016). In addition, when a students problem behavior causes sig- nicant disruption to his or her access to the educational environment, the Individuals with Disabilities Act (IDEA) requires an FBA in order to design effective interventions (IDEA, 2004). A variety of FBA methods exists, and each provides practitioners with various levels of condence in their identication of the This study was conducted in partial fulllment of a Ph. D. in Behavior Analysis from Western New England University by the rst author. All authors trained individ- uals and groups of practitioners to conduct practical func- tional assessments at their afliated places of employment through live and distance-based consulting. We thank Drs. Chata Dickson, Eileen Roscoe, and Rachel Thomp- son for their feedback on earlier versions of this manuscript. Address correspondence to: Cory J. Whelan, The Autism Community Therapists, 20 Main St., Suite G, Acton, MA 01720. Email: cory. [email protected] doi: 10.1002/jaba.845 Journal of Applied Behavior Analysis 2021, 9999, 119 NUMBER 9999 () © 2021 Society for the Experimental Analysis of Behavior (SEAB). 1
19

Randomized controlled trial of seminar‐based training on ...

Feb 20, 2022

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Randomized controlled trial of seminar‐based training on ...

Randomized controlled trial of seminar-based training on accurateand general implementation of practical functional assessments

Cory J. WhelanWestern New England University and May Institute and The Autism Community Therapists

Gregory P. HanleyWestern New England University

Robin Landa and Emily SullivanWestern New England University and May Institute

Kara LaCroixWestern New England University and The Autism Community Therapists

Rachel MetrasWestern New England University

General and long-term outcomes of functional analysis training have not yet been reported.Within a randomized control trial, we trained 18 behavior analytic practitioners to interviewcaregivers, design and then conduct a personalized analysis as a part of a practical functionalassessment (PFA). Participants were randomly assigned to groups, and those who experiencedthe seminar prior to conducting PFA with a confederate demonstrated more component skillsthan those who were provided the same materials but did not experience the seminar (meanscores: 87%, 36% respectively). Participants who experienced the seminar considered the train-ing valuable and reported greater confidence in their ability to achieve control in an analysis.Several participants then conducted a PFA with a client who engaged in SPB. Results showedthat skills transferred to these authentic applications. Results suggest that a seminar-based train-ing can increase practitioners’ ability to functionally analyze problem behavior and leads to sub-sequent analytic activity.Key words: functional analysis, IISCA, problem behavior, RCT, staff training

Children and adolescents who engage insevere problem behavior (SPB) often cause dis-ruption to the classroom environment and pose

safety risks to themselves, other students, andstaff. When confronted with SPB, it is a behav-ior analyst’s ethical responsibility to conduct afunctional behavior assessment (FBA) prior toimplementing intervention (Professional andEthical Compliance Code 3.01; Behavior Ana-lyst Certification Board, 2016). In addition,when a student’s problem behavior causes sig-nificant disruption to his or her access to theeducational environment, the Individuals withDisabilities Act (IDEA) requires an FBA inorder to design effective interventions (IDEA,2004). A variety of FBA methods exists, andeach provides practitioners with various levelsof confidence in their identification of the

This study was conducted in partial fulfillment of a Ph.D. in Behavior Analysis from Western New EnglandUniversity by the first author. All authors trained individ-uals and groups of practitioners to conduct practical func-tional assessments at their affiliated places of employmentthrough live and distance-based consulting. We thankDrs. Chata Dickson, Eileen Roscoe, and Rachel Thomp-son for their feedback on earlier versions of thismanuscript.Address correspondence to: Cory J. Whelan, The

Autism Community Therapists, 20 Main St., Suite G,Acton, MA 01720. Email: [email protected]: 10.1002/jaba.845

Journal of Applied Behavior Analysis 2021, 9999, 1–19 NUMBER 9999 ()

© 2021 Society for the Experimental Analysis of Behavior (SEAB).

1

Page 2: Randomized controlled trial of seminar‐based training on ...

variables that evoke and maintain problembehavior.Functional behavior assessments exist on a

continuum of scientific rigor, which includesindirect assessments such as interviews andrecord reviews; descriptive assessments such asobservations of the target behavior in the con-text in which it typically occurs; and functionalanalyses (FAs) during which the relevant esta-blishing operations (EOs) and consequencessuspected to be influencing the target behaviorare manipulated (Kratochwill & Shapiro,2000). Given that FA is the only method thatexperimentally manipulates variables suspectedto influence behavior (see Beavers et al., 2013;Hanley et al., 2003; Hanley, 2012 for reviews),it is important for behavior analysts to conductone when assessing SPB. However, practi-tioners report an almost exclusive reliance onindirect and descriptive assessments when con-ducting FBAs of SBP in school and residentialsettings (Ellingson et al., 1999; Oliver et al.,2015; Roscoe et al., 2015). Ellingson et al.(1999) found that despite the majority ofrespondents reporting that FAs are the mostuseful tool for identifying the relevant variablesrequired for effective treatment, behavioralinterviews were the most commonly reportedFBA method. Roscoe et al. (2015) surveyed205 behavior analysts and found that themajority of respondents (68%) considered FAsas the most informative type of FBA but thatonly 35% of respondents reported conductingFAs when assessing problem behavior. Oliveret al.’s (2015) respondents reported using indi-rect and descriptive assessments most often andFAs most infrequently.The survey studies also asked behavior ana-

lysts why they relied more heavily on indirectand descriptive measures, rather than FAs.Respondents reported lack of time or suitablespace/materials to conduct an analysis as rea-sons preventing them from conducting FAswith students who engaged in SPB (Oliveret al., 2015; Roscoe et al., 2015). Roscoe et al.

(2015) also reported that concerns of socialunacceptability influenced respondents’ use ofFAs. Additionally, some respondents reported alack of training as a barrier.Although training was cited as a barrier to

conducting FAs, multiple studies have evaluatedmodels for training people of varying employ-ment and educational backgrounds to conductFAs. Several studies used variations of behavioralskills training (BST) to teach participants toimplement conditions commonly associated withthose of a traditional FA (e.g., attention, play,demand, tangible; Iwata et al., 1982/1994) undersimulated conditions (Alnemary et al., 2015;Chok et al., 2012; Iwata et al., 2000; Lambert,Bloom, et al., 2014; Moore et al., 2002;Moore & Fisher, 2007; Phillips & Mudford,2008; Rispoli et al., 2016; Wallace et al., 2004;Ward-Horner & Sturmey, 2012). In general,these studies incorporated reading material, lec-ture, video models, role-plays, feedback, andwritten quizzes as components within a trainingpackage and participants demonstrated improvedperformance when implementing FA conditionswith a confederate client. Erbas et al. (2006) alsoused BST to teach participants how to imple-ment traditional FA conditions; however, theymeasured participants’ performance with actualclients engaging in problem behavior prior toassessing their skills with confederates.Several studies (Alnemary et al., 2017;

Flynn & Lo, 2016; Griffith et al., 2019;Kunnavatana et al., 2013; Lambert, Lloydet al., 2014; Rispoli, et al., 2015; Rispoli et al.,2016) have described training packages aimedat teaching people to conduct trial-based FAs(TBFAs; Bloom et al., 2011; Sigafoos &Saggers, 1995). Similar to the traditional FAtraining literature, these authors used compo-nents of BST to implement a TBFA with con-federate clients. Lambert, et al. (2013) describedtraining practitioners to conduct TBFAs withactual clients, forgoing the typical approach oftraining under low-stakes conditions with con-federate clients.

Cory J. Whelan et al.2

Page 3: Randomized controlled trial of seminar‐based training on ...

Some noteworthy contributions exist in theFA training literature. For example, some stud-ies described how the training with confederateclients was extended to FA conditions withactual clients (Kunnavatana et al., 2013; Mooreet al., 2002; Moore & Fisher, 2007; Rispoliet al., 2015; Wallace et al., 2004). Chok et al.(2012) trained participants to interpret FAgraphs, respond to undifferentiated data, andselect interventions that were appropriate giventhe identified function, demonstrating thatbehavior analysts can be trained to completeseveral aspects of an FA beyond implementingits conditions.Despite the contributions described above,

there remain limitations within the FA trainingliterature. For instance, apart from Flynn andLo (2016), no published study reports the anal-ysis data from confederate or authentic(i.e., with real client) experiences. Therefore,the extent to which participants were able toachieve functional control in their analyses isunknown. In addition, even though severalstudies demonstrated that participants’ skillstransferred from confederate to authentic experi-ences, they failed to demonstrate that their par-ticipants could design and conduct an FAindependent of researcher support. That is, dur-ing participants’ experiences, researchers eitherprovided instructions regarding which condi-tions to implement or provided feedback duringand/or between brief sessions. Given that Oliveret al. (2015) and Roscoe et al. (2015) discov-ered limited use of FA in practice, practitioners’ability to independently design and conduct ana-lyses that yield differentiated outcomes is rele-vant. It may be the case that the surveyscontinue to reveal reliance on indirect anddescriptive assessment methods because practi-tioners have not been trained to independentlyconduct FAs that produce meaningful results.In addition, fewer than half of the training

studies reported on the social validity of theirprocedures or results. Furthermore, participantsin Rispoli et al. (2015) commented that,

although they considered the TBFA to be anacceptable form of behavioral assessment, theyhad concerns regarding the length of timerequired to conduct such an assessment. If par-ticipants credited the training packages for pro-viding them with practical tools, they may bemore likely to implement FAs when assessingproblem behavior in the future. In the sameregard, no previous FA training study reportedon the extent to which participants use FA fol-lowing their participation in the study.Collectively, these training studies show that

people of varying levels of experience can betrained to implement the conditions of a tradi-tional FA or TBFA either with confederate clientsor with actual clients given live and direct supportfrom an experimenter. The BST methodologydescribed in the literature provides a useful frame-work for teaching practitioners a variety of skillswith respect to FA; however, conducting an FAas a part of a functional assessment is more com-plex than solely implementing conditions withintegrity. Practitioners are required to gather rele-vant information, design conditions based on per-sonalized EOs and reinforcers, and adjust theconditions based on the client’s behavior, all whileattempting to safely achieve functional controlover problem behavior. Furthermore, BST can betime-intensive, particularly when implemented ina one-on-one training arrangement, which limitsits scalability.In this study, we describe a seminar-based

approach for imparting capacity to practitionersto conduct practical functional assessment(PFA). Seminar-based approaches often involveteaching a set of skills or content to more thanone person at a time and can be done using avariety of teaching methods such as lecture,video modeling, active responding amongtrainees, and role plays. Seminars can addressthe above shortcomings to individual BST byallowing the teacher to impart capacity to sev-eral trainees at once and they provide traineesthe opportunity to learn from observing theirpeers practice new skills or receive feedback. In

3Functional Analysis Training RCT

Page 4: Randomized controlled trial of seminar‐based training on ...

addition, they are common when training largegroups of practitioners. For instance, at theAssociation for Behavior Analysis International45th Annual Convention in 2019, seventeen7-hr and fifty-eight 3-hr instructional work-shops (n = 75) were scheduled, resulting in atotal of 293 continuing education units (CEUs)available for BCBAs (Association for BehaviorAnalysis International, 2020). At the 40th

Annual Conference of the Berkshire Associationfor Behavior Analysis and Therapy in 2019,eighteen 3-hr and three 1.5-hr workshops (n =21) were scheduled, resulting in a total of63 available CEUs (Berkshire Association forBehavior Analysis and Therapy, 2020).The PFA process includes an interview-

informed synthesized contingency analysis(IISCA; Hanley et al., 2014), an FA in whichmultiple suspected reinforcers and their respec-tive EOs are synthesized in a single test conditionwhile the same reinforcers are simultaneouslyand continuously available in an otherwise mat-ched control condition. This approach to func-tional assessment included an open-endedinterview with caregivers to identify individual-ized contingencies of reinforcement suspected tobe maintaining problem behavior.We elected to train practitioners on the PFA

process because practitioners responsible forbehavioral programming should be skilled in allcomponents of FA including information gath-ering, analysis design, and analysis implementa-tion. We decided to train capacity with IISCAsprimarily because of their reliable social valida-tion (Beaulieu, et al., 2018; Hanley et al.,2014; Jessel et al., 2018; Santiago et al., 2016;Strand & Eldevik, 2017; Taylor et al., 2018)and because of their demonstrated treatmentutility (Beaulieu et al., 2018; Chusid Rose &Beaulieu, 2019; Hanley et al., 2014; Hermanet al., 2018; Jessel et al., 2018; Santiago et al.,2016; Slaton et al., 2017; Strand & Eldevik,2017; Taylor et al., 2018).It is possible that FAs are not being con-

ducted either because they are not considered

socially acceptable by BCBAs or their col-leagues, because their conduct has not yieldedsocially meaningful outcomes for practicingBCBAs, or both. The purpose of the currentproject was to evaluate a model for trainingBCBAs, BCBA supervisees, ABA graduate stu-dents, and classroom staff to conduct func-tional assessments associated with strong socialacceptability and treatment utility. PFAs maybe more difficult to implement with integritythan traditional functional assessments and FAsbecause each IISCA is individualized from anopen-ended interview. The integrity withwhich PFAs are implemented was evaluatedand described in this study as well as the proba-bility of a differentiated result. We consideredthe likelihood of our participants conductingFAs that yielded differential outcomes impor-tant, given the discrepancy between the FAtraining literature and the functional assessmentsurvey studies. We report social validity dataregarding the acceptability of the FA processwith the practitioners who implemented theFAs in this study as well as data from FAs con-ducted following the completion of this studywith students who engage in SPB. Finally, asan additional measure of social validity, wegathered reports regarding the extent to whichanalytic activity continued following the study.

Method

ParticipantsEighteen staff from one specialized school for

students with autism and other developmentaldisabilities who engage in problem behavior par-ticipated in this study. The school was selectedas the research site because it was the firstauthor’s place of employment. Researchers werebehavior analysis doctoral students and theiradvisor; none were compensated for their time.A total of 38 staff were nominated for participa-tion by their clinical supervisors. Participationwas voluntary, and 18 (47% of nominated staff)staff agreed to participate. They ranged in

Cory J. Whelan et al.4

Page 5: Randomized controlled trial of seminar‐based training on ...

experience with respect to designing andconducting FAs, their credentials regardingboard-certification in behavior analysis, andemployment duration (see Table 1 for partici-pant details). Each participant signed aninformed consent which outlined what theyshould expect to experience. Procedures wereapproved by a university institutional reviewboard (IRB) as well as an internal research com-mittee at the research site.Prior to random assignment to the waitlist

control or experimental group, participants werematched based on three criteria: their experiencedesigning and conducting FAs (e.g., traditional,trial-based, IISCA); their credential (BCBA;BCBA candidate waiting to take exam; ABAgraduate student; or none); and their durationas an employee providing ABA services to chil-dren, teenagers, or adults with disabilities. Weassigned participants points for each matchingcriterion. Scoring was as follows: regarding FAexperience, two or more analyses = 2, one = 1,zero = 0; regarding BCBA status, BCBA = 2,candidate or graduate student = 1, none = 0;regarding employment duration, 1+ years = 1,0-1 years = 0. FA experience and BCBA statuswere variables that we considered more likely toinfluence performance positively. Therefore, we

considered them primary matching criteria andweighted them more heavily than employmentduration in our matching procedure. We rankedparticipants according to this combination ofmatching factors and each ranked dyad (e.g.,1 & 2, 3 & 4, 5 & 6…) was randomly assignedto groups (Table 1) using a randomizer applica-tion found at www.random.org. Going forward,we refer to participants based on their group andmatched pair (e.g., W1, E1, W2, E2, and so on).

MeasurementThe PFA process was deconstructed into

22 component skills (see y-axis of Figure 2) andtrained observers recorded data on each PFAcomponent skill. The interviews and analyseswere video recorded, and participant perfor-mance was evaluated using pencil and paperdata collection post hoc. To evaluate partici-pants’ ability to design an analysis following aninterview, each participants’ analysis design wascompared to an analysis design constructed byan expert (behavior analysis doctoral studentwith extensive experience designing, con-ducting, and interpreting FAs). The expertconducted an open-ended interview with theexperimenter acting as a caregiver, and then

Table 1

Participant Characteristics

Waitlist Group Experimental Group

PairingMatchingScore

FAsDesignedand/or

Conducted BCBA Status

EmploymentDuration(years)

MatchingScore

FAsDesignedand/or

Conducted BCBA Status

EmploymentDuration(years)

1 5 4 BCBA 1+ 5 2 BCBA 1+2 4 2 Candidate 1+ 4 2 Candidate 1+3 3 2 Grad student 0-1 4* 2 Grad student 1+4 1 0 Grad student 0-1 1 0 Grad student 0-15 1 0 Grad student 0-1 2 0 Grad student 1+6 2* 0 Grad student 1+ 2 0 Grad student 1+7 2 0 Grad student 1+ 1* 0 Grad student 0-18 0 0 None 0-1 1 0 None 1+9 0* 0 None 0-1 1 0 None 1+

Note: Participants displayed in order of ranked pairs.*Conducted authentic PFA.

5Functional Analysis Training RCT

Page 6: Randomized controlled trial of seminar‐based training on ...

designed an analysis based on that interview.The resulting analysis design was used as amodel against which to compare participants’designs. Participants’ analysis designs were ratedfor generic reliability with the expert. In otherwords, if the participant identified major catego-ries of reinforcement such as escape to tangiblesor escape to mand compliance, the participantreceived full credit for that design. Specific reli-ability, such as escape from a particular task to aparticular item or activity, was not required inorder to receive full credit for the design.

Data CollectionObservers blind to the matching and random

assignment of participants scored the interviewand analysis videos. The first author trained datacollectors on operational definitions of compo-nent skills. Observers recorded data on eachcomponent skill as each opportunity occurred.For example, an opportunity to reinforce firstinstance of problem behavior occurred if the con-federate or client engaged in problem behavior.An opportunity to progressively initiate establishingoperations (EOs) occurred if the confederate orclient did not engage in problem behavior uponthe presentation of the initial EO. Data collectorsrecorded whether each participant emitted thetarget component skill during each opportunityto do so. For each component that could occurmore than once during the interview or analysis,such as reinforcing problem behavior during a testcondition for 20-40 s or providing salient transi-tions between (EO) and reinforcement (SR) inter-vals, participants received a percentage ofoccurrence score. For component skills withbinary measures (e.g., begins analysis with controlcondition), participants were provided full creditor no credit depending on their performance.Component skills that were not occasioned duringthe PFA process (e.g., ignoring problem behaviorin the control condition could only be measuredif problem behavior occurred in the control condi-tion), were omitted from the total PFA percentagecorrect score.

Using the percentage of occurrence scoresrecorded by the data collectors, the first authorassigned either full, partial, or no credit to eachparticipant’s component skills. For skills demon-strated in 80% or more of opportunities, partici-pants were given full credit for that component;for skills demonstrated between 30% and 79%of opportunities, participants were given partialcredit for that component; for skills demon-strated between 0% and 29% of opportunities,participants were not given credit for that com-ponent. The first author then calculated a totalPFA percentage correct score by assigning anumerical value to full, partial, and no creditcomponent scores (see Himle et al., 2004). Scor-ing was as follows: fully demonstrated skills = 1;partially demonstrated skills = 0.5; skills notdemonstrated at all = 0. The total PFA percentcorrect score was calculated by adding the totalcomponent scores and dividing by the numberof PFA component skills (with the exception ofany skills not expected to occur).

Interobserver AgreementInterobserver agreement (IOA) for interview

and analysis component scores for confederatePFAs was assessed by having a second observercollect data on PFA component skills for twoparticipants from the waitlist group andtwo participants from the experimental groupand for one authentic PFA (22% of confederatePFAs; 25% of authentic PFAs). The secondobserver recorded whether each participantemitted the target component skill during eachopportunity to do so, just as the primary datacollector did. Agreement percentages were cal-culated by dividing the number of agreementsby the total number of PFA skills multiplied by100. IOA averaged 92% (range, 82% to100%) across selected participants for the con-federate PFAs (E1, 100%; W1, 95%; W4, 91%;E4, 82%). IOA was 82% for the authenticPFA. IOA for the authentic IISCA data was cal-culated for one of the four of analyses (25%).The agreement percentage was calculated by

Cory J. Whelan et al.6

Page 7: Randomized controlled trial of seminar‐based training on ...

dividing the total number of agreements regard-ing the occurrence or nonoccurrence of problembehavior multiplied by 100. IOA was 100%.

DesignA posttest-only group design (Campbell &

Stanley, 1963, pp. 25-27) with preassignmentmatching then random assignment was used toevaluate the effects of the PFA training seminaron participants’ implementation of the PFA pro-cess. The randomized, posttest-only design allowsfor detection of an effect of an independent vari-able while controlling for interactions betweenhistory and testing effects. This design wasselected because we expected some learning tooccur on behalf of the participants during base-line PFAs conducted with confederates. There-fore, a pretest may have affected performanceduring posttests. In addition, given the resourcesneeded to conduct each confederate PFA (e.g.,coming in before school, staying late after school,time away from other clinical responsibilities forboth the participants and researchers), theposttest-only design allowed the researchers tocollect the necessary data for each participant inhalf the time it would have taken to administer apretest to each participant.

ProceduresParticipant Shared ExperiencesPrior to the start of this study, we provided

all potential participants with a document thatdescribed all stages of the experiment. Weinformed them that they would be randomlyassigned to either the experimental or waitlistgroup and that they would conduct a func-tional assessment with a confederate client.They were all made aware of the chance thatthey would conduct the assessment withoutattending the seminar but, if that was the case,that they would attend the seminar following theirassessment. After reviewing that document andconsenting, participants from both groups com-pleted prematching questionnaires (5-10 min) and

either attended a PFA seminar (3 hr) and con-ducted PFAs with confederate clients(10-40 min) or conducted PFAs with confederateclients (10-40 min) and then attended a PFAseminar (3 hr). Members of both groups receivedfeedback on their performance (10-20 min) fol-lowing their analysis implementation with con-federate clients if they volunteered to implementa PFA with an actual client. Researchers followedidentical interview and analysis scripts regardlessof the group to which the participant wasassigned. The PFA seminar was identical in con-tent and duration for both groups, with somedifferences noted based on participants’ questionsas the seminars progressed. The experiences forparticipants in the experimental group differedfrom those in the waitlist group only with regardto when they attended the PFA seminar. Thosein the experimental group attended the seminarprior to conducting a PFA with a confederate cli-ent and those in the waitlist group attended theseminar after conducting a PFA with a confeder-ate client.

PFA SeminarThe independent variable in this study was a

3-hr seminar, with one 15-min break embed-ded, presented to participants in the experi-mental group prior to their confederate PFAexperience. The seminar, developed by the firstand second authors and presented by thefirst author, was designed to provide partici-pants with the skills required to conduct theopen-ended interview, use the informationgathered in the interview to design a safe, effi-cient analysis, and conduct the analysis. Partici-pants were provided with a workbook, a pen,and blank paper to use for note-taking if theychose to do so. Some sections of the workbookwere prepopulated with information regardinghow to conduct the IISCA and other sectionswere left blank to encourage participants toattend to the material being discussed (see Glo-dowski & Thompson, 2018, for a descriptionof guided notes).

7Functional Analysis Training RCT

Page 8: Randomized controlled trial of seminar‐based training on ...

The seminar consisted of several componentsof BST including didactic instruction based ona PowerPoint presentation, video examples oftrained experimenters implementing PFA com-ponent skills with real clients, active respondingduring which participants collaborated onmini-assessments throughout the presentation,and discussion of four cases among participants.The seminar progressed from general discussionregarding FA safety and efficiency to descrip-tion of and rationale for the PFA process andexamples of how to make adjustments that mayresult in a greater level of control over problembehavior during an analysis. For instance, theresearcher discussed the importance of rein-forcing nondangerous topographies of problembehavior that are likely members of the sameresponse class as dangerous topographies (seeWarner et al., 2020) to prevent the occurrenceof dangerous behavior. In addition, participantswere given a task analysis made up of each PFAcomponent skill. Participants were encouraged,but not required, to take notes or ask questionsthroughout the seminar.

Confederate PFAsAll participants conducted a confederate PFA

with an experimenter acting as a caregiver dur-ing the interview and as a child engaging inproblem behavior during the analysis. All inter-views and analyses were conducted on the sameday; some were conducted back-to-back with10-20 min allocated for the design and otherswere conducted with several hours in between(e.g., interview at 7:30 am and design/analysisat 3:00 pm). During the interview, we gave allparticipants a writing utensil, a folder with theopen-ended interview (Hanley, 2012), andblank pieces of paper. The experimenter toldeach participant, “This is your chance to getsome information to conduct a functional anal-ysis. Here are some materials to do that—youcan choose to use them or not. If you prefer touse this time differently, you may. You canstop at any time.” Participants were free to use

the time as they pleased and the experimenterwas instructed to terminate any interview thatexceeded 60 min; however, none did.The experimenter was provided with a script

that outlined several responses to each questionon the interview. If the participant asked thequestion as written in the interview, the experi-menter responded with answer A; if the partici-pant asked a follow up question or inquiredabout additional detail, the experimenterresponded with answer B; if the participantasked an additional question, the experimenterresponded with answer C. In other words, eachadditional question asked by the participantresulted in more qualitatively rich detail regard-ing the child and/or the EOs and SRs influenc-ing problem behavior. We chose to provide theresearcher with a script with several responseoptions so that participants would be requiredto ask additional questions to gather all theinformation they needed. The experimenterswere provided with the expert’s analysis designand instructed to reference that design whenunsure how to answer a participant’s question.For example, if a participant asked a questionabout EOs and/or reinforcers that were not inthe script, the researcher referenced the expert’sanalysis design and provided information suchthat the participant could achieve generic reli-ability with the expert.During the design process, we gave partici-

pants a writing utensil and a folder with ananalysis design form (developed for the PFAseminar) and blank pieces of paper. The experi-menter told the participant, “This is your timeto design your conditions. Here are some mate-rials to do that—you can choose to use themor not. Let me know when you are ready toproceed with the analysis.” Participants werefree to use the time as they pleased and theexperimenter was instructed to terminate anydesign process that exceeded 20 min; however,none did.During the analysis, all participants were

provided with a writing utensil, a clipboard, a

Cory J. Whelan et al.8

Page 9: Randomized controlled trial of seminar‐based training on ...

data sheet, a timer, and a plastic storage binwith the following items: toy cars, crayons, col-oring sheets, math worksheets, toothpaste, atoothbrush, puzzles, sight word flashcards, andmath flashcards. The contents of the bin con-sisted of all SR and EO materials that the experi-menters were instructed to divulge during theinterview. In addition, there were several otherreinforcers and EO materials included that werenot suggested during the interview. The experi-menter told the participant, “This is your timeto conduct your analysis. Here are some mate-rials to do that—you can choose to use them ornot. You can terminate the analysis at any point.Please identify what you are doing by sayingaloud the condition you are running. For exam-ple, you could say, ‘Starting control condition,’before you start a control condition. Take 3-5minutes to get set up and we will begin.”The researcher was provided with a description

of how to behave during each condition depend-ing on what the participant did. For example, ifthe participant refrained from implementing anyEOs during the control condition, the researcherdid not engage in any problem behavior. How-ever, if the participant implemented any EOduring the control condition, the researcherimmediately engaged in the least dangerous topog-raphy of problem behavior reported to precede orco-occur with the dangerous topographies. If theparticipant reinforced that behavior, the researcherstopped. If the participant did not reinforce thatresponse, the researcher engaged in the next least-dangerous topography and continued up theresponse-class hierarchy until the most concerningtopography of behavior (i.e., self-injury) wassimulated. Unlike previous FA training studies,confederates engaged in problem behavior con-tingent on the participant implementing an EOinstead of on a time-based schedule to betteremulate authentic FAs.

Social ValidityAfter completion of the confederate PFAs,

we asked each participant to rank their

confidence in their ability to conduct a safe andefficient functional analysis. Participants wereasked to rank their confidence and/or ability indifferent components of conducting a PFAfrom 1 (not at all) to 7 (very much so). SeeTable 2 for specific social validity statements.

Authentic PFAsWe invited all participants to the authentic

PFA portion of this study after they completedthe confederate analysis portion and attendedthe seminar (waitlist group only). Two partici-pants from the experimental group and twoparticipants from the waitlist group conducteda PFA with a client. The primary author metwith each participant via phone for 10-20 minto review the video of their confederate PFAand provided feedback on any component skillnot implemented fully during the process. Cli-ents were nominated for participation by theirclinical teams. Caregivers for each client pro-vided informed consent, which outlined whattheir child would experience during the PFAprocess. Client assent to participate was evalu-ated prior to and throughout every session byproviding them with the option of coming tosession or leaving at any time. If a client choseto opt out of session or leave during a session,that session would be terminated; however, thisnever occurred.Participant E7 conducted an authentic PFA

with Hannah, an 8-year-old girl diagnosed withautism spectrum disorder who engaged in self-injury, property destruction, aggression, crying,and bolting. She communicated using an alterna-tive and augmentative communication (AAC)device and liked to play with her iPad, musictoys, and swings. Hannah was identified for par-ticipation in this study due to a recent increasein dangerous behavior resulting in the need foremergency physical restraint procedures to pre-vent injury to herself and her caregivers.Participant E3 conducted an authentic PFA

with Cam, a 19-year-old man diagnosed withautism spectrum disorder who engaged in

9Functional Analysis Training RCT

Page 10: Randomized controlled trial of seminar‐based training on ...

aggression, property destruction, foot stomping,and yelling. He communicated vocally andenjoyed playing on his iPad while interactingwith his teachers. Cam was identified for par-ticipation in this study because he had severalinconclusive FAs and he continued to engagein dangerous problem behavior in his schooland residence.Participant W6 conducted an authentic PFA

with Daniel, a 16-year-old young man diagnosedwith autism spectrum disorder, attention-deficit/hyperactive disorder, intellectual disability, cere-bral palsy, and Blount disease who engaged inhead-directed self-injury, aggression, propertydestruction, and swearing. He communicatedvocally and enjoyed playing with his toys includ-ing blocks and electronics while interacting withteachers. Daniel was identified for participationin this study because the severity of his problembehavior had caused injury to himself and staffmembers.Participant W9 conducted an authentic PFA

with Albert, a 15-year-old young man diagnosedwith autism spectrum disorder who engaged inaggression, property destruction, swearing, andvocal protests. He communicated vocally and hispreferences included playing keyboard, takingphotos/videos and talking about them with histeachers, and playing with toy bugs. Albert wasidentified for participation in this study becausethe severity of his aggression had recently led tohis school district placing him out of district at aprivate school for children with autism andsevere problem behavior.During all authentic IISCAs, a researcher was

present to film the analysis and provide guid-ance to the participant only if it appeared likelythat dangerous problem behavior might occurdue to participant error (e.g., progressing theEO too quickly, not reinforcing nondangeroustopographies). Researchers were instructed tomonitor for participant error, or any other vari-able, that might result in dangerous behavior andintervene to ensure safety for everyone involved.For example, if a participant had progressed an

EO too quickly (e.g., abruptly removing all tangi-bles and instructing the client to complete themost difficult task) or withheld some reinforcerscontingent on problem behavior during a testcondition, the researcher would have promptedthe participant to follow the IISCA task analysisthat she received during the PFA seminar. How-ever, this did not occur for any other participants;these four participants conducted all steps of thePFA process independently.

Follow Up QuestionnaireTen months after attending the PFA semi-

nar, we surveyed all participants regarding theirfunctional assessment practices since training.We asked participants if they were currentlyable to initiate or implement functional ana-lyses in their settings. We also asked partici-pants how many functional analyses they haddesigned or conducted since attending the PFAseminar (this same question was asked prior tothe training in the screening process, allowingfor a comparison of responses).

Results

Confederate PFAsTotal PFA scores from each group are summa-

rized in Figure 1. All PFA performance scores forthose in the experimental group were higher thanthose in the waitlist group. A two-tailed Mann–Whitney U statistic revealed that the PFA semi-nar led to a statistically significant difference withrespect to the target PFA skills (U = 0.0,p < .05) suggesting the seminar was responsiblefor the improved performance conductingIISCAs. The between-groups effect size statisticdescribes a relatively large effect (d = 3.49).Randomization of the matched pairs resulted

in no difference in BCBA status across groups;however, the number of FAs conducted favoredthe waitlist group and the number of years ofemployment favored the experimental group.Furthermore, Pearson correlations (r), calcu-lated for each matching factor and performance

Cory J. Whelan et al.10

Page 11: Randomized controlled trial of seminar‐based training on ...

both within and across groups, revealed statisti-cally insignificant correlations.Participants in the waitlist group demon-

strated fewer overall component skills of thePFA process than participants in the experi-mental group (see Figure 2). The mean totalPFA score for participants in the waitlist groupwas 36% correct. By contrast, the mean totalPFA score for participants in the experimentalgroup was 87% correct. Within the waitlistgroup, participant W3 achieved the lowest over-all PFA implementation score and participantW6 achieved the highest (7% correct and 71%correct, respectively). In general, participants inthe waitlist group demonstrated a majority ofthe interview component skills at least partially.Only two participants, W1 and W2, demon-strated each design component skill at least par-tially. The remaining participants omitted atleast one design component skill. Performanceduring the analysis, however, varied among par-ticipants in the waitlist group. Participant W6

demonstrated the most analysis componentskills fully. A few participants, W2, W4, andW5, demonstrated some skills to proficiency.Others, W1, W3, W7, W8, and W9, demon-strated few or no skills to proficiency.By contrast, participants in the experimental

group demonstrated relatively high levels of

PFA component skills. Participant E2 achievedthe lowest overall PFA implementation scoreand participant E5 achieved the highest (73%correct and 96% correct, respectively). In gen-eral, participants in the experimental groupdemonstrated a majority of interview compo-nent skills fully, with one notable exception.Several participants received partial or no creditfor asking follow-up questions; however, thisdid not appear to impact their ability to designand conduct their analysis. All participants inthe experimental group demonstrated at leasthalf of the design component skills fully, withfour participants demonstrating all four designcomponent skills fully.1

Performance during the analysis was consis-tent across participants in the experimentalgroup. Consistent errors were observed acrossparticipants with two component skills in partic-ular. For example, several participants failed toreinforce the first instance of problem behaviorand instead waited to reinforce a more dangeroustopography (e.g., withheld reinforcers for whin-ing but delivered them for physical aggression).In addition, five participants failed to providereinforcement for 20-40 s contingent on prob-lem behavior in a test condition with some par-ticipants reinforcing for less than 20 s and othersreinforcing for longer than 40 s. However,despite these errors and with the exception ofparticipant BE, all participants in the experimen-tal group achieved total PFA implementationscores of 80% correct or higher.

Social ValidityImmediately following their confederate PFA

experiences, participants from both the waitlistand experimental group responded to a surveyin which they ranked their confidence or abilityto implement the PFA process on a scale from1-7 (1 = not at all; 4 = unsure; 7 = verymuch so). This question was an attempt to

Figure 1Total PFA Implementation Scores for Waitlist and Experi-mental Group Participants

Waitlist Experimental

0

50

100

PFA

Im

plem

enta

tion

(% c

orre

ct)

U = 0.0, p < .05 d = 3.49

Note. Mean lines, a two-tailed Mann–Whitney U statistic,and a between-groups effect size statistic (d) are reported.

1Design data for CE were misplaced and not availablefor inclusion in the analysis.

11Functional Analysis Training RCT

Page 12: Randomized controlled trial of seminar‐based training on ...

Figu

re2

PFACom

ponent

andTotalPF

AScores

Note:Waitlist(36.1%

)andexperimentalmeans

(87.4%

)arerepresentedby

dashed

andsolid

lines,respectively.

Breaksin

thedata

path

representno

opportun

ityto

observethecomponent

skill.*

Designdataun

availableforparticipantE3.

Cory J. Whelan et al.12

Page 13: Randomized controlled trial of seminar‐based training on ...

measure the meaningfulness of the outcomes(Wolf, 1978). Results from that survey are dis-played in Table 2. Participants in both thewaitlist and experimental groups felt confidentin their ability to gather information to designan ecologically relevant, safe, and sociallyacceptable FA. Participants in the waitlistgroup felt less confident in their ability to effi-ciently demonstrate control over problembehavior than participants in the experimentalgroup. Most participants from the waitlistgroup did not respond to the question regard-ing the training they received regarding thePFA process. By contrast, the majority of par-ticipants in the experimental group reportedthat the training they received regarding thePFA process enhanced their ability to design,conduct, and interpret an FA.

A two-tailed Mann–Whitney U statistic, anda between-groups effect size statistic (d) arereported for the social validity measures. Therewas no statistically significant differencebetween participants’ confidence in gatheringrelevant information to conduct an FA nor intheir confidence in conducting an FA thatwould be safe and socially acceptable to the

Figure 3Results from Authentic IISCAs

Prob

lem

Beh

avio

r pe

r M

inut

e

1 2 3 4 5 6

0.0

0.5

1.0

1.5

1 2 43

0.0

1.0

2.0

Hannah

Daniel Albert

Cam

Sessions

ControlTest

Note: Data from final iterations presented for Daniel andAlbert.T

able2

SocialValidity

WaitlistGroup

Experim

entalG

roup

Statem

ent

12

34

56

78

9Mode

range

12

34

56

78

9Mode

range

Ustatistic

Effectsize

Ifeltconfi

dent

inmyability

togather

relevant

inform

ationto

design

anecologicallyrelevant

FA.

66

56

46

66

363-6

76

76

66

37

63-7

U=

20.5

d=

0.62

Ifeltconfi

dent

thatIcouldcond

uctan

efficientFA

thatwould

yieldsufficientfunctio

nalcontrol.

55

43

34

24

33,

42-5

66

56

55

26

62-6

U=

11.5

d=

1.28

Ifeltconfi

dent

thatIcouldcond

uctasafeFA

thatissocially

acceptableto

theclient

andhis/hercaregivers.

57

5.5

33

66

56

63-7

76

77

64

36

6,73-7

U=

24d=

0.41

The

training

Ireceived

regardingcond

uctin

gPF

Asenhanced

my

ability

todesign,condu

ct,and

interpretan

FA.

54

34

15

4,51-5

77

77

76

76

76-7

U=

0.0d=

2.76

Note:1=

notatall;4=

unsure;7

=very

muchso;n

otext

=no

response.A

two-tailedMann–

Whitney

Ustatistic,and

abetween-groups

effectsize

statistic

(d)are

reported.S

tatisticallysignificant

effectsarein

bold.

13Functional Analysis Training RCT

Page 14: Randomized controlled trial of seminar‐based training on ...

client’s caregivers. The between-groups effectsize statistic describes a relatively large effect inregard to participants’ confidence in their abil-ity to implement an efficient FA that yieldeddifferential outcomes and their interpretation ofhow their training enhanced their ability todesign, conduct, and interpret an FA (d = 1.3and 2.8, respectively).

Authentic PFAsParticipants’ performances during their

authentic PFAs are depicted in the final col-umn in Figure 2. Their performance during theauthentic PFA process was evaluated identicallyas it was during the confederate PFA experience.All participants demonstrated the majority ofcomponent PFA scores to proficiency; Partici-pant E7’s total PFA score during her authenticPFA experience was 91%; Participant E3’s totalPFA score was 96%; Participant W6’s total PFAscore was 96%; Participant W9’s total PFAscore was 100%.The results from the authentic IISCAs are

depicted in Figure 3. Hannah’s caregiver reportedthat problem behavior was most likely to occurwhen her preferred toys and attention wereremoved and she was instructed to go to her tableto engage in academic demand. During the con-trol condition, Hannah was given continuousaccess to her preferred toys, attention from Partic-ipant E7, and no demands were presented. Dur-ing the test condition, Participant E7 terminatedaccess to the preferred toys, removed her atten-tion other than providing instructions, andinstructed Hannah to transition to the worktable.Contingent on the occurrence of any problembehavior, Participant E7 removed all EOs anddelivered access to the synthesized reinforcers.During the analysis, elevated rates of problembehavior were observed during the test conditionand zero problem behavior was observed duringthe control conditions.Cam’s caregiver reported that problem

behavior was most likely to occur when Cam

had to relinquish his iPad, attention from hisstaff diminished, staff did not comply withhis mands, and staff presented academicdemands. During the control condition, Camwas given continuous access to his iPad, atten-tion from Participant E3 in the form of mandcompliance and discussion about his videos,and no demands were presented. During thetest condition, Participant E3 removedthe iPad, did not comply with Cam’s mands,and provided instructions to complete an aca-demic task. Contingent on the occurrence ofany problem behavior reported to co-occur,Participant E3 removed all EOs and deliveredaccess to the synthesized reinforcers. Duringthe analysis, elevated rates of problem behaviorwere observed during the test condition and noproblem behavior was observed during the con-trol conditions.Daniel’s caregiver reported that problem

behavior was most likely to occur when Danielwas instructed to stop playing with his toyswithout warning of the upcoming transition toa less-preferred activity. During the controlcondition, Daniel was allowed continuousaccess to his preferred toys and conversationabout his favorite videos without any instruc-tion to terminate playing and start a new task.During the test condition, Participant W6

removed the preferred items from Daniel andprompted him to complete an academic taskwithout any warning of the transition. Contin-gent on any problem behavior reported to co-occur, Participant W6 terminated all EOs anddelivered access to the synthesized reinforcers.Participant W6 decided to conduct two itera-tions of the IISCA due to no responding in thefirst iteration. In the second iteration, Partici-pant W6 placed Daniel’s preferred toys out ofview during the test conditions. This change inEO presentation resulted in elevated rates ofproblem behavior in the test condition and noproblem behavior in the control condition.Albert’s caregiver reported that problem

behavior was most likely to occur when a

Cory J. Whelan et al.14

Page 15: Randomized controlled trial of seminar‐based training on ...

teacher interrupted him playing with preferredtoys, stopped providing him with attention rel-evant to those toys/activities, and instructedhim to complete a difficult academic task. Dur-ing the control condition, Albert was allowedto play with a variety of preferred toys includ-ing the keyboard, plastic bugs, and an iPad touse for taking pictures and videos. ParticipantW9 provided him with continuous attentionrelated to those ongoing activities. During thetest condition, Participant IW instructed Albertto stop playing, relinquish his positive rein-forcers, and complete a difficult academic task.In the first iteration, Participant W9 providedprompting to complete the task (data notshown). Despite designing these conditionsbased on caregiver report, the EOs were notstrong enough to evoke problem behavior. Par-ticipant W9 independently altered the condi-tions and instructed Albert to relinquish hispositive reinforcers and complete a difficult aca-demic task independently. Despite this change,Albert did not engage in any problem behaviorduring the analysis.

Follow-up QuestionnaireOf 18 participants, 12—six from the experi-

mental group and six from the waitlist group—returned the survey (67% return rate). All

12 respondents were working in positions inwhich they were able to initiate functionalassessments either during clinical review orteam meetings. Prior to attending the PFAseminar, 33% of respondents (n = 4) hadreported that they designed and/or conducted atotal of eight FAs in their work history. Tenmonths after attending the PFA seminar, 100%of respondents (n = 12) had reported designingand/or conducting a total of 39 FAs rep-resenting almost a five-fold increase in the useof functional analysis among respondents (seeFigure 4).

Discussion

The PFA seminar proved to be an effectiveand socially validated method for trainingbehavior-analytic practitioners to conduct apractical functional assessment process withconfederate and some actual clients. In thisstudy, we addressed several barriers preventingpractitioners from using FA as identified byOliver et al. (2015) and Roscoe et al., (2015).Survey respondents cited inadequate training,lack of time, and social acceptability of FA pro-cedures as barriers to conducting FAs. The par-ticipants in the current study were adequatelytrained via seminar to conduct FAs efficiently(range, 10 to 40 min) and safely (e.g., minimaldangerous problem behavior occurred).After attending the seminar, participants dem-

onstrated the ability to gather information regard-ing a response class of problematic behaviors aswell as the EOs and reinforcers suspected to beinfluencing problem behavior. They synthesizedthe information gathered during the interview todesign an ecologically relevant analysis with allmembers of the response class eligible for rein-forcement during the test condition and allsuspected reinforcers freely available during thecontrol condition. Participants then conductedtheir confederate IISCAs and demonstrated con-trol over problem behavior. The PFA seminarresulted in all but one participant in the

Figure 4Follow-up Survey Results

Pre PFA Seminar Post PFA Seminar

0

10

20

30

40

Num

ber

of R

epor

ted

FA

s D

esig

ned

and/

or C

ondu

cted

n = 12 respondents

15Functional Analysis Training RCT

Page 16: Randomized controlled trial of seminar‐based training on ...

experimental group demonstrating proficiency(i.e., at least 80% correct implementation) withthe PFA process.Several components of the PFA seminar

likely contributed to its positive effects. Theemphasis on role plays and active responding,for example, provided participants with multi-ple opportunities to practice and receive feed-back. It may have been the case thatparticipants’ performance was influenced byreceiving direct feedback and by observingothers receive feedback as the training was pro-vided to a group. In addition, the video exam-ples may have provided effective models of howto perform during a PFA. The effects of videomodeling on staff skill acquisition is well docu-mented (see Bovi et al., 2017 and Deliperiet al., 2015 for recent examples) and is likely tohave contributed to participants’ performances.A seminar-based approach to training staff toconduct FAs might not be as efficacious with-out all or some of these components.This training, in addition to feedback follow-

ing implementation with a confederate, also ledto successful implementation of the PFA pro-cess with clients who engage in SPB. The inter-action between the effects of the seminar andfeedback in this study are unknown. Identify-ing the effects of a seminar experience on PFAimplementation without any feedback wouldbe important as many professionals who attendworkshops may not have the opportunity forfeedback prior to implementation. Therefore,at this time, our recommendation is for profes-sionals to arrange for observation and feedbackon their PFA implementation following work-shop experiences.Four of 18 participants (22%) conducted

authentic PFAs with clients. There are severalreasons why more participants did not conductthe authentic PFA. First, it may be the case thatonly four participants had a client who was duefor an updated behavioral assessment. It may alsobe the case that participants were not ultimatelyresponsible for behavioral assessment for the

clients they served and, therefore, felt uncomfort-able nominating them for participation. A finalconsideration may be that, despite wanting toconduct an authentic PFA with a client on theircaseload, participants were prevented from doingso because the client was already undergoingsome other behavioral assessment. Despite a lim-ited number of authentic PFAs, we encouragefuture researchers to more systematically evaluategeneralized applications of FA methodology.During the authentic IISCAs, three out of

four participants successfully evoked andreinforced nondangerous topographies of prob-lem behavior, preventing the occurrence ofdangerous behaviors that were reported to bemembers of the same response class. This seemsto be an important emphasis for PFA trainingsgiven the strong support for this tactic shownin Warner et al. (2020). Given the variabilityof this tactic being implemented with confeder-ate clients, this aspect of the seminar-basedtraining probably should be strengthened infuture applications.Despite a wide variety of expertise and expe-

rience among participants in the experimentalgroup, the PFA training resulted in greaterreported confidence in conducting analyses thatyield differential outcomes. In fact, participantsin both the experimental and waitlist groupsreported using FAs more often in their practicefollowing the seminar. More specifically, all12 participants who responded to the follow-upsurvey reported conducting FAs within10 months of completing the study. This iscontrasted with only one third of participantsreporting FA activity prior to the PFA seminar.A limitation of the current study is that we didnot demonstrate experimental control over the39 future applications of FA reported by oursurvey respondents. Future research shouldpush out the scheduling of the waitlist group’sseminar experience by several months from thatof the experimental group so that a more exper-imentally rigorous understanding of the generalimpact of the PFA seminar can be realized.

Cory J. Whelan et al.16

Page 17: Randomized controlled trial of seminar‐based training on ...

Future researchers might also consider how toaugment the effects of the seminar for participantswho do not demonstrate proficiency. Griffithet al. (2019) provided individualized instructionsto participants who did not demonstrate profi-ciency following a self-instruction package andsmall group training. It is possible that similar,individualized teaching would enhance partici-pants’ PFA skills. Future researchers might alsoconsider using video modeling, similar to Mooreand Fisher (2007), as a method for improving per-formance following the seminar. Participants whofailed to meet criteria with a confederate mightalso benefit from more support during applicationwith a client. Supported application with a real cli-ent would allow for the expert to provide coachingand feedback on all component skills. This sup-port could be provided on-site or at a distancegiven the advances in telehealth technology(Peterson et al., 2017; Wacker et al., 2013).Another limitation of the current study was

that we did not teach participants how toengage in the iterative process that is sometimesinvolved in PFAs. Three out of four authenticIISCAs were differentiated, which is consistentwith previous studies replicating IISCAs in clin-ical settings (Jessel et al., 2016; 2018). How-ever, it is possible that Participant W9 wouldhave achieved control over problem behaviorwith Albert had we spent more time discussingwhat to do when the first iteration does notresult in a differentiated outcome. A refinementof the PFA component skills might includeproblem solving such that control over problembehavior is achieved.Another next step for future research would

be to evaluate the effects of a similar seminaron designing and implementing treatmentbased on the results of a PFA. A seminar mightbe a useful way to disseminate basic informa-tion about function-based treatments. Futureresearchers should consider evaluating theextent to which such a training might augmenta collaborative implementation process inwhich experts consult to practitioners learning

to implement interventions. Given the complexdecision-making skills required to implementtreatment protocols that result in meaningfulreductions in problem behavior, a seminarwithout supported application would likely notbe an effective training method.

REFERENCES

Alnemary F., Wallace, M., Alnemary, F.,Gharapetian, L., & Yassine, J. (2017). Application ofa pyramidal training model on the implementationof trial-based functional analysis: A partial replication.Behavior Analysis in Practice, 10(3), 301-306. https://doi.org/10.1007/s40617-016-0159-3

Alnemary, F. M., Wallace, M., Symon, J. B. G., &Barry, L. M. (2015). Using international videoconfer-encing to provide staff training on functional behav-ioral assessment. Behavioral Interventions, 30(1), 73-86. https://doi.org/10.1002/bin.1403

Association for Behavior Analysis International (ABAI)(2020). 45th Annual Convention Program. Retrievedfrom https://www.abainternational.org/events/program-details/event-detail.aspx?intConvId=57&by=Workshop&date=05/23/2019#divMastrTopBanner0

Beaulieu, L., Van Nostrand, M. E., Williams, A. L., &Herscovitch, B. (2018). Incorporating interview-informed functional analyses into practice. BehaviorAnalysis in Practice, 11(4), 385-389. https://doi.org/10.1007/s40617-018-0247-7

Beavers, G. A., Iwata, B. A., & Lerman, D. C. (2013).Thirty years of research on the functional analysis ofproblem behavior. Journal of Applied Behavior Analy-sis, 46(1), 1-21. https://doi.org/10.1002/jaba.30

Behavior Analyst Certification Board (BACB) (2016).Professional and Ethical Compliance Code. Retrievedfrom https://www.bacb.com/wp-content/uploads/2017/09/170706-compliance-code-english.pdf

Berkshire Association for Behavior Analysis and Therapy(BABAT) (2020). 40th Annual Conference Program.Retrieved from https://babat.org/wp-content/uploads/2019/03/BABAT-2019-Conference-Program-revised-.pdf

Bloom, S. E., Iwata, B. A., Fritz, J. N., Roscoe, E. M., &Carreau, A. B. (2011). Classroom application of atrial-based functional analysis. Journal of AppliedBehavior Analysis, 44(1), 19-31.https://doi.org/10.1901.jaba2011.44-19

Bovi G., Vladescu, J. C., DeBar, R. M., Carroll, R. A., &Sarokoff, R. A. (2017). Using video modeling withvoice-over instruction to train public school staff toimplement a preference assessment. Behavior Analysisin Practice, 10(1), 72-76. https://doi.org/10.1007/s40617-016-0135-y

17Functional Analysis Training RCT

Page 18: Randomized controlled trial of seminar‐based training on ...

Campbell, D. T., & Stanley, J. C. (1963). Experimentaland quasi-experimental designs for research. HoughtonMifflin.

Chok, J. T., Shlesinger, A., Studer, L., & Bird, F. L.(2012). Description of a practitioner training pro-gram on functional analysis and treatment develop-ment. Behavior Analysis in Practice, 5(2), 25-36.https://doi.org/10.1007/BF03391821

Chusid Rose, J., & Beaulieu, L. (2019). Assessing thegenerality and durability of interview-informed func-tional analyses and treatment. Journal of AppliedBehavior Analysis, 52(1), 271-285. https://doi.org/10.1002/jaba.504

Deliperi, P., Vladescu, J. C., Reeve, K. F.,Reeve, S. A., & DeBar, R. M. (2015). Training staffto implement a paired-stimulus preference assessmentusing video modeling with voiceover instruction.Behavioral Interventions, 30(4), 314-332. https://doi.org/10.1002/bin.1421

Ellingson, S. A., Miltenberger, R. G., & Long, E. S.(1999). A survey of the use of functional assessmentprocedures in agencies serving individuals with devel-opmental disabilities. Behavioral Interventions, 14(4),187-198.https://doi.org/10.1002/SICI)1099-078X(199910/12)14:4<187::AID-BIN38>3.0.CO;2-A

Erbas, D., Tekin-Iftar, E., & Serife, Y. (2006). Teachingspecial education teachers how to conduct functionalanalysis in natural settings. Education and Training inDevelopmental Disabilities, 41(1), 28-36. https://www.jstor.org/stable/23879866

Flynn, S. D., & Lo, Y. (2016). Teacher implementationof trial-based functional analysis and differential rein-forcement of alternative behavior for students withchallenging behavior. Journal of Behavioral Education,25(1), 1-31. https://doi.org/10.1007/s10864-015-9231-2

Glodowski, K., & Thompson, R. (2018). The effects ofguided notes on pre-lecture quiz scores in introduc-tory psychology. Journal of Behavioral Education, 27(1), 101-123. https://doi.org/10.1007/s10864-017-9274-7

Griffith, K. R., Price, J. N., & Penrod, B. (2019). Theeffects of a self-instruction package and group train-ing on trial-based functional analysis administration.Behavior Analysis in Practice, 13(1), 63-80. https://doi.org/10.1007/s40617-019-00388-9

Hanley, G. P. (2012). Functional assessment of problembehavior: Dispelling myths, overcoming implementa-tion obstacles, and developing new lore. BehaviorAnalysis in Practice, 5(1), 54–72. https://doi.org/10.1007/BF03391818

Hanley, G. P., Iwata, B. A., & McCord, B. E. (2003).Functional analysis of problem behavior: A review.Journal of Applied Behavior Analysis, 36(2), 147-185.https://doi.org/10.1901/jaba.2003.36-147

Hanley, G. P., Jin, C. S., Vanselow, N. R., &Hanratty L. A. (2014). Producing meaningfulimprovements in problem behavior of children with

autism via synthesized analyses and treatments. Jour-nal of Applied Behavior Analysis, 47(1), 16-36.https://doi.org/10.1002/jaba.106

Herman, C., Healy, O., & Lydon, S. (2018). Aninterview-informed synthesized contingency analysisto inform the treatment of challenging behavior in ayoung child with autism. Developmental Neuro-rehabilitation, 21(3), 202-207.https://doi.org/1080/17518423.2018.1437839

Himle, M. B., Miltenberger, R. G., Flessner, C., &Gatheridge, B. (2004). Teaching safety skills to chil-dren to prevent gun play. Journal of Applied BehaviorAnalysis, 37(1), 1-9. https://doi.org/10.1901/jaba.2004.37-1

Individuals with Disabilities Education Act, 20 U.S.C. §1400 (2004). Sec. 300.530 (d) (1) (ii). Retrievedfrom https://sites.ed.gov/idea/regs/b/e/300.530/d/1/ii

Iwata, B. A., Dorsey, M. F., Slifer, K. J.,Bauman, K. E., & Richman, G. S. (1994). Toward afunctional analysis of self-injury. Journal of AppliedBehavior Analysis, 27(2), 197-209. https://doi.org/10.1901/jaba.1994.27-197 (Reprinted from Analysis andIntervention in Developmental Disabilities, 2(1), 3-20,1982)

Iwata, B. A., Wallace, M. D., Kahng, S., Lindberg, J. S.,Roscoe, E. M., Conners, J., Hanley, G. P.,Thompson, R. H., & Worsdell, A. S. (2000). Skillacquisition in the implementation of functional anal-ysis methodology. Journal of Applied Behavior Analy-sis, 33(2), 181-194. https://doi.org/10.1901/jaba.2000.33-181

Jessel, J., Hanley, G. P., & Ghaemmaghami, M. (2016).Interview-informed synthesized contingency analyses:Thirty replications and reanalysis. Journal of AppliedBehavior Analysis, 49(3), 1-22. https://doi.org/10.1002/jaba.316

Jessel, J., Ingvarsson, E. T., Metras, R., Kirk, H., &Whipple, R. (2018). Achieving socially significantreductions in problem behavior following theinterview-informed synthesized contingency analysis:A summary of 25 outpatient applications. Journal ofApplied Behavior Analysis, 51(1), 130-157. https://doi.org/10.1002/jaba.436

Kratochwill, T. R., & Shapiro, E. S. (2000). Conceptualfoundations of behavioral assessment in schools. InE. S. Shapiro & T. T. Kratochwill (Eds.), Behavioralassessment in schools (pp. 3-15). Guilford Press.

Kunnavatana, S. S., Bloom, S. E., Samaha, A. L., &Dayton, E. (2013). Training teachers to conducttrial-based functional analyses. Behavior Modification,37(6), 707-722. https://doi.org/10.1177/0145445513490950

Lambert, J. M., Bloom, S. E., Clay, C. J.,Kunnavatana, S. S., & Collins, S. D. (2014). Train-ing residential staff and supervisors to conduct tradi-tional functional analyses. Research in DevelopmentalDisabilities, 35(7), 1757-1765. https://doi.org/10.1016/j.ridd.2014.02.014

Cory J. Whelan et al.18

Page 19: Randomized controlled trial of seminar‐based training on ...

Lambert, J. M., Bloom, S. E., Kunnavatana, S. S.,Collins, S. D., & Clay, C. J. (2013). Training resi-dential staff to conduct trial-based functional ana-lyses. Journal of Applied Behavior Analysis, 46(1), 296-300. https://doi.org/10.1002/jaba.17

Lambert, J. M., Lloyd, B. P., Staubitz, J. L.,Weaver, E. S., & Jennings, C. M. (2014). Effect of anautomated training presentation on pre-service behavioranalysts’ implementation of trial-based functional analy-sis. Journal of Behavioral Education, 23(3), 344-367.https://doi.org/10.1007/s10864-014-9197-5

Moore, J. W., Edwards, R. P., Sterling-Turner, H. E.,Riley, J., DuBard, M., & McGeorge, A. (2002).Teacher acquisition of functional analysis methodol-ogy. Journal of Applied Behavior Analysis, 35(1), 73-77. https://doi.org/10.1901/jaba.2002.35-73

Moore, J. W., & Fisher, W. W. (2007). The effects ofvideotape modeling on staff acquisition of functionalanalysis methodology. Journal of Applied BehaviorAnalysis, 40(1), 197-202. https://doi.org/10.1901/jaba.2007.24-06

Oliver, A. C., Pratt, L. A., & Normand, M. P. (2015). Asurvey of functional behavior assessment methodsused by behavior analysts in practice. Journal ofApplied Behavior Analysis, 48(4), 817-829. https://doi.org/10.1002/jaba.256

Peterson, K. M., Piazza, C. C., Luczynski, K. C., &Fisher, W. W. (2017). Virtual-care delivery ofapplied-behavior-analysis services to children withautism spectrum disorder and related conditions.Behavior Analysis: Research and Practice, 17(4), 286-297. https://doi.org/10.1037/bar0000030

Phillips, K. J., & Mudford, O. C. (2008). Functionalanalysis skills training for residential caregivers.Behavioral Interventions, 23(1), 1-12. https://doi.org/10.1002/bin.252

Rispoli, M., Burke, M. D., Hatton, H., Ninci, J.,Zaini, S., & Sanchez, L. (2015). Training Head Startteachers to conduct trial-based functional analysis ofchallenging behavior. Journal of Positive BehaviorInterventions, 17(4), 1-10. https://doi.org/10.1177/1098300715577428

Rispoli, M., Neely, L., Healy, O., & Gregori, E. (2016).Training public school special educators to imple-ment two functional analysis models. Journal ofBehavioral Education, 25(3), 249-274. https://doi.org/10.1007/s10864-016-9247-2

Roscoe, E. M., Phillips, K. M., Kelly, M. A.,Farber, R., & Dube, W. V. (2015). A statewide sur-vey assessing practitioners’ use and perceived utilityof functional assessment. Journal of Applied BehaviorAnalysis, 48(4), 830-844. https://doi.org/10.1002/jaba.259

Santiago, J. L., Hanley, G. P., Moore, K., & Jin, C. S.(2016). The generality of interview-informed functionalanalyses: Systematic replications in school and home.Journal of Autism and Developmental Disorders, 46(3),797-811. https://doi.org/10.1007/s10803-015-2617-0

Sigafoos, J., & Saggers, E. (1995). A discrete-trial approachto the functional analysis of aggressive behavior in twoboys with autism. Australia and New Zealand Journal ofDevelopmental Disabilities, 20(4), 287-297. https://doi.org/10/1080/07263869500035261

Slaton, J. D., Hanley, G. P., & Raftery, K. J. (2017).Interview-informed functional analyses: A comparisonof synthesized and isolated components. Journal ofApplied Behavior Analysis, 50(2), 252-277. https://doi.org/10.1002/jaba.384

Strand, R. C. W., & Eldevik, S. (2017). Improvement inproblem behavior in a child with autism spectrum diagno-sis through synthesized analysis and treatment: A replica-tion in an EIBI home program. Behavioral Interventions,33(1), 102-111. https://doi.org/10.1002/bin.1505

Taylor, S. A., Phillips, K. J., & Gertzog, M. G. (2018). Useof synthesized analysis and informed treatment to pro-mote school reintegration. Behavioral Interventions, 33(4), 364-379. https://doi.org/10.1002/bin.1640

Wacker, D. P., Lee, J. F., Dalmau, Y. C.,Kopelman, T. G., Lindgren, S. D., Kuhle, J.,Pelzel, K. E., & Waldron, D. B. (2013). Conductingfunctional analyses of problem behavior viatelehealth. Journal of Applied Behavior Analysis, 46(1),31-46. https://doi.org/10.1002/jaba.29

Wallace, M. D., Doney, J. K., Mintz-Resudek, C. M., &Tarbox, R. S. F. (2004). Training educators to imple-ment functional analyses. Journal of Applied BehaviorAnalysis, 37(1), 89-92. https://doi.org/10.1901/jaba.2004.37-89

Ward-Horner, J., & Sturmey, P. (2012). Componentanalysis of behavior skills training in functional analy-sis. Behavioral Interventions, 27(20), 75-92. https://doi.org/10.1002/bin.1339

Warner, C. A., Hanley, G. P., Landa, R. K.,Ruppel, K. W., Rajaraman, A., Ghaemmaghami, M.,Slaton, J. D., & Gover, H. C. (2020). Toward accu-rate inferences of response class membership. Journalof Applied Behavior Analysis, 53(1), 331-354. https://doi.org/10.1002/jaba.598

Wolf, M. M. (1978). Social validity: the case for subjec-tive measurement or how applied behavior analysis isfinding its heart. Journal of Applied Behavior Analysis,11(2), 203-214. https://doi.org/10.1901/jaba.1978.11-203

Received August 4, 2020Final acceptance March 25, 2021Action Editor, Jeffrey Tiger

Supporting information

Additional Supporting Information may befound in the online version of this article at thepublisher’s website.

19Functional Analysis Training RCT