Acceptability of Traditional & Electronic Daily Behavior Report Cards

[vc_row][vc_column width=”2/3″][vc_custom_heading text=”Acceptability of Traditional & Electronic Daily Behavior Report Cards” font_container=”tag:h2|font_size:42|text_align:center” css=”.vc_custom_1460306474539{padding-top: 1em !important;padding-right: 1em !important;padding-left: 1em !important;}”][/vc_column][vc_column width=”1/3″][vc_column_text css=”.vc_custom_1460306621861{margin-top: 2em !important;}”]Sean Duncan & Kasey Johnson Duncan

KIPP Believe College Prep & Nicholls State University

Meagan Medley, PhD & Carmen Broussard, PhD

Nicholls State University[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_empty_space][vc_custom_heading text=”Lit Review” font_container=”tag:h2|font_size:34|text_align:left”][vc_column_text]One of a school psychologist’s responsibilities is to support the learning environment through consultation (Erchul & Martens, 2010). During this process, school psychologists help the consultee select and implement effective research-based interventions often using a problem-solving model (Kratochwill & Bergan, 1990).

Treatment acceptability is related to selecting and implementing effective interventions, as it is correlated with treatment integrity and student outcomes (Kratochwill & Bergan, 1990).

The Daily Behavior Report Card intervention has been found to be effective for addressing student behaviors (Vannest, Davis, Davis, & Mason, 2010), and acceptable to school staff (Chafouleas, 2006).

Purpose of Study: The purpose of the current student was to measure the impact that two different formats of Daily Behavior Report Cards (DBRC) had on pre-post intervention teacher acceptability ratings in two different settings[/vc_column_text][vc_empty_space][vc_empty_space][vc_text_separator title=”Results” el_class=”h2″][/vc_column][/vc_row][vc_row][vc_column][vc_empty_space][vc_column_text]The IRP-15 yields a score range from 15 (least acceptable) to 90 (most acceptable). Results show that average total scores for teachers using the paper-and-pencil DBRC was lower than for teachers using the online version. This was true at Pre and Post intervention.

Average total scores for teachers using paper-and-pencil DBRC decreased from Pre to Post intervention, while average total scores for teachers using the online version did not.

A global rating of the intervention was considered using Item 15 or the IRP-15 , which states “Overall, this intervention would be beneficial for the child”. The median score was 5.1 at pre intervention and 5.0 at post intervention for teachers using the paper-and-pencil version. The median score for the online version was 5.0 at pre intervention and 6.0 at post intervention.

Student Analysis: The effect of the DBRC on student behavior was examined through visual analysis and calculation of Percent of Nonoverlapping Data (PND). Results for the paper-and pencil group was one student with strong improvement, two students with moderate improvement, and one student with no improvement. For the online group, one student had strong improvement, one student had moderate improvement, and two students had no improvement.[/vc_column_text][vc_empty_space][/vc_column][/vc_row][vc_row][vc_column][vc_single_image image=”560″ alignment=”center”][vc_empty_space height=”4em”][vc_single_image image=”561″ alignment=”center”][/vc_column][/vc_row][vc_row][vc_column width=”1/4″ css=”.vc_custom_1460306009607{margin-right: .25em !important;margin-left: .25em !important;}”][vc_custom_heading text=”Method” font_container=”tag:h2|font_size:36|text_align:left”][vc_column_text]Participants

Demographics are presented in table form. Participants were collected via online solicitation using Survey Monkey Southern states were primarily represented. ~75% from AL, MS, TN, LA, TX, SC, VA, KY, & FL. All other participants reside in 14 nonsouthern states.

Dependent Variables

Ratings on the 5-point Likert (strongly disagree (1) – strongly agree (5)) for measurability, observability and social importance on items related to active engagement. See table.

Independent Variables

  • Item Wording: Positive vs Negative
  • School Role: Teachers vs School Psychologists


  1. Paired Sample T-test between wordings
  2. Repeated Measures ANOVA between school role

[/vc_column_text][/vc_column][vc_column width=”3/4″ css=”.vc_custom_1460306024862{margin-right: 1em !important;margin-left: 1em !important;}”][vc_custom_heading text=”Discussion” font_container=”tag:h2|font_size:36|text_align:left”][vc_column_text]The purpose of this study was to measure teacher’s acceptability of two formats of Daily Behavior Report Cards. Results indicated that both paper-and-pencil and online formats were acceptable. While the online version was observed to be slightly more acceptable to teachers, the paper-and-pencil version yielded slightly more positive outcomes in regard to child behavior in the classroom.

In analyzing individual teacher responses across items, there were variable trends. Four 4 teachers (3 in paper-and-pencil 1 in electronic) had lower post scores than pre. In comparison, 4 teachers (1 paper-and-pencil and 3 electronic) had increased scores post intervention. Two teachers’ total scores increased by more than 10 points, while one teacher’s total scores decreased by 16 points.

This study demonstrated the feasibility and acceptability of use of DBRC to address classroom behavior concerns using two variety of formats. While an evidence-base exists for the use of paper-and-pencil format of the DBRC, this study suggests that an online version of DBRC may have the same broad applications. This is a timely application, as access to technology is increasing for teachers for a variety of assessment and teaching purposes. As with any intervention, training and accurate implementation of procedures for paper-and- pencil as well as online versions of the DBRC will be imperative.

Limitations and Future Research

  • One potential limitation is the variability of behavior being addressed with the DBRC. Behaviors were selected by teachers as important rather than selected by the researcher. It is not known if selected behaviors were more or less amenable to this intervention.
  • Another limitation may be the training that was provided, as an assumption was made that all teachers had some background in implementing general behavioral intervention.
  • Length of baseline and intervention were for some students insufficient. The intervention was planned for a prescribed period of time, but adjustments were required due to other scheduled activities at the school.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_btn title=”Download as PDF” size=”lg” align=”center” button_block=”true” link=”||”][vc_tta_accordion][vc_tta_section title=”References” tab_id=”1460230362939-fe2487c6-b826″][vc_column_text]Chafouleas, S.M. (2006). Good, bad, or in between: how does the behavior report card rate? Psychology in the Schools, 14(2).

Dougherty, E.H., and Dougherty, A. (1977). The daily behavior report card: A simplified and flexible package for classroom behavior management. Psycology in the Schools, 14 (2), 191-195.

Erchul, W.P. and Martens, B.K. (2010). School consultation: Conceptual and empirical bases of practice. 3rd ed. United States: Springer-Verlag, New York.

Kratochwill, T. R., & Bergan, J. R. (1990). Behavioral consultation in applied settings: An individual guide. Kluwer Academic/Plenum Publishers, New York, NY.

Martens, B. K., Witt, J. C., Elliott, s. N., & Darveaux, D. X. (1985). Teacher judgments concerning the acceptability of school-based interventions. Professional Psychology: Research and Practice, 16, 191-198.

Vannest, K.J., Davis J.L., Davis C.R., & Mason, B.A. (2010). Effective intervention for behavior with a daily behavior report card: a meta-analysis. School Psychology Review, 29 (4), 654-672

Who Are School Psychologists? (n.d.). Retrieved April 23, 2015, from