Attendance Works News

September 3rd, 2017

Final AAM 2017 Webinar: Portraits of Change

Join us for the final Attendance Awareness Campaign 2017 webinar as we discuss Portraits of Change: Aligning School and Community Resources to Reduce Chronic Absence, a new brief from Attendance Works and the Everyone Graduates Center.

Co-authors Hedy Chang and Robert Balfanz will highlight key findings from their national and state analysis of how many schools face high levels of chronic absence and discuss the implications for state and local action.

Presenters will share inspiring examples of how their communities reduced chronic absence, even when it reached high levels in a school, district or particular student population. These insights are even more important as a growing number of states adopt chronic absence into their accountability systems for school improvement. Download the full brief, executive summary and state-by-state chronic absence data charts before the webinar. Consider spreading the word about the report and download our messaging materials for Facebook, Twitter or your newsletters.

Our presenters for this webinar: Alicia Lara, United Way Worldwide; Robert Balfanz, Everyone Graduates Center, Johns Hopkins University; District Leaders including Lorri Hobson, Cleveland Public Schools; Ramona Halcomb and Robin Shobe, Oregon Department of Education; Carrie Zimbrick, Willamina School District; and Hedy Chang, Attendance Works. Register Today!

Please Note: We are likely to exceed the webinar room capacity of 500! Please note that once you register you will receive the webinar recording, PowerPoint slides and other materials whether you attend or not. You might consider organizing a separate session to watch with a group using the recording and discussion guide. Guests are welcome to log in 15 minutes prior to the beginning of the webinar.

Posted in Research | Comments Off on Final AAM 2017 Webinar: Portraits of Change

August 31st, 2017

High Quality Attendance Data is More Important than Ever

This is the second in our blog series in which we highlight attendance-related issues that are emerging as states work through the complexities of responding to the Every Student Succeeds Act (ESSA). ESSA provides states new flexibility to evaluate school performance using more than just test scores. As this table shows, many state ESSA accountability plans include chronic absenteeism, or other related indicators, as measures of school quality. Beginning in 2018, these schools will be held responsible for meeting student attendance and other achievement goals. (Read our first blog, Making the Most of Attendance Indicators.)

Why is the quality of attendance data so important?

Including chronic absence in accountability systems means that states will depend on these data to identify schools that need extra support or interventions. And more importantly, it means that schools are more likely to have the data they need to intervene throughout the year to prevent student failure and disengagement that is so likely to result from poor attendance. This is the great power of attendance data – it can provide critical, “just-in-time” information to school-level staff, and cumulative, end-of-year data to parents, state leaders and other stakeholders.

At the same time, the adoption of chronic absence as an indicator can present a hurdle for schools, because it requires daily data collection and many data collectors. This reality means that states and districts must establish daily attendance-taking systems that are easy to use, hard to ignore, securely maintained, and capable of producing timely and actionable reports for educators. In addition, data definitions and collection protocols must yield comparable statewide and summative reports that are clear and accessible to stakeholders.

How do you know your state’s student attendance data are high quality?

High quality student attendance data has several characteristics: It can be compared across districts and schools; it’s transparent, secure, and is supported by district policies and procedures. Ask the questions below about your state’s student attendance data (or ask your state’s education leadership to answer these questions!) Then, consider the following suggestions to improve the quality of these data:

Are the data comparable across districts and schools?

    • Is there a consistent definition of what constitutes a day of attendance across the state? This question is especially relevant in middle and high schools where students typically change classes throughout the day.
    • Is the definition of all attendance-related measures, particularly chronic absence, clear and consistent across districts and schools?
    • Are chronic absence rates calculated consistently and correctly across the state?
    • Is there guidance about when attendance should be taken during the day and how to modify records, for example, when a student is late?
    • Is there a neutral setting for recording attendance in student data systems? If not, are students considered present until they are marked absent or vice versa?
    • Can student data systems detect when schools don’t submit attendance? Is there staff assigned to follow-up on un-submitted attendance?

Steps to improve comparability

  • Don’t wait for students to be enrolled for most of the school year before compiling and assessing attendance data. Ensure that the greatest number of students is included in the data collection by creating thresholds (such as number of days enrolled) for inclusion that capture the majority of students.
  • Choose a neutral value as the default setting in student data systems so that students must be marked either present or absent.
  • Have a process in place to audit student attendance data and attendance-taking processes.

Are the data transparent?

    • Are attendance data readily accessible to stakeholders and educators and tailored depending upon their roles? Can they find the student attendance data they need, such as chronic absence rates, to make decisions or take action?
    • Does the state provide information about the data, such as, clear explanations about how student attendance data are collected, how indicators are calculated, and whether there are data limitations?
    • Has the state communicated why student attendance data are valuable to meeting its goals (beyond accountability purposes) to districts and the public?

Steps to improve transparency

  • Engage a diverse set of stakeholders to get input on how student attendance data are collected and whether the data are accessible and understandable.
  • Ensure that educators, parents and community partners have easy access to data dashboards that allows them to break chronic absence data down by student, grade, teacher, student sub-group, and geography so that they can intervene before students miss too many school days.
  • Have a process by which stakeholders can question the data if there are results that do not seem accurate?

Are the data secure?

    • Are individual-level student attendance data being kept safe according to FERPA standards?
    • Is student privacy protected in the process of collecting attendance data?
    • Do the state’s procedures ensure that only those who need to see individual-level attendance data have access?

Steps to improve data security

  • Empower a data governance body to ensure that student attendance data are securely collected and maintained. The Department of Education created a toolkit that simplifies the Family Educational Rights and Privacy Act (FERPA).
  • Proactively communicate about how the data are protected.
  • Provide school leaders with professional development about protecting individual student attendance data.

Updating district policies and procedures for attendance supervision

  • Accurate data also requires updating Local Education Agency policies and administrative regulations which support the involvement of specific staff members in maintaining accurate data.
  • Board policies and administrative regulations must be consistent with state laws for establishing a system to accurately track pupil attendance to ensure that pupils with attendance problems are identified early to provide applicable support services and interventions

State leaders should be prepared to look critically at student attendance data and answer questions about the processes and decisions behind the numbers. The quality of these data depends upon the daily buy-in of teachers and school-level staff. Over time, public scrutiny of the day can help improve the quality of the data and higher quality can increase trust in using the data to inform practice and policy.

Much of the power of chronic absence emerges when it is used as a diagnostic measure, rather than a year-end accountability measure. If these attendance data are to be of high quality and used, not just reported, schools and districts must be encouraged – not punished – for honest reporting and proactive use of the data.

We would like to express our appreciation to Elizabeth Dabney, Director of Research and Policy for the Data Quality Campaign and Jane Sundius, Senior Fellow for Attendance Work for significantly contributing to the content of this blog.

We’d like to hear from you about ESSA planning and implementation. Please share your comments with Sue Fothergill, Associate Director of Policy, at Sue@attendanceworks.org.

Posted in Research | Comments Off on High Quality Attendance Data is More Important than Ever

July 27th, 2017

Making the Most of Attendance Indicators

This blog is the first of a series in which we highlight attendance-related issues that are emerging as states work through the complexities of responding to the Every Student Succeeds Act (ESSA). We hope to stimulate conversation among those on the front lines of ESSA planning and implementation. We’d like to hear from you. Please share your comments with Sue Fothergill, Associate Director of Policy, at Sue@attendanceworks.org.

The recently submitted state plans for implementing the Every Student Succeeds Act (ESSA) show that chronic absence is gaining traction as an indicator of school quality and student success. As this chart shows, the majority—14 out of the 17 officially submitted ESSA plans—includes some variant of chronic absence as an accountability indicator and many other states with plans in preparation seem likely to follow suit.

Attendance Works is excited by the opportunity that the increased focus on chronic absence provides because it has the potential to increase student achievement substantially. We now know that excessive student absences are a proven, widespread, and consequential problem in American schools. National data from the Office for Civil Rights shows that at least 6.8 million public school students missed 15 or more days of school in 2013-14, and it affects at least 89 percent of the nation’s school districts. Several high quality research studies   show that the impact of chronic absence leads to lower achievement,  disengagement  and often dropout. Yet chronic absence can be reversed and, when attendance improves, student achievement is likely to improve.

While many states have added, or are considering including, attendance measures to their accountability systems, the nature of the indicator and definitions of chronic absence differ, as do their attendance goals and intervention points. We believe that some of these differences are critical because the choices states make may determine how powerful attendance and chronic absence are as measures of school quality and student success. One of the most promising developments is that ten of the states with formally submitted ESSA plans have chosen to define chronic absence as missing 10 percent of school days. We recommend that states adopt this particular definition for two critical reasons: first, it has a proven ability to identify students who are at very high risk of academic failure due to absences; and, second, using it will allow for comparisons across states and districts nationwide, even if the lengths of their school years differ.

Using a positive indicator  

A second approach that is surfacing in a few states is the use of a positive metric, one that attempts to measure the percentage of students who have good attendance, rather than the percentage chronically absent. This approach is laudable. Our resources  are chock full of positive messaging, student and family engagement strategies and other approaches to improving student attendance in this way.

However, choosing an effective, appropriate positive indicator is not so simple.

Take for example the most obvious positive metric: the percentage of children who are not chronically absent, or those that attend more than 90 percent of days. This type of metric won’t distinguish between students who just met the measure—missing  17 days—and those who had much better attendance such as missing 5 days out of an entire school year. It’s important to understand that, while chronic absence can reliably identify students at high risk of failure due to absences, its inverse should not be considered a reliable indicator of students at low risk of academic failure due to absences. This is because research suggests that low risk students attend, on average, 95 percent or more of school days.

So why shouldn’t states adopt a standard of 95 percent as satisfactory attendance?

What we have heard from some states is that too many of their schools, and particularly their high schools, would not meet this standard. At the same time, choosing a 90 percent or higher attendance indicator and giving it a positive label (such as “persistent,” or “consistent,” attendance) can send the wrong message: namely, that students need only attend more than 90 percent of school days to be successful. This may have the unintended consequence of setting the bar for attendance too low.

For this reason, and others we have outlined in our policy brief, Chronic Absence: Our Top Pick,  Attendance Works believes that the chronic absence indicator, defined as missing 10 percent or more of school days, is the best bet for state accountability systems. Including chronic absence in accountability systems and improvement plans ensures that schools and districts respond to chronic absence as the emergency that it is.

However, if a state is committed to a positive attendance measure, our next recommendation is to use two indicators: satisfactory attendance, defined as attending 95 percent or more of school days, and chronic absence, again, defined as missing 10 percent or more of school days. Taken together, these indicators would communicate the level of attendance that gives students the best chance of success and would ensure that students at high risk of academic failure due to their attendance receive the attention they need. What about states that cannot include two measures in their accountability systems and are committed to using the more-than-90 percent attendance measure?

We suspect they will have to do additional education and training to send the message that satisfactory attendance is 95 percent or better. They will need to look at how they use a range of strategies, such as school report cards, workshops with families, professional development with school staff, etc., to communicate that students and families will need to set higher attendance goals, significantly above 90 percent, to ensure absences are not contributing to poor academic outcomes.

Now it’s your turn. For those readers in the thick of dealing with these issues and decisions, tell us what your state is doing. How is it handling attendance indicators? How can we make the most of the ESSA opportunity and maximize the impact of attendance indicators? Send us an email to Sue Fothergill, at Sue@attendanceworks.org

Jane Sundius, Senior Fellow, and Hedy Chang, Executive Director, Attendance Works

Posted in Featured Article, State News | Comments Off on Making the Most of Attendance Indicators

« Older Entries