What Comes Next?

The past month or so has been the most challenging month in memory for most of us in the Education field. Four weeks ago, I think it’s safe to say that nobody would have thought schools across the country would be completely shut down.

But, here we are. For the past few weeks, schools have had to almost entirely invent and implement distance learning – a challenge in and of itself for students pursuing their degrees in college – much less distance learning in Kindergarten. Much effort has been heroically placed into that effort.

In the blockbuster hit, “Hamilton,” Lin Manual Miranda’s King George asks the difficult question of the newly formed United States: “What comes next?”

We ask the same question. What comes next? What happens in the fall, when schools (hopefully) open back up? What does instruction look like then – when students have missed 1/3 of the prior year of instruction – even despite our best efforts at distance learning.

In collaboration with experts across the country, ion is teaming up to bring you the “What Comes Next” webinar series – starting May 4th. We’ll post more details as they become available – but we are planning on hosting a series of 60-90 minute webinars that will address topics important to resuming instruction in the fall.

Check back for more details. We’ll post them everywhere when we have them sorted out.

Hang in there everyone – the good news is – school will open up again! Every single day, we are one day closer to the end of this pandemic.

Distance Learning Experiment

This post is by guest blogger, Chris Birr, EdS. Birr is a member of the ion Board of Directors, a School Psychologist, MTSS Coordinator and deep thinker. Chris lives in suburban Columbus, Ohio with his wife, two daughters and dog.

This is likely a frustrating time for teachers and parents. Teachers are scrambling to figure out how to teach from a distance. Most parents are trying to figure out how to work from a distance while managing learning from home for their children. There are no blueprints for this since we have never had a situation where everyone stayed home. In the past week, a couple of resources have come out that begin to highlight promising practices related to education in our current situation.

Dr. John Hattie recently released a summary regarding the effects of school closure. As we all worry about what the effects of an extended shutdown will be on our students, Dr. Hattie brought up a couple of points. First, there is not a lot of data regarding the effects on student learning during extended shut-downs or closures. Summer vacation has minimal effects and school closure due to teacher strikes or catastrophic events also have had fairly minimal effects on student achievement. None of these are ideal but context can help.

A few of the takeaways were worth noting. I am paraphrasing but the link is here to read yourself. First, do not panic if this goes on for 10 or so weeks. Again, it is spring and growth tends to be lower in spring anyhow. Next, worry more about the technical subjects that parents tend to know less about (I read that as math). As teachers, use opportunities to obtain feedback about what students do not know and fill in the gaps, without the use of busy-work. Lastly, it is not the amount of time that matters. What matters is how we use the time with our students.

REL Mid-Atlantic and a team of researchers recently provided a webinar to discuss evidence-based practices, a decision-making framework, and approaches to addressing equity in the age of school-closure. Personally, I am a big REL supporter and this webinar appeared to be a welcome, genuine attempt to put some parameters around what can be controlled and provide recommendations regarding our best hypotheses for what might work in the near future.

Below are highlights of the critical takeaways from the webinar. Readers are directed to the REL Mid-Atlantic site to access the on-demand version of the webinar. Many of the items below are questions that were posed and issues to consider when planning or refining distance learning opportunities. Again, no one has all the answers but the following are posed as items to consider to improve service delivery. the

Remote Learning consists of Synchronous and Asynchronous methods. Synchronous joins students and teachers in real-time via technology. Asynchronous requires students to work on their own (worksheets, videos, emails)

Synchronous instruction matters

  • Critical elements to include in synchronous instruction:
  • Time to interact with the teacher
  • Feedback, tutoring, AND support
  • Project-based learning- creating meaning matters to students
  • Gaming or virtual simulations- use highly engaging methods
  • Plan beyond content delivery- provide additional resources (supplement)
  • Educators can help collect data to inform practices. /e2iCoach/
  • Use evidence to refine practices

Have a Plan B– not all students have access to technology or adequate internet

  • Options such as picking up materials when food is picked-up (meal service) or make phone calls

Equity- not all students can access equipment and internet

  • Partner with internet providers, create areas to access hotspots while maintaining safety, public television or radio to promote instruction
  • Support Parents- have virtual office hours or modes of communication where parents can seek assistance
  • Choice of Instruction- Does instruction meet the needs of ALL?
  • Special education, English learners, homeless, economic situation, culturally responsive- are these students receiving what they need to complete work and remain engaged?
  • Rigor for all? Does instruction engage those who struggle and those who require enrichment?

As an attendee of the webinar, I appreciated the timely and thoughtful approach of the presenters. Background on what is known about distance learning and the starting point to improve instruction was clearly presented. The takeaway that synchronous and asynchronous opportunities to learn are necessary may provide some teachers with a starting point if guidance has been less clear from district leadership. District leadership was provided with recommendations that may assist with planning or confirm the work that was recently begun. As usual, this webinar confirmed my support of the RELs.

Going back to Hattie’s summary, do not panic. Take a page from REL and look at what we do know about distance learning. Look to add some active involvement with students and provide methods for them to complete work on their own. Provide feedback and try to find ways to keep engagement high.

Implementing all of this at once would be daunting and frustrating. Based on the information provided, I would pick one strategy to implement and seek proficiency over time. Possibly, seek district-approved modes to provide synchronous instruction. Keep the expectations realistic for students and parents and track progress even by tallying contacts or percentage of engagement in the class. Refine and try to increase engagement. Select, plan, attempt, reflect, refine, repeat.

I hope that we will all be past this and moving to a new normal in the coming months. Best case, the strategies, and skills learned now can be applied in an isolated situation for students who struggle with regular attendance. Again, the hope is to re-open and engage in normal practices in the near future. Nevertheless, if we are faced with rolling closures or another virus arrives, this is the time to develop the skills and practices to increase effectiveness and efficiency while minimizing disruptions.

No State Test? No Problem!

No State Test? No problem!

This post is by guest blogger, Chris Birr, EdS. Birr is a member of the ion Board of Directors, a School Psychologist, MTSS Coordinator and deep thinker. Chris lives in suburban Columbus, Ohio with his wife, two daughters and dog.

Anecdotally, it sounds like most states are canceling the end of year-standardized tests this year. For some schools, that could be a substantial loss of student data and methods to assess achievement in their system. However, for most districts and schools, the loss of year-end state test data should not be a significant loss of data. Again, anecdotally, and from experience, schools have more than enough student data and this could be a moment for schools to step back and improve assessment use and fidelity of implementation and interpretation. Some questions swirling involve how to plan for students coming back with summer slide coupled with pandemic slide, assessment can provide some context for whom your most critical students are and how to plan for the entire grade level.

Side note: assessment is not the end-all, be-all but assessment is a critical component in the problem-solving model. (Thanks Florida MTSS). We need to know who to provide instruction that is more intensive and how the entire group is doing from season to season.

Assuming your school has some type of universal screening such as easyCBM, MAP, STAR, AimsWeb Plus, and fall screening is in place, you are in great shape. Even better, you have data from fall and winter from this year. Here are a few questions that I’ve heard and seen so far.

How do we know which students are going to be ok once we are back in school?

Use your previous data and screen all students in the fall. Establish criteria or a cut score to identify students who are on-track for proficiency and those who are off-track and need instruction that is more intensive. Next year (every year, really), we should be accelerating growth for all students so the use of fall, winter, spring data will provide feedback if students are making adequate growth during the year.

What is the best way to set a cut score?

Select either a Normative or Criterion-Referenced cut score. This Cut Score will provide an indication of students who are on track to reach proficiency or off-track and in potential need of more intensive instruction (i.e. intervention).

Criterion-Referenced (preferred). Most screeners provide a linking study. For some, the screener and state test results are equated and provide an indication that based on the fall screening score, the student is likely to score in a certain range on the spring state test. Students who reach a certain proficiency cut score, are likely to demonstrate proficiency on the state test.

If your screener does not provide a linking study, research-based methods can be used to develop seasonal cut scores. This is more difficult and requires some statistical judo. Partner with a local university or graduate students who can help. Bonus points if you can arrange for a research opportunity using district data, check your district policies regarding research.

Normative Cut Scores (less preferred). If a linking study is not provided, use research-based methods are not realistic, and/or time is limited, a normative cut score could be selected. The drawback here is that the cut score selected is based on a percentile rank and not directly linked to performance on the state test. Comparing a national score to a local population obscures the data a bit and predictions are less reliable. Best case, look at the state test and try to determine the state percentile needed for proficiency and apply that to the fall scores in each grade. For example, if a student is required to score at the 65th state percentile for proficiency on the spring test, set the fall score at the 65th percentile for your screening assessment. This is not ideal but is a shade better than setting the cut score at the 50th percentile as an arbitrary selection.

· If using ORF for screening, look at the 50th percentile as a cut score. Faster is not always better in this situation but students reading below this score may need repeated reading or other intervention. Also, include accuracy in the decision rules. If students score below 93% accurate, more information may be needed to make instructional decisions. Low accuracy could be an indication of a decoding deficit. (Christ, 2016)

We have seasonal cut scores but the state test has changed and now we don’t have results from this year? Can we use the old screening cut scores in fall?

Cut scores generally hold up over time even if the state changes the test. Regarding cut score stability, it might be wise to continue using the same cut score on the seasonal screening assessments until new linking studies are conducted or the state test has a few years of use (Klingbeil, Van Norman, Nelson, & Birr, 2018). If the cut score selected indicated proficiency on a prior state test, the prior fall cut scores will likely provide a pretty good indication if students are on track for spring proficiency. Although using past cut scores is not perfect, educators in your system will likely be aware of the scores and able to interpret results from the fall screening. Also, proficiency standards assessed should be similar even if the test or scales change. The goal here is not perfection, but good enough. An unpopular statement, but using reliable and valid assessment data to make decisions is considerably more efficient and effective than professional judgment (Begeny, Krouse, Brown, & Mann, 2011).


The focus is not to overemphasize testing and data when we return. On the contrary, this is intended to provide recommendations to streamline assessment procedures to limit the amount of testing completed and improve efficiency in data-based decision-making. When we return, having baseline data on all students should be used to develop an effective plan of instruction for ALL. Current achievement levels are a starting point but educators need to look at whether or not students are demonstrating strong growth throughout the next year. Set up systems that maximize efficiency and effectiveness so teachers can spend less time on data analysis and more time planning to change trajectories for all students.

Next Year!

This post is by guest blogger, Chris Birr, EdS. Birr is a member of the ion Board of Directors, a School Psychologist, MTSS Coordinator and deep thinker. Chris lives in suburban Columbus, Ohio with his wife, two daughters and dog.

Disclaimer, this is focused on academic skill attainment. Social-emotional learning is critical and our focus should be on supporting the emotional well-being of all during and following this crisis. The following is a recommended plan for the recovery phase after this pandemic.

Today, I had two conversations about what next year might bring. In both conversations, educators were worried about an onslaught of anxious parents and requests for referrals for special education or academic interventions. We have been off for several weeks now and by the time we return, it could mean missing a quarter or more of instruction. Luckily, most CBM or assessment norms charts indicate that winter to spring is generally the time with the least amount of growth attained. If we had to miss a quarter of school, it was, fortunately, this one.

There has been some online chatter already that fall screening will be critical as we enter the next school year. I hope that the pandemic will behind us and treatments and vaccines are on the way. Regardless, I do predict that many educators will be feeling pressure to respond to justifiably anxious post-homeschool teachers, better known as parents.

Most states will likely cancel state tests for this spring. Personally, I have no issue with a year of lost state test data provided that teachers are supported to use fall screening data to identify gaps or needs for students. In many districts, the practice of triangulation of results will be used to confirm who they believe really needs intervention.

Triangulating data should stop. The more you assess, the more confusing results can become (VanDerHeyden, Burns, & Bonifay, 2018). Eventually, anyone can gather enough data to support a belief. Use a valid and reliable screening assessment and form instructional groups (that do change) for students who will need more instruction primarily using results from that assessment. If you really want to collect more data, use a gated screening method and screen kids who score toward the lower end of the middle and it is undecided whether or not they need more instruction (Van Norman, Nelson, Klingbeil, Cormier, & Lekwa, 2018).

If I ruled the world, I would use this crisis as an opportunity to increase data literacy and strengthen a district’s data-based decision-making framework. Here are the steps I would take, again, if I ruled the district and was focused on academic skill growth.

1. Establish and enforce fidelity to fall screening practices– ensure all students are screened within the same window of time. Identify assessment champions in each school (based on knowledge and interest, not role). Support the champions and encourage principals to advertise them as the assessment go-to in the building.

2. Immediately begin intervention or intensive instruction for students who needed it last year. Form groups for the students with intensive needs but leave room to add or shuffle groups when screening is completed.

· Summer will be a time to look at the previous season of screening and estimate how many students per grade needed interventions. Estimate a few more seats per grade and look at creative ways to schedule groups that begin immediately and accommodate more as needs are clearer. Predicting is possible and necessary.

· Consider a schedule renovation. The schedule drives your school, now is the time to make a change. Create the schedule you always needed.

3. Develop and deploy interpretation guidelines. What is the proficiency target? What is the score that indicates risk or a need for more intensive instruction? What is the score that indicates more information is needed? Set it and create one-page cheat sheets and a plan to roll it out.

4. Create grade-level groups based on performance categories. Oregon RTI has 100% meetings and templates that structure teams to examine how many students are in the minimal, basic, proficient, and advanced ranges by season. This may be the ideal time to implement continuous improvement activities that use this type of tracking system.

5. Select measures to use to collect more information. Keep the menu to one monitoring (CBM) and 2-3 diagnostic assessments. Ensure all assessments have adequate reliability and validity (paging school psychs).

6. Monitor all students in interventions and a selected group of students who are on “the bubble”. Although 12 weeks of data is best, look at trends after 4-6 weeks. If students are not trending up, intensify, add, or create groups based on skill needs. This could be a lot of monitoring, seek online or group administered CBM to increase efficiency. Times are unique, the perceived best case may not be realistic for a while.

We have all heard how unprecedented the last few months have been. As we restart school, beginning a new year will be unlike any of the past. With this, now is the time to increase efficiency and effectiveness. Although restarting will be difficult, planning and allocating resources will give each school the best chance of meeting the needs of ALL students.