. Reading and Math Interventionist AIMSweb Assessment Issues in an Urban / Suburban College District in Northeastern Kansas. Any time you begin a brand new assessment program, there will likely be "growing pains". Inside the district where I teach we had our share, but by the finish of the second AIMSweb assessment session (Winter, 2012), we felt like we had a quite very good handle on it for the subsequent session (Spring, 2012). Following, I'll talk about a number of the challenges we had plus the options - or attempted solutions - we implemented. Initial of all let me describe our scenario. We had a district-wide testing group. This was the very first time our district had performed this. Needless to say this was also the very first time our district had truly implemented an assessment technique. We had not utilised a universal screener district-wide ahead of. Previously, each developing tested its own students, typically employing a formative or diagnostic test. Our team consisted of our building-level interventionists. Our instructional coaches played a role, but for essentially the most element didn't administer any assessments. They helped with organizing supplies, served as "runners" to obtain little ones to and in the testing region, helped make the schedule, and so on. It must be noted that by "helped" I mean they fundamentally did it all by themselves. Our testing group received education in the best way to administer and score the AIMSweb measures. The training was more than the course of two to three sessions. We fundamentally followed the AIMSweb Training Workbook, applied the video examples to practice scoring, received information from presenters in person and via webcam. We did not get instruction from certified AIMSweb trainers. And, I assume this produced a distinction. In my opinion it would happen to be helpful to have coaching from an individual who had really administered the assessment measures. We had some pretty precise inquiries that could not be answered. Our 1st session (Fall, 2011), went fairly nicely thinking of we knew next to nothing going in. Our principal challenge was in scoring. We didn't agree on what to accomplish about the entire "does the answer have to be written inside the blank or not" on the MCAP (Math Concepts & Application) and MComp (Math Computation) measures. The instructions are fairly explicit. According to AIMSweb, if the answer isn't in the blank, it is marked incorrect. However, there is a grey location. Within the standardized instructions for the students around the MCAP it specifically says to write the answer in the blank. Within the standardized instructions for the students around the MComp is doesn't say to write the answer inside the blank. Unfortunately, what this meant for us was that some scorers counted it wrong and some didn't. So, we didn't have consistent scoring the first time around. The lesson we learned (I hope) is that you need to have those types of difficulties decided just before you even administer the test. We talked about what it meant if the student didn't write the answer in the blank. We decided it meant they could not follow instructions, not that they could or couldn't compute the problem correctly. We asked ourselves, "What are we trying to determine with the test?" We decided we were not trying to determine whether or not a student can follow directions. We decided that it did not matter if they put it within the blank or not, we just needed to all be scoring the same way. So, we eventually decided not to count it wrong as long as the answer was within the box somewhere and correct. Another issue we had was that there was no "team leader" for our testing team. We had someone we could call, but not anyone on site. In hindsight it would happen to be valuable to have a "go to" particular person assigned or appointed to the group. This could be one with the interventionists, or an individual who doesn't do any testing. This would have saved us quite a bit of time when we had to try and figure out how we were going to score the math. That particular person could have made an executive decision or called someone to find out. Then there would have been no disagreement about what to accomplish in a particular circumstance. Another major issue we still have is what to do concerning the data in terms of getting it out to the teachers and explaining what it means. We did eventually print out parent report letters and talk about the results at conferences. We found what we seriously need is for the teachers to receive some AIMSweb training. We need to know tips on how to read the data, interpret or analyze the data, and learn how to talk to parents regarding the data. Some people aren't familiar with percentile rank, norms, standardization, and so forth. Progress monitoring is another area that hasn't been perfectly implemented. Some schools progress monitor once per week, some once every two weeks, some barely once a month. There is a lot of information and facts to consider when progress monitoring. A number of the more important pieces of data are: How often do you progress monitor? Should you progress monitor or strategically monitor a particular student? When progress monitoring is it seriously necessary to "drill down" or "test backwards" until you find the level at which to monitor the student? (That, by the way, takes a long time.) How do you set the goals for the student? What formula do you use? What do you do if the student reaches his/her goal? Are they automatically dismissed from intervention? What if they aren't on a trajectory that shows they will meet their goal? Do they automatically go to tier 3 interventions? How many data points are necessary to make a decision about a student? Hopefully you is going to be able to have some of these queries answered ahead of you start your district-wide assessment technique. It will save you so much time and effort and you might be able to focus on what matters: what to do with the students who are at-risk according to the screener. Find out more info about [[https://www.robinspost.com/socialmarket/blog/view/116391/aimsweb-assessment-problems|AIMSweb Login]]