Is it recommended to compare student achievement the first year students and teachers use laptops? Will Sunnyside Schools compile and share this data with principals? I would be interested.
If this is going to happen, I think 5th grade teachers would also be interested in this information. Right now, we have been given high expectations with minimal training. I think there are too many variables. We have also implemented new intervention strategies as well as other instructional tools.
I agree with your statement about having high expectations with minimal training. I was also a bit frustrated that the 1:1 handbooks were just printed and handed out recently. Being a Phase 1 school, the handbooks should have been handed out at the parent meeting in the Fall, so that all the parents, teachers, and students understood the expectations.
I would have to agree with this. It is very important for the handbooks to be presented to all involved (parents, teachers, students). Although phase one teachers tried to take into account what the expectations would be for the one to one program, it would have been extremely beneficial to have all the material in writing before distributing laptops to students to review ALL the expectations. This would allow for teachers to set and enforce the expectation at the beginning of the program.
We have had trouble getting our hands on extra handbooks for students who have misplaced theirs.The novelty of the laptops and the introduction time has affected benchmark scores but we need to see what these scores to over time, and then we can adjust or modify instruction, and discover how to teach more effectively with the laptops. I think it is important to give it time just like anything else that is introduced.
We have been given minimal training, one week, and at our school two of us were using the computers all day in every lesson and using them meaningfully, and we saw our benchmark scores drop. The district needs to spend on training teachers on how to intergrate technology effectively while teaching the standards.
I just don't think that this is possible. Because several schools have just recently received their laptops, there is no way that the information retrieved on teachers and students of all schools can be useful. I know that for Phase 2 schools we were given a deployment date that was followed close to benchmarks and AIMS in April. The priority of technology will not be fully in place until next year. That is just my opinion.
I think the first year of any new program should be considered a benchmark. Also if we are going to look student achievement shouldn't we have some specific goals in place? We would need a focus for the year and then collect data.
Also, we as a district, need to make sure that we have the 9 key implementation factors in place before we start judging how one-to-one teachers are doing.
I also think we want to be careful not to make this another judgment on teacher success. I agree that we need to look at the implementation factors first. of course, teachers are part of the process, but since we have not all had access to the same Internet accessibility (at school and in student homes), access to technology support tools such as projectors, and time with laptops, it doesn't seem wise to use this year as a valid indication of the affects of 1:1 computing. I would really like to know how the schools in the Project Red study began and the timeline for their results. I may not have looked carefully enough at the data--it may have been included.
It would not be meant to measure teacher success. Principals have been tasked to select the very best teachers for the laptop implementation at the fifth grade. If we did not believe in the teachers they would not be teaching fifth grade. Any data available is always helpful to make improvements where needed. This has been a benchmark year and I hope we can learn from it. I can name numerous things we should have done and did not. No one is to blame. Things do happen and we have to be patient and adjust to change. It is never to late as long as all parties are willing.
You make a good point, Stephanie. I noticed that the data collected from several of the schools in the study stretched over a period of 7 years (2002-2009). Change takes time, and I am sure the effects of 1:1 computing did not show up over night. I, too, believe that it would be unwise to use this year as a valid indicator, particularly since our school has only had laptops for a few weeks.
I don't think the first year data should be looked at since not all students received laptops at the same time the data would be skewed. I think this first year should be a learning year and data should be collected next year when everyone has the technology at the beginning of the year.
I think if you are going to look at student achievement you need to look after this next school year. Our teachers are just getting projectors now and our 5th graders have just gotten their laptops. How could you possibly measure student success without a year of our learning curve under our belts. I will be excited to see what the results are when the time comes.
For measuring purposes I would agree Cathryn the first year should actually be a full year for many schools the end of this school year will only mean 4 months with the laptops. However, I would be interested to see the effect the laptops had over those 4 months if we only looked at it "in-house" just to see what happened for bad or good and maybe identify how what went right and maybe what we need to change. Maybe writing went up so we can see a shift that students are more eager to write on the computer or maybe math scores went up, did we use the computers more for math. I would just like to use it for planning next year.
I have observed that newer teachers are more able to implement technology into instruction but stuggle more with instruction in general. How do you meld the knowledge new teachers have and the experience of veterans? This would be a good place for us to drill down.
I was thinking about this very same thing. What are the teaching demographics of the district? I would venture to guess that most teachers hired in the last five years are "digital natives." Their comfort level with technology is much greater than the teacher teaching 25 years. This may be the place where collaboration can really have a positive impact. Each group has strengths that could help the other. Let's find ways of forming these collaborative teams toward the betterment of each group.
I agree with Ray in that all teachers are learning how to meld these new technologies with solid teaching practices. This would appear to be one of the next steps we take.
I will be interested to see how the students score on standardized tests after using the computers in Fifth Grade.
That would be very interesting to look at once we get the AIMS results. I know that any new implementation takes 3-5 years to see results, but the information we receive from this year would be some great baseline data for the district.
It would be great to look at data. We would be able to ask ourselves what factors in technology could have played a part in the results as well as what outside factors could have contributed to the data.
Yep Kim, I am waiting with bated breath for results. So far everything is coming up roses. This district needs more state wide recognition, not for our egos, ( but that is good too) but for what will follow in the form of support. Other districts will follow in our foot steps and hopefully the words "public schools" have the respect deserved. It'll be a new day!!!
I think that we should use this year's AIMS data as a baseline. Teachers have had minimal training, so we have to wait until training had been given to them in order to collect reliable data.
I think this is a very intriguing topic. I heard rumors that some Phase 1 schools were significantly below their anticipated benchmark scores earlier this year...and I don't think it's a stretch to guess that some of the time spent working out the kinks for 1 to 1 might have played a part. I think the district will take all of this into account when analyzing student achievement, and I certainly hope that teachers and schools aren't compared given the reality of the variables. My hope and belief is that within a short period of time, everything will be ironed out and technology will be the catalyst for levels of student achievement well beyond what we've achieved in the past.
It would be interesting to see the data for both Phase 1 and Phase 2 schools to see what the relationships are. I don't think that teachers should be compared based on the level of technology usage in the classroom as many different factors have affected that throughout the school year. However, the data should be looked at with a critical eye and be used as baseline data for the coming years. As with everything, there is always room for improvement and growth. It has been a great 'beginning' year and there is a lot to be learned and implemented in the next school year.
I think it would be interesting to compare the growth of the first phase schools to the second phase schools to see if the impact was larger in either group. This could serve somewhat as a baseline for the upcoming school year.
One can only assume that this technology will lead to greater things with each year being better than the last.
"One can only assume that this technology will lead to greater things with each year being better than the last"
I agree, the focus should be improvement from year to year. As the teachers get better with technology, knowledge and instruction...as the district improves with accessibility...as the students benefit from quality instruction and higher levels of engagement... we will continue to make gains, I'm sure!
Karen, you are absolutely correct. We need to give teachers and students time to make the transition from traditional instruction to digital learning. It certaining wouldn't be fair to look at this year's data given that there were some infrastructure issues and that not all 5th grade classes started at the same time. Let's be patient...I think we are all anxious to see the change in our growth, but let's do it when things are more securely in place.
yes I agree! According to Project RED on page 129, "Understand that student achievement will increase over time when a guaranteed curriculum and instructional program are integrated with 21st century tools. Communicate that research shows teachers need 3 to 5 years to seamlessly integrate technologhy and instruction." So, it is too early to do any assessments on teachers or schools to see if technology is making a difference.
I would love to see this. I hope that we look at all the different groups that we have as well. I would like to see how the computer changed my students. I know that I see in the classroom what they did and how more independent they are. As nervous and scared i was to have the computers in my classroom my students were not. Once they got some basic knowledge in making power point and working with windows they played with them and really have learned more by themselves with my guidance than I could have taught them the old way.
I'm curious how technology has impacted the writing process. I sit here and read the forum posts noticing the amount of words people write and the ease in which thoughts are understood between the participants.I wonder if our students have found writing to each other as easy? Has there been some comprehension issues?
I can see us looking at the data to set some goals for ourselves for the coming years-rather as a formative than a summative-in all areas.
I agree that looking at the data would be interesting but again you are comparing data with this years students against last years students- so again can we really say that laptops had an impact positively or negatively? I think next year it would be helpful to assess the students technology knowledge at the beginning of the year by taking a survey similar to one we just had to take before spring break. Then have those same students take the same survey at the end of the year to see how much improvement they had. I also wonder if the laptops have had any effect on attendance? I know this year has been tough with illnesses but I wonder if classrooms with laptops had better attendance than classrooms who did not start the year with them?
While the struggles we have had in implementing this very exciting initiative may make any achievement data or inferences from that data a little spotty, one place we shoiuld be able to see growth is in the Technology Learning Assessment taken by 5th graders this year and last. It is similar to the technology assessment teachers recently completed (same company, same website), and directly measures many areas of technology usage and learning. Students took a pre-test in the fall, and will take a post-test in late spring. These data should be available by early summer.
Good point, Anne. 5th grade students in Phase 1 schools will have had an entire year of work within the classroom and with the LTF teacher-librarians. Phase 2 schools will not have the same amount of time devoted to technology, but will have double the time for at least a semester. It will be interesting to see if there are significant differences in the Technology Asessment scores for the two different sets of schools.
I was thinking abut the www.learning.com assessment. I think we'll see the some improvement in at least three of the five strands: Internet, MS Office apps and computer fundamentals. Our experiences with user profiles and wirelss access have given everyone a deeper understanding of the last.
I've already seen a tremendous improvement while teaching multimedia presentations. The 5th graders seemed so much more comfortable maneuvering around PPT. I'm excited to see the results!
That is a great point, Anne. Since one important part of raising achievement through the use of technology is having students be fluent with the devices, tools and apps they will need to use in their learning, this should give us some good insights. It will be interesting to see what differences there might be between Phase 1 and Phase 2 schools, and what the learning curves might be for students in mastering some basic tech skills.
Unfortunately, phase 2 teachers have just received their laptops. Any comparisons to phase 1 teachers would not be accurate. Everyone needs more adjustment time. Rushing into evaluation is unduly stressful for all concerned.
I agree it would be interesting to see data however I would be hesitant to attribute too much of it to the one to one especially in a phase 1 school. The transition has been bumpy and only time will truly tell. I do know that it has to impact the readiness our students feel for the real world and the possibilities they see that are open to them.
The data would be interesting to see as well but there are so many factors that contribute to the results that I would be very hesitant to attribute the results to the use of technology. I would also be hesitant to compare last years to this years 5th grade results - comparing apples to oranges.
I'm thinking about 3 years of data collection at least. We need to be patient in this respect. We'll get better with time and hopefully the data will show. I'd like to see how the 6th graders do next year AND I'd like to see what happens next year to the students who are 4th grade now.
connecting student achievement with student/teacher laptops is not the best idea this year. There has been so much change this year and I feel we still aren't using the laptops to their full potential. I also observe that teachers use the laptops in very different ways. Some use them more than others. In order for their to be appropriate data we will have to be more purposeful and set clear goals and expectations for teachers and students.
The study specifically states that it will take 3 to 5 years to see substantial gains in achievement. However, I'm afraid that if we do not see gains immediately, there will be a lot of people making a lot of noise about it being worthwhile and/or effective, etc. Don't get me wrong I believe there will be gains, but we need to make sure that people know that realistically it will take 3 to 5 years. We need to follow through on this and keep improving until we have "seamless" integration. We have all done a great job of working with the technology, but we have a long way to go.
This is an excellent idea. I'd really appreciate any data showing how our students have progressed after implementing the laptops.
As a teacher I would want that information but how will the looping schools effect the data? All other schools' teachers will be in year two but looping schools' teachers will be year one...that will probably affect things.
We need to collect data. This is our first year and we have had two phases, but we need to start somewhere. Many people have posted that we should compare data. We should collect data with our 5th graders and again next year when they are 6th graders.
I would like to see some information from across the district to see if the students and classes that have 1-to-1 are making more improvement on the test scores than those students and classes that don't have it. I would also like to see the areas of improvement and what was done to make this improvement. Lets share what we have learned so we can make our students better.
As an administrator, I would like to see if the other areas of 1:1 implementation have been affected besides the AIMS test scores. My school was part of Phase I. Did the 1:1 initiative affect discipline/behavior and attendance? These are important data to reflect on as well.
I know the test scores are very important, but what happens when we don't meet the AMO in fifth grade and it causes our school to move down a level in AZLearns? I worry about this. I know we are all in a huge learning curve, but the facts are the facts; how well will we do this school year in AIMS because the AMO numbers have moved up?