PATRICIA JOHNSON: Good afternoon, everybody. It's great to be here. I always love to be here with you. How many of you work with or have a partnership with or a relationship with one of Ed's MSP programs? Raise your hand.
See, isn't that great? When we started this program, I don't know, five or six years ago, we were not sure that there would be such overlap. And so, it's really good to see that that is actually happening. I'm going to talk a little bit about some examples of that when I conclude.
What we're going to do with this session is to-- I'm going to give you the whirlwind tour of the third annual report that describes what the Department of Education's investment in the MSP program has yielded. It is the descriptive look at what we fund, is also some preliminary evaluation data that we can share. And then, I'm going to turn to two projects that are representative of the 501 projects that we're currently funding. And I'll talk a little bit more about them as I introduce each one of them. And then finally, we'll conclude with some ways in which we see the Department of Education's MSP program currently collaborating, and hopefully developing new ways of collaborating as these two programs move forward.
So, for those of you that are new or don't know, Congress created two MSP programs in the Department of Education. There is legislation that describes what we must do with these funds. It's called Title II Part B of NCLB. And the basic conceptual model is that it's a professional development program that we are asked to develop partnerships of high-needs school districts and STEM faculty in institutions of higher education. The law also says that other partners may participate. I think almost half of our projects have school of ed. partners as well, other school districts, and other non-profit and for-profit organizations. And the idea is to provide intensive professional development, which will lead to classroom instruction improvements, which will then lead to student achievement gains in math and science. And so, that's our charge from the legislation. And this is how we run the program, really quickly. There's a lot more to it, but I don't think that that's the point of this thing today.
Congress appropriates funds to us, to the department. And then, on a formula basis, we divvy up those funds among the 50 states and Puerto Rico and the District of Columbia. And no state can receive less than one half of one percent of the funds. So the smallest states receive just under a million dollars, and the bigger states receive $25-$23 million dollars. The states are then required to run competitions to make awards. And they must be partnerships. And the partnerships must be with STEM faculty, as I said, and high-needs school districts.
Then, the law also requires that the individual projects that were funded by the states report to us at the U.S. Department of Education on an annual basis, on the impact that their program is having on teacher learning and on student learning. So we, the Department, are collecting information on the projects, working with the states. So it's a very interesting model and quite unique in the pantheon of programs that are run at the Department.
So, where are we currently? These data are from the 2006 appropriation, which we just finished, which we now have a review of. I won't go into the funding stuff, but that's what we have. Over 3,000 institutions of higher education faculty are participating in MSP, Ed-funded MSP projects across the country. Over 3,800 different organizations are partnered to form 501 projects across the country. They offer 49,000 hours of professional development to more than 56,000 teachers, which impacts on classroom instruction for over 2 million students. So that's the big picture.
Q: [off microphone]
I'm sorry, that number is probably wrong. But it is 56,000 teachers, and I'll figure out what the right number is for the professional development. Because I'll tell you how many it is, and we can do the math, actually, on average. Sorry about that.
We have a $181 million dollar appropriation in 2006. It's remained at about that level for the last couple of years. And, as I said, the smallest grant to the states was about $900,000 and the largest was $25 million. And the projects that were funded by the states-- now the states get to make decisions about the size and scope of the projects that they fund. So we have some states that have given all the money to one entity. And we have other states that have divvied it out in quite small grants. But the average grants, or most of the grants, are between $100,000 and $500,000 dollars for-- most of them are two or three years. Most of them, now, seem to be three years.
Okay, so on average, we have six IHE faculty participating per project. Now, that includes both the Ed. School faculty and the STEM faculty. It also includes anthropologists and social scientists and evaluators. And we have 56,000 teachers, which ranges from a size of ten teachers in one project to over a thousand in one of our statewide projects. The average number of teachers that are served is 113. But the median number, which is the sort of typical example, is 44 teachers per project.
There's different models for professional development. The typical model, or the largest number of projects, are what we call individual teacher models, which means that a university or a school puts together a program and then invites teachers from the participating schools and school districts. And teachers decide to join as individuals in order to enhance their own professional development. About 83% of the projects use that model. Another 17% use the teacher leader model, which is that the teachers that are trained in the P.D. are trained to then go back and work with other faculty and other teachers in their schools. And this is important because it has to do with the way we look at the evaluation and what we evaluate, based on the different models that are used.
So, how many hours of professional development are offered through the programs? You can see that we have different models of professional development. Again, the law strongly urges the use of summer institutes followed up during the school year. And, as you can see, most of the projects have followed that strong requirement. For summer institutes, just straight up with no follow-up, only 3% of our projects do that. And they offer about 82 hours of professional development in a given 12-month period.
The largest majority, 62%, sponsor summer institutes with follow-up during the school year. And the average project offers about 125 hours of professional development, 66 in the summer and 59 in the school year. And the follow-up activities, we'll talk about those later, but they're distance learning, they're lesson study. Lesson study is very widely used. Saturday programs, in-school coaching. But these are just direct hours of professional development, not the other hours that they may be working with their colleagues on this. And the other includes a lot of other things that are, like, just work during the school year, courses that are offered at the college campus, distance learning opportunities and the like. And those offer an average of about 83 hours of professional development per participant.
So, you can see that the programs are offering fairly intensive, very intensive professional development, quite different from a lot of the other P.D. programs that have been sponsored by the Department of Education. Okay, again, I said this was the whirlwind tour. We are required, in the legislation, to look at the impact on teacher knowledge. And so, we have started-- we are requiring that our projects do pre- and post-testing on teachers' content knowledge. And our colleagues at Westat who are here have worked with us to help develop an instrument, an Excel chart, really, that helps to test for the significant of those gains on the teachers. And so, the projects report to us on the pre- and post-test scores of the individual teachers. And then, the instrument that we've developed helps to test whether the teachers made significant gains.
Now, this is, again, across all of the projects. So, each of the projects has its own pre- and post- instruments that we allow. We don't require that any particular instrument be used. And, in this 2006 year, was our pilot testing of that. So, not all of the teachers had pre- and post-testing data. But, you can see that, for those that did, in mathematics, we had 71% of the teachers who participated in the professional development made statistically significant gains in their knowledge. And in science, the teachers made 80% gains in their content knowledge. And I can talk more about the instruments that were used and what all that means. But it gives us a good baseline. And, what it tells us is that the teachers are learning-- the vast majority of teachers are learning, to a significant degree, what's being taught and what's being expected in the professional development.
Again, we're looking at longitudinal information and using the proficiency scores that are available in mathematics. But, we have seen students that have made-- So these are the baseline information of what the percentage of students' proficiency scores are. So, what's interesting is that we have been able to do a comparison for some of the projects that were able to give us these data, which are not all the projects. We found that, in 2005, six percent-- there was a six percent growth from the teacher's classroom.
So, what we're comparing are the proficiency levels in a teacher's classroom the year before they participated in the professional development, and then the proficiency levels of the students after they had benefited from instruction after the teacher had participated in the professional development. And, what we found overall, across all the projects and across teachers and different models, was that there was, in 2006, a seven percent gain in students' scores for mathematics.
And so, what does that mean? Is that good or bad is the question. That sounds good, right? So, we went back to the data that the Department of Education collects, to find out what the national average is, looking at proficiency scores. And it was 3.5%. So, across thousands of students in this project, as compared to the national average, we think that it's an indication that something positive is going on for students. So those data are also being collected. And again, these are baseline.
Whirlwind, here we go. Evaluation designs. We do ask that the project conducts evaluation so that we can find out what works and how it works. But it's not required. It's something that we urge as a way of contributing to the knowledge base. And so, in this year, we had six projects that were conducting an experimental design and 173 that were attempting quasi-experimental designs. And the others are all kinds of things, like case studies and some very interesting research designs. But they're just not experimental designs. So, a lot of our projects are trying to do good work with limited resources.
So, we looked at the final reports of the projects that claim to do quasi-experimental or experimental designs. And again, the numbers that you saw was for projects that are in any phase of their project. So we had 29 projects that had a quasi-experimental design with a matched comparison group. And, of those 29, eight were classified as having strong quasi-experimental design in one or more of the following categories. So, we're looking at the projects in three areas: developing teachers' content knowledge, improving teachers' practice, and in student achievement. And we have a rubric that's been developed to help us look at the final reports from their projects, to look at their research designs.
And again, getting back to the other one, still, there's a lot more data in there that I could share, but we don't have time here. And we'll talk about it tomorrow in our breakout session. But, we see some emerging information.
We also did a regression analysis of project characteristics associated with gains in teachers' content knowledge. Again, this is just a preliminary look that we think is illuminating. But, what we found is, that the individual teacher model in mathematics, teachers had greater gains than the teacher leader models, which, you know, I'm sure there's some explanations for that we could talk about later. But in science, there was no difference.
The other program lead was something that was pretty interesting, and I think would be interesting to you. When we had some projects, most of the projects had either an LEA or an institution of higher ed. as its project lead. But, there is a subset of grants that have other organizations, like one you'll hear about, REESE and others that are here, where they were the project lead. What we found was that there were statistically significantly greater teacher content gains when there was an other lead, which is just preliminary. But I think it's something we should look at and might want to think about.
And the numbers of teachers participating-- this is sort of a backward of saying the smaller the cohort of teachers, the larger their gains seem to be in the aggregate in mathematics. Okay, so in summary real quickly, $181 million dollars. The average MSP was $337,000. The median was about $200,000, serving-- The typical program served 44 teachers. The majority of teachers or majority of their projects were aimed at helping individual teachers develop their content knowledge. Of the teachers that were pre- and post-tested, 71% made statistically significant gains in their content knowledge in mathematics and 80% in science of those that were tested.
Overall, the proficiency level in classrooms of teachers who participated across the country was increased about six percent. That's compared to 3.5% increase in the national average. And we had seven percent in science, but there is no equivalent proficiency scores in science. So we can't make that comparison.
And that's the whirlwind tour. I know that's a lot of data. There's a lot to think about. We have an executive summary of this, as well as a report that is on our website. But I wanted to let you know, sort of real quickly, a description of what the Department of Education's investments have been.