**See updated data at:
While the last post looked at elementary school performance, this post will examine college-readiness performance, the transition to post-secondary institutions of higher education, and performance in four-year universities. Again, all data used in this review are publicly available. This time, data is either from the Academic Excellence Indicator System (http://ritter.tea.state.tx.us/perfreport/aeis/) section of the Texas Education Agency, the High School to Higher Education Data section on the Texas Higher Education Coordinating Board website, or from student-level data purchased from TEA.
Table 1 below has four areas:
–percentage of first-time 9th grade students that had TAKS reading and math scores of at least 2200;
–percentage of students meeting the TEA college-readiness standards (TAKS scale score of 2200 or greater OR 500 on the reading/math SAT exam and a total score of at least 1070 OR >= 19 on the reading/math ACT and >=23 composite ACT);
–percentage of students advancing from high school to college in the fall semester after spring graduation; and,
–percentage of students in four-year universities that had less than a 2.0 GPA during the first year of college.
The percentage of 9th grade students scoring at 2200 or above on BOTH the mathematics and reading TAKS in the 8th grade is included because it is the single most important input in explaining college-readiness. Indeed, the correlation between this percentage and the TEA college-readiness standard is greater than .500.
As shown in the yellow and pink rows, the IDEA Secondary School (the school changed names during the time spans under examination, but continued enrolling the same students) had incoming 9th grade students that were extremely similar to the South Texas Health and Science Academies–both magnet schools of choice in the Rio Grande Valley. Thus, these two schools serve as the best comparison schools to IDEA.
TABLE 1: College Readiness, High to College Transition, and College Performance for IDEA Secondary School and Selected Rio Grande Valley High Schools
Also note that only McAllen Memorial (MM)had incoming students even remotely as well-prepared as the incoming 9th grade students in IDEA. And, since the percentage of students scoring at or above 2200 when entering high school is highly predictive of the percentage of students achieving TEA’s college-readiness standard, IDEA had the proverbial “leg up” on almost all other public high schools in the Rio Grande Valley.But MM certainly appears to add far more value in terms of moving students into the college-readiness range and, while still lagging in the percentage of students entering a 4-year college, certainly has far better prepared students for college-level work.
The results get more interesting, however, when we get to section three: percentage of students entering four-year colleges. While IDEA sent a greater percentage of students to 4-year colleges than other public schools, IDEA sent a lower percentage than the two schools with similar student characteristics.
Finally, and most intriguing, is that 38% of the graduates of IDEA that entered four-year colleges had less than a 2.0 GPA. In short, almost 40% of the IDEA graduates were at serious risk of “flunking out” of college. This was a greater percentage than all the other schools included in the list and every school in the Rio Grande Valley as far as I could tell (it is quite laborious scrolling through the THECB pdf). In fact, I purposefully selected some of the schools with the LOWEST percentage of students meeting the TEA college-readiness standard to see if they would have a higher percentage of students failing college-level coursework. Yet, even these schools had lower percentages of students with a GPA lower than 2.0.
So, what can we learn from this data? Well, the data about IDEA is somewhat mixed.
Yes, IDEA has a high percentage of students designated as college-ready, but that is largely explained by the characteristics of students entering the school in the 9th grade. IDEA performs just slightly better than predicted from the incoming 9th grade scores.
Yes, IDEA has a high percentage of students entering 4-year colleges. But, the percentage is below schools with similar student achievement.
Finally, many IDEA students seem woefully under-prepared for college even though the majority of the students do not enter a Tier I university. In fact, for the last cohort available (the graduates of spring 2009), 44% of students earned a GPA lower than 2.0. That strongly suggests that IDEA is not preparing students very well for success in 4-year colleges.
One possible explanation for this is that IDEA focuses on preparing students for the TAKS–engages in “teaching to the test”– rather than preparing students to think, analyze, write, and question as is expected of college students. Other explanations are certainly viable, but no one has studied this in-depth. Someone certainly should investigate and determine the reason behind the poor college performance of IDEA graduates.
So, we have to ask, what is the value-added of IDEA? Are they able to add value to the students starting the 9th grade? The data here suggests they do not add value–or at least not much anyway.
Let’s look closer.
Let’s start with looking at the relationship between the percentage of incoming 9th grade students scoring at or above 2200 on both the math and the reading TAKS in the 8th grade and the percentage of students meeting the TEA college-readiness standard for both mathematics and reading. For both 2008 graduating class, IDEA slightly outperformed the level at which they were predicted to perform based on incoming student achievement.
FIGURE 1: Percentage of Incoming 9th Grade Students (2005) at “College-Ready” Standard in 8th Grade (2004) and the Percentage of Graduates (2008) meeting the TEA College-Readiness Standard
In 2009, as shown in Figure 2, IDEA performed marginally better than the level at which it was predicted to perform based on the TAKS scores of incoming 9th grade students. Essentially, it performed as well as expected.
FIGURE 2: Percentage of Incoming 9th Grade Students (2006) at “College-Ready” Standard in 8th Grade (2005) and the Percentage of Graduates (2009) meeting the TEA College-Readiness Standard
In 2010, as shown in Figure 3, IDEA performed at the level at which it was predicted to perform based on the TAKS scores of incoming 9th grade students. Interestingly, each successive cohort regressed toward where the line of prediction. In other words, the first cohort performed better than predicted, the second cohort predicted slightly better than predicted, and the third cohort performed as predicted.
FIGURE 3: Percentage of Incoming 9th Grade Students (2007) at “College-Ready” Standard in 8th Grade (2006) and the Percentage of Graduates (2010) meeting the TEA College-Readiness Standard
Let’s turn to SAT scores. It is very difficult to fairly compare SAT scores, so this analysis must be interpreted with a great deal of caution. The most important variable explaining average school SAT scores is the percentage of students taking the SAT. Generally, the greater the percentage of participating students, the lower the scores, all other factors being equal. Unfortunately, IDEA did not correctly report to TEA the percentage of students taking the SAT for any of the three cohorts. Thus, I cannot include that variable in any analysis.
However, let’s examine the relationship between the percentage of students in the graduating classes in 2008, 2009, and 2010 and the average SAT scores of those graduating classes. The following three figures display these relationships, with 2008 first and 2010 last. One would suspect a fairly strong relationship between the two measures, especially since scoring above 500 on the SAT mathematics or reading scores and above 1050 for the composite score achieves college-ready status. The following three figures confirm this supposition. Indeed, the correlations for each of the three years are .600 or greater.
Note that in each of the three years, IDEA performs worse than predicted based on its percentage of college-ready graduates.
FIGURE 4: Percentage of 2008 Graduates Designated as College-Ready and Average SAT Scores for 2008 Graduates
However, remember we do not know the percentage of students taking the SAT and inclusion of this variable could result in different outcomes for IDEA. Well, let’s assume that every single IDEA student took the SAT. This would be the most conservative assumption to make since it would be in favor of IDEA in any analysis.
Yet, when I employed regression analysis to explain average school SAT scores with the percentage of poor students, the school percentage of students scoring at 2200 in math and reading in 8th grade for incoming 9th graders, school size, and the percentage of graduates achieving college-ready status, IDEA always performed significantly below where the regression would predict they would perform. This was true for each of the three years.
Why IDEA under performs in this area, we do not know. However, as mentioned previously, one possible explanation is that IDEA is engaging in “teaching to the test” strategies that inflate scores and reduces the connection between the scores and the knowledge and skills possessed by students (see Koretz, 2008; Measuring Up: What Educational Testing Really Tells Us).
In closing, the data on IDEA Charter schools again raises serious questions about the claims of superiority made by IDEA leaders. Much, much more work needs to be undertaken to answer these questions. IDEA could shed some light by sharing data so researchers could examine these issues and having researchers interview and survey students in IDEA schools. This is the type of research EVERY school organization should be doing.