Has the Percentage of Texas Public School District Staff Who Directly Impact Student Outcomes Changed over the Last 20 Years?

Posted on April 10, 2011


In the last post, I showed that the decrease in the percentage of teachers was largely due to an increase in support staff. But support staff is a catch-all that covers a wide variety of individuals. As it turns out, a fair percentage of support personnel are located at schools and have a direct impact on students. A significant percentage also provide teacher support. Below I review this in (agonizing) detail. The goal here is to document the percentage of individuals in positions that directly impact student outcomes. The reason for doing this is to determine how many positions could be cut without directly impacting student outcomes. I want to do this to disprove some comments by the governor, other politicians, and lobbyists that school districts are choosing to lay off individuals who directly impact student outcomes. First, we must determine the number of individuals who are in positions that directly impact student achievement and the amount of money expended on their salaries. Next week, I will examine whether districts would have enough money under HB 1 to not cut individuals who directly impact student outcomes. But, first we need to determine who impacts student outcomes.

Why all the Detail?

Stay with me through the post (or skip to the end for the big bang if you are swamped at work and then re-read all the way through when you have time). The reason I go through the agonizing detail is that too many politicians, pundits, and media make claims without providing any substantiation. This is how we get into trouble. What we need is an independent organization to provide unbiased reports that are substantiated with data. TEA could provide such data, but TEA staff communicated to me that they simply do not conduct any research that is not prescribed by federal or state law. This is unfortunate because they could provide data such as in this post so that policymakers and the public could have a common understanding of the situation, thus make more informed decisions.

On to the DATA!!!

As shown in Table 1,the greatest numerical increase was for those providing student support with an increase of almost 14,000 FTEs over a 20 year time period. In addition, there was an increase of almost 7,000 FTEs in student special education support. Thus, almost 21,000 of the approximately 38,000 increase were FTEs in positions directly impacting student outcomes. There was a large increase in administration, with general support having an increase of almost 12,000 FTEs over the 20 year time period.

Table 1: Support Personnel by Broad Job Area (1991, 1996, 2001, 2006, & 2011)

As shown in Figure 1, data at this broad level suggests that about 54% of the increase in support personnel were in positions that directly impacted students (student support + student special education support). Around 31% of the increase was for general support. Thus, most of the increase was for student support, but there was also an increase for other support. While we could stop at this level–and many people do–let’s dig deeper. Why? Because we can . . .  and the more detail, the better we can determine what is really going on.

Table 2, shown below, provides more detail on the specific job titles of those in the broader job codes above. The largest increases were for counselors, librarians, other campus professionals, teacher facilitators, and other non-campus professionals. Clearly, counselors and librarians provide direct student support. But what about other campus professionals and non-campus professionals?

Table 2: Support Personnel by Specific Job Description (1991, 1996, 2001, 2006, & 2011)

But what about these other campus professionals? What do they do?

Well, Table 3 below reveals in detail the job titles of these individuals. I have arranged them in broad categories that are of my own making. Undoubtedly, the greatest numerical increase was for those providing teacher support and assistance. When combining all areas that directly impact student outcomes (student support, discipline management, and special education), there was an increase of over 1,000 FTEs. But, this table shows that not all of the other campus professionals have a direct impact on student outcomes.

Table 3: Number of FTEs for Other Campus Professionals in 1991 and 2011 by Job Title

Who are these non-campus professionals? The TEA role code describes them as holding non-instructional positions. But let’s see. You never know until you check.

The details are below in Table 4. Almost all of these non-campus professionals were either general administration or operations. Thus, almost all of these individuals do not directly impact student outcomes.

Table 4: Number of FTEs for Non-Campus Professionals in 1991 and 2011 by Job Title

So, let’s summarize.

In Table 5, we see that the majority (79%) of other campus and non-campus professionals are NOT in positions that directly impact student achievement. Thus, we need to revise figure 1 since using the broader categories mis-classifies a number of individuals.

But, let’s not lose the forest for the trees here. We are talking about 20,000 of the over 61,000 FTEs that are support personnel in 2010-11.

But clearly, the data as categorized by TEA at the role code level is not detailed enough to determine the percentage of individuals in positions that directly impact student achievement and that is what wee ultimately want to know.

Table 5: Number of Other Campus and Non-Campus Professionals by Job and Relationship to Student Outcomes


To determine the number of individuals in positions that impact student outcomes, I had to go to the individual employee data from TEA. This data includes each person’s assignment and partial FTE (full-time equivalent) for that assignment as well as the partial base pay for that assignment.

Table 6 shows the role codes, role code descriptions, and the designation of each role as having a direct or indirect impact on students. All teachers, educational aides, principals, assistant principals, counselors, librarians, social workers, work-based learning coordinators, visiting teachers, truancy officers, and all special education staff have a direct impact on students. This is true regardless of whether the individual is designated as being located at a school or central office.

Other campus professionals are located in several areas, depending on their specific job title. Some are school-based student support staff, others are teacher support staff, and even others are district support staff.

Superintendents, associate superintendents, assistant superintendents, directors, executive directors, coordinators, managers, business managers, HR directors, and athletic directors are all designated as central office administration. Everyone else is considered central office support.

Table 6: Role Codes and Descriptions and Designation of Direct or Indirect Student Impact

Staff Directly and Indirectly Influencing Student Outcomes

Okay–enough playing around–let’s get to the bottom lime. Below I show the number and percentage of all staff in positions that directly impact student outcomes or indirectly student outcomes. As shown above, determining which roles and jobs directly impact student outcomes is pretty straight-forward for most roles and positions. The support staff positions–in particular the “other campus professionals” and the “other  non-instructional/non-campus professionals” are where the difficulty lies. As it turns out, however, some of the administrator positions are mis-coded as well. Using the individual person-level data allows me to identify each person’s role and job and determine whether the position directly or indirectly impacts student outcomes.

Before showing the results, let’s review who directly impacts student outcomes:

1) teachers and educational aides since they are in classrooms every day with students;

2) principals, assistant principals, nurses, counselors, and librarians–any one who has worked in schools or who had children in schools knows that all of these individuals help students achieve educational outcomes; and,

3) Special education specialists, whether at a school or at the central office, make a critical difference in the lives of students with special needs and without these staff, many students would suffer academically and emotionally.

Those who do not directly impact student outcomes are:

1) School-based curriculum and instructional specialists (many would argue that they do directly impact student achievement, but I’ll play it conservatively);

2) Central office leaders (superintendents, associate superintendents, and managers), administrative staff (business-related positions and human resource staff), curriculum and instructional specialists, and operations personnel (plant maintenance, food processing, transportation, etc.); and,

3) auxiliary staff (bus drivers, custodial staff, etc).

As we see in Table 7, the total percentage of staff directly impacting student achievement in 1991 was 67.8%. In 2011, the percentage dropped 0.1%. Oh the bureaucracy!!!! We have gotten so top-heavy in the last 20 years. We have plenty of room to cut positions without harming students!!

Table 7: Number and Percentage of All Staff in Positions Directly and Indirectly Influencing Student Outcomes

What???? Let’s check that again with a simple bar graph!

I don’t know about you, but those bars look pretty much the same. As Charles Barkley says, “I could be wrong, but I’m probably not!”

Could other organizations done this same analysis? Sure–the data from TEA is accessible to anyone with a checkbook. Yet, why dig down to the very details when you can make your own ideological point using data that you never checked and can;t be checked by anyone else?

Hope you enjoyed the post. Next? If 67% of staff directly impact student outcomes, then can we keep all 67% under the current House Budget? Perhaps an additional post on teacher evaluations and how Senator Shapiro’s bill could end up costing beginning teachers their jobs through a statistical error.