I noticed yesterday an article in the Chicago Tribune (and linked by the Chronicle of Higher Ed.) about an uproar surrounding a report by the National Council on Teacher Quality that "grades" college programs designed to train educators. Like legal education programs, educator training programs also have seven-year accreditations and their graduates must pass a statewide exam after graduation. As is the case with all rankings, the ranked do not agree with the criteria that are used to create the rankings. One commentator analogized the report methodology to "evaluating the quality of restaurants by only requesting menus be mailed to the evaluator — without sampling the food or visiting the site." However, this seems to be an overstatement as the report details its methodology in the Appendix:
NCTG based its evaluation of each program’s design n many different sources of data. Each analysis started with an initial review of course catalogs and other program material posted publicly by the institution to identify much of the core data that are required for the review; institutional admissions standards and an education school’s own admission policy; general education course requirements; course requirements for secondary teachers in their subject area(s); professional course requirements and descriptions; graduation requirements; course schedules; and, teaching assignments and faculty listings. our solicitation of materials from Illinois institutions requested the syllabi for particular courses necessary for evaluation (for specified sections – selected at random – if multiple sections were offered), information on graduate and employer surveys, student teaching handbooks and 10 elementary or secondary schools at which student teachers were placed. We augmented our analysis of these sources with additional information obtained from surveys of personnel in public school districts; principals of those schools in which a program places its student teachers and superintendents who hire a program’s graduates.
If this sounds like a lot of work, then that's because it is. The report evaluated over 50 schools, and each school's preliminary assessment took over 40 hours. I have no dog in the education training fight (I live in a college town, so I obviously know professors in our School of Education, and our babysitter is a student there), but I think it is interesting to judge these criteria alongside the USNWR law school rankings, which seem to rule the world.
As far as I know, the USNWR does not look at our course listings, our syllabi, or our materials. It does not look at our clinical programs, externship programs or summer associate placements. The USNWR does look at our admissions, surveys hiring partners, judges and academic peers. I don't believe USNWR surveys either graduates, non law firm employers or law firm clients. And they sure don't do site visits!
Now, none of this is to say that the education training rankings are more or less accurate than law school rankings, just that it's interesting to hear what ranking victims in other disciplines hate about their ranking criteria!
TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8345157d569e20133f5be9051970b
Links to weblogs that reference Rankings Criteria: Law Schools v. Education Schools:

Sun | Mon | Tue | Wed | Thu | Fri | Sat |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 | 31 |
