ZNC Dandy wrote:Criteria used, as reported in the article...
Thanks for the criteria. Here's my opinion:
-Modern Romantic orchestra
I guess this refers to size and instrumentation. More of a qualifier than a criterion.
-concert performances
This seems to be a decent quantifiable measure. However, are these free performances? Not a big deal here, as most orchestras play in halls and charge admission. Free concerts can usually be counted on one hand with most North American groups.
-recording output
This also seems to be a good measure, but has realistic flaws. I remember years ago when "Laserlight" started putting out recordings of a lot of the standard repertoire. Some were good, but most were nothing to write home about. I remember quite a few of these being played by a former Eastern Bloc orchestra. So there is some padding that might influence final results here. Also, some groups, like St. Louis, really are "off the beaten path" in regards to recording projects.
-contributions to local and national communities
This one I found a bit odd. Considering most orchestras have (at least in the USA) a large amount of benefactors, I would be weary of one that is doing a lot of "contributing" above and beyond driving PR campaigns and awareness to younger children. Also, this can be quite a dynamic measure as well.
-ability to maintain an iconic status in an increasingly corporate climate
I'd love to see the quantifying measurements in this category. Major Seven? Big Three? Big Five? Top tier? Tier II? Blah blah blah. This category sounds overtly subjective with the "increasingly corporate climate" thrown in to make it appear to have more gravitas.
There's some fabric for you. Continue discussing please

I still stand with my original conclusion. Lists are created to generate revenue, not to actually quantify results or qualify real decisions.