I don’t think America will ever have a national theater. Yes, theater is popular but it’s not culturally important to everyone. American theater didn’t start until the 1700s because that’s when America was formed. European countries have history that stretches back centuries. America wasn’t as involved in developing theater as England and France were, to some extent we just changed the plays. The article mentions that when America was first founded the culture of the puritans didn’t leave space for theater (source). I think the timeline difference between America and European countries was one aspect, but I also think the structure of the US didn’t allow for theater at least in theaters. The US is huge and for much of history it was hard to travel those long distances. Theater would have been regional, and it would lag behind the expansion west. Until the train and the industrial revolution, it wouldn’t have been easy to bring theater everywhere, or to show one production to a lot of people.
I was surprised to learn America did have a national theater historically, but the fact that it was a Great Depression plan also made sense. Perhaps when theater reopens the government could support theaters, but I feel like that’s not likely. Theater just isn’t important to everyone. It’s also inaccessible to many people. Tickets costs are rising, especially on Broadway and there’s often an idea that regional theater is less good.
I could keep going on but at the end of the day, theater is not universally important in the US. And unless that (or something else) changes we won’t see a national theater. Maybe this is the final point of theater: it started with the Greek’s as a universally loved and subsidized thing, and now many people have never and will never be able to see a live play. Maybe theater will collapse and disappear. Or maybe it means we need to change the theater system.