This was the first time I've listened to one of the Science and Tech podcasts and I'm sorry to say I couldn't make it past the 11 minute mark. The whole first segment seemed to be based on a headline and no one seemed to have been alerted that it would be a matter for discussion. The participants should get some sort of heads up so they're not clearly rushing to educate themselves on a topic during the podcast itself.
Its pretty clear from the National Nuclear Security Agency's own press release (http://nnsa.energy.gov/mediaroom/pressreleases/trinity) that the Trinity super computer will not "manage" nuclear weapons in the way that seemed to be implied at first during the podcast. In fact, the press release does not use that word to describe Trinity's function. Trinity's purpose will be to handle the modeling and simulation for the "Nuclear Stewardship Program." This program, simply put, is the government's way of making sure the weapons still work, how well they can be expected to work over the years, and whether or not an accident might be imminent. Having weapons that function and knowing how they can be expected to function is a critical part of the deterrence equation, whether you believe in it or not. Knowing when weapons become nonfunctional or dangerous and need to be dismantled is also serious business. All of these models are likely to be painfully specific and statistical simulations are known to be very memory intensive, both in terms of processing power and storage space. You'll need a lot of space if you're trying to account for every conceivable outcome.
Until quite recently, the United States used to get a lot of the basic data by detonating warheads just to see what would happen. Having a super computer means you don't have to set off nuclear weapons just to check in on them.
EDIT: This recent LANL video helps give some sense of what the "stockpile" has encompassed in the past and the kind of work that goes into maintaining it: https://www.youtube.com/watch?v=dWA5Z32tiKM