We simply don't have enough AAA and AA rated data to be statistically confident in these distinctions ex ante, which is why AA+ and AAA rated securities differ very little in their yields, usually by only 10 basis points (0.1%) on average. Here's the data from Moody's that excludes Munis and ABS:
Average Cumulative Issuer-Weighted Global Default Rates (%), 1920-2009
Note that the +/- addition, as in grades you got in college, just adds further granularity (Moody's uses the less obvious 1,2 and 3 suffixes, 1 being +, 3 being -), but with the following exception: there is no AAA+ or AAA-! So AAA to AA+ is one 'notch'. The main thing to realize is the default rates are approximately log linear in ratings category, and I would say this is a general law. People perceive things in log space (decibels, Richter scales, brightness, acidity), and so an "AA" is 2-5x as risky as an AAA.
Data on ratings performance would be a great project for our many regulators because ratings agencies compile these default studies themselves and self-servingly exclude various data points (note the complete absence of AAA defaults even though several AAA mortgage-backed CDOs went down, because ABS aren't included in the general tables!). It's basically impossible to compile these without some regulatory authority, and it would be straightforward and very useful.
Yet it appears ratings are pretty good ordinal rankings over 5-10 years. The key is that moving from AAA to AA+ is in one sense small (0.03% in annual default rate), another a paradigm shift, from the state where risk is 'as low as conceivable' to not, and this will focus Treasury buyers on the real probability of a US default ($14T in debt, but if you include social security, medicare and medicaid, it is around $75T).