fastcompany.com launched with ~750,000 pieces of content on day 1. They had performance and scaling problems initially, but it was related specifically to the fact that large-scale faceted search of the entire content base turned out to be the most popular feature, and they weren't using a dedicated search indexing system.
The New York Observer converted to Drupal a while ago, and their scaling problem had nothing to do with the amount of content; it was straightforward "how to handle Drudge and the Huffington Post both linking to you at the same time during the election season"
The Onion, Lifetime Television, and a number of other pretty large sites use Drupal. Mother Jones magazine just converted to it. NowPublic.com, the crowdsourced news site, also runs on Drupal and has been since the (much slower) days of Drupal 4.7.
The key scaling issue is not really how many discrete pieces of content you have, but rather the kind of slicing and dicing you'll be doing with your queries. Those are optimized ad-hoc, like any other SQL query. Drupal tends to focus on optimising for small to medium sites out of the box, and the larger stuff requires prodding around at the indexes and paying attention to how you build your Views-based pages (since they're basically just presentation logic wrapped around SQL).
As an earlier poster noted, if you don't need lots of user-customized content ('stuff my friends have posted,' 'what my buddies are doing,' etc.) the amount of expensive querying drops dramatically.