Different data structures support different use cases. For instance, a distributed hash-table would be a good choice if you can make do with the limited API of the dictionary/map interface.
If you need to ask more complex queries of your database, then pick a database that supports this use case efficiently. The database landscape is very varied, and there is probably a database out there for you.
For range queries, the BigTable clones (and upwards in expressive querying power) will probably be worth considering.
Even if the database with the right data structure isn't fast enough or can't scale enough, you can still pull tricks like sharding, replication and clever use of caching or search indexes.
How you want to weigh the constraints of consistency, availability, resilience, through-put and latency all depend on the specific problem, and you may find that you need more than one type of database to implement the optimal solution.
Don't artificially make things harder for yourself; premature-optimization, you know.
I work at a MySQL shop and it works fast enough for our hundreds of thousands of transactions a day (millions of queries), and reliably enough for when our software is used in live TV events.
Sorry that I can't give you a more concrete answer than this.