"Handle up to ~100,000 insert commands a second" - is this peak, or normal operation? If normal operation, your 'millions of records stored' is likely to be billions...
With questions like this, I think it is useful to understand the business 'problem' further - as these are non-trivial requirements! The question arises whether the problem justifies this 'brute force' approach, or if there alternative ways of looking at it to achieve the same goal.
If it is needed, then you can consider if there are methods of aggregating / transforming data (bulk loading of data / discarding multiple updates to the same record / loading to multiuple databases and then aggregating downstream as a combined set of ETLs perhaps) to make it easier to manage this volume.