It depends on how you have implemented it; but it sounds like a bad idea. You don't give a lot of details.
If you are storing 100000 records in the same cache bucket, and it gets invalidated; then the next request must retrieve all 100000 records, before it can proceed - even if it only needs a few of them. If multiple users request the resource at the same time, then they will be placed on queue until the first request finishes retrieving records. It sounds like that could be your problem.
It might be better to store the records in individual cache buckets, or in small batches depending on how much you know about the data and when it will be needed. Then a single request will only need to fetch what it actually needs, before it can proceed.
I don't know how much this answer helps you, but you are not giving very much detail in the question. Feel free to update your question, and you might get better answers.