Caching is a good thing if you need it and a terrible thing if you don't. Is your app showing strain? Have you benchmarked it? Have you isolated the bottleneck to the point where you could select an optimization if you had a cartload of them? The reason my answer is a question is that no two scaling issues have the same solution so without a much clearer idea where your app is hung up, it's hard to say whether caching will help.
On a side note, unless the principal function of your app queries users and roles, you will probably find the traffic down that path is not particularly dense. Typically you hit it once, squirrel the information away in the session and off you go. If you put a lot of work into denormalizing and caching, you may not find it pays the dividends you expect. And don't forget you now have to test for the cache-hit and cache-miss scenarios to make sure everything functions identically.
And more directly to your question:
- memcached is a good LRU cache. There are other good KV stores like redis and tokyo cabinet.
- Counter caching is low hanging fruit if (and only if) you have a frequently accessed counter. For example, if you need to know how many friends a user has, but don't need to know anything about the friends, you can just denormalize friends.count.
- I'm not sure what caching IDs would buy you unless the roles remained pretty static. You have to develop a strategy for freshening them when the roles change and ... well, then you have to test the cache-hit and cache-miss scenarios.
My experience is that the biggest bang you can get for your optimization buck is caching pages. After that, fragments, then down to database-level and application-level tweaking.
Have you profiled your app?