tags:

views:

26

answers:

2

Hi ,

i understand basic concept of cache ,

but one thing i cant able to understand , even i saw some example ,

assume ,

my first request is 9am , now new update information coming to cache , then another same content request from other user , now system get content from cache file instead of DB,

example 1 : i set the cache expire for 1hour ,

Now time is 9am ,

9am : First request : now read the content DB and store into cache file, 9.15AM: second request : now system retrive content from cache instead of retrieve from DB, 9.24am: Few contents are modified in the DB,

9.30am: Third request : NOW system retrieve the content from DB or Cache , How system know DB is updated ,

This is my doubt :

Example 2: If am not set the expiry time :,

Then When System retrieve and store the new updated content from the Database to cache file,

Am stuck up here

I appologies if this thread is duplicate,

+2  A: 

Simple: whenever you update a record in the database, you delete any cached copy of it. This forces the cache to be updated the next time the record is requested.

It should work like this:

$data = retrieveData($id);

retrieveData() does this:

  • Is the data for $id in the cache? Good, return it.
  • If it isn't, fetch the data from the database, write a copy to the cache and return it.

When updating data:

updateData($data);
  • updateData() saves the new $data to the database.
  • It deletes any copy of $data in the cache if there is one.

This means that:

  • The first time you retrieve a record from the database there's no cache. Once you have retrieved it though it'll be cached.
  • The next time the same record can be taken from the cache.
  • When the record is updated, the cache is deleted.
  • The next time the record is requested, there's no cache, so it'll be retrieved from the database and the cache updated again.
  • Rinse, repeat.
deceze
ok, see if some one retrieving records from cache, if delete , then how user continue his work, they get little but stuck know,
Bharanikumar
@Bhar See update.
deceze
yes correct, but dont mistake me, see user2 start retrieving records at 9.15am (retrieved from cache file and reading) , admin updated the record at 9.16am , now user2 ready to proceed the currently reading article , but admin paralaly updating content there, now what will happen, if admin submit the update , current cach file will delete and create the new, and paralay user 2 proceeding that current record, so i guess , now he face the file not found prob(that is concurrency prob..)... sorry my question is wrong,
Bharanikumar
@Bhar *Caching in general* should work as described. What are you really asking about? HTTP caching? You want to notify the user that an updated has occurred? What would happen if a user were to write back to the database, but bases his submitted data on an outdated copy? There are many answers to this problem, but they depend heavily on the specifics of your situation.
deceze
+2  A: 

There are two possible interpretations of what you are saying - one is to do with the database keeping results in memory which is hardly caching.

On the other hand there is HTTP caching and by the looks of it, (assuming you are talking about HTTP caching) you have got it quite wrong - with HTTP caching user1 requests a page that expires in 3 hours, any time he requests that page within the next 3 hours no hits come to your server. This is (obviously) separate for each user - when user2 comes he has no idea who user1 is or what he has in his cache so user2 requests the data from your server, after which future requests from user2 can be served from user2's cache.

HTTP also has a method for caching data of unknown lifetime - every time a user requests a page they add an If-Modified-Since: <Date last fetched/modified> or If-None-Match: <server-specified hash> header. The server can then send a 304 Not Modified HTTP Status Code if the content has not changed and not have to send the body of the file again.

tobyodavies
if i understand your ans, then you are trying to say , for each user request , system read the content from DB server and store into cache, The 5 user request means , 5 cache file will create know ?
Bharanikumar
I believe the question is not as much about HTTP caching, but rather disk caching a rendered result for a defined time.
nikc
except he's talking about 'the system' doing it for him, and the only system tagged is PHP so the only kind of caching that i can assume is happening is HTTP. i'm aware of the first version but @deceze covered that pretty well
tobyodavies