The most important aspect of the problem is going to be how you perform the query to find the new records for the user. You only want to pull back records that you haven't already retrieved earlier on the page. This could be achieved by storing a variable on the client keeping track of the chronologically latest record pulled back from before. Then, only query the server for the user's records that fall after that point. To do this, your records would need a time stamp, or their primary keys would need to be guaranteed to fall after earlier records when sorting.
// jQuery example
$.getJSON(
"http://domain/url?after=" + lastTimeStamp,
function(data) {
// Render the data
}
)
Once you've ensured that the amount of records retrieved by your query is the minimum required by the client, you also want to make sure that the query runs quickly. Making sure that the before-mentioned time stamp column and the foreign key column for associating the record with the user is indexed would probably help. Also, if you use a stored procedure, then the fact that the DB engine has a cached execution plan should make the query come back much quicker.
If your users will be returning to the page frequently, and you don't have a lot of concurrent users, you could also cache query results on the server.
If speed is the only factor, it doesn't matter a whole lot which server-side technology you use to host the code that sends the records to the client. What is important is that you only send back raw data. In other words, I would recommend against using an ASP.NET update panel. Instead, I would create a web service, perhaps in WCF, that encodes objects as JSON data structures to reduce the size of the response and the time it takes to parse it. Then, I'd had the client run the Javascript code to generate the HTML. DOM manipulation, even using tools like JQuery, is quite fast.