views:

97

answers:

5

I'm trying to improve the performance of a website written in classic ASP.

It supports multiple languages the problem lies in how this was implemented. It has the following method:

GetTranslation(id,language)

Which is called all over the shop like this:

<%= GetTranslation([someid],[thelanguage]) %>

That method just looks up the ID and language in SQL and returns the translation. Simple.

But incredibly inefficient. On each page load, there's around 300 independent calls to SQL to get an individual translation.

I have already significantly improved performance for many scenarios:

  • A C# tool that scans the .asp files and picks up references to GetTranslation
  • The tool then builds up a "bulk-cache" method that (depending on the page) takes all the IDs it finds and in one fell swoop caches the results in a dictionary.
  • The GetTranslation method was then updated to check the dictionary for any requests it has and only go to SQL if it's not already in there (and cache it's own result if necessary)

This only goes so far.

When IDs of translations are stored in the database I can't pick these up (particularly easily).

Ideally the GetTranslation method would, on each call, build up one big SQL string that would only be executed at the end of the page request.

Is this possible in ASP? Can I have the result of a <%= ... %> to be a reference to something that is later resolved?

I would also sincerely appreciate any other creative ways I might improve the performance of this old, ugly beast.

A: 

<%= x %> simply translates into Response.Write(x). There's no way to make that deferred.

In fact, Classic ASP has no way to make anything deferred, as far as I can remember.

You've already done a lot in terms of writing the caching tool. Next step would be writing a tool to convert these ASP pages to ASP.NET.

John Saunders
Damn. I feared this might be the case.
joshcomley
There are reasons to dump Classic ASP. There were even reasons to create "ASP Plus", which became ASP.NET. The flexibility of ASP was not one of those reasons.
John Saunders
Managerial decisions.. can't dump it just yet unfortunately.
joshcomley
A: 

Your cache is the best saving you can make here, you could do away with the complication of the .net pre-cacher and just have each call to GetTranslate check your dictionary for its entry, if not there then it can fetch it and cache it in Application space. That would be lightening fast. Then there just the problem of refreshing the Cache every now and then but that will be done for you every 24 hours ish when the worker process gets refreshed.

If you need it to be more up to date than that then you could pull out all your references as you do using your .net cacher. Then you could create a new function to go get all the entries for a given set of ids, write a call to this function near the top of your ASP pages for each page which would call your DB with one SQL string stashing the results in a local dictionary. Then modify GetTranslation to use the values from this dictionary. Would need to be updated but that could be part of your build process or just a job that runs every hour/night.

Pete Duncanson
The second solution you proposed is what I mentioned I had already done :)
joshcomley
Hmmm re-reading it you might be right. I knew you'd done some of the work, I was trying to get across the idea of updating the ASP files themselves via your c# code but again re-reading your question that might be what you are doing anyway in which case apologies ;)How many translations are we talking about anyway, could you not just grab them all and cache them? Guessing not but you never know.
Pete Duncanson
A: 

We use a i18n class with a namespace and language attribute in oure e-commerce system. The class has a default function called 'translate' which basically preforms a dictionary lookup. This dictionary is loaded using the memento pattern from a text file containing all the translations for the namespace and language.

The skeletons for these translations files are generated by a custom tool (written in vbscript actually) which parses the ASP's for i18n($somestring) calls. The filenames are based on the namespace and language e.g. "shoppingcart_step_1_FR.txt". The tool can actually update/extent existing translation files when we add new translatable strings to the ASP code, which is very important for maintenance.

The performance overhead for using this method is minimal. Due to the segmentation our largest translation file contains about 200 translatable strings (including static image urls). Loading it per request has very little effect on the performance. I guess one could cache the translation dictionaries in the application object using some third-party treadsafe dictionary, yet IMHO this isn't worth the trouble.

An extra tip, use variable replacement in your string to improve translatability. For example use:

  <%=replace(i18n("Buy $productname"), "$productname", product.name)%>

instead of

  <%=i18n("Buy")%> <%=product.name%>

The first method is much more flexible for the translators. A lot of languages have different sentence structures.

Thanks, again that's very similar to what I've done.I've made it super-clever with my own maintenance/crawler tool. It's now super, super quick!
joshcomley
+1  A: 

I don't think you can do delayed execution in Classic ASP. As for suggestions on improving the performance. You can have a class like this :

Class TranslationManager
        Private Sub Class_Initialize
        End Sub

        Private Sub Class_Terminate
        End Sub

        Private Function ExistsInCache(id, language)
            ExistsInCache = _
                Not IsEmpty(Application("Translation-" & id & "-" & language))
        End Function

        Private Function GetFromCache(id, language)
            GetFromCache = Application("Translation-" & id & "-" & language)
        End Function

        Private Function GetFromDB(id, language)
            //'GET THE RESULT FROM DB
            Application("Translation-" & id & "-" & language) = resultFromDB
            GetFromDB = resultFromDB
        End Function

        Public Default Function GetTranslation(id, language)
            If ExistsInCache(id, language) Then
                GetTranslation = GetFromCache(id, language)
            Else
                GetTranslation = GetFromDB(id, language)
            End If
        End Function
End Class

And use it like this in your code

Set tm = New TranslationManager
translatedValue = tm([someid], [thelanguage])
Set tm = Nothing

This would definitely reduce the calls to DB. But you need to be very careful about how much data you put into the application object. You don't wanna run out of memory. It's best you also track how long the translations stay in the memory and have them expired (deleted from the Application object) when they were not accessed for some time.

çağdaş
The reason I didn't go for something like this in the end was for several reaons.Application doesn't support collections (not a major, but added hassle); I didn't want the Application store to grow exponentially; and I didn't want to cache the actual result of the translation because what is bound to an translation ID often changes.It now finds what IDs are going to be looked up per page at each request and bulk caches the results.It still does exactly the above for anything that got missed (but scoped to request as opposed to application)
joshcomley
Damn you can't have new lines in comments!<br/>I wonder if a HTML break works...
joshcomley
A: 

The approach I use in my own CMS is to load all the static Language strings from the Database into the Application object at start up using unique keys based on the variable name, language ID and CMS instance. Then I use page level caching into an array as the page gets created so repeatedly used variable gets cached and avoids lots of trips to the Application object where possible. I use an array instead of a dictionary as the way my CMS function each page is created in sections with each section being isolated from each other so I'd be creating multiple dictionaries per page which is undesirable.

Obviously the viability of this solution depends entirely on the number of variables you have and the number of translations you have plus how many variables you need to retrieve on each page.

RobV