views:

153

answers:

5

So I've got my jquery autocomplete 'working,' but its a little fidgety since I call the webservice method each time a keydown() fires so I get lots of methods hanging and sometimes to get the "auto" to work I have to type it out and backspace a bit because i'm assuming it got its return value a little slow. I've limited the query results to 8 to mininmize time. Is there anything i can do to make this a little snappier? This thing seems near useless if I don't get it a little more responsive.

javascript

$("#clientAutoNames").keydown(function () {
        $.ajax({
            type: "POST",
            url: "WebService.asmx/LoadData",
            data: "{'input':" + JSON.stringify($("#clientAutoNames").val()) + "}",
            contentType: "application/json; charset=utf-8",
            dataType: "json",
            success: function (data) {
                if (data.d != null) {
                    var serviceScript = data.d;
                }
                $("#autoNames").html(serviceScript);
                $('#clientAutoNames').autocomplete({
                    minLength: 2,
                    source: autoNames,
                    delay: 100,
                    focus: function (event, ui) {
                        $('#project').val(ui.item.label);
                        return false;
                    },
                    select: function (event, ui) {
                        $('#clientAutoNames').val(ui.item.label);
                        $('#projectid').val(ui.item.value);
                        $('#project-description').html(ui.item.desc);
                        pkey = $('#project-id').val;
                        return false;
                    }
                })
            .data("autocomplete")._renderItem = function (ul, item) {
                return $("<li></li>")
                    .data("item.autocomplete", item)
                    .append("<a>" + item.label + "<br>" + item.desc + "</a>")
                    .appendTo(ul);
            }
            }
        });
    });

WebService.asmx

<WebMethod()> _
Public Function LoadData(ByVal input As String) As String
    Dim result As String = "<script>var autoNames = ["
    Dim sqlOut As Data.SqlClient.SqlDataReader
    Dim connstring As String = *Datasource*

    Dim strSql As String = "SELECT TOP 2 * FROM v_Clients WHERE (SearchName Like '" + input + "%') ORDER BY SearchName"
    Dim cnn As Data.SqlClient.SqlConnection = New Data.SqlClient.SqlConnection(connstring)
    Dim cmd As Data.SqlClient.SqlCommand = New Data.SqlClient.SqlCommand(strSql, cnn)
    cnn.Open()

    sqlOut = cmd.ExecuteReader()
    Dim c As Integer = 0
    While sqlOut.Read()

        result = result + "{"
        result = result + "value: '" + sqlOut("ContactID").ToString() + "',"
        result = result + "label: '" + sqlOut("SearchName").ToString() + "',"
        'result = result + "desc: '" + title + " from " + company + "',"
        result = result + "},"

    End While
    result = result + "];</script>"
    sqlOut.Close()
    cnn.Close()

    Return result
End Function

I'm sure I'm just going about this slightly wrong or not doing a better balance of calls or something.

Greatly appreciated!

+1  A: 

You could use Data Caching on the Webservice & save some data on the server rather than going to the DB on subsequent calls.

If the data in v_Clients does not have too many records, you can just put the whole thing in the cache, and use LINQ to query it.

http://msdn.microsoft.com/en-us/library/system.web.httpcontext.cache.aspx

Ed B
I'd cache a lot more but there are more than several hundred entries and growing - this is an internal CRM
jphenow
+7  A: 

First off, you MUST adjust your query. You're not using parameterized queries, so you're just begging for a SQL injection attack.

Again, you MUST fix that first!

Once you've done that, play with the "delay" value. I assume that's the amount of time it waits between keystokes before sending the query to the server? You want it to wait for your users to pause in their typing, and then send the request, to avoid overloading your server with extraneous calls.

You should also use a try..catch..finally and dispose of your connection when you're done with it.

By the way, if your query is performing poorly, try adding an index on that column. 'Like' statements are terrible if you don't have an index...

jvenema
Thank you for pointing that out. I've heard about sql injection, could you suggest how I parameterize the query? I'm new to ASP. I also didn't look too much into worrying about that yet as this is a small internal CRM but, as you point out how serious it can be, I'll address it.
jphenow
For that matter, the server code lacks error-handling. It should have a try block around the bulk of the code, with a catch block that logs the error and returns something safe, like an empty string. It should also ensure that all disposable objects are disposed of, preferably by using the `using` statement. Along with parameterizing, these are basic requirements for a stable server.
Steven Sudit
I should probably also mention that the return value should be created using String.Format, not concatenation.
Steven Sudit
@jphenow: http://www.aspnet101.com/2007/03/parameterized-queries-in-asp-net/
jvenema
I mentioned elsewhere - I injected the results into the page a little hackishly - if you're familiar with how jquery's autocomplete works would you suggest I change how I send the variable of possible strings to the page? it was the best workable way I could conceive, but again, feels a little hackish.
jphenow
@jphenow: I might adjust it a little; JSON.strinify({ input: $("#clientAutoNames").val() }) would be cleaner.
jvenema
I was a little more concerned with how I return the value from the web service - it doesn't seem very efficient
jphenow
Oh, yeah. I'd do the same in reverse on the server. Create a class and define it with the appropriate properties (ContactID, SearchName). Then just use a JSON serializer (like the one built into .NET [1]) to serialize it out. You shouldn't need the script tags. ([1] http://msdn.microsoft.com/en-us/library/system.runtime.serialization.json.datacontractjsonserializer.aspx)
jvenema
+1  A: 

you want to avoid calling the server on each keystroke unless absolutely necessary..

is it querying a huge dataset? if not, you can cache the whole dataset in a local variable on page load.

if the dataset is huge, you can think of smart caching, i.e. caching data that is used most of the time and than get the less common scenarios on demand.

also prevent autocomplete on first few characters. i.e. require at least 4 to make the call.

you'll have to use your judgement here, but measure and observe as you tweak, to get a good idea of performance / data reliability etc..

Sonic Soul
+2  A: 

You should first consider caching the results that are coming back from the database before doing anything else. This will limit the amount of queries that you are executing against the database. If you are running these queries through a single web server, consider using the System.Web.Cache for caching the results.

Any other advice on how to optimize will depend on more information than you have provided -- i.e. are you running across a web farm, how many database servers do you have, what are your performance requirements, etc.

I would also recommend you read the Wikipedia entry on SQL injection. I believe ASP.NET will prevent requests that include singe quotes by default, but you should still fix your code.

Eric Hauser
You see how I'm currently adding the queried things into a div selector in the default.aspx page? Would caching it on the service just make it 'query' the service or do you suggest I add it to the page in a better fashion?
jphenow
I would ignore the page aspect until you get the latency correct on the server side implementation. Caching it at the page level could certainly help (i.e. for when the user hits backspace), but you want to get the server side right. You do have an index on the SearchName column?
Eric Hauser
Yea its indexed
jphenow
+1  A: 

Some suggestions for "snappier"

Depending on the requirements, you might build in preventative measures on the client side against too many server calls and implied call latency. For example,

  • auto complete can only kick in after at least 3 characters have been typed; otherwise the results are too broad to matter;
  • the list should continually get smaller as the user types more, so cache the original list on the client-side and filter it as the user continues typing more chars. If the user backspaces past the original result set then grab a new one from the server and do the same with it.
  • employ a client-side timer to issue calls no sooner than 2 or 3 seconds apart. keydown would detect a change in the text box and when your timer fires it looks at that flag to determine if it should pull data again

Granted, some of these might not contribute to snappiness, but it depends on how you use them. For example, the timer delay doesn't logically contribute to speed but then again comparing the alternative of the server being inundated with calls and responding slower, it might. Mileage always varies with this type of optimization and you often trade up server work for client speed.

John K
I can work on the top ones - the last one has me caught. I was working on that and the way I currently send the return value back to the page makes it difficult to "add" to a list - more like rewriting the list. Do you have any ideas on how I cold send a list over that can be manipulated?
jphenow
If you receive back a JSON data response then it's already a JavaScript structure (hash or array), so just keep a reference to it in your JS code `var originalData = whateverJQueryReturned;`. Then write a function to iterate over that hash or array to write a new data structure having values limited to what's currently in the text box - you will display that new subset. Your function would run continually against that original list doing the same thing each time by cutting it down into a new list as the user types.
John K
I wouldn't worry about merging or adding to lists on the client side. Replace sounds good as a start, then you can get into more optimization measures if needed.
John K