views:

340

answers:

1

hello, I have an ajax autocomplete on a page ASP.NET. This calls a method from a web service which returns some postal codes.

public string[] GetNames(string prefixText, int count, String contextKey)
{
    prefixText = prefixText.Trim();
    XmlNodeList list;
    List<string> names = new List<string>();
    if ((prefixText[0] >= '0') && (prefixText[0] <= '9'))
    {
        if ((contextKey == null) || (contextKey.Equals("")))
            list = cpsForAgences["groupe"];
        else
            list = cpsForAgences[contextKey];
        int i=0;
        foreach (System.Xml.XmlNode node in list)
        {
            if (node.InnerText.ToLower().StartsWith(prefixText))
            {
                names.Add(node.InnerText);
                if (++i >= count)
                    break;
            }
        }
        names.Sort();
        return names.ToArray();
    }
}

At the client side, when he wants to publish the responses it calls Sys.Serialization.JavaScriptSerializer.deserialize() before:

try {
var pair = Sys.Serialization.JavaScriptSerializer.deserialize('(' + completionItems[i] + ')');if (pair && pair.First) {
text = pair.First;value = pair.Second;} else {
text = pair;value = pair;} 
}

For the postal codes which begin with '0' the result returned by Sys.Serialization.JavaScriptSerializer.deserialize it's different from the value of the completionItems[i] which is exactly my value. Why this behaviour? How can I avoid that? Thanks!

+1  A: 

I'd try using an integer instead of a String.