views:

728

answers:

3

I have a list of zip-codes that I need to search trough using jQuery.

I have the zip-codes in a CSV file like this:

2407;ELVERUM
2425;TRYSIL
2427;TRYSIL
2446;ENGERDAL
2448;ENGERDAL

The list is pretty big, over 4000 entries, zip-code and corresponding city.

What the fastest way to search trough the list in the browser? JSON? If that's the case, how can I convert the list to JSON or another format if better?

{
     "2407": "ELVERUM",
     "2425": "TRYSIL"
}

Can someone show me the mest way to do this?

Update Would it be possible/faster to search the loaded CSV file with just Regex?

Update2 I'm looking for an exact match, and it's only going to search when it has 4 numbers.

Update3 Here is my code:

$('#postnummer').keyup(function(e) { 
 if($(this).val().length == 4) { 
 // Code to search the JSON for an exact match.  
 } 
});

$.getJSON("data.json",function(data){
});

Can anyone show me using this code?

A: 

For large data sets, JSON parsing would be slow. I suggest you use a simple custom format instead, like the one you have is fine:

2407;ELVERUM
2425;TRYSIL

Then you can parse like:

var data = dataContent.split('\n').map(function(line){ return line.split(';'); })

Where data is now an array of 2-element arrays: first being the zipcode and second the name. Then to search through, you would do something like:

var foundItems = data.filter(function(p){ return p[0].indexOf(query) != -1; });

Which gets you an array of matches. If you are only interested in exact matches, using a map would perform better. In which case, in place of the first line, you might do this to build the mapping:

var map = {};
dataContent.split('\n').forEach(function(line){
    var p = line.split(';');
    map[p[0]] = p[1];
});
toby
Wouldn't this (the map function) be as slow (if not more) than parsing JSON itself? Also, the problem with arrays would be that every lookup is O(n) complexity.
Chetan Sastry
It doesn't work. The foundItems array is just empty. Here's my code: $('#postnummer').keyup(function(e) { if($(this).val().length == 4) { var foundItems = data.filter(function(p){ return p[0].indexOf($(this).val()) == 0; }); console.log(foundItems); }});var map = {};$.ajax({ url: "tilbud5.csv", cache: false, success: function(dataContent){ data = dataContent.split('\n').map(function(line){ return line.split(';'); }) }}); Any thoughts?
mofle
Chetan - You are right, looping through the lines of data is O(n), and so is parsing the JSON. But in practice, split() is many times faster than eval() - the function used to parse JSON, which is essentially calling the Javascript interpreter. Looping through an array to find a match is slower than a map, but if you want to do partial matching, that's what you have to do, short of creating an index. But it looks like mofle just wants an exact match, for which a map will suffice, which was my last example.
toby
Also, see this great artical on this topic: http://code.flickr.com/blog/2009/03/18/building-fast-client-side-searches/
toby
You aren't going to be able to do all those splits in an interpreter faster than the same interpreter can parse an object literal. one call of split may be faster than one call of eval, but hear you are claiming that 4001 calls to split are faster than one call to eval. The only reason to use splitting is if you need to do incremental parsing of the results. For instance if you were parsing so many zip codes that you needed to use setTimeout to break up the work. On Safari, I can parse an object with about 40,000 properties in ~ 80ms, so I doubt this is one of those cases.
Kelly Norton
@Kelly Norton one call to eval is not just one call to eval, though. The javascript engine has to parse the same amount of data as 4000 splits. I'm not saying which is faster, just saying the comparison isn't quite true.
Gabriel Hurley
Kelly - You are correct. I was wrong, and I am sorry for being misleading. The speed of the 2 are comparable, as Ross of flickr states in his article. Also, you were very apt to point out incremental parsing as a reason to use split(), that is what I did in a project to avoid freezing the browser.
toby
+2  A: 

At 4,000 entries, you should just parse it as JSON using the form you suggested:

{
     "2407": "ELVERUM",
     "2425": "TRYSIL"
}

If you are planning to search by looking for the exact match of a zipcode, this will also give you the fastest search time. If you do something where the user types "24" and you need to find all zipcodes that begin with "24", then you will need something a little more advanced.

I'm not sure what mechanisms jQuery provides for parsing JSON. The way it is typically done, is to use eval:

var zips = eval("(" + data + ")");

Or on modern browsers, you can use the faster and safer JSON library.

var zips = JSON.parse(data);
Kelly Norton
Ok, thanks, but any idea how I can convert the CSV to JSON? How can I search trough the parsed JSON?
mofle
It needs to be noted, that you have to go through converting CSV->JS Data. If you do CSV->JSON->JS Data, you should realize you are adding extra processing time to the server, so that the client gets JSON, that requires minimal conversion overhead on the client. If you want to use this method, you should convert the CSV to JSON on the server side. Something like this: http://tamlyn.org/2009/06/csv2json-convert-csv-to-json/. Otherwise it would only make sense to parse the CSV directly. You could use one of the jQuery CSV plugins.http://code.google.com/p/js-tables/wiki/CSV
So, if you want to go down this route, the server side language used will be needed. Such as if it is PHP you will want to use fgetcsv(), and then json_encode() it. Or if you want to store it as a static file, you can use the tamlyn.org link above and save it as the json file you want.
+2  A: 

This is a web page that will convert your CSV to JSON from a URL. You can use it locally on your computer. Uses JQuery and the CSV and JSON plug-ins. Note: this script is a quick hack specific to the CSV given.

<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js"&gt;&lt;/script&gt;
<script src="http://js-tables.googlecode.com/svn/trunk/jquery.csv.min.js"&gt;&lt;/script&gt;
<script src="http://jquery-json.googlecode.com/files/jquery.json-1.3.min.js"&gt;&lt;/script&gt;
<script>
jQuery(function($){

$('#conv').click(function(){
    $.get($('#myurl').val(), function(data){
     var csvobj = {};
     var csvray = $.csv(';')(data);
     $(csvray).each(function(){
      csvobj[this[0]] = this[1];
     });
     $('#jsondata').val( "areacodes=" + $.toJSON(csvobj) );
    });
});

});
</script>
Url to CSV: <input type="text" id="myurl" value="tilbud5.csv" />
<input type="button" id="conv" value="convert url to json" />
<br/>
<textarea id="jsondata" rows="1000" cols="100"></textarea>

Using the JSON data, this is just an example:

$('#postnummer').keyup(function(e) { 
    if($(this).val().length == 4) { 
        alert(areacodes[$(this).val()]);
    } 
});

$.getJSON("data.json?callback=?");
Great, this was exactly what I was looking for. One problem though, when I load the testpage with your code, the browser locks up for about 5 seconds, while loading/parsing the JSON file. Any solution to this? it's kinda critical, since nobody wants a freeezing browser ;)
mofle
Put areacodes= in front of the json data. Eg: areacodes={"2407":"ELVEREM"}.Then use <script src="data.json"></script>. Or you can use $.getJSON("data.json?callback=?");
You will also want to remove var areacodes = {}; if you do it this way.
Thanks, that did it ;)
mofle