I thought I'd share a technique for filtering large sets of data quickly in a web page. The scenario is a biggish set of data, 100,000 records, with each record having 3 properties, and we want to filter them using text matching.
Here is the finished article with dummy generated data. Give it a try. Just type any 3 or 4 characters and watch it do its thing.
The first step is to set up a function to convert the data array into a HTML list. We keep this fast by setting a limit on how many items we will show. In this demo I've set it to 10 but even increasing it to 100 makes little difference to the speed. They key thing is not to get into big numbers which put a strain on the DOM. So we don't loop through all the items, only the first few. A nice small loop, which runs quickly.
Once we have the filtered down array we can call our function to rebuild the list in HTML. It all happens in the blink of an eye.
As a final touch I've added a count message function which gets called each time we filter. This just helps to see what's going on - how many records are being shown and the size of the reduced data set.