javascript - paginating through a CRUD API -
i writing client query crud web api. be using socket.io.get('/api')
. problem is: want paginate results, can start displaying stuff while client still receiving data.
the results api come json, like
[ { "id": "216754", "date": "2015-07-30t02:00:00.000z" }, { "id": "216755", "date": "2015-08-30t02:00:00.000z" } ]
the api lets me construct url query can limit size of each result array. can make query /api&skip=10&limit=10
, , me results item 10 item 19. want able keep looping , receiving results until results array less length = 10 (that mean reached end of dataset). , need asynchronous, can start work on data right start , update whatever work have done each time new page received.
is infinite scroll trying do? or want call pages asynchronously , able receive page 3 before page 2? reading question, understand second.
you can't rely on "until results array less length = 10" since want launch calls @ same time.
you should a first query retrieve number of records. able know how many pages there are, generate urls need , call them asynchronously.
it looks (code not tested):
var nbitemsperpage = 10; socket.io.get( '/api/count', // <= have code controller returns count function(rescount) { nbpages = rescount / nbitemsperpage; (var i=0; i<nbpages; i++) { // javascript loop without waiting responses (function (pagenum) { socket.io.get( '/api', {skip:nbitemsperpage*pagenum, limit=nbitemsperpage}, function (resget) { console.log('result of page ' + pagenum + ' received'); console.log(resget); } )(i); // <= immediate function, passing "i" argument // indeed, when callback function executed, value of "i" have changed // using immediate function, create new scope store pagenum every loop } } )
if trying archive infinite scroll page, have load page n+1 after received content of page n , can rely on results.length < 10
Comments
Post a Comment