Pagination for multiple items
Let's say I have 100 000 rows of data in my database and I have a REST API to query these. Obviously I can't query the 100 000 rows at once so ideally I want to have around 25 rows in my UI present and 25 on the next page and so on. Question is, how to do this in an efficient way? How to request the right amount from the server and so on
Thanks
56 Replies
Check out serverside pagination
You can either do it using cursors or offsets
using react query you have
useInfiniteQuery
. trpc has some examples of this (the pattern is the same even if you're on REST):
"normal": https://nextjs.trpc.io (src https://github.com/trpc/examples-next-prisma-starter)
with RSC: https://rsc.trpc.ioInteresting, are the infinite posts saved where?
you mean db? yea postgres + prisma https://github.com/trpc/next-13/blob/a5762c93c297193253de6f34a8c06e61ac06ff82/server/routers/post.ts#L43-L57
No I meant on the client
when you load more the new ones gets put into the querycache
So RQ handles caching
yea
Okay maybe this is enough for me to go on
The term I mentioned is what that query is doing 🥺
So by serverside pagination the endpoint would be something like /myendpoint&pages=1
Or like /mynedpoint&entries=25
Or..?
yea exactly.
?page=5&page_size=20
i think are more common params or skip/take, first/after etcthe question mark probably is the divider here?
so if I do const { page, page_size } = params; it knows to split from the question mark?
and julius, https://rsc.trpc.io/, this page queries like this: ?page=5&page_size=20, or along the lines
I'm assuming when you scroll to the bottom, the params go up one
and it fetches more
yea, query param parsing can prob be a bit different in different frameworks though
I'm using node here
an express app
yea so like this https://masteringjs.io/tutorials/express/query-parameters
Mastering JS
Query Parameters in Express
Express automatically parses the URL query string and stores the parsed parameters in
req.query
. Here's what you need to know.yea that's the gist
Okay cool, a follow up question to my issue. I read the 100k rows from a local csv file, is there a quick an easy way to check it my database is iniatlized (so the file is already read once) so I don't have to read it again on every npm start
First thought is to check if database.db file exists
you can use the csv as your db xd
nah not an option
aight, well you can have a seed script or smth which loads the csv into your db
what's a seed script
Database seeding
Database seeding is populating a database with an initial set of data. It's common to load seed data such as initial user accounts or dummy data upon initial setup of an application.
I'm loading the csv into my db in any case
Are you using any database software?
yes that is what I need but that script cannot be run if the db already has the rows
Usually they'll have import functionality and you can just give it ur csv then it'll do its thing
I'm using sqlite
Idk the right term for it but something like datagrip
Like an SQL ide
the csv lines wont go as they are to the db
so can't really do it like that
Yeah it has functionality for that too
You can map it to columns based on the column in the csv
You can also just seed it with a portion of the data instead of all 100k of it
If you don't want to persist/having trouble persisting it
I'd like the data to persist in the db but I want to avoid reading it more than once if the DB exists
if you know that if row1 exist then row 100k exist you can just do a single read and run seed script if it comes back empty
on application launch
if there could be "holes" its trickier
that's what chatgpt told me but I'm thinking there's got to be something more better but I guess I can do it like that
So why do you have to keep re-seeding the database?
or just setup the seed script so that it stops if it finds duplicates
Shouldn't it already be seeded since it ran once
I'm just curious
I guess you're missing my point I only need to seed it once
I'm just confused why its a difficult thing is all. As nice as that can be
what I want is that if the 100k rows aren't in the db, then insert them, otherwise just continue
then just
pnpm db-seed
once and the rest of the time pnpm start
?I want that last message to come across as nice as possible
I'm not trying to be antagonistic
I'm just wondering how to perform the check the that the seed has been done
it's a bit hard to follow what you want to do. you say that you need a single seed but still want to check if it exists in runtime?
maybe I'm explaining it horribly, let me rephrase
cause ideally you'd just have a
load-csv
/seed
script you can run once (or whenever you have reset the db for some reason)so I have 100k csv lines. I need to check that these are okay etc validate them, then add these to my database. in my scenario, these files are only iterated and validated if they are not already in the database. meaning this csv file is checked once and then they are not checked again UNTIL database is permantently deleted and script is restarted.
then the rest of the times you just start the app and the db will be seeded from before
yep exactly like that
I run my script for the 1st time, seed happens. I close my app
then I run my app again, it's already seeded and it uses the same db, no need to seed
so ideally my command run app is actually "command seed && command run app"
yep so
yes, but ideally I just have npm run which is actually npm seed && npm run app, and the npm seed command checks if seed is needed.
setting up environments is really a separate thing entirely and your app shouldn't really be responsible for it
on the topic of npm and pnpm, what's the pros and cons of pnpm?
agreed
Having to query once to check then maybe do it is probably what you are gonna have to do if you want to be able to seed everytime or handle duplicates in ur seeding query
if you want that then just read if the file exists, or try read a single line from the db
yep