Fastest Way to search for something in big data
I have a lot of data and i need to search in it and get the respond as fast as possible, example:
if u search for
Sou
u should get :
even tho i know how to make this, the list of data is huge, so i am looking for the fastest way to do this, the list is originally stored in a txt file or a json, loaded from that file once so no need for optimization there
into a data structure that i havent decided yet, i need the best for this, and then i will need to search for stuff, the searching will happen often
let me know if u need anymore info11 Replies
⌛
This post has been reserved for your question.
Hey @MoonSouhayl! Please useTIP: Narrow down your issue to simple and precise questions to maximize the chance that others will reply in here./close
or theClose Post
button above when your problem is solved. Please remember to follow the help guidelines. This post will be automatically closed after 300 minutes of inactivity.
Is it only full text search?
Solr and ElasticSearch are pretty good for that kind of thing.
Is it just prefix search? If so, you can store it as a tree
character per character
ye
or alternatively sort the whole thing once if you can afford that
ye i can do that
sort the whole thing and then do a binary-search like thing
except that at the end, you find surrounding elements
or make a prefix tree like
Alright
Thank you so much
if you have it like that, you can do a very efficient prefix search
but if you just have it sorted, you can search it pretty fast as well
💤
Post marked as dormant
This post has been inactive for over 300 minutes, thus, it has been archived.
If your question was not answered yet, feel free to re-open this post or create a new one.
In case your post is not getting any attention, you can try to use /help ping
.
Warning: abusing this will result in moderative actions taken against you.