Imagine a php search function using mysql fulltext. If the search get many hits it's preferable to show some of them on different pages. What is the best way to handle this?
Of course it's possible to make the search again if the user go to next page but isn't that inefficient? Does mysql cache recent searches so that the second search would run quite fast or do I have to implement some kind of cache myself? Or is that just a bad idea?
what php + mysql do behind the scene is too fast. if you have 1 million rows in your table, you can just do a loop of query 10 times just for fun and return just the last result back to browser.
what make it slow is for your web server to transfer the plain text back to browser.
yes, mysql do have cache, but it's kind of useless in most situation. its use mostly is to handle the exact query (case sensitive) from millions of clients, so that mysql will not query again, and just return the same result from the previous query, provided the table has not changed from the last time.
in your case, do "paging" where every query will use limit, and you will get hit as much as your lower and upper limit permit. the way you explain your problem, it's like you don't use limit in your query, and it's your script that filter the result to show it in different pages. that's a no no. your script engine won't have enough allocated memory for it to store the result.
mysql queries are slow. Not as slow as the browser, but if you use a lot of queries in huge tables it might take up seconds.
But if I do a fulltext search, mysql will still have to search everything even if I use a limit. right? And another problem with limit. If I use limit how will I know how many extra pages there are?
the first query should use count, so it won't use up resource in mysql.
the second query should use limit, otherwise it will return the whole rows.