The restriction on the PHP SphinxAPI matching

0 like 0 dislike
18 views
In the function function SetLimits ( $offset, $limit, $max=0, $cutoff=0 ) there is a limit $max=1000 on the number of returns recordset.
$sphinx->SetLimits (0, self::LIMIT_A, 1000 );
If you increase this constant or in the file itself sphinxapi.php the Sphinx daemon returned error:
Query failed: searchd error: per-query max_matches=10000 out of bounds (per-server max_matches=1000).
How can I get the next 1000 values? The code below is not suitable.
$sphinx->SetLimits (1000, self::LIMIT_A, 1000 );
by | 18 views

3 Answers

0 like 0 dislike
zaminusovali — many do not mind
but the answer, as is clear not enough...
by
0 like 0 dislike
View help on the Sphinx — a lot of mind is necessary.
But to write the question seemed enough :D
\r
It is necessary to increase not constant and not in the file sphinxapi.php and in your Sphinx config
\r
\rwww.sphinxsearch.com/docs/current.html#api-func-setlimits — help SetLimits, it is written about max_matches
\rwww.sphinxsearch.com/docs/current.html#conf-max-matches
\r
In the config just set the desired number max_matches.
by
0 like 0 dislike
May be someone will be useful. It's not the best solution, but:
my client had a requirement to display/save all the search results.

Stepping through page by page navigation for a long time. I поиск19567 results took about 6 - 7 seconds in increments of 1000.

I solved the task so:

// code listing is conditional, since the torn pieces from the middle of the code, but you get the idea // pre query $sphinx->SetLimits( 0, 1, 1); // get how many Rustamov was found according to the key/other parameters $pre_res = $sphinx->Query($Query, 'index_name'); // set limits for the entire range of matched records at a time $sphinx->SetLimits( 0, $pre_res['total_found'], $pre_res['total_found']); $res = $sphinx->Query($Query, 'index_name');

This method selects 40728 records somewhere for 2 seconds. Of course time is approximate and depends on a number of factors.
It's not the best solution, but in my case it was only necessary to collect ID records found and save them in database for further work. Beautiful way I have not found, except that a "axe" :( .. but "axe" with the task of coping not too bad ;)
by
110,608 questions
257,186 answers
0 comments
32,900 users