How to protect a DB with critical data from the tyranny of slow queries?

0 like 0 dislike
25 views
Situation: a server — the webserver and MySQL. Muscles interact with, first, a PHP script running under Apache on this server and the second remote users via TCP. They work with the same base. However, the performance of the link "local Apache plus muscle" critical and "remote users plus muscle" — no.


"Remote" the user launches a clumsy request — for example, REGEXP select on non-indexed column 20 million rows. At the same time in 3-5 minutes inhibit all other queries to this database, which in normal conditions fly. In the end, critical web part stops responding at an acceptable speed. How to make so that remote users could send gownsare without harm to the functioning of local connections to the database? "LAN" and "all remote workers" connecteda to the databases of different users. Post base on a two — option is not satisfied. Stock performance on the server is (8 cores, 24 gig memory).
by | 25 views

7 Answers

0 like 0 dislike
To make reading for ever deleted on the slave server (can reside on the same piece of iron, just use a separate hdd), and the entry to the master. To share read and write using the mysql proxy forge.mysql.com/wiki/MySQL_Proxy_RW_Splitting.
by
0 like 0 dislike
you can configure replication (in the book does not specify whether remote clients to change the DB): master-to give the job to the local server and remote clients and the process of removal of files to backup from slave. So for example the question with the backup data tables over 1 million. records are best addressed through an appeal to the slave server.
by
0 like 0 dislike
To write a demon who desired frequency checks current muscle kveri remote user, and kileed them if they lasted longer than...
by
0 like 0 dislike
Post base on a two — option is not satisfied.? and why is it necessary to post two as it was already told to select individual slave or a system of master-slave is not included in your plans? because in this way not only increases performance but also reliability. Yes, of course You will need the ability to design and solve problems of synchronization — but the result may exceed your expectations.
by
0 like 0 dislike
to use Sphinx
by
0 like 0 dislike
Think of iron can help high-speed scsi/sas screws, and softovoi optimization sort_cache ,random_cache, etc.
by
0 like 0 dislike
I like Enterpriseco generally seems to be something wild to give users that aren't developers / support engineers, and I don't know SQL and databases in General direct SQL-level access to a large database loaded with critical data (by the way, what do you mean access via TCP? You mean — the ability to manually run the queries from the mysql client, SQLYog type?)
\r
You need to have Your users could run his own poorly written queries? They can't do with a set of sagalovoy of report requests, or something like that? They can't send requests that you want to perform, the competent special dude who will be their reviewwith and run? And how do you prevent accidental deletion of data and so on? Users have read-only privileges on all database objects?
\r
If You directly so critical to give users such access... Then, as they said here, to do slave server with replication, or configure user resource quotas on the use of CPU, IO bandwidth, memory for each user (under which your users connecteda DB). He is not strong in mysql, because you do not know, as his quota of resources.
by

Related questions

0 like 0 dislike
3 answers
0 like 0 dislike
6 answers
asked Mar 21, 2019 by zorba_buddha
0 like 0 dislike
4 answers
0 like 0 dislike
7 answers
110,608 questions
257,186 answers
0 comments
1,107 users