View Full Version : Too much data

08-23-2006, 08:33 PM
I have a database with currently over 2million records and we're starting to run into problems. Obviously queries on that table are starting to hit slow down - we had a client today wanted to upload 12 million offers - which we had to say no to.
The issue we have is the first page you fill in some search criteria and then on the second page the criteria you have is narrowed down by the criteria on the first page. eg. You want to go to the med for 2 weeks in september the list of available departures and board types in based on what we have available for 2 weeks in september in the med. As we get more offers, this search takes longer - the more we grow, the worse the performance. I just really dont know what to do. I've optimised the queries the best I can. These queries come up in explain as range and eq_ref as the search type and the extras all look fine, but 0.25s is too long, 2 of those on the page and we can only server 2 users per second - 120 per hour (double with 2 cpus) before we start to get problems. My boss wants every page coming in at 1/4 of a second - which isnt even remotely feasable but we have to do what we can.

Any bright ideas?

08-23-2006, 08:48 PM
I know from that other thread you had a couple weeks ago that you already put indexes on the tables and such.

But what kind of hardware is mysql running on? I mean at some point the hardware is going to become your limiting factor. I've never used mySQL with that much data before so I don't know where that point is exactly but I have with Microsoft SQL server and it eventually came down to upgrading the server.

08-23-2006, 11:30 PM
We've got a pretty decent server - duel 3.2ghz xeons 4gb RAM running Debian Sarge.
I know at some point we'll have to add more servers but I dont think my boss is ready to lay out the expense and inconvience of load balancing right now

08-24-2006, 04:23 PM
At the very least are you limiting the amount rows returned?