08-20-2011, 10:43 PM
the title gives it away roughly with what I'm going to be asking.
Basically, I have an MySQL table in the nested set model and it holds categories and subcategories, etc for a large business directory that's being developed.
I was thinking, instead of a query to the table on every page load (which I'm guessing may cause 'lag' if there's lots of visitors) having the database table dumped into a structured xml file. So instead of querying the db all the time, it just pulls the info from the xml file.
Now I'm wondering if opening an xml file all the time would use less resources than querying the db.
Is there anyway to test this sort of thing as I want to get it right now while it's being made, rather than 'hotfix' it when the site's up and running
08-21-2011, 03:18 AM
Added after response written: Sorry, I rambled a little.
So I guess it probably really matters how big your categories \ subcatagories table is. How many rows does the table have?
So my answer is purely speculative, and in some ways I am not even sure it matters how big the table is, but I think it will be faster to pull information from the DB. The reason being, PHP is stateless, with the exception of sessions. It has no memory of what happened on the last page. So you are either going to have to load the XML file on every page, or query the DB. You are then going to have to parse this XML file line by line, until you find the entry you are looking for. Not only is this not real efficient, but I think it will end up requiring more computing power, and more bandwidth, although the bandwidth will be limited. You are sending everything, on every page load. The database on the other hand is indexed, and can return just a single result. It can take short cuts to find the information it is looking for because it is indexed, and the software is maximized to find information quickly. Parsing an XML file is almost certainly going to be slower. Regardless of how large the table is, and the type of storage hardware the server uses, you will probably find that the database can return the result in a few thousandths of a second. There is no way you can parse XML faster. Every other process is shared between the two: making connection to the server, and the server response. The XML scenario has the disadvantage of having to download everything.
EDIT: There is actually a flaw in this reasoning. You would not have to download the XML file every time, you can read it from the server... but you still need to find the file, open it, read it then close it. Databases work more efficiently by keeping tables open in memory (correct me if I am wrong). Querying the database will still be faster.
08-21-2011, 04:05 AM
Cheers for that, there's a few hundred entries in the table. I'll stick to the db for now then
08-21-2011, 04:59 AM
Cheers... With a few hundred entries I wouldn't worry about 1 query per page load. If you want to maximize load times, get rid of any unnecessary queries. Remember, that is what the database software is designed to do. Store and retrieve data...quickly.
In my experience, it doesn't really seem to be queries that slow things down all that much. I am still trying to figure out what is slowing down one of my scripts, so this is a topic I have spent some time looking at, but it isn't the queries. I put microtime() timers on each step in the script I am looking at. Loops inside loops seem to really slow things down, and possible large, nested arrays. If anyone can offer more insight into what slows PHP down, I would certainly be interested in reading.
08-21-2011, 05:04 AM
Hey mate... is Blue Panda yours?