...

View Full Version : Last 100 Records Only



ubh
03-23-2009, 03:24 AM
Ok I have been trying to figure this one out for about an hour now.

I have a data table with an unknown amount of records, I want to echo out only the LAST 100 records but not in a DESC order but rather ASC.

Am I DOOMED?

This echos out only the first 100 records.

SELECT * FROM chatroom ORDER BY id ASC LIMIT 100"

Where as this echos out the last 100 but in the wrong order.

SELECT * FROM chatroom ORDER BY id DESC LIMIT 100"

oracleguy
03-23-2009, 05:21 AM
What about?


SELECT * FROM (SELECT * FROM chatroom ORDER BY id DESC LIMIT 100) ORDER BY id ASC

It might not be the most elegant solution but it should work (I didn't test it). Perhaps someone else can suggest a better method.

ubh
03-23-2009, 05:35 AM
Hum I got a message back from the server.. never seen this one before.



Every derived table must have its own alias

Old Pedant
03-23-2009, 05:46 AM
Yeah, it's a silly MySQL requirement.

Just add in a fake alias:



SELECT x.* FROM (SELECT * FROM chatroom ORDER BY id DESC LIMIT 100) AS x ORDER BY id ASC

Old Pedant
03-23-2009, 05:48 AM
If it's not obvious, the inner SELECT is considered a derived table.

NOTE: You know, you *could* do this via a VIEW:



CREATE VIEW last100posts
AS
SELECT * FROM chatroom ORDER BY id DESC LIMIT 100

And then you ad hoc query becomes


SELECT * FROM last100posts ORDER BY id ASC

ubh
03-23-2009, 05:53 AM
Old Pedant, you saved the day again!! LOL so ok I guess I have one more question.

Out of lets say hundreds I have my last 100 rows (thanks to you), what if I wanted to keep these 100 rows but delete the rest or rather all previous rows before this last 100?.?

Muhhahah I'm evil, I know...

Old Pedant
03-23-2009, 06:03 AM
Oh, that's trivial!

If your id field is an AUTOINCREMENT field and if you don't delete rows for any other reason, then just do:


DELETE FROM chatroom WHERE id < ( SELECT MAX(id) FROM posts ) - 99


Why work harder?

Old Pedant
03-23-2009, 06:08 AM
I might note that selecting the last 100 records by an autoincrement primary key is NOT going to be slowed down by having, say, 100,000 records in the table.

All that will happen in the DB engine is that it will hit the primary key index and *EASILY* find the last 100 values. Only then will it go grab the actual 100 records.

Hmmm...that's certainly true with SQL Server, at least. I would hope that MySQL would be that efficient.

If not, you COULD always do


SELECT * FROM chatroom
WHERE id > (SELECT MAX(id) FROM chatroom) - 100
ORDER BY id

*possibly* that would be faster than the LIMIT clause. Dunno. Haven't done much benchmarking with MySQL.

NOTE: If you kept the ID of the last inserted record in a globally accessible (to all threads) PHP variable, then it would CERTAINLY be most efficient to do


$sql = "SELECT * FROM chatroom WHERE id > $LASTID - 100 ORDER BY id"
...

(Again, assumes that id is a primary key, but surely it is.)

ubh
03-23-2009, 06:12 AM
Oh, that's trivial!

If your id field is an AUTOINCREMENT field and if you don't delete rows for any other reason, then just do:
Code:

DELETE FROM chatroom WHERE id < ( SELECT MAX(id) FROM posts ) - 99

Why work harder?

Oh crap, lucky I didn't run that. I just thought about it and if this gets ran more than once between multiple posts and gets to my data table, you might delete them all lol!

Ok new scenario, obviously I am doing this for a chat application so my logic was to have 100 and only 100 hundred records in my table.

For each new row that's added to the bottom, the first row at the top gets removed. Only allowing a total of 100 rows at a time in a ASC order... so now I am all confused again.

walks away to smoke a cig and ponder.

Old Pedant
03-23-2009, 06:19 AM
Use a circular buffer.

Slots are numbered 0 to 99.

You have an outside variable named "currentStart".

When you add a new post, you do:


LOCKTHEBUFFER
currentStart = currentStart-1
buffer[currentStart] = posting
UNLOCKTHEBUFFER

To get the last 100 posts you do:


for ( index = currentStart; index < currentStart + 100; ++currentStart )
{
showPost( buffer[ index % 100 ] );
}


Whether the circular buffer is in memory (has to be "global" memory, accessible to all threads) or in the DB is up to you.

&&&&&&

added: Naturally, the currentStart variable/value has to be globally accessible as well.

Sure you don't want to do this in ASP code? Take about 3 minutes to write it. <grin/>

ubh
03-23-2009, 06:19 AM
I might note that selecting the last 100 records by an autoincrement primary key is NOT going to be slowed down by having, say, 100,000 records in the table.

All that will happen in the DB engine is that it will hit the primary key index and *EASILY* find the last 100 values. Only then will it go grab the actual 100 records.

Hmmm...that's certainly true with SQL Server, at least. I would hope that MySQL would be that efficient.

If not, you COULD always do
Code:

SELECT * FROM chatroom
WHERE id > (SELECT MAX(id) FROM chatroom) - 100
ORDER BY id

*possibly* that would be faster than the LIMIT clause. Dunno. Haven't done much benchmarking with MySQL.

NOTE: If you kept the ID of the last inserted record in a globally accessible (to all threads) PHP variable, then it would CERTAINLY be most efficient to do
Code:

$sql = "SELECT * FROM chatroom WHERE id > $LASTID - 100 ORDER BY id"
...

(Again, assumes that id is a primary key, but surely it is.)


Hum ok, yeah you seen my logic in thinking and yes its autoincrement. I was only going to limit the 100 records due to unknown stress tests on MySQL... Anyone else know if my retrieving data will not be slowed down if I am retrieving from a specific primary key index out of thousands of records or will it be fine?

Honestly if its not going to slow anything down GREAT, cause I would like to keep all the records for admin purposes.

ubh
03-23-2009, 06:27 AM
Sure you don't want to do this in ASP code? Take about 3 minutes to write it.

Naw, my hosting plan wont support it unless I upgrade. Plus I have yet to learn a single thing about ASP.

I started with PHP and MySQL cause its all provided to me for free with an Apache server... just what I could get my hands on first sort of thing.

I think I will just keep all the records in my table until I get a reason to do other wise, everything is working great so if its not broken why fix it I guess... right? lol

Old Pedant
03-23-2009, 06:41 AM
I was just teasing you about using ASP.

But anyway, a google search turned up this:
http://lab.arc90.com/2007/06/appcache_for_php5.php

Looks like that would do the job. Hmmm....well, maybe. Nothing in those short docs about locking an object by one thread to avoid deadlocks. You might have to implement locking yourself. Probably a simple two-phase lock would be enough. But you could download it all and find out.

ubh
03-23-2009, 06:49 AM
Thanks man for all your help. This all seems a bit over my head, so I will get the pot of coffee rolling and see what I come up with tonight.

Who knows we might see a chat area on my site by morning ha ha.

Let me know if there is anything I might be able to help you out with my friend.



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum