...

View Full Version : Retrieving data about most frequently visited page



bostjank
11-20-2003, 10:57 AM
Hi!

I have a database (MS Access) filled with data about web pages that were visited. How can I get the data about what page was most frequently visited - if possible without checking the number of visits on each page and than comparing those numbers.


Thanks,
Bostjan

Spudhead
11-20-2003, 11:52 AM
I'm a trifle lost. Your access database contains a record of hits logged on various .html pages, or your access database is functioning as a content repository with rows containing the html for individual 'pages' which is pulled in by some ASP?

And you want to know which number out of a group of numbers is highest, without comparing those numbers?? Umm, that's going to be awfully tricky.

I think I've missed something here.

bostjank
11-20-2003, 12:03 PM
My table looks like this


PageName Date Time
================================
page1.asp ... ...
home.htm ... ...
back.htm ... ...
page1.asp ... ...

In this example page named "page1.asp" was visited most frequenty. I was just wandering if there is a function I coud use to get the most frequent vaue in specific column.

Roelf
11-20-2003, 12:57 PM
select PageName, count(PageName) as hits group by PageName order by hits desc

raf
11-20-2003, 01:21 PM
Just a thought:
do you realy need a record for each hit? can't you get that from a logfile and analyse it with some log-analyse-tool?

I always use one page-table where i store the pages title, description, keywords, css, creationtime, last udatetime and the number of hits. If a page is requested, i select the title etc to build the html-header and increment the countervariable. Then it's easy to instantly get the hits/page.

the select group by might take up some resources if you have some trafic over time (10 pages x 100 visitors a day x 100 days = 100k records ...)

bostjank
11-20-2003, 01:39 PM
It could of course be analysed with log-analyser, but my boss doesn't want to do it that way - especialy because traffic analysis must include per-registered-user analysis.

Besides, the MDB file will be always downloaded to a local machine for analysis.

raf
11-20-2003, 02:07 PM
if i were your boss, i would open access, go to querys, open new, hit querywizard and then choos crossab-wizard. You can there define what you need in the column-headers (pages?) and what you need in the rows (users) and what agregation you want in the fields (hits) and presto !

it's a jet-sql specific query that is used (something like transform + group by select + pivot). you could run through the wizard and afterwards look at the generated sql and use that inside your ASP-page (or simply use the access query as such.

bostjank
11-20-2003, 03:40 PM
If I were my boss... :p

M@rco
11-22-2003, 03:43 AM
So what's the problem with Roelf's post? It's the textbook answer to your question!

bostjank
11-22-2003, 06:58 AM
Roelf's post is perfect - it does exactly what I wanted.

raf
11-22-2003, 12:20 PM
Originally posted by M@rco
So what's the problem with Roelf's post? It's the textbook answer to your question!

It may be the perfect sollution for un unnecesary problem.

If you would frequently run this sort of query online, with a 'group by page' or ' group by page, user' on a intensely used site, then it would cause an unnescecary performance drop. Not the query as such, but the tablelocking which would prevent new inserts from being processed (aka requested pages to be fully processed and responded to).

If you run it offline, then there are better alternatives like generating the crosstab, which decimates you code to proces the returned recordset.

It's also reinventing the wheel and making your server doing double work since the logfiles can give you the same info (you only need to keep a session-table with 1 record with the IP number - UserID per session) Using the logs will be safer + will save you an insert in the page-table for each pagerequest.

M@rco
11-22-2003, 12:38 PM
I totally agree.

I was merely prompting bostjank to reply and acknowledge Roelf's post, since it seemed that it had gone unnoticed.

;)

raf
11-22-2003, 01:04 PM
I see :)

I suppose that the more experienced you are, the more sollutions you know to each problem, but the less problems you have to solve because you've learned to effectively use all build in features.

M@rco
11-22-2003, 01:31 PM
Absolutely. ;)

bostjank
11-25-2003, 03:49 PM
I do not agree with the "reinventing the whell..." - why would I (or our customers for whom the solution is intended) use (buy) specialized software for analyzing web logs is all we need for our purpose is to retrieve the name of most frequenty used page.

Especially becuase specialized software cannot operate with the retrieved value and use it in the functions our information already has implemented...

But once again - I'm very grateful for your help.

Bostjan

raf
11-25-2003, 07:55 PM
Originally posted by bostjank
I- why would I (or our customers for whom the solution is intended) use (buy) specialized software for analyzing web logs is all we need for our purpose is to retrieve the name of most frequenty used page.

Because
- you are not the first and probably not even the 1,000,000 webmaster who needs to do that
- because lotts of people hav already written indeed specialized software (is specialized a 'bad word') that does just that
- because there is free software available for all webservers that does just that
- because using a queryanalyser is less resourceeating (to gather and process) the info
- because you probably don't work for free so maybe your customer would be happier if they could cut back on unnescecary developmentcost
- there is even an ASP Counters component that does exactly what you would thing : counting pagehits or whatever you specify inside your cade, and make the data available to all ASP-code.

Recreating features that milions of webmasters use does look like reinventing the wheel to me. But i don' mind.


Originally posted by bostjank

Especially becuase specialized software cannot operate with the retrieved value and use it in the functions our information already has implemented...
:confused: This i new to me. Why would a queryanalyser not be able to save the stats in a db, so that they come available to your special functions ?

Besides, we're talking about analysing historical data, so why spend runtime and database resources on it ? (i'm probably just to old to not mind peolpe spilling resources).

bostjank
11-26-2003, 07:13 AM
I agree with most of what you said, but the ONLY function that specialized software offers that is useful for this project is the name of most frequently used page.

That software cannot proved us with the same analysis for each of our registered and other identified users.

So - why not writing code to do that? It is not meant to analyse overall statistics of visits, but merely give us 1 element of that statistics especially designed for tracking identified users.



EZ Archive Ads Plugin for vBulletin Copyright 2006 Computer Help Forum