Results 1 to 2 of 2
Thread: Analyzing traffic statistics
08-19-2004, 12:24 AM #1
- Join Date
- Dec 2003
Analyzing traffic statistics
This article was orginally written at whreviews.com. We would like to thank Dan for this great article. Also checkout his website more articles.
Why is everyone worried about the amount of traffic?
Traffic is at the heart of the Internet. Traffic is what makes the Internet so alive and kicking. An Internet without all these people searching for information, products and services (you and me included) would be like the ocens without the water. Traffic is the Internet.
This wonderful, miraculous thing that has changed the lives of many by providing such an abundant amount of information fast and easy to find, requires money to exist though.
The web becomes increasingly commercial as the years pass by. The educational purposes and idealistic concepts that were so closely tied to the Internet lost most of the ground in their fight with the mighty dollar. The 'net is no longer a place where the geeks of the day share their knowledge in an utopian fraternity.
Today's internet is a savage place where the bigger fish eats the smaller fish, where monopolies are starting to emerge, a place that requires new laws that are created along the way, by which time some people do get hurt.
It's a lot like the wild west from a few points of view, but the gold rush is now the traffic rush. As in those day gold meant money and was the subject of the day, nowadays, web traffic means money and is on the lips of every website owner on the planet.
What all this got to do with web hosting?
Well, as a matter of fact, for most people this has everything to do with hosting. Why? Because the server that hosts your website knows exactly how many pages it served, when they were served and who was the person that requested them.
The web server can and will if instructed to, save this kind of data in a what is commonly known as a raw logs file. The data in these logs files is written in a certain format, according to some standards.
The raw logs files are basically a text file which can be viewed with an application such as Notepad. I guess any person can understand at least partially the information that is stored in such a file just by looking at it.
Raw logs are a standard feature in the hosting packages now, but you should make sure nevertheless. Raw logs are important if you plan to use advanced client based software to analyse the traffic your website receives.
Analog, Webalizer and Awstats
For smaller sites, web based software such as Analog, Webalizer or Awstats are usually enough. These programs are run on the server and they are most of the time included in the hosting package. If you don't plan to use your own software to analyze the logs look for a host that offers these in the package you're planning to buy. Note that these are common feature; almost all hosts include them in the package because they're freeware, so not offering them is rather strange.
Analog the most basic one of the three but it's the most popular perhaps becausse it's been around for a while. Webalizer offers similar information but in a different format and somewhat more detailed. Awstats is quite nice and it's the better looking of the three.
All theree analyzers can help you find out new things about your visitors. You'll not go WOW when using them, but you'll know a thing or two about you visitors nevertheless. Just so you can get an idea of how they look like, here are their demos: Analog, Webalizer and Awstats.
Because we tend to be so concerned with numbers, the first thing we want to know is the number of visitors the site received. More important however is what we learn about those visitors.
Which pages do they prefer? Where did they came from? From the search engines? Which search engine? What were they searching for in that search engine? From a link exchange page? On what website? From a directory? From a message board? etc.
The answers to those questions can provide information that can and should be acted upon. Knowing the sources of traffic can tell you a bit about how targeted that traffic is. It can also give you hints on what to do to increase the traffic in the future. Suppose you notice that much of your traffic comes from directories. That should be a hint that this is an effective technique to bring even more traffic.
One great thing is that traffic from theme related directories is rather targeted. We'll discuss (Or rather I'll discuss... by myself ) about the quality of traffic a bit later.
You'll also notice that not all pages bring in the same amount of traffic from the search engines and that not all search engines bring in the same amount of traffic. As I write this Google surely is the most generous search engine. Although Yahoo and MSN have comparable audiences they fail to send free traffic to websites. Why? Money is the answer, but let's not get into that! The subject here is traffic statistics and analysis, not search engines and their behaviors.
As I was saying... some pages receive more traffic than others. Once you know which pages get the traffic, you know on which pages you should concentrate your efforts to get more sales/revenue/newsletter sign-ups or whatever you want.
I had the pleasure of experiencing Urchin, a commercial web based logs analyzer which was included in the hosting package that I used. My initial impression was this that the interface is a lot nicer compared to its free siblings Analog and Webalizer and that it's a bit more powerful too.
However, after using it a bit more I found it to be a great tool and I would certainly recommend you to find a host that has it installed, especially if you're not going to pay for raw logs analysing sofware. Sure, it's not crucial to get it Urchin, but it's a very nice feature nonetheless.
It helped me understand more about my visitors, about the way they travel inside the site, which helped me make decisions regarding the appearance of the site. If it proved to be useful for my site which is mainly informative in nature, it must be ten times more useful for a purely commercial website.
Raw logs analysis
Raw logs are very useful indeed. Once you download them you can use all kinds of software to analyze them. Some programs are powerful, other not so powerful, some have nice graphics and charts, others have ugly ones.
A nice program that I've used is WeblogExpert. I like that it's easy to understand and it has has nice graphics and stats. I also noticed that it develops very fast! It even has a lite free version which is quite good! Sure, there are lots of other programs out there and this is just one piece of software that I happened to like. There's plenty to choose from Dmoz.
Downtime and statistics
It can be useful to have detailed statistics because if your website is popular enough to ensure a relatively constant flow of visitors, you can determine downtime based on lack of traffic in the logs. Say your website receives an average of about 200 visitors/hour, but never less than 75. If for some reason you spot a 3 hours break in the stats it's most likely that the site was down for that period of time.
I also noticed, although I'm not sure if this is a rule, that each time a server would go down the raw logs file for that month went back to zero data. This can be also a hint that the server had a few problems.
Is is all about the numbers?
As I was saying in the beginning of this article, everybody is concerned with the amount of traffic. While higher numbers of visitors usually translate in more sales or other desired actions from the part of the visitors, it's often the quality of traffic that determines profitability. The return on investment (ROI) is a much tooted profitability indicator.
It can be seriously improved if the traffic you bring to your website is highly targeted. Suppose your website is sells diamonds. If you bring traffic from a programming forum, that will result in very few sales - if any. If you bring in people searching for "cheap diamonds" in a search engine, you stand a considerably higher chance of landing a sale because that traffic is targeted.
Also, it's not always about bringing more traffic, but about getting more from the traffic you're already getting. It's a known fact that sales can be increased impressively simply by changing the wording of the presentation. One could argue that it's a lot easier to increase sales by 50% by altering your sales presentation rather than by getting 50% more traffic. It is my belief that the best solution is to concentrate efforts on both techniques.
For a reasonably successful website, taking decisions based on simple raw logs, Analog or Webalizer, is not very smart. There are advanced solutions out there that can track with pinpoint accuracy the path of the visitors, the visits that resulted in sales etc. Implementing such advanced tracking is a good decision once your website can afford it. Here are some of the competitors in this field: Clicktracks, WebSideStory, Webtrends and Urchin.
Logs can be very useful for any website owner. Often this small amount of information is enough to determine great and very beneficial changes to a website. A hosting package can provide good tools to analyze your traffic and any webmaster should use these tools to ensure that the website performs at its best.
08-20-2004, 08:51 PM #2
- Join Date
- Feb 2004
This was a very simple to understand explination of analizing site traffic. Couldn't of said it better myself!
Not many people understand how big of a roll this should play in running a website, because it helps you look at what is working, and what is not.
Great stuff!The Web Hosting Show - The Voice of the Web Hosting World
Think of it as talk radio mixed with Web hosting discussion for both Web hosts and Web hosting clients! New episode every Monday!