AKA Marketing.com Logo            VISIT THE BLOG            

Blogged thoughts, is our web blog. Expect views, opinion, rants and tirades about everything and anything 

« Home / Forums »        

 


Subscribe to our SEO / IT related blog by entering your email address below

Blogged thoughts

| by the www.akamarketing.com team

Archive for August, 2006


Keyword Density Optimization on MSN

Thursday, August 24th, 2006

Recently I’ve been having a small bit of luck on MSN by working my pages towards average keyword densitys for elements such as title tag, meta description, h1 tags and general body content. I ‘computed’ the averages by using the densitys of the top ten sites for my chosen keywords on MSN.com. For example for the term ’search engine optimisation’ on MSN.com I am currently ranked 5th. This is outstanding considering before I conducted this latest round of onpage optimisation I was not even in the top 50. Interesting to know that MSN still pays a lot of attention to onpage elements, much more than Google and Yahoo anyhow.


A look at the AOL search data disclosure

Thursday, August 10th, 2006

OK so we probably all know about AOLs recent boo-boo when they inadvertently released search data pertaining to over 650,000 users. Just how serious is this though? What potential problems or embarressments may it cause for any users which might have searched for their personal names, addresses, social security number or other private information which might cause them to be identified? TJ McIntyre is a lecturer in the School of Law in UCD (University College Dublin) and has authored three very interesting posts on this very high profile incident on his blog ‘IT Law in Ireland’ which is located at http://www.tjmcintyre.com/.

In one of his posts TJ refers to the NY Times which ran a story about 62 year old Thelma Arnold who was ‘traced’ because of her detailed search queries. “My goodness, it’s my whole personal life,” she said. “I had no idea somebody was looking over my shoulder.” Thelma Arnold though wasn’t searching for anything too strange, things like ‘60 single men’ and ‘dog that urinates on everything’ where among her somewhat embarrassing searches. If others are indentified their searches may not be so harmless as queries such as ‘how to secretly poison your ex’ and ‘how to kill a wife’ are to be found among the massive 2GB of data which was released.

A lot of people are extremely annoyed with AOL over this and thus calls for an AOL boycott are widespread. Anyhow check out these posts, I found them very interesting.


Google Analytics - exclude your visits even with a dynamic IP

Tuesday, August 8th, 2006

Hi folks, hope all my visitors from Ireland enjoyed the long weekend, I know I certainly did. Today I’m going to go through the steps which you need to do to have Google Analytics filter out and exclude all data from your own personal visits to your website even if you are on a dynamic IP. I visit this site a lot to check the blog and forums in particular so I do not want my visits artifically inflating my ‘real’ visitor data, if your site is generally static you will not have a need to visit it that often, so perhaps filtering out your own visits is not needed but only you can determine this.

Up until recently I did not know this could be done as since I’m an Esat/BT/IOL (or whatever they’re calling themselves thesedays) broadband customer I did not have a permanent static IP which I could use to identify my machine and thus exclude by IP, additionally I couldn’t filter by my network location which is ‘Ireland On-Line Broadband Customers’ as this would of course filter out visitor data for all IOL broadband customers.

The solution is then to use cookies as opposed to IP addresses, the overall idea is to set a cookie and then use the filter interface to instruct Google Analytics to filter out and ignore all assocated visit data from all machines which have this cookie set on them. To set the cookie you need to create a new page on your domain with the following code:

<body onLoad=”javascript:__utmSetVar(’no_report’)”>

Please note that this code is in addition to the Google Analytics tracking code that you have on every page of your website. Next you need to visit this page from all computers that you would like to exclude from your reports, to set the cookie on each machine.

The final step is to actually create the filter via your Google Analytics account. For this you will need to create an exclude filter to remove data from visitors with this cookie set. Follow the instructions at http://www.google.com/support/analytics/bin/answer.py?answer=27207&topic=2970 to create a filter with the following settings:

Filter Type: Custom filter > Exclude
Filter Field: User Defined
Filter Pattern: no_report
Case Sensitive: No

That’s it, to verify this is working correctly I recommend creating and visiting (a couple of times) a temporary page (say temp1.html) which has the regular analytics tracking code on it before you visit the page which sets the cookie, the next day in the content reports section in Google Analytics you can locate the hits to this page (which must have come from you as no one else knows about the page). After this, visit the page which has the cookie set code and then revisit temp1.html a couple of times, you should find no new hits to temp1.html when you recheck your stats the following day.

If you use multiple browsers you will need to visit the set cookie page from all them as they all store cookies in different locations. Any questions let me know.


A look at the Matt Cutts SEO Videos (1-5)

Friday, August 4th, 2006

Matt Cutts is a Google employee who works on their anti-spam team. Recently he has started to publish a number of videos on Google Video (where else?) in which he attempts to answer questions on a number of different Google SEO and sometimes general Google questions. Earlier yesterday morning (August 3rd) he added his 9th and 10th videos. 

I’ve seen blog posts all over the place about these videos but the source posts are of course from Matts own blog which is located at http://www.mattcutts.com/blog/ so to see what he has to say about them go there, well actually you’d be better going to http://www.mattcutts.com/blog/type/movies/ as his posts pertaining to the videos are conveniently one after another on his movies category page whereas his blogs homepage has others posts scattered in between. Anyhow in this post I will provide an overhead look at the first five of these videos. These are just rough summaries and are not transcripts as such.  Although these summaries do cover all the important points from each video, I still recommend you see the videos for yourself and for this reason I have included direct links to them.

Video #1 - Qualities of a good site
Matt talks about some general guidelines and recommendations to increase visibility on Google…
He says that the #1 mistake people make with their sites is that they are not crawlable, Matt says try to go through your site using something like the Lynx text browser, if you can get through your site with Lynx your going to be in pretty good shape as far as crawlability is concerned. Also consider using Google Sitemaps.

The main thing which Matt advises is to make sure people relevant to your niche know about you, (ie. get links from them). Think also about a ‘hook’ for your site, something viral. For example really good content like newsletters, tutorials, videos etc. which have links back to your site. Matt gives the example of some tutorials he read about making his videos professional and then at the end of these tutorials the tutorial providers said something like ‘oh and by the way you can use our equipment to make these professional videos, visit www.alink.com’. Matt says content can be a great way to get links and mentions things like syndicating tutorials, digg and slashdot. Matt says fundamentally you need something different which sets you apart from the pack.

A question is asked about the dmoz snippet. Matt talks about why Google would use the dmoz snippet for your listing as opposed to your meta description tag. He says Google will pick whichever snippet is better for the query, so it is query dependent. He also mentions using the noodp tag which will disallow Google using the ODP tag for your listing snippet.

Does Google favour bold over strong tags? Matt says they favour bold, however it’s so slight that you need not worry about it.

Video #2 - SEO myths
Too many sites on same server? Too many sites with IPs too similiar? Too many sites which all use same Javascript?
99% webmasters are OK, but if you have something like 2000 sites, Matt poses the question do they/can they all have unique relevant content?

Launching millions of pages at same time? Matt says things have changed in Google, now he says it’s better to launch a few thousand pages now and then a few thousand pages later. Try a softer launch as otherwise your pages could be attracting unwanted scrutiny.

Video #3 - Optimize for Search Engines or Users?
Which is more important SEO or end user optimization. Matts says both - for initial attraction of visitors and then conversion of visitors to customers. The trick is to try and make sure your users interests and the search engines interests are aligned as possible (ie. good content is liked by both users and engines as too is good navigation)

Matt recommends sitemaps for checking your site is ‘clean’ as it should flag you about crawl and other errors which were found.

W3C - Matt says W3C validation does not help in the search engines as some figures estimate 40% of all HTML pages have syntax errors and there’s no way Google would remove or penilize all these. He mentions that in general it’s a good idea but for SEO purposes making compelling content is to go at the top of your priority list.

Video #4 - Static vs. Dynamic URLs
Does Google treat static and dynamic URLs differently? Matt says generally they are treated the same way in terms of ranking. He says pagerank flows to dynamic URLs in the same way as to static URLs. He refers specifically though to an example URL from the website of the person asking the question which has 5 parameters and in response to this he says you can use too many parameters. Matt says it’s best to stick with 2 or 3 parameters at most and to avoid long numbers as Google can think they are session IDs. He also mentions using mod_rewrite if you think you may have problem URLs.

Can Google inform webmasters if they were hacked from within sitemaps? Matts says good question but Google does not have the resources available yet.

Is it safe to use GEO targeting software without Google thinking we are cloaking to provide different discount and marketing messages to people from different countries? Matt talks about the way Google defines cloaking which is showing different content to users than you do to search engines. He says geo targeting software is not cloaking under Googles guidelines. Matt mentions that the thing which will get you in trouble is if you start treating Google differently. He says not to make a special country just for Google… such as Googlebotistan (his example, not mine). Treat Googlebot just like a regular user, so for example Googlebot comes from a United States IP so show it the page you would show normal human visitors from the United States and you will not have any problems.

Video #5 - How to structure a site?
Will acquiring related domain names and using a 301 redirect to the final website after the aquisition get a site banned or penilized? Matts says in this occasion no because the domain name was related, but if you run a music site for instance and then all of a sudden get a tonne of links from sites about debt consolidation, cheap online etc… that might raise a few eyebrows.

What’s the best way to theme a site using directories? Matt says to concentrate on a tree like architecture and to break things down by topic, so for example if your selling clothes you would have shoes as one directory and sweaters as another directory.

Is it OK to serve static pages to Googlebot instead if dynamic URLs are unindexable? Matts cautions the person who asked the question about getting into the realms of cloaking and tells them to see if there’s a way to unify what search engines and users see in terms of URLs. If you do decide to serve static pages to search engines, you have to be sure that if regular users do visit these pages that they are not redirected (ie. if Google can see it, users should see it too)

Will A/B split testing raise suspicions of cloaking? Matt says it’s best to split test in an area where search engines are not allowed to index. He mentions using robots.txt or .htaccess files to make sure Googlebot doesn’t index your split tests. He again cautions about doing anything special for Googlebot, treat it like a regular user and you should be fine.

Well folks as you can see from above Matt has already been quite helpful to the webmaster community with the these videos… and that was only the first five. The most interesting thing for me from the first five videos was the fact that he said using 301 redirects with aquired but related domains was OK, I was also too afraid to try this in the past but now it seems to me that it could be an excellent way to get a tonne of related backlinks for discount prices. Anyhow I hope this post helped, this wasn’t exactly a transcript but it wasn’t far off. I’ll go through videos 6-10 in a couple of days. Please let me know your comments on Matts first five videos, remember I did put together this post for you so let me know what you think.

kick it on GoogleKicks.com


Yahoo UK & IRELAND overlook Irish market yet again

Wednesday, August 2nd, 2006

Yahoo UK & Ireland have launched another service which again has overlooked the Irish market. No matter who denies it the United Kingdom & the Republic of Ireland are extricably linked in culture, business, language, history etc. and therefore to ignore the Irish market is just plain idiocy. What makes it worse is that the site although on a co.uk is referred to as Yahoo UK & Ireland.

The service in particular is Yahoo local, have a look at the following sample search I performed on Yahoo local… says it all really. This information becomes particularly annoying when one combines it with the fact that Yahoo pay per click advertising is available for GEO targeting in 14 european countries but not Ireland, this dispite the fact that Yahoo have its european headquarters in Dublin. Is Ireland not a center of IT excellence? Do we not host major companies like Yahoo, Google, Amazon, Dell, Intel and Oracle to name but a few? Have we not got Internet businesses to promote and market too? OK enough ranting from me.

HOME | ABOUT US | CONTACT US | SITEMAP | GOOGLE SITE SEARCH | TOP
12 Lorcan Crescent, Santry, Dublin 9, Ireland +353 87 9807629