ASP.NET Programming Tips - Tip #2

by agrace 30. September 2010 22:39

Double your BlogEngine Traffic in 60 Seconds

BlogEngine SEOSome time ago, I blogged about the increasing importance of performance, read page load speed, in the determination of "rank" by Google.

Then I noticed that most of my blog posts included an image and that there was a considerable lag, so I decided the quickest method of fixing this was to simply go into my settings and decrease the number of posts per page from five to three.

In just five months, the number of hits per months has gone from 9,977 for May to 13,542 for September, a 35% increase. In that time period, there were only three blog posts put out. Draw your own conclusions!

Tags: ,

SEO




SEO The last thing I ever wanted to be was an SEO snake oil salesman. But as an ASP.NET Web Devigner (Devigner = Developer + Designer), it's not something that I tend to ignore. I recently took on a project to improve the SEO of our local Tourism website. I'm not going to delve into the project details here, but suffice to say that a project like this can offer much insight in normal times but I probably couldn't have picked a worse time to decide on SEO strategies for a website. The last few months have seen radical shifts in SEO priorites in general, and Google's algorithms in particular. Don't forget: SERPs (Search Engine Result Pages) now return real-time results from social networking sites such as Twitter - more on the implications of this below.

Against this backdrop, I decided on an initial analysis of the site using the new IIS SEO Toolikit - get this tool and use it! This identified about 650 no-no's so I spent a week eliminating these one by one and wrote some code to take care of the meta tags and the like. The most important decision I made was to agree with the client to monitor the site for SEO hits, good or bad, for the coming six months. It behooves any contractor to take this course when they know in advance that they may not know anything at all!

IIS Toolkit

 

Google had no choice but to find some way to reduce the amount of spam clogging its data centers and apply some qualititative heuristic to measure the relevance of sites.  So, they recently re-wrote their entire algorithm (codename "Caffeine") which caused no end of panic among the SEO heads! To this end, page rank seems to be playing a much smaller part than ever before. And whatever small part it is playing will be very much influenced by a site's performance. In fact, performance is going to figure heavily in how well a site fares with Google overall. I can see myself getting more involved in this since it is going to effect clients' pocket books in a very discernible way - my prediction for 2010!

WHAT'S GOING TO MATTER

* Personalization
Search If you're signed in or not, Google can use your search history to tweak the relevance of your own searches. Signed in, you can opt to turn it on or off. Signed out, a cookie records your search history for 180 days. I'm not a big fan of this because I want my results to be the natural consequence of my ability to creatively grep precisely what it is I'm looking for. But that's just me and I can readily see how this step is necessary for Google to provide "meaningful" results to people. Personalization lends even more credence to the diminishing importance of page rank.

* Conversions
This is the number of successful transactions divided by the number of total unique visitors. Think of an E-Commerce site where you can use advanced Google Analytics tools to measure conversion rate formula as the number of sales divided by the total number of unique visitors. Check out Google's Conversion University.

* Universal Search
Remember, that search results now include video, images, blogs, books. I have been running some tests for the blog results and my impression is that the big sites with large traffic are just getting stronger. Even entering the title of my blog (The ASP.NET Community Blog), does not show me in the first ten pages of results! I've seen other developers complaining of a lack of transparency - but then again, we're talking about search algorithms which are as tightly guarded as a duck's arse and that's watertight. No surprise there, but it's still unsettling because the cause and effect of SEO tweaks seems to be even less predictable now.

THE REAL IMPLICATIONS

1) Sites with basic SEO errors will be penalized.

2) Sites with poor performance will be penalized. If you stop and think about it, there must be a huge increase in the amount of content that Google has to index in light of real-time results pouring into their data centers every second. Something has to give. Check out the peformance of your site using the Google-recommended WebPageTest application.

UPDATE

Best Practices for Speeding up your Website

kick it on DotNetKicks.com

Tags: ,

Google | SEO



ASP.NET SEO and the Canonical Tag

by agrace 21. February 2009 08:03

Canonical Tag Recently, Google, Microsoft and Yahoo announced support for a new canonical tag in an effort to combat duplicate urls (read duplicate content) on websites. Quite simply, you can add an HTML <link> tag to the <head> section of your page to indicate the preferred, or canonical, version of the page url.

If you have something like this:

http://www.mysite.com/products.aspx?productid=123

 

or

http://mysite.com/products.aspx?productid=123

 

You can have the search spider interpret it as this:

http://mysite.com/products.aspx

 

It works like a 301 redirect and is just a hint to the search spider. In other words, the other search engines are free to recognize it or not. It's mind-boggling the amount of work developers have had to do to get around this problem up to now, and it was this simple to fix at the end of the day?

To implement this for your ASP.NET products page, with its GridView of pageable, sortable widgets, you could do the following:

using System;
using System.Web.UI;
using System.Web.UI.HtmlControls;

public partial class Products : Page
{
    protected void Page_Load(object sender, EventArgs e)
    {
        HtmlLink canonicalTag = new HtmlLink();
        canonicalTag.Href = "http://mysite.com/Products.aspx";
        canonicalTag.Attributes["rel"] = "canonical";
        Page.Header.Controls.Add(canonicalTag);
    }
}

 

ASP.NET renders the following: 

<head>
    <title>Products Page</title>
    <link href="http://mysite.com/Products.aspx" rel="canonical" />
</head>

 

There is only one problem with this if you are using a XHTML doctype declaration; per the W3C recommendation, a closing slash in the <link> tag is illegal. The correct format is:

    <link href="http://mysite.com/Products.aspx" rel="canonical">

 

So where does this leave us? For me, relaxing the doctype to anything less than XHTML transitional is not an option. Does this mean we have to use an HtmlTextWriter to customize the output for this particular tag or is there some easier way? Has anyone got a suggestion or will we have to wait for a fix?

kick it on DotNetKicks.com

Tags: ,

ASP.NET | SEO