Double your BlogEngine Traffic in 60 Seconds
Some time ago, I blogged about the increasing importance of performance, read page load speed, in the determination of "rank" by Google.
Then I noticed that most of my blog posts included an image and that there was a considerable lag, so I decided the quickest method of fixing this was to simply go into my settings and decrease the number of posts per page from five to three.
In just five months, the number of hits per months has gone from 9,977 for May to 13,542 for September, a 35% increase. In that time period, there were only three blog posts put out. Draw your own conclusions!
Google Cannot Solve your Coding Problems
Stop and think before you do this. Googling for code samples is usually the final step when designing solutions. Start by defining the problem; if you can't define the problem, how can you solve it? Then, and only then, start working on a black-box solution of your own; identify the basic inputs, general logic (phrase it in a single sentence - this is the black box), and outputs of this black box. Even if you're not artistically inclined, always opt to take pencil to paper - it will help.
Finally, with a clearer perception of what it is you need to do, use Google to focus in on relevant code samples. Doing this in reverse order leads us to try and re-imagine the problem to fit somebody else's solution. I imagine 80/90% of programmers, myself included, are guilty of this everyday form of insanity.
In my ever ending search for a tool/technology to manage my inflow of information from the Web, I opened a Twitter account that could also act as a set of pseudo-bookmarks. My IE8 and Firefox bookmarks already stretch to the floor when expanded. Plus, I'm convinced that there must be some unknown aspect of HCI that dictates that once something is saved as a bookmark, it is never opened again. Bookmarks drop downs are about as user-friendly as crotch rot. I've looked at Evernote and OneNote, the latter being the most promising to date. I plan to take another look at the products from Microsoft Live Labs.
Recently, I passed the 700 tweet mark and decided to archive what I had. But how? I gleaned the following tidbits from "Twitter Tips, Tricks, and Tweets" by Paul McFedries - founder and CEO of Mashable. You can save a local copy of your tweets by entering the following URL in your browser:
where account = your account name, and
n = number of tweets
This will open n number of your tweets to date in XML in your browser. From here, you can import your saved XML file into Excel 2007 as follows:
1) In Excel, click on the Data tab
2) Data -> From Other Data Sources - From XML Data Import
3) Click a cell where you want to import the data to
4) Save as, using the id of the last tweet
The reason I include the last Tweet id# is to make it easy to archive next time, starting where I left off and using the following syntax:
If you look at the Excel sheet you will see that the last tweet had an ID of 6839319097. This will download an XML version of all my tweets since then. Now I have a proper archive of my tweets that I can search on :-)
From here, you might want to take a look at the TweetSharp API - a complete .NET library allowing you to build Twitter applications in C# and .NET 3.5. I've started reading a great book on the topic: "Professional Twitter Development with Examples in .NET 3.5" by Daniel Crenna.