Top 10 ways to increase traffic to your wiki

From EggeWiki
Revision as of 01:45, 30 September 2007 by Egge (talk | contribs) (New page: On my wiki I journal various tips, usually technical in nature. Often I write a small piece of code of figure out the solution to a strange error message, and I like to post it to my wiki...)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

On my wiki I journal various tips, usually technical in nature. Often I write a small piece of code of figure out the solution to a strange error message, and I like to post it to my wiki. Most [search engine optimization] articles relate to general websites or blogs. Wiki's generally do not have regular readership, but people come across them through search engines or direct links. Most SEO tips are about creating stickiness and brands. What I want is different. People who find my wiki, should find the information they were looking for and leave.

Why do I post to my own wiki instead of to [x]? Over the years I've published various helpful guides and answers. For a while I was posting to a well known site which lets 'experts' publish answers to questions. The initial business model was that corporate users would pay for 'expert' credit. Unfortunately, the site turned in to the webs biggest SEO link farm. Often when I search for solutions to computer problems this site shows up as the number one page in Google. I've gone so far as to install a Grease Monkey script to remove these links. The other obvious place to post these sorts of answers are to news groups. Well, many of the things I've figured out don't have a pre-existing question. I figure if I write the answer, people with the question will find it.

So, I've gone to creating a wiki with all sorts of information which I find useful. Where I have relevant information, I want people to be able to find it. Here are some tips I have to make you wiki more accessible to search engines.

Top 10 ways to increase traffic to your wiki

  1. robots.txt
    • Every site should have a Robots.txt file. This file serves to keep the robot out of areas where it doesn't need to be, and pointed to the ares it should be indexing. A robot does not need to index the history, or click on edit links. I disallow these, and this removes some of the 'noise' from the search engine. Secondly, the robots.txt file lists my sitemaps.
  2. site maps
    • Most wiki software can create a [site map]. This is important, as it lets robots know exactly what has changed on your website, and what priority it should index things in. This keeps the robots from indexing the sames pages all the time, and helps them find pages which might not be linked to.
  3. Wikipedia
    • Wikipedia has a specific policy against original research on it's site. If you've written original research, then it you may want to put the content on your site, and include a reference to your site on Wikipedia. For example, a receive a lot of links from Wikipedia to my Java implementation of a [CUSIP] check. I don't think Wikipedia needs to have code samples in every language on it's site, but many people find the link to the Java or C code useful. I'm opposed to adding non-relevant external links on Wikipedia, and I actively try to remove them.
  4. [Google Webmaster Tools]
    • This is a great service provided by Google, and it lets you monitor your crawl stats and query search terms. You can see if non-relevant parts of your website are being index, and what search terms people type in to find your site.
  5. Yahoo! Site Explorer
    • Not as useful as Google's tool, but you'll want to make sure that Yahoo has a good index of your site, and that it's picked up your site map.
  6. [AWStats]
    • This is a great, free tool, and many hosting providers have it already setup. You should look at the query search terms, as well as any errors. Additionally, it makes sense to see look for abnormal traffic. This is often the only clue that your site is being covertly spammed. For a while my [Magpie RSS] reader tool was being used by spammers to make it appear that their content was a part of my site.
  7. [The long tail]
    • I know my writings serve only a small niche. If you have a wiki, don't try to compete with Wikipedia, but rather, write about specifics which you know a good deal about. One of my more popular pages wiki pages is, and has been GPS_Meets_Google_Maps. I wrote this when Google Maps first came out, and many people since then have wanted to view their runs on a Google map. There's probably not a whole lot of people who own a Forerunner 201 and also want to put the data onto a Google Map, but those who do, often come across my article. Most of the time you have no idea what will become popular, so write about unique things, and you'll find a niche audience.
  8. keep spam free
    • Make sure that spammers aren't polluting your site. Once spammers start to get links into your site, your page rank will go down. You should update your wiki software at least once a year. Often old software contains security holes where spammers can write scripts to directly edit your site. A few times when I ran phpWiki, my entire site was replaced by a spammer. One thing to do is to subscribe to all edits on your site. Even with this, if there is a security hole, you won't get notifications of the changes. MediaWiki has been quite good in regards to security and updates. If there wasn't active development of MediaWiki, I'd probably have to switch to something else. Many of my pages rank #1 on Google with the right set of search terms. Even with CAPTCHA's, spammers will find it worth their time to attack these sorts of pages.
  9. have a quality web host with good availability
    • Well, I haven't done so well on this front. For the price, LinkSky provides a real good service. However, often the web server gets overloaded on doesn't respond. I'd like to upgrade to at least a VPS. If I were making money from my website, I might be able to justify the increased cost. If your hosting your own site you can setup Squid caches so most traffic doesn't have to involve database calls and large amounts of php. This is how Wikipedia is able to scale. When huge volumes of people hit a dynamic page you either have to have lots of servers, or effective caching of the content which isn't changing.
  10. DMOZ
    • It's not what it used to be, and unfortunately, is a far cry from what it could be, but it's still an important directory to be a part of. Make sure your site is listed once (and only once) in the directory.

Things I'm not so sure about

  1. Spelling
    • It makes sense to have things spelled correctly, but misspelling is a common SEO tactic. Misspelling certain words will give you ranking for the misspellings, but may not give you traffic. For example, as of this writing, I have the top page for [curoisity]. When I noticed this, I fixed the spelling on the page, and left a note regarding the original misspelling. We know people misspell and miss type all the time, but I think it looks really bad to have misspellings on your site.
  2. Social bookmarks
    • I'm experimenting with adding social bookmarks on my site.
  3. Create a top 10 list
    • People seem to like to link to these sorts of articles. We'll see if this pages becomes one of my top 10 wiki pages.