Anyone that uses affiliate programs to make money online should know the benefits of masking your affiliate links. If you are promoting or selling multiple products having links that look like blah.blah.te.dah.com/purchase.php?productid=1&affid=bob
not only looks terrible but it also exposes your affiliate id, opening you up to click theft and a slew of other issues. It also makes the links much more difficult to insert into your content and so on and so forth. Additionally, if you are interested in tracking statistics for your affilaite programs using masking allows you that option as well.
By masking your links you can take that unsightly 3 mile long link and shorten it down to something like yoursite.com/likes/ProductName
which is not only a lot more user friendly, but also allows you control of the anchor text for the link as well as keeping your affiliate ids hidden from peering eyes. There is however one downfall to this, and that comes in the form of our favorite little bots following those links thinking they are local links on your site and even in certain circumstances indexing the content on the following pages as if it were content on your site. I highly doubt you want the search results for your site to return clones of pages for each and every affilaite program you have ever promoted (can you say duplicate content).
A easy way to fix this is to simply tell the search bots where they aren’t allowed to go, this can be done very easily by inserting the following lines into your robots.txt file (that all sites should have anyway) You can use this format for as many areas on your site that you would like to keep bots from checking out, but remember, not all bots will honor this so it is not “sucure” but most honorable bots like Google will follow the instructions in your robots.txt file.
Disallow: /likes/
This will tell any bot that comes to your site to ignore and not follow any url with /likes/ in it, can it get any easier than this? Obviously you will want to modify the code to fit your needs. By using the Disallow feature of robots.txt files you can very easily control the indexing of your site, It’s smart SEO and it’s also a good way to save the bots time by not having them waste their resources following and indexing things you don’t want thus freeing them up to index the content on your site that you actually want indexed.
Nice tip Jesse!
I implemented the affiliate masks but haven’t done the robot thing yet, good reminder :)
Chris’s last blog post..Exploring the earning power of a niche site