This is the continuation to the 20 Tips to SEO series. We’ve already covered Google Penalty, Image Optimization and Link Building Tips. Today, let’s talk about optimizing the Robots.txt file for websites and blogs.
Link: Daily Blogger
All too often, internal marketing teams or “the tech guys” at any given company think they know exactly what’s best for a website’s SEO strategy. This developer mindset is based on a view that’s considered programmatically correct. It’s quite often the way things have always been done, and continue to be done; “search” is given little value, as it’s attributed to being a marketing function.
For example, many developers are of the opinion that changing a site’s structure—adding in new content pages, editing title tags, and so on—makes no difference. WRONG!
There are some fundamental rules that can be applied universally to any website, and they simply cannot be ignored. So, without further ado, let me climb atop my mountain of SEO wisdom and expound some of these gems to developers and marketers across the globe.
SEO Doctor is made with both beginners and experienced SEOs in mind and it’s scoring system and recommendations are based on official SEO documents, namely Google Webmaster guidelines, Google Image guidelines and Google SEO starter guide as well as my own experience.
Main features of SEO Doctor are:
* Points out to potential problems and assigns a score for your pages based on currently accepted SEO methodology
* Shows link structure and helps track page rank flow for your pages
* Detects and warns you about pages not indexable by search engines using most comprehensive methods available in any tool
* One-click access to most popular SEO tools allows you to additionally inspect a site
* Fully customizable