Webmaster Makes The Top Ten Mistake- Trips & Tricks BD














The skill of crafting a website for success in the search engines has evolved over the years. Online businesses that have focused on branding have held steadier positions, while others who have skated by on tricks have been hit hard following the Panda and Penguin updates. Some mistakes are caused by intentionally using tricks to game the search engines to achieve better rankings; others occur as a result of not understanding best practices. Here are the top ten mistakes webmasters make when constructing and promoting their websites:



10. Spam


The definition of spam has changed over the years. What is considered “white hat” one day can be quickly rendered “black” the next, upon an algorithm update. This offense takes place off the website in places where one can easily create a link to help the website rank better, (i.e. forums, content 2.0 websites, directories). Some webmasters are responsible for generating the spam themselves; others hire the least expensive SEO company and get a high volume of low quality links.  In an attempt to improve the quality of the search results, Google has implemented two major Penguin updates over the last couple of years with numerous tweaks in between.

Today, spam is commonly recognized as:

  • Forum posting – The worst spam is performed by automated bots. Spammers pay brief compliments to the posters and add no value to the websites they post on, yet leave anchor text links in their signatures.
  • Blog comment spam – Where prompted to enter one’s name in the field above the comment, the spammer uses the target keyword instead, linking it to the designated website. Many leave irrelevant comments, making it clear they never read the blog post.
  • Paid blog networks – Websites offering a network of blogs for paid members to post links to. Many in the system were derived from dropped domains which were replaced with new content provided by the paying member.
  • Spun articles – A single article can be rewritten numerous times, either manually or by merging different paragraphs from different articles together to create hundreds of versions. The search engines are now able to detect spun articles and consider them low quality links.

9. Unnatural Link Profile


An unnatural link profile may consist of a high concentration of spam, but the term “unnatural” refers to a broader definition of link building behavior.  “Unnatural” can include the speed at which the links were built, the anchor text used in the links and the types of websites from which the links were derived. If the webmaster obtained too many of the same type of link, i.e. directory links, the search engines will frown upon the link profile.

If a website were coasting along at the rate of 20 new links per month and suddenly built 200 the following month, the search engines would be able to detect an “unnatural” link growth. To remain safe, webmasters should avoid any sudden leaps in link growth and try to slowly implement the increase over a longer period of time. Links should be obtained from a wide variety of sources, the majority from websites relevant to the target website.

One of the biggest impacts of the Penguin update was incurred by webmasters building far too many exact match anchor text keywords, primarily to the home page of the website. One way to check a website’s anchor text concentration is to use Moz’s Open Site Explorer. If you enter your website url in the search field, you will be presented with several different tabs to analyze information about your website. One tab, called “anchor text,” will offer you the option of examining the spread across your domain or to a particular page. A natural link profile will have the highest concentration of back links in the form of the domain name, i.e.  Website Name, website name, www.websitename.com, websitename.com, http://www.websitename.com and http://websitename.com. Most links built by visitors to a website will combine the aforementioned links plus links with no target keywords, i.e. click here, visit site, read more and a string of random words. If you discover that the majority of the anchor text to your website contains exact match keywords, you will either need to remove some of them, modify the anchor text to appear more natural or build more natural links to dilute the concentration of exact matches. If your website has recently experienced a dramatic decrease in traffic, the exact match anchor text links should be further examined to see if they require removal or modification. A healthy link profile will have less than 10% exact match anchor text links and the acceptable percentage is expected to dwindle upon further algorithmic updates.

8. Black Hat Tricks


Some of the common black hat tricks that worked well on websites years ago were doorway pages, hidden text, keyword stuffing and cloaking. Today, none of these tactics work and cause search engine penalties.

  • Doorway pages may be listed in the search engines and clicked on by the visitor but send the visitor to a different page, sometimes via a meta refresh.
  • Cloaking is another form of a doorway page, but the search engines see one page and the visitors see another. This is performed through a server side script.
  • Hidden text could be contained within JavaScript code or rendered in the same color as the page background, so as not to be seen by the visitor.
  • Keyword stuffing has been cracked down on by the Penguin update. It is defined as cramming too many keywords into the text of a page. Some webmasters also extend this to the meta tags.

Sometimes webmasters unintentionally harm their websites by using meta refreshes to send a visitor to the right page because the content has been moved. This is the wrong approach; the old page should be redirected using a 301 redirect with code added to the .htaccess file.

 7. Poor or Low Quality Content


Prior to the Panda update, the word count on an individual page did not factor into its ability to rank well in the search engines. Today, “high quality” is typically considered a page with a minimum of 300 words, proper formatting, images, videos, etc. to make the page visually appealing and informative to the website visitor.

6. Duplicate Content


Many times, the webmaster does not realize his website contains duplicate content. Some content management systems will generate multiple urls automatically. Some duplication is caused by creating multiple pages for the same product to account for different colors and  configurations. Another example is when a page offers search filter options and generates a different url for each filter result. Many of these instances are easily fixed through the application of the "rel= canonical" tag on the duplicate page – code that tells the search engines to credit a particular page’s content back to the page referenced in the "rel= canonical" tag. When there are multiples of a page, links can be built to all of them, splitting up the link juice that should be given to one version of the page.

But, the worst offenses are those that are deliberate. If a webmaster steals another website’s content and places it on his website, he is scraping content and often violating copyright terms. The search engines can determine which website is the original source of the content by the page's date stamp in the server. The scraper site may be penalized by Google, especially in cases where the problem occurs throughout the website.

5. Copying Competitors


Many webmasters get caught up in what the Smith’s next door have. If the webmaster is so focused on how his competitor ranks that he copies everything his competitor does, he will always be chasing his competitor’s tail to keep up with him in the rankings. The greater concern is that if one sees a competitor getting away with a technique that violates Google’s Webmaster Tools Guidelines and copies that technique, eventually the search engines will catch on and both the competitor and the website mimicking the technique will suffer. When webmasters copy what works, they also copy the competitor’s mistakes.

4. Lack of Trustworthiness


Trustworthiness is an especially important factor if your website is an e-commerce store.  Today, shoppers are more wary of websites they’ve never ordered from before. If your contact page is just a form with no address or phone number, customers will be concerned about having a problem remedied if you do not respond to their emails. If you do not respond to customers in a timely manner, you will quickly develop a bad reputation. Few shoppers write glowing reviews – more are apt to write poor reviews following bad experiences.  

Trust is also demonstrated by presenting a safe browsing experience through secured web pages to prevent the customer’s personal and credit card information from being stolen. Displaying a logo from a qualified SSL certificate company such as VeriSign provides shoppers with assurance their information will remain secure. This, in turn, can translate to a higher ROI.

3. Poorly Planned Redesigns


Often, webmasters redesign their websites to incorporate new features, cleaner navigation and a more attractive design. A lot of things can go wrong in the process such as not redirecting all of the old urls to the new ones, removing too much content or temporarily redirecting pages through 302 redirects. Old and new pages should be carefully mapped out to prevent loss of pages and rankings when the new design is launched.

2. Misinformation


There is no shortage of webmasters whose main purpose is to find high CPC domains to monetize. They purchase a domain, hire someone to write the content and fill the pages with affiliate links. In cases where the writer is unfamiliar with the niche or the webmaster is filling the pages with uninformative text optimized for certain keywords, the information can be boring at the least and completely wrong in the worst case scenario. Poor information is certain to chase website visitors away, creating a high bounce rate and possibly earning the webmaster a bad reputation.

1. Outdated Content


While many businesses have caught on to the benefits of an online presence, they treat their websites as though they were billboards along the highway. Page freshness is currently a part of the ranking factor and websites that sit idle for years have little chance of doing well in the search engines. Websites should be updated and present new content.  One way to accomplish this is to include a blog that adds new posts on a regular basis.
It is much more time consuming and costly to go back and fix on-site issues than to build a website efficiently from the onset. Ideally, if you work with a programmer who is familiar with SEO best practices, the code will be cleaner, the urls search-engines friendly and content duplication properly handled through the robots text file. When a website is updated with new content or new features, the webmaster should research how those features may affect his website's performance and how to avoid any issues that could arise as a result. Any off-site work performed by a third party should adhere to Google's Webmaster Tools Guidelines, the goal being to build strong links from high authority, related websites at a steady pace. Beware of keyword ranking promises that sound too good to be true, for those are the
Share this article :

0 Comments:

Post a Comment