- Web Development
- Social Media
- Strategic Marketing
- White Label Services
- Referral Program
Filing a reconsideration request can fix a website that was hit by the Penguin algorithm.
Reconsideration requests are to be used for getting a website out of a manual penalty. The Penguin algorithm works on it’s own, and even if someone at Google decides that your website no longer deserves to be downgraded by Penguin, 99.99% of the time there is nothing that they can or will do to help you.
You need a robots.txt file in order for your website to be crawled by the Google
When there is no robots.txt file telling the bots NOT to crawl the content, it is assumed that they may crawl and they will do so. There is no need to have a robots.txt file in order for this to be possible.
Adding links to a disavow list sends Google negative signals about that website
This myth was recently busted by John Mueller (Webmaster Trends Analyst for Google Sweden) in a Google+ hangout. Being added to a disavow list does not negatively effect the website which was added and it doesn’t count as any kind of a negative vote against that domain.
Using strong tags instead of bold tags, and emphasis tags instead of italics tags is better for SEO
Matt Cutts has stated multiple times that these tags are treated equally by Google’s algorithms.
Websites can be penalized for having duplicate content
There is no penalty for duplicate content. The duplicated content is generally just ignored. When you are dealing with a scenario where your website’s content has been duplicated by someone else on a different domain, Google will do their best to figure out who the original author was and will likely choose to display this website’s content. This doesn’t always mean that the first website to have the content indexed will be considered the original source, and it doesn’t mean that only the original source will automatically be shown. It is possible for this to give a webmaster the short end of the stick, but this is certainly not a penalty against a website.
Disavowing all of the bad links to a website that is under manual penalty will get the penalty removed
There will almost always need to be more work done than this in order to get out of a manual penalty for having bad links to your website. The reconsideration requests are reviewed by real people and they are specifically looking to see what type of link removal efforts were made in order to get the website out of it’s penalty.
Foreign/Vanity TLD’s are treated equally by Google
Here’s the lowdown on vanity TLDs: They are generally treated much differently than TLDs native to a searchers country, but with exceptions, and not always in a consistent way. A vanity TLD is actually designated for a specific country, so Google will usually assume that content on domains with that TLD is most relevant to users within the country that the TLD is designated for.
For example, domains using the .ca (Canada) TLD will be shown much more often to users in Canada than anywhere else in the world.
However, some TLDs like .co (Colombia) are being used often now by American companies as vanity domains, so Google is choosing to treat these a bit more like a .com domain which is shown much more often to American users.
Lesson; vanity domains are probably going to be really bad from an SEO standpoint, but a few of them can still be viable options like .co.
Asking someone to link to your website is against webmaster guidelines
This is allowed. The important factor is that it is the editor’s choice to link to you and it is their choice how they link to you (which anchor text, destination URL etc). When this is their choice, it is an editorial link.
This walks a fine line regarding what is editorial and what is not. Carefully stated: If you are giving them the code to put onto their website (with anchor text and destination URL), this is usually something that would be considered to be questionable or manipulative. Simply asking for a link does not necessarily violate the Webmaster Guidelines.
All New Websites Go Through an SEO Sandbox Period
A common experience for owners of new websites is to see them get indexed, obtain their rankings in the search engines for a period of a few days or weeks, and then the website will be dropped far down in the SERPs for a period of months. This has traditionally been known as the sandbox effect. Not all websites go through this.
The only way to get out of a manual penalty is to file a reconsideration request
While for most practical purposes this should be considered the proper course of action, it’s not the only way out of a penalty. Manual penalties do expire. There is no set period of time known to the SEO community, but Google has indicated that this is will usually happen after a period of time – measured in years, not months.
Meta descriptions and meta keywords should always utilize your primary keywords that you want to rank for.
Neither of these are used as ranking factors by Google according to John Mueller, and I have known this to be the case for years now. This doesn’t mean that they are not used for anything, but stuffing these with keywords is not going to help you rank any higher.
In some cases, here at seo4anyone we stopped filling out the meta descriptions altogether for some pieces of content. This increases the chances that users will be shown something that is directly relevant to what they searched for, as opposed to just simply using the meta desc. to show the majority of users what YOU want to show them (not always the thing that is most relevant to their query and most likely to get a click-through).
Not more than a few weeks ago, Matt Cutts revealed that he had begun doing the same thing on his own blog.