The Seven Deadly Sins of SEO: #7 “Avoid Black Hat Techniques”

July 17, 2013 by  
Filed under Seo For Advanced

They appear every so often on internet marketing forums; people claiming to have discovered a fool proof “black hat” search engine optimization technique. Their technique, available for a price, will propel your website to the top of the search engine listings – and of course they guarantee you’ll never get caught.

Now, think about it. While we’d all like to believe that there are methods that can get us to number one in Google with no effort whatsoever, it just isn’t true. Google is huge, and it’s smart. There’s no denying that those employing “black hat” (a phrase used to describe methods that go against Google, or other search engine, terms of service) techniques may experience success at first, but it won’t be long term. Not ever. In fact, there’ll be lucky if it works for a few days.

Let’s say these people, these forum peddlers, really had discovered a flawless technique to guarantee themselves top of the pile picks in search engine results. Do you think they’d be selling their method for a couple of bucks on forums? No, of course not. If their method really worked, they’d be creating small affiliate websites in every profitable niche, working their SEO black hat magic and sitting back to watch the profits roll in. Furthermore, the more they publicize their method, the more likely it is that Google will discover it – so why would they risk it?

They wouldn’t, because these methods don’t exist. Avoid them. Don’t waste money, both on purchasing the method and the subsequent building and use of method on a website, on something that is doomed to fail.

The Seven Deadly Sins of SEO: #6 “Title Stacking”

July 7, 2013 by  
Filed under Seo For Advanced

When it comes to search engine optimization, one of the most useful tools in a web developer’s arsenal is the < title > tags within HTML code. Unlike articles, which must be based around keywords (a procedure which is never easy), the < title > tag is a section of code which you can pack with your keywords – all without having to add a context, a readability, and all the other things that an article needs. The extra bonus is that you can have your main page have a < title > tag full of keywords, and keywords are often hard to find on a simple “welcome to this website” page.

The usefulness of the < title > tag is also one of its major problems. The tag becomes so powerful, so influential, and so easy to use, that those employing shady black hat SEO techniques quickly learn how to manipulate it. They discovered that by using more than one set of < title > and < / title > tags in an HTML code for a web page, they could fit in many more keywords – and thus rise up the search rankings. Using many sets of < title > tags is, understandably, known as title stacking.

Get caught doing it by search engines, and you’ll be dropped from the search results quicker than you can say ‘jack rabbit’. It might work for awhile, but the overall quality and reliability of your site will soon be called in to question – because you will get caught. Use one set of < title > tags only, and keep the keywords relevant to your site.

The Seven Deadly Sins of SEO: #5 “Hosting Viruses, Malware or Other Nasties”

June 29, 2013 by  
Filed under Seo For Advanced

This may seem obvious; no search engine is going to rank you well in their search results if their bots discover that there is spyware, malware, viruses or any other kind of internet nasties contained within your website. In fact, if a bot does discover such content, your site will most likely be removed and blacklisted for good.

So that’s simple – and most of you won’t even be considering hosted that kind of content anyway, so there’s nothing to worry about, right? Perhaps wrong. Many sites are subject to hacking, which leads to them being infected with the nasties that search engines (and internet users in general, for that matter) hate so much. Even sites with thoroughly strong security can be hacked and infected, quite without the owner’s knowledge. So you could be merrily promoting your site, working on its content and ensuring your SEO is tip top, but you may not be aware that your site is infected and only a few steps away from being blacklisted forevermore.

There are a few things you can use to prevent it. The first is obvious, but crucial: visit your site regularly with your anti-virus working, and check it seems okay. Secondly, you can get a good idea of what other people think of your site by installing a Firefox Add-On called “Web of Trust”. This displays a ring of one of three colors near the browser menu of a website; green means the website is ‘safe’, orange means ‘doubtful’ and red means ‘avoid this site’. These colors are user generated, so you can check that no one is experiencing problems with your site by installing this add-on.

The Seven Deadly Sins of SEO: #4 “Linking To Bad Sites”

June 22, 2013 by  
Filed under Seo For Advanced

Have you ever heard the phrase ‘falling in with a bad crowd’? Well, if you link to websites that search engines consider ‘bad’, that’s the search engine optimization equivalent of falling in with a bad crowd. While your website may not be intrinsically ‘bad’ in itself, if you promote (by linking) sites that violate the terms and conditions of major search engines, you’ll be tarred with the same brush. While it’s unlikely your site will be completely blacklisted, you may see a sharp fall in rankings position – or be removed from the search rankings altogether.

This, of course, begs the question: how do I know what a ‘bad’ site is? After all, if someone links to you, you’re probably going to want to do the decent thing and return the favor That’s what so much of website building, networking and promotion is all about – right? So how can you be sure you’re not destroying your own search engine chances by linking to a poor site that search engines consider bad?

It’s tricky, but the basic answer is to use your gut. How does the website look? Does it look professionally designed, properly maintained? Is the content unique, or does it all sound familiar, or is the English terribly written?

On a more technical basis, you can check the PageRank of the site, and also its standings with Alexa. This should give a good understanding of the website in question’s general standing, and whether or not it’s the kind of crowd you want to be associating with. Also familiarize yourself with the Google terms of service, and scan the site for any obvious violations. If it passes, feel free to post a link back.

The Seven Deadly Sins of SEO: #3 “Duplicate Content”

June 15, 2013 by  
Filed under Seo For Advanced

Among those well versed in internet marketing, duplicate content is something of a sticky issue. The exact nature of the problem is in what constitutes duplicate content, with some internet marketers insist anything that has previously been written on any other website qualifies as duplicate content – while others say it only matters for the same text to be repeated on the same website.

The exact definition is not exactly known, and isn’t helped by the fact that the search engines are not particularly forthcoming on the issue. However, if you are found to be using duplicate content on your website and a search engine does have an issue with it, you can kiss goodbye to a good ranking with that search engine.

It is more likely – though not certain – that the duplicate content rule applies to text used within the same site. You should not, for example, make lots of pages all using the same article with no changes. This is the lesser version of duplicate content, though some marketers still exist search engines frown on the same article or text being used from anywhere on the internet will trigger a duplicate content penalty.

The idea, of course, is to avoid plagiarism and for search engines to avoid publishing results that show the same text over and over again. To be absolutely sure you’re not committing the duplicate content sin, always write and use original content, both within your website and externally. That way, you can be sure – no matter who is right and wrong in the debate – that you aren’t going to be penalized for it.

The Seven Deadly Sins of SEO: #2 “Cloaking”

June 8, 2013 by  
Filed under Seo For Advanced

All the major search engines compete to make their search results as relevant, up to date and informative as possible. For a search engine to be considered effective, and therefore gain users, it relies on its reputation for providing the right information for any given search term.

They’re right for assuming this. Imagine you were looking for some tips on how to clean your windows, and you used a search engine you’re unfamiliar with. If you visited a site through this new search engine, and it brought you to a website on adult porn – you wouldn’t be too happy, would you? In fact, you’d probably dismiss the search engine as useless, and wouldn’t bother to use it again.

That’s why search engines take an issue known as ‘cloaking’ so very seriously. If their livelihoods depend on the search results being accurate and informative, search engines have a duty to their own business ethics – as well as their customers – to frown upon cloaking, and they do. Do it, and your website will be removed from search results and most likely blacklisted.

So what is cloaking? Cloaking is the practice of writing a piece of programming that means human visitors to your website see something very different from what a search engine bot crawling your website sees. If you cloak effectively, you could indeed disguise your adult site as something as harmless as cleaning windows – and you’d benefit from a good SEO ranking. You’d also, unfortunately, ruin the search engine results – and they can’t be having that. When it comes to cloaking, avoid.

The Seven Deadly Sins of SEO: #1 “Hidden Text”

June 1, 2013 by  
Filed under Seo For Advanced

Anyone with a basic understanding of search engine optimization will know that text on a website plays a large part in how you are ranked in search engines. In fact, it could be argued that the textual content of a website is actually the most important thing for search engines.

It’s therefore natural for the cunning mind to wonder if it’s possible to introduce sections of ‘hidden text’. Imagine you’re not the best writer in the world, and you don’t want to have to spend a lot of money outsourcing content creation. Yet at the same time, you’re aware of the importance that search engines place on textual content. So rather than writing poor articles yourself, trying to jam your keywords in, you can simply write the keywords into a spare section of your website – and then changed the font color so it is the same, or virtually the same, as the background of the page. Suddenly, your website is stuffed with keywords, but all without having to publish poor articles or ruin the look and feel of your website in general.

This practice goes by a variety of names, including font matching and keyword stuffing. However, whatever you call it, it’s a bad deal.

Why? Well, the reason is obvious – it’s a cheat. Google, and the other major search engines, place an importance on text content because they want their search results to be relevant. Hidden text defeats the point of this, and if you’re caught doing it, you will have your website banned from the search engine – for good.

Why Is SEO Important?

May 26, 2013 by  
Filed under Seo For Advanced

Think back to the days before the internet, when everyone looked for business listings in the Yellow Pages or a similar hard copy directory. Now, imagine you’re trying to find a plumber. You go to the “P” page, flick through, and only see three plumbers listed.

It’d be a massive advantage for the three plumbers, wouldn’t it? To effectively be all that potential customers see. Well, that’s the power of SEO.

When someone uses a search engine, they type in a set of words to bring up results that are relevant to them. Once the applicable websites appear on the results page, they are rated in terms of ‘relevancy’. Any website which has correctly used search engine optimization will be judged by Google to be ‘relevant’. So if you have correctly used SEO, your website will appear somewhere near the top of the search engine results page – hopefully in the top three.

Why is that so important? Well, studies have shown that the vast majority of those using search engines only click on websites listed in the top three results of any search engine results page. This is where it compares to being only one of three businesses listed in an old-style paper directory like the Yellow Pages. Master SEO, and your website will appear in high in search results, in the slots (one to three) that users click. That naturally leads on to more people visiting your website, and that in turn means more business, more customers, and ultimately more money.

What Is Robots.txt?

May 19, 2013 by  
Filed under Seo For Advanced

For a search engine to keep their listings up to date, and present the most accurate search engine results, they perform an action known as a ‘crawl’. This is essentially sending a ‘bot’ (sometimes known as a ‘spider’) out to crawl the internet. The bot will then find new pages, updated pages or pages it did not previously know to exist. The end result of the crawl is that the search engine results page is updated, and all of the pages found on the last crawl are now included. It’s simply a method of finding sites on the internet.

However, there may be some instances where you have a website page you do not want included in search engine results. For example, you may be in the process of building a page, and do not want it listed in search engine results until it is completed. In these instances, you need to use a file known as robots.txt to tell a search engine bot to ignore your chosen pages within your website.

Robots.txt is basically a way of telling a search engine “don’t come in here, please”. When a bot finds a robots.txt file, it will ‘read’ it and will duly ignore all the URLs contained within. Therefore pages within the file do not appear in search results. It isn’t a failsafe; robots.txt is a request for bots to ignore the page, rather than a complete block, but most bots will obey the information found within the file. When you are readied for the page to be included in a search engine, you simply modified your robots.txt file and remove the URL of the designated page.

Should You Outsource Your SEO Work?

May 14, 2013 by  
Filed under Seo For Advanced

Search engine optimization is one of those odd, new Millennium techniques that has arrived right at the time when the average user can become an expert. Yes, you need to study SEO, but by and large the mechanics of it can be self-taught using internet forums and help guides. Yet there are companies that offer to perform SEO work for other people, and they manage to stay in business – how do they do that when it’s a skill that most people can learn? Maybe it isn’t all so easy after all…

People who are new to SEO may quickly consider themselves an expert. You can search Twitter, and will find a thousand profiles cheerfully insisting that the person running the account is an SEO expert. SEO Expert is not (like doctor, or dietitian) a legally protected term, so anyone can claim to be so. And many users may genuinely feel they are an expert, and that they have nothing to benefit from an SEO company. That’s the problem with new technologies.

If you are looking to launch a website to sell a product or service, you’re probably looking around the internet to see what you need to know. Eventually, the term SEO – and it’s importance – will crop up, along with the help guides telling you how to do it yourself. Doing it yourself is a lot cheaper than outsourcing to a company, so why should you both?

Put simply, no one can be an expert in a short period of time. The people who run SEO service companies really are experts, who have studied the art of SEO for long periods of time. You will always, unless you can spare several months to learn it all yourself, get better results with them.

Next Page »