Google updates come once in a while although for the last two years the intervals between updates have been few and far in between. The latest update from Google, Panda, has affected a lot of people especially those who do not have quality content on their websites. After all it was designed to help weed out spammy sites from those which produce real and helpful content.
The Panda updated has changed the SEO landscape considerably. No longer are website owners rushing to produce just any type of content. They are now concentrating more on quality instead of quantity. SEO, or search engine optimisation, is responsible for content quality, user experience, design and usage metrics.
The good news is that webowners and writers need not be in the dark because Google has released a few clues to give everyone insight on how web engineers think and work so that everyone can create SEO friendly posts. However, content which is already on your site now must be fixed in accordance to algorithm rules so that you will not be hurt by Panda.
Duplicate content is one of the most frowned upon things when it comes to SEO. However, internal duplicate content is an entirely different thing from that of plagiarised content. Internal duplicate content happens when you create unique URLs for the same piece of content. This means that you will have two URLs pointing to the same page. You will be penalised if Google discovers this because it’s as if you’re trying to game the algorithms, albeit unintentional. They do not take to this very kindly.
To solve this problem all you have to do is eliminate the duplicate content. The only thing that Google wants to see in sites is fresh and entertaining content.
Internal near-duplicates is content changed slightly based upon a few strategic keywords and headers to generate another article. Content is the same across all pages save for some keywords which have been changed. This usually only happened to large e-commerce sites where products only have variations when it comes to colour and size.
Large websites have the tendency to store numerous pages of internal search results and this is something which Google does not like. Why? Because they do not want to deliver another search page to the user. Google only wants to deliver useful and relevant information to users all over the world.
Although this might not impact small or medium-sized sites, it is still important that everyone should know what to do about this and that is to block or nonindex the pages from being crawled. This might be time consuming but this is something which needs to be done to avoid problems.
Low copy density but high structure site design
If you put too much focus on design and neglect the copy portion, Google will penalise you. How to solve this? Write detailed and original content which has a toned down design. Simplicity is king so determine what things your site can live without and get rid of them.