Join the link exchange system
It's a fairly old job, but it's no longer valid. The search engine that wants to link is essentially 'natural', citing when it comes to providing information and tools. In the meantime, link exchange exhibits change and they are very easily detected.
Do not waste time joining link exchanges to build this simple sub-link system. Link building is, however, very important as Web pages in link diagrams are useful for users. Build links to pages that have the same topic and are useful to users. And of course it would be better if the web page with this topic links to your website without necessarily linking back.
Intermediate content
There are two ways to create dual content:
- Many Webmasters deliberately create doorway pages, websites with similar content, even completely like the original page. These pages are presented in a variety of ways to promote the company's products or services.
- Sometimes, in the same Web page, the same content will appear in many different pages (different URL paths). For example, the same blog content can be found in the link to the article, category, archive, RSS and on the homepage.
The problem with dual content is that Google always wants to give searchers a wide choice of content, for example, Google only picks out a single page of duplicate content. So duplicate content wastes search engine time and wastes your Web server bandwidth. And sometimes the results displayed on the search page are not the content version that you want users to reach.
What do you have to do to avoid intermediate content? Please refer to the article on duplicate content above and find ways to reduce them. In addition, there are a number of tools to help you figure out which version needs to be indexed while excluding the extra versions.
Use Session IDs in URLs
Before going into the details, if you have not mastered the basic components of a URL hypertext link, please refer to the article of basic components of URLs, static Web and dynamic Web.
The fact that Google indexes Web pages is continuous. The frequency of Googlebot depends on the ranking of the Web site and the level of updating the page's information. To have a high ranking Web site is a long-lasting persistence. In addition, Google and other search engines like static Web sites. The parameters that appear at the end of the URL will be treated by the search engine as a component of the URL.
If your dynamic Web page contains a Session ID parameter, it is more likely that the search bug will fall into an infinite loop when indexing your page because each visit is assigned a new Session ID and GoogleBot will treat this as a new article. With Session ID, you will create as many duplicate content as mentioned. And Google will spend a lot of time uselessly indexing, while you spend more bandwidth on them. Session ID will reduce your page rank.
Although Google's algorithms have significantly improved the handling of session IDs, you should use cookies instead of using parameters on the URL. Remember that only 2% of users do not use cookies.
You also try to create friendly URL paths (keywords in the URL) using mod_rewrite URL with for example htacess, or Permanent Link configuration for WordPress.
Website with Flash
On the technical side, a Web page that is completely covered by Flash can be eye-catching, but it is certainly hard to get high rankings on search engines. As in Flash Website SEO article for Google, even though search engines can read and index Flash, it is hard to see a Flash Web site has high rankings for hot, highly competitive keywords. . One of the simple reasons is that Google likes text. And if you display pages with lots of text, Flash just stops providing visual effects.
Using too much JavaScript
JavaScript can be very effective in Website design. The problem is that Google will have trouble understanding javascript source code. Although now and in the future, Google has and will make more efforts but the use of JavaScript will still lack the effect of contacting search engines.
For optimization, SEO people often separate JavaScript, but in case of using, insert the file (include) or use CSS to replace in the title or body of the Website. Help the machine understand the main content of the page and index them easily, so everyone benefits.
Cloaking technique
This is a 'black hat' SEO technique to display other content for search bugs than regular users. This is another old technique used by many spammers in previous years.
Search engines today easily detect this fraud by sending often signing new search bots for the purpose of detecting cloaking. There are many cloaking techniques, tricking search bots that cannot be listed in the article's limits. However, they were soon discovered. This is a 'black hat' SEO trick to avoid.
In case of discovery, the relevant Web site will be banned. So you should not use this technique. Let's solve the problem by other techniques.
Conclusion SEO tips
Through the above analysis, vietSEO summarizes two main issues that Webmasters, SEOs need to pay attention when applying SEO tips:
- Learn how search engines work to help them understand the content of your website. The issues that explored the above section all have one thing in common is that they make it difficult for search engines to index and define Web content. So build a website that interacts well with search engines to provide them with unique content.
- Don't use useless time to fool search engines. Because search engine algorithms are smart enough to detect tricks, not to mention human resiliency in fighting spam. Even if you are looking through the eyes, the search engine. It is only temporary for a short period of time and the price to pay when being turned over will be much more expensive. Fooling search engines is not a long-term way. Use your time, energy and money to invest in useful content, tools and participate in other promotions that you will do if you Search engines did not exist.