Translate

Common Issues And How To Solve Problems Seo

Common Issues And How To Solve Problems Seo - This post outlines some of the most common problems we often encounter when doing site audits, along with some that are not so common at the end. Hopefully a solution that will help you when you encounter this problem your digital marketing (web seo).
Common Issues And How To Solve Problems Seo
Common Issues And How To Solve Problems Seo


  1. URL Canonicalization / Some versions of the homepage

  2. URL canonicalization is the process of ensuring that a consistent URL ("canonical" URL) presented to the search engine for each page, to avoid the perception of having duplicate versions of the same page content in your website. The problem with having multiple URLs for the page is that the link does not consolidated equity between them. And if it happens that the search engines believe that you have a ton of duplicate content on your site, they may consider your site to be of low quality.

    Here are three common canonicalization issues we see
    This homepage is achieved in a variety of URL, such as: example.com, example.com / home, example.com / english / home, example.com / home.php Functions entire website with or without www subdomain: example.com and www.example.com both bring homepage without any diversion. URL works with or without spaces: example.com / products and example.com / products /

    The reason this is a common problem is because it is very subtle. Web site designers typically focus on user experience website, and URL only played a minor role in this regard. However, in some cases canonicalization issues can drastically affect SEO from across the web.

    How to cope:
    We prefer to solve this problem by adding a 301 redirect to the duplicate version of the page that points to the correct version. You also can solve the problem by using the rel = canonical tag.
    Another solution is to do crawl the site using tools like Screaming Frog to find internal links pointing to the duplicate page. You can then go in and edit the duplicate pages so that they point directly to the correct URL, rather than having an internal link will be through 301 and lost a bit of link equity.
    Tip - You can usually decide whether this is actually a problem with the look of the Google cache every URL. If Google does not know duplicate the same URL, you will often see the PageRank different levels and different cache date.

  3. Uppercase vs. lowercase URLs

  4. This problem is most common at sites that use. NET. The problem stems from the fact that the server is configured to respond to URLs with uppercase and not redirect or rewrite to lowercase version.
      This problem is not as common as it is because generally, the search engines have become a lot better choose the canonical version and ignore duplicates. However, I have seen too many examples of the search engines do not always do this correctly, that means that you have to make it explicit and does not rely on search engines to find out for yourself.

    How to cope:
    There is a URL rewrite module that can help solve this problem in IIS 7 server. This tool has a great option in the interface that allows you to enforce lowercase URLs. If you do this, the rule will be added to the web config file that will solve the problem.

  5. Query parameter added to the end of the URLs

  6. In this case, the user clicks on the URL in terms of SEO is relatively friendly, but quite often you can end up with a URL like this:
    www.example.com/product-category?colour=12

    This example will filter product category with a particular color. Filtering in this capacity for the user but may not be great for the search, especially if the customer is not looking for a specific type of product to use color. If this is the case, the URL is not a great landing pages to target specific keywords.
    Other issues may have a tendency to use up tons of budget creep is when said parameters are combined together. To make things worse, sometimes the parameters can be combined in different orders but will return the same content.

    For example:
    • www.example.com/product-category?colour=12&size=5 
    • www.example.com/product-category?size=5&colour=12 

    How to cope:
    The first place to start is to handle the page you want to allow Google to crawl and index. This decision should be driven by your keyword research, and you need to cross-reference all database attributes with your core target keywords. Let's continue with the theme of Go Outdoors our example:

    Here are our main keywords:
    Waterproof jacket
    hiking boots
    Ladies running pants

    On an eCommerce website, each of these products will have attributes associated with them that would be part of the database. Some common examples include:
    Size (ie Large)
    Color (ie Black)
    Price (ie £ 49.99)
    Brand (ie North Face)
    Your task is to find these attributes are part of the key used to locate products. You also need to determine what combination (if any) of these attributes are used by your audience.

    In doing so, you may find that there is a high search volume for keywords that include "North Face" + "waterproof jacket." This means that you will want a landing page for "North Face waterproof jacket" to be crawlable and indexed. You also may want to make sure that the database attributes have SEO friendly URLs, so rather than "waterproof-jacket /? Brand = 5" you would select "waterproof-jackets/north-face /." You also want to make sure that the URL is part of the structure navigation of your site to ensure a good flow of PageRank so that users can find this page with ease.

    If the URL is not indexed, simple steps that can be done is to add the URL structure to your robots.txt file. You may need to play around with some Regex to achieve this. Be sure to use the Fetch as Google feature in Webmaster Tools. It's important to note that if the URL has been indexed, add them to your robots.txt file will NOT get them out of the index.

    Adding rel = canonical tag serves as a plaster placed on this issue in the hope that you can fix it properly later. You'll want to add the rel = canonical tag to the URL you do not want indexed and leads to the most relevant URLs that you want indexed.

  7. Lack parseable content

  8. Most of our company's web site at the lack of substantial parseable content. Because internal resources are not allocated to writing original content for websites (and also because most of the content has to be reviewed by various departments, ranging from legal to PR), most companies fail to include more than 150 words important content on the landing page. This results in fewer opportunities for optimized keywords or link inclusion. Ideally an important landing page must have at least 300-400 words of content. It can be calibrated by pages that rank above the fold to check and see how much of the content on their pages.

  9. Soft 404 errors Page

  10. A soft 404 also means you can not see the real page and identify the damaged areas of your site where users receive a bad experience. From the perspective of building links (I should mention somewhere!), Solution is not a good choice. You may have entered the URL link is broken, but the link would be difficult to track and redirect to the correct page.
    How to cope:
    Fortunately, this is relatively simply fix for developers who can set the page to return the status code 404 instead of 200. While you're there, you can have fun and make a cool 404 pages for the pleasure of your users. To find 404s soft, you can use the feature in Google Webmaster Tools will tell you.

  11. Redirects 302 instead of 301

  12. Once again, this is an easy redirect for developers to get one because, from the perspective of the user, they can not tell the difference. However, search engines treat the transfer is very different. Just to recap, a 301 redirect is permanent and search engines will treat it as such, they will pass link equity across to the new page. A 302 redirect is a temporary redirect and the search engine will not pass link equity because they expect the original page to return at some point.
    How to cope:
    302 URL directed to find, I suggest using the crawler like Screaming Frog or IIS SEO Toolkit. You can then filter by 302s and check to see if they really have 302s, or if they have 301s instead.
      To solve the problem, you will need to ask your developer to change the rules so that a 301 redirect is used instead of a 302 redirect.
Thus our discussion this time about the Common Issues And How To Solve Problems Seo, hope to help you all

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.