For SEO, content duplication is one of the major problems which many websites come across and still there is no indication why this is happening even though everything seems okay.
If you are unaware of what we are talking about it seems like you have not submitted your url on Google Webmaster Tools. This tool is used by Digital Marketing Agencies and Website Design Companies to fix your website errors on Google search Engines. We advice every person should know about it because this will help you see if your SEO Agencies are doing a good Job for you. Please click Below
How TO SUBMIT URL IN GOOGLE SEARCH ENGINES
If you search Google accordingly, it will say that somehow and somewhere there is some duplication of URLs.
Well, this is right but there can be things that you have not yet imagined. We are running a huge website with lots of complexities and functions that have been causing similar issues you must be facing, but our website is more complex with lots of pages and again we have put a team to figure out what has been wrong in our SEO.
Although, our Search Engine Optimization team having 5-10 years of experience, they also couldn’t have figured out what had been the problem. But now we have griped the real issue and we are in the position to simply solve it.
We will go through very simple steps so that you might be able to clean your website immediately.
Robots.txt (if you want to know more about robots.txt click here)
Try keeping it clean. One of the reasons for keeping it clean that it states a mark on your search console which actually annoys me. This is our opinion to keep it clean otherwise anything that is blocked by robots.txt will show something like this.
However, this will not affect your SEO ranking or anything. But if this is clean you feel motivated that everything is going well.
Try keeping only your website files on a particular directory and any other directory to make a subdomain.
Duplicate URL by Google
Duplicate content is content that has the same content on other pages. Duplicate content is a tricky matter, but when search engines crawl many URLs with similar content recently, Google says it does not matter. Mueller of Google said
"if you had the same content in textual form on your website where it’s clearly duplicate content then what would happen there is we would pick one of those versions to show in Google Search.
It’s not the case that we would say: “oh this website has some duplicate content, we will not show it at all in Google.”
Rather we will say: “There are two versions here. We will pick one of these to show and we will just not show the other one.”
Why is this so that duplicate content isn't affecting your ranking is that because many retailers use the same content, images and description on their website. secondly, if you have the same or duplicate content this means that Google will ignore that page from crawling resulting in a waste of content that is unique on that page.
Finally, even if your content is going on a good ranking, search engines may pick the wrong URL as the "original" using canonicalization that helps you control your duplicate content.
The problem with URLs
You might be thinking "Why would anyone duplicate a page?" and wrongly assume that canonicalization isn’t something you have to worry about. The problem is that we, as humans, tend to think of a page as a concept, such as your homepage. For search engines, though, every unique URL is a separate page.
For example, search crawlers might be able to reach your homepage in all of the following ways:
This can only be resolved using 301 redirections to the one you have set on the search console for example https://webnet.com.pk. All your URLs should be redirected to one which you have set on the search console.
This is a problem that actually causes a lot of indexing issues and a lot of companies require developers to do it. This is the reason that SEO requires a team and it is not a one-man (freelancer doing SEO) job.
The other way of doing this is by using Canonicalization tags are required for all your pages. Of course, it is a time taking job but it could be managed if you have fewer pages. However, if your pages are a lot it is better that it should be resolved using 301 redirections.
If you have 301 redirections in place, still we are facing the same: duplicate, submitted URL not selected as canonical.
Now the problem is that once all this is resolved still Google is somehow managing to give us the same error. Initially, the number of pages is usually less, but once it keeps on crawling, the number of pages increases. Are you still facing the same problem?
NO. WOW you have RESOLVED it. But to understand, to avoid the problem faced by others after fixing everything, please I would recommend reading this blog.
If your answer is YES. Your problem solution is just 1 min away.
The tricky part in putting a canonical tag is to put the URL presently you want google to crawl. A lot of people don't know this but this is very important which we will say again. The canonical URL is the present URL address you want google to crawl or is on the sitemap. Please allow Google sometimes depending on when google bots will crawl it will eventually resolve it immediately. If you see google crawling your URL and your problem is still not fixed please make sure you have the same URL on the sitemap.
Fixing Your Sitemap is very important and it should be perfect. I hope you will have no problems now and if you do you could request us for some help.