As a consequence, we removed the parameters from
ELK stack: 3 elks stacked How does this benefit me in practice? The theory behind log files is all well and good, but the real question is: What conclusions and measures can you derive from the large amount of data? Hopefully the following two examples make the topic a little more tangible. 18,000 subscribers can't be wrong You don't want to miss any of our posts and stay up to date? Subscribe to our newsletter now and receive every new Seocracy article by email directly to your inbox! Subscribe now Example.Unwanted parameters A log file analysis from one of our customers showed us that Special Data Google crawled an enormous number of URLs with parameters. As a simple measure, we specified in the parameter handling that Google should not crawl URLs: parameter handling gsc We could see the result in the log files: A year later, Google only crawled a handful of the unwanted parameter URLs: crawled parameter urls Example 2: Incorrect Canonicals After a ranking crash in two countries, we were unable to discover any errors in our crawls and on the important landing pages for another customer. So we looked at the log files.
http://zh-cn.phonenumberes.com/wp-content/uploads/2024/03/Add-a-heading-300x169.jpg
There we saw that the majority of crawled URLs contained a parameter: crawled parameters googlebot 1563 of 168 crawled URLs contained a parameter - even though none should actually exist. All of these parameter URLs were unwanted pages. It turned out that the canonicals of these pages pointed to themselves, even though they were duplicate content and should have pointed to the page without parameters. Because the pages Google crawled were unlinked, no SEO tools could find the incorrect, duplicate pages. the canonicals and the rankings were back: corrected canonicals How can I derive measures from log files? In the beginning, many people find themselves at a loss because of the mass of data.
頁:
[1]