Hello, guys, this is a very interesting and very important article. Google Search Console is headquartered of your website.
Google Search Console tells you how your website reacts in the search engine. You can see much important information there. No one tool is able to show the information like Google Search Console.
Today I am going to show you some basic uses of Google Search Console. Before reading this article you have to add website here.
Without adding the website here you cannot enter in the Google Search Console.
Let start the tactics one by one-
Add Sitemap to Google Search Console
This is a one-time process. The sitemap is a very important part of the website. Every website has the sitemap. No one website is without a sitemap. We need to generate the sitemap. The sitemap is a very simple one-liner code.
You can easily add or remove sitemap by using the Google Search Console. The sitemap is the source for crawler by which it can easily crawl your website. Total links of your website are in the sitemap.
Search Engine crawler gets the correct path to go from.
Kindly you have to go to your Google Search Console account. I am telling the correct way to add the sitemap to Google Search Console. By following this step by step you can easily add your sitemap to Search Console.
First, you have to open your Google Search Console. Now you click on the website where you have to add the sitemap. Choose only the correct website.
After that, you can see the different option on the left side of the option. In that option, you can see the drop down option Crawl. Simply click on the option. Now the new list will open. You can see the list.
In this new list, you got the independent option for the sitemap. Now you have to add your sitemap there.
The sitemap of the blogger is different than other websites. If you are using blogger, I have written the independent article for it.
Checking the robot.txt file
Friends this is also a very important term of the website. It allows the crawler to crawl your website. As you know there are many types of crawler like Google Crawler, Ninja Robot etc. If you want to disallow someone of this to crawl your website, you can do it by using this.
Robot.txt file is on our website server. I have written the important article basis on robot.txt for Blogger.
I have given the complete guidance of the setting of the robot.txt file for Blogger. Crawler gets the crawling signal through the robot.txt file because the sitemap of your website is in this file. Therefore this is not good to ignore it.
Search Console tells you about your robot.txt. It shows is your robot.txt working properly, how does it react with the crawler, is there any errors in the robot.txt file. In short, you can find your errors by using Google Search Console.
By this way, you can increase your website visibility in Search Engine.
After installing the robot.txt file, you can check its status in Google Search Console.
How to check the status of robot.txt in Google Search Console?
First, you have to click on the drop-down button crawl. After that, you can see the robot.txt tester option. Simply click on that option. After that, you can see your robot.txt file.
Googlebot, Googlebot-news, Googlebot-image, Googlebot-video, Googlebot-mobile, Mediapartners-Google, Adsbot-Google this type of bots you can check by this tool.
Fetch as Google
This is also one of the very important parts of Google Search Console. If you do this process for your website, it increases in your website visibility.
You can do it in your Google Search Console account by following the following steps.
First of all, you have to go to your website in Google Search Console. Now click on the crawl button. After that, you can see the option. Now Click on the Fetch as Google button.
Congrats you have entered successfully.
Now you need to add your every link here. Here you get two options to add links. The first option is for desktop and the second option is for mobile devices.
You have to copy your links one by one and paste it in the empty space as shown in the image. You can click any one button Fetch or Fetch and Render.
This is your choice. But I recommend Fetch and Render because it will be more profitable for you.
After submitting the links you can see the new option will open for you. Request Indexing is the name of the option. You have to click on that. Now again the window will open with two objectives. You have to choose the second one.
Crawl this URL and its direct links are the correct option for you because you have done the internal linking in your every article. By using this option crawler get the permission to crawl that links again.