The “Blocked Due to Unauthorised Request (401)” error in Google Search Console (GSC) is often confusing-largely because it can stem from various causes and apply to multiple use cases.
While we could write separate guides for each scenario, this article focuses specifically on identifying and fixing the 401 error within your own website as seen in your GSC dashboard.
We’ll explain what this error means, how to identify affected pages, and the steps needed to resolve it effectively.
We may explore how to troubleshoot this error when accessing someone else’s site in a future article. But for now, let’s concentrate on optimizing your site’s performance in Google Search Console.
What Is the “Blocked Due to Unauthorised Request (401)” Error?
When you encounter the “Blocked Due to Unauthorised Request (401)” error in GSC, it indicates that Googlebot attempted to crawl certain pages on your site but was denied access due to lack of proper authentication.
In other words, Googlebot reached a page it was supposed to crawl (based on your sitemap or internal linking structure) but encountered a 401 status code-a standard HTTP response that signals “Unauthorized.” This prevents the bot from crawling, indexing, or ranking that content, and GSC logs it as a blockage.
Unless a page is explicitly excluded through a robots.txt rule or a noindex meta tag, Google assumes it has permission to access it. When it can’t due to authentication requirements, it throws a 401 error and stops trying.
Where to Find 401 Error Pages in Google Search Console
To identify which pages are triggering this error:
- Log into your Google Search Console account.
- In the left-hand navigation panel, select “Indexing” > “Pages.”
- You’ll be presented with a report that includes various indexing issues. Scroll down to view the different error categories.
- Look for the issue labeled “Blocked due to unauthorized request (401)” and click on it.
- GSC will list all URLs impacted by this specific error.
You can then export the list or examine each URL individually to begin the troubleshooting process.
Common Causes of the 401 Error in GSC
Several issues may cause Googlebot to encounter a 401 error when trying to access your site:
1. Authentication-Required Pages
Some sections of your site might be protected by login credentials, intentionally or accidentally. If Googlebot isn’t provided access credentials, it can’t crawl these pages.
2. Incorrect Website Configuration
CMS settings (like WordPress privacy options), server misconfigurations, or restrictive .htaccess rules can block access unintentionally.
3. Firewall or CDN Restrictions
Security systems, CDNs (like Cloudflare), or web application firewalls (WAFs) can misinterpret Googlebot’s activity as a threat and block it. These tools may need to be configured to allow known crawlers.
4. robots.txt or Meta Directives
While robots.txt typically prevents crawling rather than serving a 401, misconfigured rules can result in mixed signals. Similarly, a combination of noindex, authentication, and block rules may cause unintended crawl restrictions.
5. Problematic Plugins or Extensions
Website security plugins may add an authentication layer to pages unintentionally. If you’ve recently added or updated a plugin, especially those related to security or membership, it’s worth reviewing their settings.
How to Fix the “Blocked Due to Unauthorised Request (401)” Error
Thankfully, resolving the 401 error involves a structured approach. Start with basic diagnostics and progress toward more technical checks:
1. Manually Check the URL
Try accessing the affected URL directly in your browser (e.g., Chrome or Firefox). See if a login prompt appears or if the content is restricted. If all users are required to authenticate, Googlebot will be blocked as well.
If this authentication is unnecessary, consider removing it or adjusting access permissions for that page.
2. Examine Server Logs
Review your server logs to identify how often the 401 error occurs and for which URLs. Logs can provide timestamps, request headers, and IP addresses, which are helpful for distinguishing between user and bot activity.
Frequent 401s on the same page often point to systemic access issues that require attention.
3. Audit Website Configuration
Check your .htaccess, server permissions, and CMS settings. Make sure no unintended access restrictions are applied to public-facing pages.
This step is especially important if you’ve made recent changes to website themes, page settings, or user roles.
4. Use the URL Inspection Tool in GSC
Google Search Console’s URL Inspection Tool allows you to test a specific page to see how Googlebot renders it.
- Paste the affected URL into the search bar at the top of the GSC dashboard.
- Click “Test Live URL.”
- Check for crawlability and any HTTP status codes returned.
If the tool still detects a 401 error, then the issue persists and needs deeper investigation.
5. Review Firewall/CDN Settings
Ensure your firewall or CDN (e.g., Cloudflare, Akamai) isn’t blocking Googlebot. These systems often have bot protection settings that can be overly aggressive.
Add Googlebot’s IP ranges to your allowlist or configure your firewall to recognize and permit its user agent string.
6. Check Plugins and Extensions
Deactivate any recently installed or updated plugins-particularly security, membership, or access control plugins-and see if the issue resolves.
If the 401 errors disappear after deactivation, explore the plugin settings or consult the plugin developer for a fix.
7. Update All Systems and Software
Keeping your CMS, plugins, theme files, and server software up to date ensures compatibility and reduces the risk of unintended access issues.
An outdated plugin or misaligned version of PHP, for example, could cause conflicts that restrict access to pages.
8. Seek Professional Support (If Needed)
If you’ve tried all of the above and are still encountering 401 errors, it may be time to consult a web developer or server administrator.
An experienced professional can perform deeper diagnostics, adjust server configurations, and ensure that best practices are followed.
Conclusion: Keep Googlebot Crawling Freely
Understanding and addressing the “Blocked Due to Unauthorised Request (401)” error in Google Search Console is critical for maintaining your website’s visibility in search results. Whether the issue is due to authentication, misconfigurations, or overly strict firewalls, the key is identifying and resolving access blocks swiftly.
The earlier you fix these errors, the sooner Googlebot can resume crawling and indexing your content-ensuring it reaches your intended audience.
Related Articles:
How to Fix “Crawled – Currently Not Indexed” in Google Search Console