Fixing Crawlability Warnings on Squarespace
TL;DR:
- Crawlability warnings mean search engines can't access parts of your site
- Save all edits in Squarespace and refresh your page before rescanning
- Fix broken links and server issues straight away
- Connect to Google Search Console for better search visibility
- Use both SEOSpace and Google Search Console to double-check your fixes
When SEOSpace flags crawlability warnings, it means search engines like Google are having trouble accessing parts of your website. This usually happens because of broken links, server hiccups, or settings that need adjusting.
The good news is these issues are normally straightforward to fix.
Check and Save Your Edits First
Start by making sure all your changes in the Squarespace editor are properly saved. This sounds basic, but it's the most common reason why SEOSpace still shows errors after you think you've fixed them.
After saving, refresh your browser page completely. This clears any cached versions that might be hiding your recent updates from SEOSpace scans.
Fix Broken Links
Check any links that SEOSpace has flagged as broken. Sometimes these show up because of temporary server problems, but other times they're genuinely broken and need fixing.
Go through each flagged link and test it manually. If it's broken, either fix the URL or remove the link entirely. Broken links don't just hurt your SEO – they frustrate visitors too.
Connect to Google Search Console
Make sure your Squarespace site is properly connected to Google Search Console. This connection helps with crawlability issues and gives you direct insight into how Google sees your site.
Google Search Console will show you exactly which pages Google can't crawl and why. It's more detailed than SEOSpace and comes straight from Google itself.
Common Crawlability Problems
Pages Not Found (404 errors): Usually happens when you've deleted or moved pages without setting up redirects. Fix these by either restoring the page or creating a redirect.
Server Errors (5xx errors): These indicate problems with your website's server. Contact Squarespace support if you see these regularly.
Blocked by robots.txt: Check your site's robots.txt file to make sure you haven't accidentally blocked important pages from being crawled.
Slow Loading Pages: Pages that take too long to load can timeout during crawling. Optimize images and remove unnecessary elements to speed things up.
Testing Your Fixes
Once you've made changes, wait a few minutes then run another SEOSpace scan. Crawlability issues should clear up quickly if you've fixed them properly.
For more thorough testing, use Google Search Console's URL Inspection tool. This shows you exactly how Google sees any specific page on your site.
TL;DR:
Q: How do I make sure SEOSpace recognizes my fixes?
A: Save all changes in Squarespace, refresh your browser completely, then run a new SEOSpace scan. Give it a few minutes between making changes and rescanning.
Q: What should I do about broken links?
A: Test each flagged link manually. If it's broken, either fix the URL or remove the link. Don't leave broken links on your site as they hurt both SEO and user experience.
Q: Why should I connect to Google Search Console?
A: Google Search Console gives you direct information from Google about crawling issues. It's more detailed than third-party tools and helps you understand exactly what Google can and can't access on your site.
Q: How long does it take for fixes to show up?
A: SEOSpace should recognize most fixes within minutes. Google Search Console can take longer to update, sometimes a few days for major changes.
Jargon Buster
SEOSpace: An SEO tool that scans Squarespace sites for optimization problems and crawlability issues.
Crawlability: How easily search engines can access and read your website's pages. Good crawlability means search engines can find and index your content.
Google Search Console: Google's free tool that shows you how your site performs in Google search and flags any technical issues.
404 Error: A "page not found" error that happens when someone tries to visit a page that doesn't exist.
robots.txt: A file that tells search engines which parts of your website they should or shouldn't crawl.
Wrap-up
Crawlability warnings might sound technical, but they're usually quick fixes once you know what to look for. The key is being systematic – save your changes, refresh your browser, fix any broken links, and double-check everything with both SEOSpace and Google Search Console.
Regular monitoring helps you catch these issues before they become bigger problems. Set a reminder to check your crawlability status monthly, especially after making significant changes to your site.
Ready to dive deeper into Squarespace SEO? Join Pixelhaze Academy for comprehensive training and ongoing support.