Managing a growing website is no small task. I often found myself juggling between multiple tools—Google Search Console, WordPress plugins, and even third-party services—to keep track of my website’s health. Whether it was identifying broken links, tracking redirects, or analyzing server-side metrics like HTTP status codes and load times, I was constantly switching between tabs. Not only was this time-consuming, but it also introduced a significant risk of missing crucial insights, especially when managing over 400 URLs.
This scattered approach made me realize the need for a centralized system. I wanted a single platform where I could monitor all my URLs, analyze their server-side performance, and quickly spot potential issues—all without the hassle of opening multiple tools. That’s when I decided to harness the power of Google Sheets and Apps Script to build an automated URL monitoring system.
By creating custom functions in Google Apps Script, I was able to bring everything into one place. From fetching HTTP status codes to tracking server response times and even identifying redirect chains and canonical URLs, my Google Sheet transformed into a comprehensive monitoring dashboard. What once felt like a tedious process is now automated, efficient, and easy to manage.
In this blog, I’ll explain exactly how I built this solution. Whether you manage a small blog or a large website with hundreds of URLs, this guide will walk you through automating URL monitoring using Google Sheets and Apps Script. Let’s dive in and simplify your workflow.
Benefits of Automating URL Monitoring
Automating URL monitoring offers significant advantages, especially when you’re managing multiple pages or a growing website. Below are the key benefits that I experienced after centralizing and automating my URL tracking process:
1. Time
Before automation, I had to manually check URL health across tools like Google Search Console and WordPress plugins. This process consumed hours, especially with a large list of URLs. Automating these checks now saves me that time and effort by providing real-time results in a single Google Sheet.
2. Easy Management
Instead of switching between different platforms, I now have all the critical server-side metrics—like HTTP status, load time, redirects, and metadata—in one place. This centralization allows me to identify and resolve issues quickly without losing track of what’s already been analyzed.
3. Proactive Problem Detection
Automating URL monitoring helps spot issues like broken links (404 errors), redirect loops, or missing metadata before they negatively impact user experience or SEO rankings. This proactive approach allows me to fix problems before they escalate.
4. Scalability
Whether you’re managing 10 URLs or 1,000, an automated system scales effortlessly. Manual monitoring becomes impractical with a large number of URLs, but automation allows me to handle more without any added effort.
5. Customizability
The system I built isn’t just limited to HTTP status codes. With custom Apps Script functions, I can monitor additional metrics like robots.txt content, meta robots tags, and even hreflang configurations for multilingual websites. This flexibility means I can tailor the solution to meet my specific needs.
6. It’s Free
Instead of relying on expensive third-party tools for monitoring, this solution leverages Google Sheets and Apps Script—both of which are free to use. It’s an affordable way to maintain professional-grade website monitoring.
By automating URL monitoring, I not only streamlined my workflow but also ensured that my website remains in top shape. If you’ve ever felt overwhelmed by managing URLs manually, automation can be a game-changer, allowing you to focus on improving your website rather than maintaining it.
Prerequisites
Well, it is not a rocket science, but just in case you’re doing this kind of things for the first time, I’m sharing the things needed to perform this task:
- Google Account: Like, obviosuly you need a google account.🌚 You’ll need access to Google Sheets and Apps Script, both of which are part of Google Workspace.
- Basic JavaScript Knowledge: While I’ll provide ready-to-use code, understanding JavaScript basics will help you customize the scripts if needed.
- List of the URLs: Create or prepare a Google Sheet with the URLs you want to monitor in the first column. This will act as the input for your script.
- Patience for Testing: Well, the process can take a little time if you’ve a large number of URLs. So patience is must.
That’s all you need to get started. In the next section, I’ll guide you through setting up your Google Sheet and Apps Script for URL monitoring
URL Monitoring using Gsheet & Apps Script
So, now we’ve covered the benefits, prerequisites, and all other stuff, it’s time to dive into the actual process of automating URL monitoring. In this section, I’ve written the steps I followed to set up my Google Sheet, write Apps Script code.
I hope by the end of this section, you’ll have a fully functional system to monitor HTTP status codes, load times, redirect chains, and more—all from a single spreadsheet.
Let’s begin with setting up Google Sheets.
Setting Up Google Sheets
Create a New Google Sheet:
- Open Google Sheets and create a blank sheet.
- Rename it to something meaningful like “URL Monitoring Dashboard.”
Add Column Headers:
- In the first row, create headers such as:
URL
HTTP Status
Load Time
Redirect Chain
Content Type
Canonical URL
Meta Robots
Hreflang Tags
- These headers will organize the metrics you’ll track.
- In the first row, create headers such as:
Input URLs:
- In the first column (under the
URL
header), list all the URLs you want to monitor.
- In the first column (under the
Format Columns:
- Select the entire
URL
column, and go to Format → Number → Plain Text to ensure that URLs are interpreted correctly. - Similarly, format other columns as needed (e.g., plain text for most results).
- Select the entire
Once your sheet is ready, it will serve as the input and output workspace for your automated monitoring system. In the next step, we’ll move on to writing the Apps Script code that powers the monitoring process.
Writing the Apps Script
With your Google Sheet set up, now we’re going to write the Apps Script code that will automate the monitoring process. Google Apps Script allows you to create custom functions and automate tasks directly from your spreadsheet.
Here’s how you can get started:
1. Open the Apps Script Editor
- In your Google Sheet, go to Extensions → Apps Script.
- This will open a new tab with the Apps Script editor where you can write and manage your code.
2. Understand the Key Functions
In this code I’ve written custom functions to monitor various metrics, like:
- HTTP Status Code: Fetches the HTTP response status (e.g., 200, 404) for a URL.
- Load Time: Measures the server response time for a given URL.
- Redirect Chain: Tracks the sequence of URLs in a redirect chain.
- Content Type: Retrieves the type of content served by the URL (e.g.,
text/html
). - Canonical URL: Extracts the canonical URL specified in the page’s HTML.
- Meta Robots Tag: Fetches the value of the
meta robots
tag, if present. - Hreflang Tags: Lists
hreflang
tags for multilingual pages.
3. Paste the Code
Copy this Apps Script code that I’ve prepared:
function getHttpStatusCode(url) {
if (url === "") return "No URL provided";
try {
var response = UrlFetchApp.fetch(url, { muteHttpExceptions: true });
return response.getResponseCode();
} catch (e) {
return "Error: " + e.message;
}
}
function getLoadTime(url) {
if (url === "") return "No URL provided";
try {
var startTime = new Date().getTime();
UrlFetchApp.fetch(url, { muteHttpExceptions: true });
var endTime = new Date().getTime();
return ((endTime - startTime) / 1000).toFixed(2) + " seconds";
} catch (e) {
return "Error: " + e.message;
}
}
function getRedirectChain(url, maxRedirects = 10) {
if (url === "") return "No URL provided";
var chain = [];
var currentUrl = url;
for (var i = 0; i < maxRedirects; i++) {
try {
var response = UrlFetchApp.fetch(currentUrl, {
muteHttpExceptions: true,
followRedirects: false,
});
var statusCode = response.getResponseCode();
chain.push(currentUrl + " (" + statusCode + ")");
if (statusCode >= 300 && statusCode < 400) {
var location = response.getHeaders()["Location"];
if (!location) break;
currentUrl = location;
} else {
break;
}
} catch (e) {
chain.push("Error: " + e.message);
break;
}
}
return chain.join(" -> ");
}
function getContentType(url) {
if (url === "") return "No URL provided";
try {
var response = UrlFetchApp.fetch(url, { muteHttpExceptions: true });
return response.getHeaders()["Content-Type"];
} catch (e) {
return "Error: " + e.message;
}
}
function getCanonicalUrl(url) {
if (url === "") return "No URL provided";
try {
var response = UrlFetchApp.fetch(url, { muteHttpExceptions: true });
var content = response.getContentText();
var canonicalMatch = content.match(
/<link[^>]*rel="canonical"[^>]*href="([^"]*)"[^>]*>/i
);
return canonicalMatch ? canonicalMatch[1] : "No canonical URL found";
} catch (e) {
return "Error: " + e.message;
}
}
function getRobotsTxt(url) {
if (url === "") return "No URL provided";
try {
var parsedUrl = url.match(/^(https?:\/\/[^\/]+)/i);
if (!parsedUrl) return "Invalid URL";
var robotsUrl = parsedUrl[1] + "/robots.txt";
var response = UrlFetchApp.fetch(robotsUrl, { muteHttpExceptions: true });
if (response.getResponseCode() === 200) {
var content = response.getContentText();
return content.slice(0, 500) + (content.length > 500 ? "..." : ""); // Return first 500 characters
} else {
return "No robots.txt found or not accessible";
}
} catch (e) {
return "Error: " + e.message;
}
}
function getMetaRobots(url) {
if (url === "") return "No URL provided";
try {
var response = UrlFetchApp.fetch(url, { muteHttpExceptions: true });
var content = response.getContentText();
var metaRobotsMatch = content.match(
/<meta[^>]*name="robots"[^>]*content="([^"]*)"[^>]*>/i
);
return metaRobotsMatch ? metaRobotsMatch[1] : "No meta robots tag found";
} catch (e) {
return "Error: " + e.message;
}
}
function getHreflangTags(url) {
if (url === "") return "No URL provided";
try {
var response = UrlFetchApp.fetch(url, { muteHttpExceptions: true });
var content = response.getContentText();
var hreflangMatches = content.match(
/<link[^>]*rel="alternate"[^>]*hreflang="[^"]*"[^>]*>/gi
);
if (hreflangMatches) {
return hreflangMatches
.map(function (tag) {
var langMatch = tag.match(/hreflang="([^"]*)"/i);
var hrefMatch = tag.match(/href="([^"]*)"/i);
return langMatch && hrefMatch
? langMatch[1] + ": " + hrefMatch[1]
: "Invalid hreflang tag";
})
.join("\n");
} else {
return "No hreflang tags found";
}
} catch (e) {
return "Error: " + e.message;
}
}
JavaScript4. Save and Test
- Once you’ve pasted the code, click the Save icon or press
Ctrl + S
. - Test each function individually:
- Go back to your Google Sheet and use a function like
=getHttpStatusCode(A2)
to verify that it fetches the expected result. - Similarly, test other functions such as
=getLoadTime(A2)
or=getRedirectChain(A2)
.
- Go back to your Google Sheet and use a function like
5. Debug and Adjust
Although it shouldn’t through any error, but just in case
- Check the Apps Script editor for issues if a function throws an error. Common problems might include invalid URLs, network restrictions, or API limits.
- Make adjustments as needed to improve functionality or handle edge cases.
6. Automate the Script
- Once all functions are working, you can optionally set up triggers (e.g., time-driven or manual triggers) to automate monitoring at regular intervals.
- To do this, go to Triggers in the Apps Script editor and set up the desired frequency.
Okay, now we’re almost done. After pasting the code and saving it, now we’ll come back to the sheet to explore how to use these functions effectively to track your metrics in real-time.
Calling the Functions in Google Sheets
After setting up the Apps Script code, you can now call these functions directly in your Google Sheet to monitor your URLs. Here’s how to do it:
1. Enter the Custom Function in a Cell
- In your Google Sheet, select the cell where you want to display the result.
- Type the custom function in the format:
=getHttpStatusCode(A2)
– This will return the HTTP status code for the URL in cellA2
.=getLoadTime(A2)
– This will measure the load time of the URL in cellA2
.=getRedirectChain(A2)
– This will show the redirect chain for the URL in cellA2
.=getContentType(A2)
– This will fetch the content type of the URL in cellA2
.=getCanonicalUrl(A2)
– This will extract the canonical URL from the page’s HTML.=getMetaRobots(A2)
– This will return the meta robots tag for the URL in cellA2
.=getHreflangTags(A2)
– This will list allhreflang
tags for multilingual pages.
In my case it was cell "A2", it can be different in your case.
2. Copy Down the Formula
After entering the formula in one cell, you can copy it down for the entire column:
- Hover over the bottom-right corner of the cell (it will show a small blue square).
- Click that button two times and it will auto fill the same for the complete column.
3. Results
Each function will display its result in the cell where it’s called:
- HTTP Status Codes: Expect values like
200
(OK),404
(Not Found), or500
(Server Error). - Load Time: This will show the response time in seconds (e.g.,
1.23 seconds
). - Redirect Chain: A chain of URLs indicating the redirects (e.g.,
https://example.com -> https://www.example.com (301)
). - Content Type: Displays the MIME type (e.g.,
text/html
,application/json
). - Canonical URL: Shows the canonical URL specified in the HTML (if any).
- Meta Robots Tag: Provides the content of the meta robots tag (e.g.,
noindex, nofollow
). - Hreflang Tags: Lists language-region pairs for multilingual pages (e.g.,
en-US: https://example.com/us
).
By calling these functions directly in Google Sheets, you’ll have a real-time monitoring system that tracks essential server-side metrics for all your URLs.
Why Client-Side Metrics Cannot Be Measured?
Now, you might say, if I can do this much then what if I can also measure other SEO metrics like Core web Vitals. So, you’re not alone my friend, I also thought about this.
When I first started working on this URL monitoring thing, my initial goal was to track everything—server response, load time, redirects, and even performance metrics like LCP (Largest Contentful Paint), FCP (First Contentful Paint), CLS (Cumulative Layout Shift), and INP (Interaction to Next Paint). However, I quickly realized a major limitation: Google Apps Script cannot measure client-side metrics. Here’s why.
Server-Side vs. Client-Side Data
To understand why Apps Script cannot measure certain performance metrics, it’s important to differentiate between server-side and client-side data.
Type of Data | Processed By | Metrics That Can Be Measured | How Apps Script Handles It |
---|---|---|---|
Server-Side Data | Web server | – HTTP status codes (200, 404, 500) – Redirect chains – Load time (server response time) – Content type – Canonical URLs & meta robots tags (extracted from raw HTML) | ✅ Can be fetched using UrlFetchApp since it’s returned in the HTTP response. |
Client-Side Data | User’s browser (after server response) | – LCP (Largest Contentful Paint): Measures when the largest visible element appears. – FCP (First Contentful Paint): Measures when the first visual content loads. – CLS (Cumulative Layout Shift): Tracks unexpected layout shifts. – INP (Interaction to Next Paint): Evaluates page responsiveness to user interactions. | ❌ Cannot be fetched using UrlFetchApp because it requires JavaScript execution and browser rendering. |
Google Apps Script operates on the server-side, meaning it can only retrieve raw HTML, HTTP headers, and network response details. It does not execute JavaScript, render web pages, or measure browser-based interactions.
Basically, Client-side performance metrics require these:
- JavaScript Execution: Many modern websites use JavaScript frameworks like React, Vue, or Angular to dynamically render content. Apps Script does not execute JavaScript within fetched HTML, meaning it cannot detect elements that load after JavaScript execution.
- Rendering Engine: Metrics like LCP and CLS depend on how the browser visually paints elements on the screen. Apps Script does not have a browser rendering engine like Chrome, so it cannot measure how content appears over time.
- User Interaction Simulation: INP measures how quickly a webpage responds to a user action like clicking a button. Apps Script operates in a server-side environment and cannot simulate real user interactions.
Because of this, metrics like LCP, FCP, CLS, and INP— which depend on how a page visually loads and responds to interactions—cannot be measured using Apps Script alone.
But then you might say that, is this totally impossible then?
Well, no. I’ve figured out few ways to implement these:
How to Measure Client-Side Metrics?
Since Apps Script cannot capture these values, here are the best alternatives:
Option 1: Use Google PageSpeed Insights API
While Apps Script cannot directly measure LCP, FCP, CLS, and INP, you can use Google PageSpeed Insights API to fetch these metrics. Unlike Apps Script, PageSpeed Insights runs a Lighthouse audit on the page in a real browser environment.
Example API request:
https://www.googleapis.com/pagespeedonline/v5/runPagespeed?url=https://example.com
This returns performance data, including LCP, FCP, CLS, and INP.
✅ Advantage: Uses Google’s real-world performance data.
❌ Disadvantage: API requests can be slow and have rate limits.
Option 2: Use Web Vitals JavaScript Library
For ongoing monitoring, you can integrate Google’s Web Vitals library into your website. This captures real user metrics and sends them to an analytics dashboard.
Example:
import {getLCP, getFID, getCLS} from 'web-vitals';
getLCP(console.log);
getFID(console.log);
getCLS(console.log);
JavaScriptAutomating the Process
Okay, so now let’s say you want to regularly check whether all the URLs on your website are working perfectly. But manually checking hundreds of URLs every day? That’s not practical. I know that because I tried it, and it quickly became exhausting.
So, what you can do is automate the process so that it runs at scheduled intervals without requiring manual intervention. Here’s how I’ve done this:
How to Automate the URL Monitoring Process?
Google Apps Script allows you to set up triggers that run your script at fixed intervals. Instead of manually calling each function every time you want to check the URLs, you can schedule the script to run daily, weekly, or at any custom frequency.
Step 1: Open the Apps Script Editor
- In your Google Sheet, go to Extensions → Apps Script.
- Make sure your script (the one containing functions like
getHttpStatusCode()
) is saved and working.
Step 2: Create a Function to Run All Checks
Since we need to check multiple URLs at once, we should write a script that:
- Loops through each URL in the sheet.
- Calls all the relevant monitoring functions.
- Stores the results back in the sheet automatically.
You can create a new function, like runBatchChecks()
, which goes through each URL and fills in the monitoring data.
Step 3: Set Up a Time-Based Trigger
- Open Apps Script and go to Triggers (Click the Clock Icon on the left sidebar).
- Click + Add Trigger and set up the following:
- Choose function to run:
runBatchChecks
(or whatever you named it). - Select deployment: Head
- Select event source: Time-driven
- Select type of time-based trigger: Choose a schedule (e.g., every day, every hour, or every week).
- Save the trigger.
- Choose function to run:
Now, your script will automatically execute at the defined schedule, fetching fresh monitoring data at regular intervals.
Why Automate the Process?
Once you’ve set up automated URL monitoring, you’ll never have to manually check your URLs again. Here’s why automation makes sense:
Challenge Without Automation | Solution With Automation |
---|---|
Checking URLs manually is time-consuming. | The script runs automatically at set intervals. |
Human errors can lead to missed issues. | The script ensures consistent and accurate checks. |
It’s difficult to track historical performance. | Regular updates let you see trends over time. |
You might forget to check URLs regularly. | Automation ensures your site is monitored continuously. |
By automating URL monitoring, you ensure that any issues—broken links, incorrect redirects, or slow-loading pages—are detected immediately without any manual effort.
With the monitoring system now fully automated, the next step is to analyze the results and use them to improve your website’s health. Let’s move on to the next section.
SEO Benefits of URL Monitoring
Alright, so now you have a fully automated system tracking your URLs. Great! But you might be wondering: “How does this actually help my website’s SEO?” Let me break it down for you.
Imagine Google’s bots visiting your website. They’re like strict examiners—crawling your pages, taking notes, and deciding whether your site deserves a good ranking. If they find broken links, slow pages, redirect loops, or missing metadata, they’re going to dock points from your SEO score faster than a teacher catching a student copying in an exam.
Here’s why regular URL monitoring is crucial for your website’s technical SEO health:
1. Fixing Broken Links Before Google Notices
404 errors are SEO nightmares.
- A page that was once live but now returns a
404 Not Found
sends bad signals to search engines. - If users and crawlers hit broken links frequently, Google assumes your site is poorly maintained, which can negatively affect rankings.
- With automated monitoring, you can spot and fix 404 errors before they impact your traffic.
✅ SEO Impact: Ensures a smooth user experience and helps retain search rankings.
2. Keeping Redirects Under Control
Redirects aren’t bad, but too many of them? That’s a problem.
- A simple
301
redirect from an old page to a new one is fine. - But if Page A redirects to Page B, which redirects to Page C… now you’ve got redirect chains that slow down load time and confuse search engines.
- Worse, a poorly set up
302
(temporary) redirect instead of a301
can prevent Google from passing link equity.
✅ SEO Impact: Monitoring redirect chains ensures proper SEO-friendly redirects and prevents unnecessary delays in page indexing.
3. Improving Page Load Speed (Google Loves Fast Sites!)
Google’s algorithm considers page speed a ranking factor.
- If your pages take forever to load, visitors leave. Google notices. Your rankings drop.
- A slow page doesn’t just hurt SEO—it kills conversions too.
- Regular monitoring of server response time helps you spot performance issues before they affect rankings.
✅ SEO Impact: Faster pages improve user experience, lower bounce rates, and boost rankings.
4. Ensuring Canonical Tags Are Set Correctly
Duplicate content confuses search engines.
- If two URLs serve the same content (
example.com/page
vs.example.com/page?ref=twitter
), Google might pick the wrong one to rank. - A canonical tag tells Google which version to index.
- Monitoring canonical URLs regularly helps avoid duplicate content issues that could impact search rankings.
✅ SEO Impact: Prevents duplicate content penalties and ensures correct indexing.
5. Keeping Robots.txt and Meta Robots in Check
Ever had a page disappear from Google without explanation? It could be because of your robots.txt
file or meta robots
tag.
- A misconfigured
robots.txt
file can accidentally block search engines from indexing important pages. - A
meta robots
tag withnoindex
on a critical page? Say goodbye to organic traffic. - Automating these checks ensures that you’re not unintentionally telling Google to ignore your best pages.
✅ SEO Impact: Ensures search engines crawl and index the right pages.
6. Tracking Hreflang Tags for Multilingual Sites
If you run a multilingual website, hreflang tags tell search engines which page version to show to users in different regions.
- Incorrect or missing
hreflang
tags can lead to wrong pages ranking in search results (like showing your French page to an English-speaking audience). - With automated monitoring, you can ensure your multilingual SEO is on point.
✅ SEO Impact: Helps search engines serve the right content to the right audience, improving international SEO.
SEO isn’t just about writing great content or getting backlinks. It’s also about technical health. Regular URL monitoring ensures that Google sees your website as a reliable, well-maintained platform—which means better rankings and a smoother experience for users.
So, if you want higher rankings, better user experience, and fewer SEO headaches, automated URL monitoring isn’t just a nice-to-have—it’s a must.
And now that you know the SEO impact, let’s talk about how to optimize and scale this system for larger datasets. 🚀
Conclusion
Managing a website isn’t just about publishing content—it’s about keeping things running smoothly. And trust me, manually checking URLs, redirects, and SEO-critical elements is a time sink you don’t want.
By automating URL monitoring with Google Sheets and Apps Script, you now have a system that:
✅ Checks for broken links before they affect SEO.
✅ Tracks redirects to prevent unnecessary chains.
✅ Measures load times to keep your website fast.
✅ Monitors canonical tags and robots.txt to ensure proper indexing.
✅ Keeps hreflang tags in check for international SEO.
This isn’t just about saving time—it’s about staying ahead of SEO issues before they hurt your traffic. And the best part? It runs on autopilot.
Now, instead of wasting hours tracking URLs across multiple tools, you have one place where everything is monitored and updated. Your website stays optimized, Google stays happy, and you focus on growth instead of firefighting.
So, if you haven’t already, set up your Google Sheet, add the script, and start monitoring your URLs today. Your future self (and your rankings) will thank you. 🚀
People Also Ask For:
How can I monitor my website’s uptime using Google Sheets and Apps Script?
By combining Google Sheets with Apps Script, you can create a tool that periodically checks your website’s status. This involves writing a script that sends HTTP requests to your site’s URLs and records the responses in the spreadsheet. This setup allows for automated tracking of your website’s availability.
Is it possible to detect content changes on a website using Google Apps Script?
Yes, you can monitor a website for content changes using Google Apps Script. This typically involves fetching the website’s HTML content and comparing it to a previously stored version to detect any differences. However, be aware that some websites may have dynamic content that changes with each visit, which can complicate this process.
How do I set up automated checks in Google Sheets to monitor URLs?
In Google Sheets, you can write custom functions using Apps Script to check URLs. For instance, you can create a function that fetches a URL and returns its HTTP status code. By applying this function across a list of URLs in your sheet, you can monitor their statuses. Additionally, setting up time-based triggers in Apps Script allows these checks to run automatically at specified intervals.
Can I receive alerts if my website experiences downtime using Google Sheets and Apps Script?
Yes, by utilizing the MailApp service in Apps Script, you can configure your script to send email notifications when your website returns an unexpected status code, such as 404 or 500. This ensures you’re promptly informed of any issues affecting your site’s availability.
Are there limitations to using Google Sheets and Apps Script for website monitoring?
While this approach is effective for basic monitoring, there are some limitations:
1. Execution Time: Scripts have a maximum execution time, which can be restrictive for monitoring a large number of URLs.
2. Quota Limits: There are daily quotas for sending emails and making external requests.
3. Dynamic Content: Monitoring websites with frequently changing content can lead to false positives when detecting changes.