monitoring – The Official Blog https://www.alertbot.com/blog/ Thu, 29 Jan 2026 18:42:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 Synthetic Monitoring: Frequently Asked Questions https://www.alertbot.com/blog/index.php/2025/03/05/synthetic-monitoring-frequently-asked-questions/ Wed, 05 Mar 2025 18:32:56 +0000 https://alertbot.wordpress.com/?p=1295 Synthetic Monitoring: Frequently Asked Questions title graphic illustration of a laptop and scientists checking graphs and charts

Synthetic Monitoring: Frequently Asked Questions

One of the most important features in a comprehensive enterprise-grade web monitoring solution is synthetic monitoring. Below, we answer some frequently asked questions, so that you can clearly understand what this is, how it works, and why it’s essential vs. optional.

 

What is synthetic monitoring?

Simply put, synthetic monitoring is a method simulating the journeys that visitors take on a website, and then evaluating performance. The main purpose is to proactively identify errors or bottlenecks (so your team can fix them), including hard-to-find flaws that may be associated with variables such as browsers, devices, geography or network. Synthetic monitoring is also ideal for improving and optimizing performance (i.e., making transactions and workflows faster and simpler).

 

What are some critical insights that synthetic monitoring can reveal?

Synthetic monitoring can provide answers to core questions such as:

  • How fast is our website response time at the moment?
  • Are all our complex transactions (e.g., filling out forms, adding items to carts, etc.) functioning correctly and optimally?
  • What areas of our website receive a limited amount of traffic, and is this normal and expected or a potential problem?
  • If we are experiencing a failure or slowdown, where exactly is it?

 

How does synthetic monitoring work?

Essentially, there are three steps to setting up and implementing synthetic monitoring:

  1. Create scripts that simulate visitor interaction and behavior.
  2. Collect data gleaned from scripts.
  3. Analyze collected information so you can fix identified problems and optimize performance.

 

What is the difference between synthetic monitoring and journey monitoring?

They are the same thing, although generally the term synthetic monitoring is more common across leading web monitoring solutions.

 

How can synthetic monitoring help improve competitive advantage?

Synthetic monitoring is ideal for benchmarking performance against competitors. You can also use it to simulate traffic from different geographic locations to track APIs, SaaS products, etc. This can be especially helpful for identifying peak markets. And synthetic monitoring can help safeguard your business during times of anticipated traffic spikes (e.g., Black Friday, Cyber Monday, etc.), by alerting you of any problems right away so you can make sure your website doesn’t miss a beat.

 

How can synthetic monitoring help improve third-party vendor compliance and performance?

You can use synthetic monitoring to help ensure that SaaS vendors are meeting their Service Level Agreement (SLA) commitments.

 

We are using, or thinking of using, real user monitoring. Is this sufficient?

In the distant past this was probably fine, but these days synthetic monitoring is far superior and widely recommended by experts. Here is why: real user monitoring (RUM) uses real collected user data instead of simulated data to evaluate website performance in-the-moment and over time.

In theory, this is good. But in practice, it’s problematic because there can be use cases and workflows that visitors may not trigger, but nevertheless represent performance pitfalls and other vulnerabilities. The scope of synthetic monitoring is much wider and deeper, and it’s not limited to what visitors may or may not have done in the past, or are doing at the current time. RUM is a lake, while synthetic monitoring is an ocean.

 

How can we learn more about synthetic monitoring?

Easy! Just sign up for a FREE TRIAL of AlertBot. There is no billing information required, no installation, and you’ll be set up in minutes. And there is even better news!

AlertBot’s celebrated multi-step synthetic monitoring script recorder is simple and easy to use. Just click record, interact with your website (e.g., fill out forms, add items to your cart, etc.), and then upload your completed script to dive deep into the granular workflow details. You will clearly see what’s working and what isn’t, as well as what should be improved to optimize visitor experience. There is NO programming required!

 

Start your FREE TRIAL of AlertBot now: click here.

]]>
Unleashing the Web Guru: How Website Monitoring Boosts Traffic https://www.alertbot.com/blog/index.php/2023/07/31/unleashing-the-web-guru-how-website-monitoring-boosts-traffic/ Mon, 31 Jul 2023 17:38:01 +0000 https://alertbot.wordpress.com/?p=917 AlertBot Blog title "Unleashing the Web Guru: How Website Monitoring Boosts Traffic" with futuristic looking city with data ones and zeroes in the sky on the left and a road with a bright light in the center of the picture.

Unleashing the Web Guru: How Website Monitoring Boosts Traffic

by Louis Kingston

In the vast, mystical realm of the internet, where websites come to life and cat videos rule the land, there resides a hidden hero – Website Monitoring. Armed with lightning-fast reflexes and a vigilante’s keen eye, this unsung champion is the secret sauce to soaring traffic.

Picture this: your website is a thriving carnival, with merry-go-rounds of content and rollercoasters of creativity. But, alas, like an absent-minded wizard, you’ve forgotten to keep an eye on the gates. Enter Website Monitoring, the loyal gatekeeper who ensures no trolls sneak in to mess up your virtual fiesta. With a mischievous grin, it sends you real-time alerts the moment any gremlins try to mess with your website’s uptime. Your website’s downtime days are numbered!

Now, let’s journey into the realm of speed. In a world where every second counts, your website’s performance is its very heartbeat. But fret not, dear web adventurers, for Website Monitoring is the swiftest hare in the web-jungle. Armed with its trusty stopwatch, it tracks your page loading times like a hyperactive roadrunner, shouting, “Faster! Faster!” before your visitors can even say, “Are we there yet?” Voilà! Your website now zooms like a caffeine-fueled cheetah on the digital savannah.

Oh, but the fickle web travelers; they change their minds like chameleons change colors. Fear not, for Website Monitoring is here to unravel this enigma. With its mystical analytics, it becomes your crystal ball, revealing the mysteries of visitor preferences and behaviors. You’ll know what they like, what they loathe, and what they yearn for more than a lifetime supply of authentic New York style pizza. Armed with this newfound wisdom, you’ll sprinkle enchanting content like fairy dust, keeping your visitors spellbound and coming back for more.

Behold the battlefield of the mighty search engines, where websites engage in an epic struggle for visibility. But alas, valiant webmasters, Website Monitoring dons its armor of SEO prowess. It crawls through the darkest corners of the interwebs, sniffing out broken links and bad keywords like a digital bloodhound. Armed with this knowledge, you’ll climb the search engine ranks like a warrior scaling Mount Everest – and trust me, you won’t need oxygen!

In this whimsical tale of website wonders, we’ve unveiled the magical powers of Website Monitoring – the tireless protector of uptime, the guardian of speed, the oracle of analytics, and the knight of SEO. So, dear webmasters, heed this advice: with Website Monitoring by your side, you’ll wield the mighty sword of traffic-increase like a modern-day King Arthur.

Embrace the power of Website Monitoring and may your website’s journey throughout your site be filled with joy, triumphs, and an army of loyal visitors marching towards your digital domain!

Say goodbye to web nightmares and embrace the hero you deserve: AlertBot! Our supercharged website monitoring service is the ultimate sidekick you need to keep your online kingdom running smoothly. With AlertBot by your side, you’ll enjoy 24/7 vigilance, lightning-fast alerts, and more data than you can shake a unicorn horn at. So, what are you waiting for? Join the epic quest for flawless websites and unleash the power of AlertBot today – because even Gandalf would agree, “You shall not pass…without website monitoring!”

 

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
A Closer Look at AlertBot’s Failure Reporting Feature https://www.alertbot.com/blog/index.php/2023/02/21/a-closer-look-at-alertbots-failure-reporting-feature/ Tue, 21 Feb 2023 20:32:09 +0000 https://alertbot.wordpress.com/?p=903 AlertBot Blog titled "A Closer Look at AlertBot's Failure Reporting Feature" with image of a man with a headset on sitting at a computer in front of a screen that looks like a NASA space terminal.

The year was 1995. Michael Jordan returned to the NBA. Amazon sold its first book. Windows 95 unleashed the era of taskbars, long filenames, and the recycle bin. And when people weren’t dancing the Macarena, they were flocking to see Apollo 13 and hear Tom Hanks utter the phrase that would launch millions of (mostly annoying) impersonations: “Houston, we have a problem.”

Thankfully, the eggheads in space and the eggheads on the ground worked tirelessly (and apparently smoked a whole lot of cigarettes) to get the crew home. But it was the pivotal moment when the failure was first reported that triggered the spectacular problem-solving process. If it happened an hour — or maybe even a few minutes — later, then the outcome could have been tragic instead of triumphant.

Admittedly, the brave, intrepid professionals in charge of keeping their organization’s website online and functional DON’T have to deal with life-and-death scenarios. But they DO need to deal with problems that, if left unsolved, will significantly damage competitive advantage, brand reputation and sales (immediately if we’re talking e-commerce, and eventually if we aren’t). And that’s where AlertBot’s failure alerting feature enters the picture.

What is Failure Alerting?

Failure alerting is when designated individuals — such as a SysAdmin, CTO, CIO, CEO, and so on — are proactively notified when something goes wrong with a website, such as downtime, errors, slowness, or unresponsive behavior.

As a result, just like in Apollo 13, the right people can take swift, intelligent action to fix things before visitors/customers sound the alarm bell, or worse, head out the (virtual) door and go straight to a competitor without looking back.

Notification Methods

AlertBot customers can choose any or all of the following methods to notify team members of a website failure event:

  • Email
  • Text Message
  • Phone Call

For example, a SysAdmin could receive an email, a text message, and a phone call the moment something goes wrong.

Automatic Escalation

Now, if we were in NASA Mission Control circa 1970, someone wearing really thick horned-rimmed glasses would rise above the cigarette smoke and ask: What happens if the SysAdmin doesn’t receive the email, text message, and phone call? It’s a good question, and there is an even better answer: don’t worry about it.

AlertBot’s failure reporting feature can be configured to escalate the website failure warning if certain individuals don’t respond within a specific timeframe. For example, if a SysAdmin is indisposed for any reason (driving, sleeping, etc.), then after two minutes the alert can be pushed to another designated team member such as the CTO. And if the CTO doesn’t respond within two minutes, then the alert can be pushed to the CIO, and so on.

Ideally, the individual (or multiple individuals) who are sent the first alert receive it immediately, and take rapid action. But if they don’t or can’t, then the alert is escalated accordingly. It is important to note that all of this happens automatically, so there is no possibility of human error.

Granted, none of this is as entertaining as watching Apollo 13. There’s no rousing soundtrack or Tom Hanks. Heck, there’s not even Kevin Bacon.

But when it comes to fixing website problems as quickly as possible, organizations know that the less drama, the better. That’s precisely what AlertBot’s multi-channel, auto-escalating failure reporting feature delivers. We don’t need an Oscar. We just need extremely satisfied customers — and we have a lot of those.

 

Next Up: Reviewing Failure Events Online

 In our next blog, we’ll explore reviewing failure events online to pinpoint issues and detect problems. Stay tuned!

Launch a free trial of AlertBot’s acclaimed site uptime monitoring solution. No credit card. Nothing to download. Get started in minutes. And if you decide to purchase our solution, there are NO setup fees!

]]>
Multi-Step Monitoring: Why it’s Essential and How it Works https://www.alertbot.com/blog/index.php/2022/06/06/multi-step-monitoring-why-its-essential-and-how-it-works/ Mon, 06 Jun 2022 18:37:22 +0000 https://alertbot.wordpress.com/?p=850 Graphic of technical items featuring a check mark, a man pointing at squares, the word "method" circled, a microscope, magnifying glass and a pie chart.

Multi-Step Monitoring: Why it’s Essential and How it Works

 The term “essential” is thrown around pretty loosely these days. That new show about the hospital (no, not that one… not that one either… yeah that one) is advertised as essential viewing. A newly-released track by a hip hop artist that describes how little they need to release new tracks in order to live much, much better than the rest of us? That’s essential listening. And how can we forget that new muffin that cannot legally be advertised as a muffin, because is technically more of a candy. That’s essential snacking (“mmmmmm….pseudo muffin”).

But then on the other end of the hype spectrum, there are things that are legitimately essential, because going without them could lead to dire consequences — or maybe even a catastrophe. And for e-commerce companies, one tool that truly qualifies as essential is multi-step monitoring.

What is Multi-Step Monitoring?

In a break with tradition in the complex world of technology, multi-step monitoring is pretty much what it sounds like: a way to track the various steps that customers take as they move through pages on a website. This way, businesses can proactively identify and fix problems such as buttons that don’t work, forms that won’t submit, links that don’t go anywhere, pages that take too long to load, and so on.

Why is Multi-Step Monitoring Essential?

 Most customers who run into problems don’t shrug them off. They get mad. And that compels them to hit the brakes and head for the exit. In fact, a whopping 88% of online consumers are less likely to return to a site after just one bad experience. So, yeah, preventing about 9 in 10 customers from disappearing is important. One might even say that it’s… wait for it… ESSENTIAL!

How Multi-Step Monitoring Works

In AlertBot, configuring multi-step monitoring is remarkably easy, and doesn’t require an advanced degree in Hypercomplex Supergeerkery, with additional specialized certifications in Megaultra Nerdology. Here is how it works (a video tutorial is also available):

  • Step 1: Login to AlertBot
  • Step 2: Go to “Monitors”
  • Step 3: Set up a new monitor.
  • Step 4: Select the TrueBrowser® Multi-Step Monitor option.
  • Step 5: Download the AlertBot Recorder (available for PC currently — this step only has to be completed once).
  • Step 6: Give the monitor a name (e.g. “Amazon 1 Multi-Step Monitor”).
  • Step 7: Launch the AlertBot Recorder, input the URL of the site (e.g. Amazon.com), and record a script simply by simulating actions that a customer would take. It is also a good idea to label steps/phases (e.g. “Homepage”, “Add to Cart,” etc.), which can be helpful when analyzing reports later on.
  • Step 8: Save the script with a unique name (e.g. “Amazon test”).
  • Step 9: Upload the script into TrueBrowser® Multi-Step Monitor (which was launched in Step 4).
  • Step 10: Hit the “Test” button.

And that’s all there is to it. When the test is complete (this can take up to two minutes), a report is automatically generated that shows:

  • The duration of each phase/step in the process.
  • Whether each process was successful or unsuccessful.
  • A waterfall chart capturing a breakdown of everything that loads on each individual page (e.g. request times, file transfers, etc.).
  • Raw browser request data that reveals anything that is not working, or that could be contributing to a degraded user experience (e.g. loading large files or images that cause slowdowns).

Tests can be run at anytime to verify that problems are fixed and improvements are made. It’s remarkably easy. And yes, it’s essential.


Learn More

Discover the benefits of multi-step monitoring. Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. 

 

]]>
How a Superior Site Uptime Monitoring Solution Could Save Your Organization $1.85 Million https://www.alertbot.com/blog/index.php/2021/12/17/how-a-superior-site-uptime-monitoring-solution-could-save-your-organization-1-85-million/ Fri, 17 Dec 2021 00:10:04 +0000 https://alertbot.wordpress.com/?p=831 Graphic showing two computer monitors with hands coming out of the sides of them. The screen on the left shows a lock symbol with a hand reaching out to offer money to the other hand, which is coming out of a laptop with a skull and crossbones graphic on the screen. The hand receiving the money is holding out a key to the other hand in exchange for the money.

We all know the pleasure we feel when we dig into an old pair of jeans and pull out a crumpled $5 bill, or when we finally get around to vacuuming our car (“Hey, I don’t remember eating onion rings in here”) and find a few bucks in loose change. It’s as if the universe has taken a moment to smile on us.

Now imagine that, instead of finding enough money to buy some more onion rings (“Oh yeah, I remember when I ate onion rings in here — wow, that was a long time ago”), you get your hands on a cool $1.85 million. Pleasure isn’t the word for that. Euphoria is.

Well, in a sense, that is what owners, investors, and anyone else who has a financial stake in your organization could feel if you choose a superior site uptime monitoring solution. Why? Because new research has revealed that $1.85 million is the average price tag that organizations pay to recover from a ransomware attack — a figure that has more than doubled in the last year. Let’s unpack this by taking a look at ransomware, and then explaining the link to site uptime monitoring.

What is Ransomware?

Essentially, ransomware is a type of malware that infects a computer, and blocks access to it unless victims pay a fee (a.k.a. a ransom). And if that was not nefarious enough, there are two other things about ransomware that need to be part of the story.

The first is that victims are given a very limited amount of time to pay up. If they fail to do so, then the threat — which is often carried out — is they will permanently lose access to their data, or their data will end up being disclosed on the dark web or elsewhere. The second is that even after they pay the ransom in full, only 8% of victims get 100% of their data back, and 29% get up to 50% of their data back. In the legitimate business world, this kind of chronic non-fulfilment would lead to excessive customer loss, and probably lawsuits and investigations. But on the cybercrime landscape, it’s standard operating procedure. There is no complaints department or review site (“We were very disappointed in this hacker who failed to return all of our data, but we are adding a star because communication was prompt”).

Where a Site Uptime Monitoring Solution Enters the Picture

A superior site uptime monitoring solution cannot block ransomware attacks. For strategies and tactics on that front, we recommend this helpful article at eSecurityPlanet.com, and this site by the Cybersecurity and Infrastructure Security Agency (CISA).

However, a superior site monitoring solutions CAN do something that hackers earnestly hope that potential victims do not realize: immediately alert them to a ransomware attack — even if it’s at 3:00am — so they can rapidly roll-out an uncorrupted back-up and carry on without disruption or (and here is the euphoric part) having to pay $1.85 million or more in ransom/recovery costs.

Then, the organization can move to fortify cybersecurity defenses and reduce the size of the attack surface (probably by deploying many of the recommendations highlighted by the sources listed above), ultimately reducing the likelihood of future ransomware attacks.

The Bottom Line

Ransomware is on the rise, with the number of reported incidents surging 183% between the first two quarters of 2021. A superior site uptime monitoring solution won’t stop these attacks or frankly even slow them down. Hackers are notorious for doing things over and over again until they stop working — and unfortunately, ransomware is quite profitable. But it can give organizations the warning and time they need to strengthen their defenses, and in the process potentially save an average of $1.85 million.

Launch a free trial of AlertBot’s superior site uptime monitoring solution. No credit card. Nothing to download. Setup in minutes.  

 

]]>
Debunking 3 Website Availability Monitoring Myths https://www.alertbot.com/blog/index.php/2021/05/27/debunking-3-website-availability-monitoring-myths/ Thu, 27 May 2021 18:51:29 +0000 https://alertbot.wordpress.com/?p=749 A pretty girl with brown hair up in a bun looks intently at a laptop with her hand to her chin in a pensive manner.

Debunking 3 Website Availability Monitoring Myths

by Louis Kingston

Some myths in life are harmless, or even helpful. For example, Santa Claus has come in very, very handy for parents who want to nudge their kids from the naughty list to the nice one. And let’s give a round of applause to the Tooth Fairy, whose promise of nominal financial compensation has turned the prospect of losing a tooth from a meltdown trigger into a motivational factor.

However, other myths are on the opposite end of the spectrum: they lead to stress and costs. The bad news is that there are some rather notorious website availability monitoring myths out there. But the good news is that debunking them is simple. Here we go:

Myth #1: Free website monitoring tools are just as good as paid versions.

The Truth: So-called free website monitoring tools are riddled with gaps and vulnerabilities — simply because they’re free, and the folks who make them aren’t trying to provide a public service or earn some good karma. They’re in business, and that means there’s always (always!) a hook. Here are some of the drawbacks: zero technical support, excessive false positives, reduced test frequencies, limited testing locations, and s-l-o-w product updates. For a deeper dive into these pitfalls, read our article here.

Myth #2: Buying website availability monitoring from your host is a smart idea.

The Truth: Your web host probably offers website availability monitoring, and keeps pestering you to buy it. What’s the harm? Well, here’s the harm: your web host is a web host. That’s their jam. They don’t specialize in website monitoring, which means that customers like you are going to pay for their lack of competence and capacity. And on top of this, your web host has an inherent conflict of interest when it comes to giving you the full picture — because your hosting agreement includes uptime standards. As such, they may be less inclined to be fully transparent if they fall below this standard. Or to put it bluntly: they might lie, and you’ll have a really hard (if not impossible) time trying to detect and prove it. For more insights on why it’s a bad idea to buy website monitoring from your host, read our article here.

Myth #3: Website availability monitoring is just about website availability monitoring.

The Truth: This last myth is especially tricky. Yes, website availability monitoring is about website availability monitoring. But that’s not where it ends. Comprehensive (i.e. the kind your business needs) website monitoring also analyzes key aspects such as website usability, speed and performance — because there are situations where a website can be available, but not accessible or optimized. To learn more about why comprehensive website availability is not just a technical necessity but also a customer experience requirement, read our article here.

The Bottom Line 

Does your kid have a toothache, threatening to go to DEFCON 1? Do a myth tag team of the Tooth Fairy + Santa to avert a meltdown (and hey, you might even enjoy some extras out of the deal like getting them to clear the dishes after dinner or clean out the cat litter — kids are tough negotiators, but see what you can get).

But if you want to keep your business safe and strong, then steer clear of all myths, and equip yourself with the clarifying truths revealed above.

And speaking of clarifying truths: AlertBot TRULY offers world-class, surprisingly affordable and end-to-end comprehensive website availability monitoring — which is why it’s trusted by some of the world’s biggest companies. See for yourself by starting your free trial now.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
The Basics of DNS Monitoring: What It Is, How It Works, and Why It’s Essential for Your Business https://www.alertbot.com/blog/index.php/2020/06/19/the-basics-of-dns-monitoring-what-it-is-how-it-works-and-why-its-essential-for-your-business/ Fri, 19 Jun 2020 21:02:28 +0000 https://alertbot.wordpress.com/?p=696 A rendering of a planet in the foreground and a planet behind that with a little moon to the bottom right of it. A Star Trek starship shows small off to the top right. Text on the image reads "The Basics of DNS Monitoring: What It Is, How It Works, and Why It’s Essential for Your Business"

The Basics of DNS Monitoring: What It Is, How It Works, and Why It’s Essential for Your Business

by Louis Kingston

On Star Trek, there’s an incredibly useful device called the universal translator. As you’d expect, it allows everyone to understand each other. For example, if Captain Jean Luc Picard bumped into a race of aliens that bore a striking resemblance to Commander Riker’s beard, then they could set a date for some Earl Grey tea (hot) thanks to the universal translator. Without it, there might be grave misunderstandings and the firing of photon torpedoes.

DNS: The Next Generation

Well, the internet has its own kind of universal translator, which is somewhat less gloriously called a Domain Name System, or DNS for short. Essentially, DNS is a protocol that establishes the standards for how computers exchange data on the internet, as well as private networks. The purpose is to convert domain names into an Internet Protocol (IP) address, so that computers can identify and communicate with each other. Without the universal language of DNS, surfing the web wouldn’t be surfing at all. It would be more like wading through quicksand because we’d all have to keep track of hundreds, if not thousands, of IP addresses.

How DNS Works

Let’s say that you type “Google.com” into your web browser. Behind the scenes, your browser sends out a request to a recursive name server in order to get the IP address for Google.com (if the recursive name server comes up empty, then the back-up plan is to check with an authoritative name server, which has information on every domain). Ultimately, provided that the website in question exists, the browser is provided with an IP address that tells it precisely where to go.

Now, does this mean that you could type in the IP address and cut out the middleman? Yes. For example, if you really wanted to, then you could type 172.217.10.14 — which is Google’s IP address — into your browser and head straight to Google.com without passing a DNS (or collecting $200). But why would you want to!? A DNS allows you to remember simple names instead of complex 10-digit numbers.

Why DNS Monitoring is Essential: Part 1

The first reason why your business needs DNS monitoring should be self-evident: if for any reason your site name isn’t resolving, then visitors won’t be able to reach it. For all intents and purposes, it will be down. Constant and automated monitoring checks to see that everything is working and there is no need for anyone to scream “RED ALERT!”

Why DNS Monitoring is Essential: Part 2

DNS monitoring also checks to see that the name resolution process is swift vs slow. Why is this so important? Consider this:

  • A study by Kissmetrics found that a one second delay in load time can send conversion rates plunging by seven percent.
  • Google — a company notorious for never confirming or denying anything to do with its ultra-secret search engine algorithm, has bucked tradition and formally verified that page speed is a significant SEO ranking factor for mobile and desktop searches (now please hold still for memory wipe procedure).
  • Thanks (or make that “no thanks”) to a phenomenon that psychologists dub the perception of speed, visitors don’t just dislike slow websites: they hate them with a passion that borders on — and often surpasses — blinding hatred. With apologies to Shakespeare: Hell hath no fury like a visitor delayed.

Why DNS Monitoring is Essential: Part 3

Hackers frequently target DNS servers to redirect visitors to sites that deliver malware. Even scarier, hackers can obtain SSL encryption certificates that allow them to intercept and decrypt email and virtual private network (VPN) credentials.

The Bottom Line

DNS Monitoring lets you know three things that are more important than not plugging in a hair dryer  when the U.S.S. Enterprise goes to warp speed: that your site is up, that your DNS server has not been hijacked by hackers, and that it’s resolving quickly. Without this information, the only way you will know that something is wrong is when angry customers or panicked colleagues start calling.

Boldly Go with AlertBot!

AlertBot automatically and continuously monitors your DNS servers (regardless of where they are located) to ensure that everything checks out, including A records (IPv4), AAAA records (IPv6), aliases (CNAME), SMTP mail server mappings (MX records), DNS zone delegates (NS records), SOA serial numbers, and more. And if an issue is suspected or detected, your team is immediately alerted so they can take action and solve the problem.

Start a free trial now and boldly go with AlertBot!

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
Word (and Warning) to the Wise: Site Downtime isn’t Just a Technical Issue — it’s a Customer Experience Problem https://www.alertbot.com/blog/index.php/2020/01/13/word-and-warning-to-the-wise-site-downtime-isnt-just-a-technical-issue-its-a-customer-experience-problem/ Mon, 13 Jan 2020 22:49:31 +0000 https://alertbot.wordpress.com/?p=659 A finger resting on a yellow star with four blank stars to the right of it, meaning it's a 1 star rating. Text above this reads "Word (and Warning) to the Wise: Site Downtime isn’t Just a Technical Issue — it’s a Customer Experience Problem"

Word (and Warning) to the Wise: Site Downtime isn’t Just a Technical Issue — it’s a Customer Experience Problem

by Louis Kingston

Businesses of all sizes — from small startups to large enterprises — are spending an enormous amount of money and time to deliver outstanding customer experience (CX). For example, they’re deploying contact centers, implementing customer-friendly return and warranty policies, training their workforce to be customer-centric, and the list goes on. And now, according to research by Walker Insights, CX is poised to overtake price and product as the most influential brand differentiator. To put this another way: customers are happily willing to pay a higher price, and for a more limited selection, if they’re getting the attention, performance, respect and results they expect — and frankly, demand.

The CX Gap that is Swallowing Customers

However, despite the fact that the CX party has been going on for a while and there’s no slowdown in sight, there’s a gap that many businesses are overlooking — one that is swallowing up their current and future customers, and transporting them directly to the competition: site downtime.

Here’s the thing: traditionally, site downtime has been primarily, if not exclusively, viewed through a technical lens, similar to a car breaking down or a roof springing a leak. And there is obviously truth in this perception. But it’s not the whole story, because customers out there on the virtual landscape equate site experience with customer experience. As such, when a site goes dark, they don’t think: “This customer-centric business has a technical problem with their website, and are surely going to fix it ASAP.” Instead, they think: “Wow, if this is what their website is like, then the rest of the business must be just as dysfunctional.”

Now, is this perception fair? Frankly, no. The vast majority of businesses — let’s say 99% of them — with site downtime truly care about delivering good (if not great) CX. These are the same businesses that, as noted above, are spending plenty of money and time on CX-related investments and training. They seriously and urgently want to get CX right.

But when their website breaks down or blows a virtual tire, this legitimate, longstanding investment and CX commitment is undermined — and customers react accordingly. Here are some of the grizzly numbers:

  • 50% of customers say they have abandoned a transaction or purchase due to poor customer service.
  • 51% of customers say they will never do business with a company again after one instance of poor customer service.
  • 74% of customers say they are likely to switch brands if the purchasing process is too difficult.
  • 95% of customers tell others about poor customer service.

The Bottom Line 

The takeaway here isn’t that businesses need to care more about CX — because they know this already, and (hopefully) are acting on this understanding. Rather, it’s that businesses need to see the direct, immediate link between poor CX and site downtime. It’s not just a technical issue. For current and future customers, it’s the difference between whether they move forward on the buyer’s journey and serve as a profitable brand advisor, or whether they head for the exit and never look back.

Protect Your Reputation + Impress Your Customers 

AlertBot delivers world-class, surprisingly affordable monitoring that immediately notifies you when your site is not operational. You can then take rapid, focused action and solve the problem before your customers form the wrong impression — and never give you a second chance to make it right. Launch your free trial of AlertBot today.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes https://www.alertbot.com/blog/index.php/2019/07/22/if-you-build-it-they-wont-come-5-big-scary-and-costly-e-commerce-site-mistakes/ Mon, 22 Jul 2019 06:55:52 +0000 https://alertbot.wordpress.com/?p=623 Photograph of a corn field set against a bright blue sky. Test on it reads "If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes"

If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes

by Louis Kingston

In the 1989 flick Field of Dreams, Kevin Costner turns his Iowa cornfield into a baseball field because a voice tells him: if you build it, he will come. The “he” in question is his late father, and the movie has a magical, uplifting ending that makes us want to dream again (and possibly, play baseball or eat some corn).

Well, many folks who launch e-commerce sites also believe that: if I build it, they will come. This time, “they” means throngs of happy, profitable customers. Except…they don’t. And before long, the site is forced to scale down or shut down. Even writing to Kevin Costner doesn’t help — even if you promise to watch a double feature of The Postman and Waterworld (not recommended without a physician’s approval).

The bad news is that this kind of misery happens all the time. The good news — actually, make that the amazing, glorious, Field-of-Dreams-ending-like news — is that preventing this doom and gloom is largely a matter of avoiding these five big, scary and costly e-commerce site mistakes:

  1. Lousy UX

Tiny buttons that are impossible to click on a mobile device without a magnifying glass and hands the size of a Ken doll. Search functions that neither search nor function. Elusive top level categories. Gigantic banners that pop open and chase customers around from page to page, like a kind of online shopping Terminator (“I’ll be baaaaaack!”). These are just some of the many ways that lousy UX destroys e-commerce sites.

The remedy? Monitor all pages and multi-step processes (e.g. login areas, signups, checkout, etc.), to identify bottlenecks where customers routinely encounter errors or unresponsive behavior, and fix any gaps and leaks right away. Learn more about doing this here.

  1. S…l…o…w…n…e…s…s

Just how vital is speed? Behold these grizzly statistics:

  • A one-second delay in load time can send conversion rates plunging by seven percent. (Source: Kissmetrics)
  • 70% of customers say that a website’s loading time affects their willingness to purchase. (Source: Unbounce)
  • As page load time increases from 1 second to 3 seconds the probability of bounce increases by 32%; from 1 second to 5 seconds the probability of bounce increases by 90%; and from 1 second to 10 seconds the probability of bounce increases by 123% (source: Google)

The remedy? Be ruthless about making your e-commerce site as fast as possible (and then make it even faster). Here are the usual suspects: bloated HTML, ad network code, images not optimized, and using public networks to transmit private data. There are other culprits, but look here first — you’ll be amazed at how much speed you unleash.

  1. Not Focusing on SEO — or Focusing too Much on SEO

Let’s talk about health. Some people have poor health because they don’t exercise at all. Their daily calisthenic routine involves digging in the couch for the remote. And then on the other end of the spectrum, there are people who work out too much — like, we’re talking to extremely, unhealthy levels. You know the type.

The same phenomenon occurs in the e-commerce world when it comes to SEO. Some sites don’t focus on SEO, which means they aren’t going to get found by the 35% of customers who start their buyer’s journey from Google. And some focus too much on SEO, that they neglect other channels and tactics — including good, old fashioned pure promotion.

The remedy? Definitely make SEO part of the visibility strategy. But don’t make it the end-all-and-be-all of online existence. It’s important, but it’s not everything.

  1. Bad Customer Service

 Customer service is as important in the online world as the brick-and-mortar world, and in some cases it’s even more important, because exiting the buyer’s journey is so simple — as is writing a scathing zero-star review that would have made Roger Ebert wince. Unfortunately, many e-commerce sites treat customer service as an afterthought or a necessary evil, rather than an asset that should be leveraged to optimize customer experience and generate loyalty.

The remedy? Make customer service — characterized by the ease, speed, and quality of responsiveness and resolution — a big part of the plan. It’s not an expense, but an investment.

  1. Lack of Original, Compelling Content

E-commerce sites aren’t vending machines, yet many of them seem to take their inspiration from these handy contraptions that dispense candy and soda in exchange for money and the push of a button (be careful you don’t press the wrong one — you might end up with that oatmeal cookie that has been there since 2007, and not the Snickers bar that you’re craving).

However, most customers — even those who are very focused on getting a specific item, like a pair of sneakers, a smartphone, or a hotel room — want and expect to access relevant information to help them make a safer, smarter purchase decision. This could be videos, infographics, social proof (e.g. testimonials, reviews, case studies, etc.), articles, blog posts, and downloadable assets like ebooks,  checklists, and so on.

The remedy? Don’t skimp on creating original, compelling content. As a bonus, this will help with SEO and can connect you with profitable customers who are not in your primary target market.

The Bottom Line

Competition on the e-commerce landscape for the hearts, minds, and indeed, wallets of customers is ferocious. Avoiding these mistakes will go a long, long way to helping your e-commerce site survive and thrive.

You may even make enough profit to retire early, buy a cornfield in Iowa, and then turn it into a baseball field that inspires the feel-good movie of the year. Hey, it worked once before, right?

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: VIVE vs Oculus https://www.alertbot.com/blog/index.php/2019/06/27/alertbot-showdown-vive-vs-oculus/ Thu, 27 Jun 2019 19:48:56 +0000 https://alertbot.wordpress.com/?p=611 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are wearing Virtual Reality head sets and holding the controls. Text reads "AlertBot Showdown: Oculus vs Vive" with the word SHOWDOWN very large at the bottom.

As technology continues to morph change with the times, the virtual reality experience keeps becoming more widespread and immersive. Two of the leading brands in the VR game are unmistakably VIVE (HTC) and Oculus. Both companies are leaders in the ever-expanding digital world of virtual reality, with both having released or having plans to release new headset models this summer.

While these brands may corner the market on connecting to the virtual realm, we wondered how they stack up when it comes to the world wide web and their own individual website performance.

To test their web performance quality, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both VIVE.com and Oculus.com from May 1st through May 22, 2019. Given the high regard in which these companies are held because of their products, we expected their web performance to be strong.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both VIVE’s and Oculus’s sites did perform quite well. Neither saw significant downtime, but each one experienced some sluggish speeds and even load time timeouts on a couple rare occasions.

VIVE.com experienced 99.91% uptime, with just a few errors recorded due to slow load times. None of these events lasted longer than a couple minutes, and none of them amounted to any significant downtime. Because of this, we still consider their performance to be quite solid.  (VIVE.com 8/10)

Oculus.com performed similarly with 99.98% uptime and similar slow page load errors that didn’t amount to significant downtime but at least put a minor hiccup in their performance. They experienced four times fewer of these errors than VIVE, so they ended up coming out just a tiny bit more on top. (Oculus.com 8.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring. We calculate the speed as an overall average across all locations during the time span selected for this Showdown.

The speed for both websites were also relatively close to each other. VIVE.com’s best speed, on average, was seen on Monday, May 13 at 3.2 seconds, which isn’t bad. Their best time of day, however, was on Tuesday, May 21 at 5am with 1.6 seconds. It’s definitely better, although it’s doubtful that they usually see a high number of traffic on a given morning. VIVE.com’s worst averaged day was Thursday, May 23rd at just 5.1 seconds. However, their worst time was on Wednesday, May 22nd at 2pm with a much less admirable 8.8 seconds. The site’s overall average speed across the entire test period was 3.78 seconds.  (VIVE.com 8/10)

Oculus.com performed very similarly. Their best day on average was Thursday, May 2nd with 3.7 seconds. Their best response time was at 9am on Wednesday, May 15 with 2.05 seconds. Oculus.com’s worst averaged day was also (like VIVE’s) Thursday, May 23rd at just 4.37 seconds (although that’s slightly better than VIVE’s worst). However, their worst time of day was on Wednesday, May 1st at 6am with 7.49 seconds (making their slowest time a full second faster than VIVE’s slowest). The site’s overall average speed across the entire test period was 3.96 seconds (Just a smidge slower than VIVE’s).     (Oculus.com 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. For this portion of the test, we compare the overall average speeds of each individual location captured during the selected period of time for this Showdown.

Previously, California had reigned supreme as the fastest state in the U.S. But lately, other states have been stepping up, dethroning The Golden State. This time, North Carolina wins (for both sites), with VIVE.com moving at a breezy 1.69 seconds in The Old North State. Oregon came in second at 1.8 seconds, with Arizona at 2 seconds. Comparatively, Washington state saw the slowest speed, coming in at a shameful 10.9 seconds, with Washington DC in second at 7.55 seconds and Texas in third at 7.43 seconds. (VIVE.com 8/10)

Oculus.com was also under two seconds with 1.9 seconds in North Carolina. Their second fastest was 2.2 seconds in Nevada and 2.3 seconds in Oregon. Overall, they were pretty close to VIVE. However, while Oculus saw a better overall “slowest” location, the second and third slowest were a little worse. Washington, DC came in at 8.66 seconds, then Washington state at 8.65 seconds, and Texas at 8.55 seconds. For the most part, though, the sites performed rather closely.  (Oculus.com 8/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to see if we can order their latest VR headset.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.VIVE.com into our Chrome browser, it took 1 minute and 36 seconds (and a wealth of clicks) to come to the conclusion that you cannot order anything from their website (at least not easily, even though there’s a shopping cart icon on their menu bar), and that viewing a map to “Try VIVE Today” tells us that we have to live in Livingston, UK if we want to visit a store.

For www.Oculus.com, it took 3 clicks and 16 seconds to add the Oculus Quest 64 GB headset to our cart and be ready to checkout.

For these tests, we attempt to go into them without much prior knowledge of the site’s user side functionality to give it an unbiased test, so we’re pretty surprised at how drastically different the user experience was here. To give VIVE a fighting chance – even before trying Oculus’s site – we tried choosing a different headset in the event that maybe the most recent one isn’t available yet, and it still didn’t help. Perhaps the problem is that we’re performing the test from the US and VIVE’s parent company, HTC, appears to be UK-based. After further investigation, however, it appears that the only way to get to a purchasing option on VIVE’s site is to look at the “comparison” portion of the products page. Still, it seems odd that they wouldn’t make it easier and clearer to order their products. (Also, it appears that the webpage ends when you’re scrolling through, but it merely eventually changes the panel you’re “stopped” on as you scroll down, and then it moves you down the page to the next panel before stopping you again. It’s a neat design, perhaps, but no doubt a little confusing at first.)

With that in mind, here are the Usability scores:

(VIVE.com 5.5/10)
(Oculus.com 9/10)

 

Verdict

Both sites performed respectably, but when it comes to usability and speed, one unexpectedly outperformed the other—especially when it came to usability. So, we’re pleased to announce this Showdown champion to be…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Oculus.com"

]]>
AlertBot Showdown: Playstation vs Xbox https://www.alertbot.com/blog/index.php/2018/04/06/alertbot-showdown-playstation-vs-xbox/ Fri, 06 Apr 2018 19:30:53 +0000 https://alertbot.wordpress.com/?p=517 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying video game system controllers. Text reads "AlertBot Showdown: Playstation vs XBox" with the word SHOWDOWN very large at the bottom.

It may have been squashing a goomba while punching a coin out of a brick, dodging barrels being thrown by a grumpy gorilla, sorting oddly shaped falling blocks into interlocking patterns or simply catapulting miffed fowl at a group of defenseless pigs on your mobile phone, but chances are high that everyone has played a video game at one point in their life.

Poor web performance is no game any self-respecting owner of a website should play. We recently aimed our sights at the gaming industry and picked out two heavy hitters to evaluate: Xbox and Playstation. While their websites may not be the main point of interest for gamers, they’re relied upon for information, updates and even online digital game sales. Their online gaming servers may be the most important thing to keep running smoothly in gamers’ minds, but these top players in the industry will want to make sure their website stays up and always accessible.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from February 4, 2018 to February 25, 2018. Both sites performed well—as can be expected from parent companies Microsoft (Xbox) and Sony (PlayStation)—but, as usual, one performed just slightly ahead of the other, even if not by much.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both websites experienced 100% uptime, but both sites encountered minor errors that served as a few speedbumps along the way. Still, it wasn’t enough to qualify as downtime.

Xbox.com, despite its 100% uptime, experienced around 50 “slow page” warnings and over 20 page load timeouts (where something on the page takes a bit longer to load, slowing the page’s overall performance down). Xbox.com also returned an SSL Certificate expiration notice. However, none of these qualified as significant outages, and for that we still have to give them props. (Xbox 9/10)

Playstation fared the same with 100% uptime and a lot better when it came to the little errors. They only registered 7 timeouts and 5 slow page loads, and for that we give them slightly higher marks.  (Playstation 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Speed is crucial to the gamer – be it game load times (who else hates waiting for spinning icons to finish to get us past a cut scene or moving on to a new map in a game?) or server responsiveness – so a speedy game company website is key. Xbox.com experienced pretty quick load times, with its best day being February 24th with an average of 4.6 seconds. Its best response time, however, was on February 23rd at noon with 2.2 seconds. On the flipside, its worst day was February 12 with 6.7 seconds (which isn’t all that bad), but their worst hour proved to be on February 11th at 11pm with a sluggish 13.1 seconds. (Xbox 8.5/10)

Surprisingly, Playstation turned out to be just a little bit slower, with their best day average being 6 seconds on February 22nd. Their best time by the hour was on the same day at noon with 2.3 seconds, just a hair slower than Xbox’s best time. Their worst day was a full second longer on February 11th with 11.7 seconds, and their worst time by the hour was also 13.1 seconds, but on February 10th at 7am. (Playstation 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California seems to win out most of the time as the fastest location for load times and for Xbox.com, it was no different. California saw load speeds of 2.1 seconds on average, with Florida coming in second at 2.2 seconds. Georgia, however, saw an average worst time of 10.3 seconds with Missouri coming in second at 9.2 seconds. (Xbox 8.5/10)

Playstation.com actually turned in slightly more sluggish results geographically, too. Their best location was California, as well, but it was 2.5 seconds, and Florida was a close second at 2.7 seconds. Playstation’s slowest spots were also in Georgia and Missouri, at 12.6 seconds and 11.2 seconds, respectively. It’s not the worst we’ve seen, but Xbox clearly performed better. (Playstation 7.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to add a digital download of a popular video game to the shopping cart and start the checkout process.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.xbox.com into our Chrome browser and clicking around to find the Xbox One games, choosing the featured one (which, in this case was Dragonball FighterZ), clicking “Buy Now” and getting to the account login screen, it took 1 minute and 10 seconds. From the homepage, it took 7 clicks to get to the checkout process. It’s been a while since we’ve last visited their site, so our experience was fresh, but we encountered some significant slow loading times when getting to the product page. We actually added an additional click to the process because the “Buy Now” button didn’t load properly at first (and did nothing upon its first click). Overall, we got to do what we set out to do, but the process could have gone a lot smoother.

We were hoping for a better experience from Playstation, and we got one. From the point of typing www.playstation.com into our Chrome browser, it took 4 mouse clicks and 35 seconds to find a featured video game (in this case, Bravo Team), and get to the checkout stage (which was also an account login screen). There was some delay on first clicking on the game title, but it still loaded quickly and allowed us to get to the end of the process fast.

Both sites allowed us to get the job done in a rather speedy manner, but Playstation’s site gave us a much more positive experience.

With that said, here are the Usability scores:

(Xbox 8/10)          (Playstation 9.5/10)

 

Verdict

Both sites performed very well, but that positive user experience helped push one over the other, albeit only slightly. So while it was a tough call to make, we have come to a conclusion —

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Playstation.com"

]]>
Synthetic Monitoring for SaaS: Keeping a finger on the pulse of your cloud app https://www.alertbot.com/blog/index.php/2018/02/22/synthetic-monitoring-for-saas-keeping-a-finger-on-the-pulse-of-your-cloud-app/ Thu, 22 Feb 2018 12:06:12 +0000 https://alertbot.wordpress.com/?p=511 An illustration showing cartoonish hands on a computer keyboard. The computer monitor, as well as a cell phone screen and tablet show pie charts and graphs on them. Graphics of clouds and gears are also on the image.
Synthetic Monitoring for SaaS
Keeping a finger on the pulse of your cloud app
by Penny Hoelscher

Congratulations, you have just leveraged an awesome Software as a Solution (SaaS) service for your organization. Perhaps you have implemented a popular application – like Office 365, SalesForce or Dropbox – to support your staff and enhance collaboration between teams. Now you need to ensure that your employees and / or customers are happy too.

At this point, a common misconception often arises: the belief that a SaaS application relieves businesses of all responsibility for monitoring the application. It is just a matter of time before your business is rudely awakened to reality when customers start complaining about outages or poor performance on social media, and overloading the support desk with calls.

A negative customer experience when utilizing one of your SaaS applications can affect your bottom line. Unfortunately, you cannot totally rely on your provider to keep the system ticking; even the big guys experience outages and cyber attacks. Synthetic monitoring provides a solution, a way for you to keep your finger on the pulse of your cloud services.

Taking responsibility for SaaS applications

Effective SaaS monitoring is measured by how positive the end-user experience is. For instance, if a user cannot log in to an application to retrieve a file you sent them, they will not be happy. Can you leave it up to a SaaS provider to keep you up-to-date when they have a problem? No. In fact, it is not unusual for SaaS providers to delay making press statements when they experience problems or not announce them at all. Organizations are fast realizing the importance of taking the responsibility of proactively monitoring the performance of the SaaS applications they use themselves.

In addition, in 2016 Gartner predicted that by 2018 50 percent of enterprises with more than 1,000 users would use cloud products to monitor and manage their use of SaaS and other forms of public cloud. This reflects the growing recognition that, although clouds are usually stable, monitoring applications requires explicit effort on the part of the cloud customer.

Why do you need to monitor your SaaS applications yourself?

  • Whatever your Service Level Agreement (SLA) says, if customers cannot access your application, they are more likely to call you than your service provider, so you need to know exactly how your service is doing, rather than wait for irate customers to notify you.
  • Again, whatever your SLA says, if you are not monitoring your SaaS application, you cannot know the conditions of your SLA are actually being met.
  • A SaaS provider gathers generic data about all of its customers that may not be explicitly relevant to you, or sufficient to generate a meaningful performance analysis and take advantage of all the benefits a synthetic monitoring solution can, as we shall see, provide.

Monitoring the customer experience (CX)

Synthetic monitoring has immense benefits for monitoring SaaS applications. It can help you keep a finger on the pulse of your SaaS application by addressing the following core issues that affect the customer experience and can affect your bottom line:

  • Identified bottlenecks in the UX: Is your application functioning as predicted? How is it performing?
    When customer functions like logging in, using shopping carts or search fail, visitors may defect to one of your competitors. Synthetic transaction tests easily simulate these types of transactions, enabling you to be certain everything is working as it should and that recent software updates have not broken existing functionality. It is much easier to measure performance when you have a baseline. Synthetic monitoring applications, like AlertBot, have user-friendly portals that provide charts, graphs and reports to highlight deviations from the baseline. They make it that much easier to keep your finger on the pulse.
  • Error sources and reporting: If there is a problem, where is it?
    Synthetic monitoring is a powerful predictive tool. According to one leading industry professional, “Synthetic monitoring doesn’t rely on complex predictive algorithms, it doesn’t take a data scientist with a Ph.D to interpret the results, and it doesn’t require additional spending on IT infrastructure. What it does is predict, to a fair degree of accuracy, how your application will perform in which geographies and isolate the root cause of any detected bottlenecks.”
  • Downtime and outages: Is your application online and accessible?
    While synthetic monitoring is traditionally associated with customer-facing websites, it is platform-agnostic and works just as well monitoring SaaS applications. For instance, it can identify what is competing for resources on your network. Resource hogging applications can degrade the overall performance of your service.


5 top advantages of using synthetic monitoring for SaaS applications

  1. Quickly assess the overall impact and performance of new features on your system
  2. Monitor multiple cloud centers simultaneously as well as performance in scenarios where traffic is ramping up aggressively
  3. Validate base-line performance prior to signing off on an SLA with your provider and objectively monitor it after
  4. Utilize automatically generated diagnostics and reports that enable you to share information and collaborate with your SaaS provider
  5. Determine if a cloud service is down and exactly who is affected in what geographical locations


Synthetic monitoring for SaaS is growing in leaps and bounds

According to a MarketsandMarkets.com report, “Synthetic Monitoring Market by Monitoring Type (paywall),” the enterprise synthetic application monitoring market size is expected to grow $919.2 million in 2016 to $2,109.7 million by 2021, at a CAGR of 18.1 percent from 2016 to 2021. The report predicts that “SaaS application monitoring is expected to gain maximum traction during the forecast period.” Don’t get left behind.

Conclusion

A 451 Research study found that the rapid growth of public cloud services and network virtualization has often outstripped management and monitoring capabilities, creating “blind spots” in network operations’ ability to maintain internal uptime and performance benchmarks. If you only recently climbed on the SaaS bandwagon, it is likely that your existing system monitoring tools are not cloud-friendly.

You may need some help from the experts to help you keep your finger on the pulse of your new SaaS application. Mosey along to AlertBot for more information about a holistic synthetic monitoring solution.

]]>
AlertBot Showdown: HomeDepot vs Lowes https://www.alertbot.com/blog/index.php/2017/10/11/alertbot-showdown-homedepot-vs-lowes/ Wed, 11 Oct 2017 09:57:06 +0000 https://alertbot.wordpress.com/?p=449 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying planks of wood. Text reads "AlertBot Showdown: The Home Depot vs Lowe's" with the word SHOWDOWN very large at the bottom. Tiny hardware nails are sprinkled around the image.

Living in an age where nearly every industry is driven by ecommerce, it should come as no surprise that this includes the home improvement world. Home Depot and Lowes are titans in their industry, and both have a strong online presence. But when it comes to who may have the better performing site, we set out to nail down one true winner.

For our fifth website Showdown, the AlertBot team got out their proverbial measuring tape and slipped on a stylish apron to dig in to the performance of HomeDepot.com vs Lowes.com.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from August 11, 2017 to August 31, 2017. Not surprisingly, the performance for these heavy lifters proved to be rather resilient for both sites. Neither service’s site experienced significant downtime, but as usual, one did prove to perform a little better the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

HomeDepot.com performed quite well over the tested time period, experiencing no failure events. At most, it had a couple hiccups, like a short-lived Timed Out error or a Slow Page File notice, but none of these occurrences caused any amount of significant downtime. (HomeDepot 9/10)

On the other hand, Lowes’ site experienced one failure event on August 21st when the site was not responding for roughly three minutes around 12:21 in the afternoon. When errors like these occur, AlertBot tests them from a second location to confirm if the error is widespread or just a brief localized outage. In this instance, the error persisted after a few tests in different locations, qualifying it for actual site downtime, before the issue resolved.    (Lowes 8/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

HomeDepot.com has a great deal of graphics on the front page, which typically slows sites down considerably. However, it didn’t seem to slow this site down much. HomeDepot.com’s best day, on average, was Tuesday, August 29th  with an impressive load time of 1.1 seconds. The “worst” day average was still an impressive 1.9 seconds.  When evaluating the site’s speed by hour, the site loaded in just 0.8 seconds at 1AM on Sunday August 20th. The worst hour was also on August 20th, at 2PM with 5.1 seconds. Overall, HomeDepot.com’s speed is quite good.  (HomeDepot 9.5/10)

Lowes.com has drastically less content on its front page, but it performed considerably slower than HomeDepot.com did. Sadly, Lowes best day was actually slower than HomeDepot’s worst, with an average of 6 seconds on Sunday, August 13th. Lowes.com’s worst day was Monday, August 26th with 7.1 seconds. That’s not horrendous, but with sites being expected to perform faster and faster these days, a respected retail giant like Lowes needs to up their speed game. On an hourly average basis, their best time was 11PM on Wednesday, August 23rd with 7.1 seconds (Again, their fastest time is slower than HomeDepot’s slowest). Their worst load time by hour was Sunday, August 27th at 1PM with a sluggish 10.1 seconds. (Lowes 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

Usually when we look at site speeds across the United States, sites tend to perform better in California than anywhere else. This isn’t the case for HomeDepot.com, however. For Home Depot, Florida appeared to experience the fastest web transaction (less than one second), while it experienced the slowest transaction test in California (But it’s still only 2.3 seconds). After Florida, however, it experienced the next fastest web transactions in New Jersey and North Carolina (both at 1 second). (HomeDepot 9/10)

Lowes.com had the fastest web transaction in California at 3 seconds. The next fastest was North Carolina, already up to 4.3 seconds. The slowest performance occurred in New York at a whopping 9.4 seconds (with the second-slowest being Georgia with 9.3 seconds). (Lowes 7.5/10)

 

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater. For this Showdown, we’ll see what the experience is like to use their respective websites to add a common product to the shopping cart.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.homedepot.com into our Chrome browser and entering “leather gloves” into the search box, choosing one and adding it to the cart, it took 25 seconds. From the front page, it took 5 clicks to get to the “Checkout now” process. It wasn’t bad, but we found the Lowes process just a bit smoother.

From the point of typing www.lowes.com into our Chrome browser, it took 4 mouse clicks and 20 seconds to get the gloves into the shopping cart and view the cart. The “Add to cart” button is much more obvious and visible on Lowes’ site, where it took a moment to locate it on Home Depot’s site. And while both sites offer a “compare” option so you can look at product features side by side, it wasn’t very noticeable on HomeDepot’s site, while it was more prominent on Lowes.com.

The aesthetic of both websites isn’t bad, but Lowes has a crisper and more streamlined appearance and functionality. Both sites get the job done pretty quickly, but we had a slightly smoother experience with Lowes. With that said, here are the Usability scores:

(HomeDepot 9/10)       (Lowes 10/10)

 

Final Verdict

Both sites performed respectably, but HomeDepot.com clearly performed faster and was more reliable than Lowes.com. Despite the fact that we may have preferred the shopping experience on Lowes.com just a little bit more, one cannot ignore the slower site performance.

So, for the fifth AlertBot Showdown, the site that gets to join the ranks of previous winners Apple, FedEx, and Burger King is…

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "HomeDepot.com"

]]>
AlertBot Showdown: Burger King vs McDonalds https://www.alertbot.com/blog/index.php/2017/08/28/alertbot-showdown-burger-king-vs-mcdonalds/ Mon, 28 Aug 2017 18:25:35 +0000 https://alertbot.wordpress.com/?p=436 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying hamburgers and wearing hats. Text reads "AlertBot Showdown: Burger King vs McDonald's" with the word SHOWDOWN very large at the bottom.

Whether you’re picking up a Kids meal for your littlest picky eater or satisfying a hankering for greasy and salty French fries, chances are you’ve found yourself in line at a drive-thru for McDonald’s or Burger King at some point in your life. But these two massive burger chains also have an online presence, and while you’re not exactly going to try to order a single or double patty to be shipped to your home, you might find yourself visiting the websites for either fast food giant to look up their menus or latest promotions.

So for this, our fourth website Showdown, the AlertBot team rolled up their sleeves, grabbed a handful of ketchup packets, and sat down to take the wax paper wrap off of these two websites to see just how the sites for BK and Mickey D’s performed in comparison to one another.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for three weeks, spanning from June 5, 2017 to June 26, 2017. Not surprisingly, the performance proved to be reliable for both sites. Neither service’s site went down, but as usual, one did prove to perform a little faster than the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both sites performed quite well during the time period, but McDonald’s site experienced a hiccup on the first day of the test, June 5. It was a timed-out warning (meaning the site failed to load in the expected time period), but it didn’t last longer than a couple minutes, and didn’t seem to affect the site for very long. Otherwise, their site was pretty stable. (McDonald’s 9/10)

On the other hand, Burger King’s site didn’t experience any confirmed failure events at all and experiencing complete uptime during the test time. However, it did see two transient errors—one a slow page notice and one a brief timed-out notice—for less than a minute that affected the site’s overall performance from a single location. When errors like these occur, AlertBot tests them from a second location to confirm if the error is widespread or just a brief localized blip. In these instances, the error only occurred from just one test location and didn’t qualify as a downtime event.    (Burger King 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user.  We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Both sites are quite graphics-heavy, so it doesn’t surprise me that they may experience some slowness at times.

McDonalds’ loading speeds averaged around 9.5 seconds per day, with its best time being 10 AM on   Monday, June 12 at 5 seconds and its best day being Monday, June 26th with an average of 8.8 seconds. Its worst day was Monday, June 5th, when the load time crawled to an average of 12.7 seconds, while the worst time was on Wednesday June 7th at 11 PM with a pitiful 17.6 seconds. (McDonald’s 8.5/10)

Burger King performed significantly better by comparison. Overall, the site averaged 3.6 seconds for its load time, which is pretty good. Its best day was Wednesday, June 19th when it averaged 3.5 seconds, with its best load time being on Wednesday, June 14th with a speedy 1.8 seconds load time at 6 AM. Monday, June 5 was the worst day, seeing a 6.1 seconds load time (which was still better than McDonald’s BEST day), and their worst time being Saturday, June 17th at 10 AM with 8.5 seconds. (Burger King 9.5/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

It seems to be the norm for California to record the fastest speeds, and the same holds true for McDonald’s. However, surprisingly, New Jersey was the next fastest state on the list. Comparatively, the fast food chain legends saw the slowest load times in Georgia and Utah.  (McDonald’s 9/10)

Burger King, for the most part, saw stronger returns across the board, with California, Colorado, Virginia, Missouri, Washington and Texas all pinging approximately 1 msec. Their slowest locations were North Carolina and also Utah. (Burger King 10/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we tested out how the experience of tracking a real package might look when using two popular parcel services. For this Showdown, we’ll see what the experience is like to use their respective websites to look up the menu and nutritional information on each company’s signature burgers.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.mcdonalds.com into our Chrome browser and navigating until we could find the Big Mac nutritional info, it took 26 seconds. We were held up at first by a prompt on the front page that asked us to join their email list. The browser also wanted to access our location. From closing out the pop-up on down to finding the Big Mac info, it took five mouse clicks.

Now, from the point of typing www.burgerking.com into our Chrome browser, it took four mouse clicks and 18 seconds to get to the Whopper’s nutritional info. BK’s design is much simpler, so we see why their load times were faster.

We liked the aesthetic of both websites, but McDonalds has a slightly more modern feel in its design. However, their graphics are all-around larger and they have more going on on the page, which could be why their overall load times are slower than Burger King’s.

So, with all things considered, with the goal being able to find the nutritional info on each chain’s most popular burger, here are the Usability scores:

(McDonalds 9/10)       (Burger King 10/10)

 

Final Verdict

Neither site performed exceptionally well over the other, but it’s safe to say that Burger King edges out McDonalds in speed and overall performance. (Just for fun, we should follow this up with a who-has-the-better-French-Fries competition!)

So, for the fourth AlertBot Showdown, the site that gets to join the ranks of previous winners Apple, FedEx and Fandango is…

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "BurgerKing.com"

]]>
Why is Website Performance Monitoring Necessary? https://www.alertbot.com/blog/index.php/2017/01/17/why-is-website-performance-monitoring-necessary/ Tue, 17 Jan 2017 11:00:04 +0000 https://alertbot.wordpress.com/?p=343 Photo of two hands holding a tablet horizontally with illustrations of graphs and icons floating off the face of the tablet.

Given the fact that we live in a highly-digitized world today, websites, blogs and web-stores are now an essential component of any business and brand. While waiting for a site’s content to load can be annoying for a user, it can also be potentially disastrous for business.

That, however, is only one reason to monitor the performance of your website. Here are four more:

1.     Loss of Sales and Web-Traffic

First and foremost, businesses maintain websites and have web-stores to promote commercial growth. Now, imagine a situation where you’ve gone to a store and the service is impossibly slow. The salesmen and women are hardly making an effort to engage or help you and you just decide to take your business elsewhere. The same happens to a shopper when they visit a website that takes ages to load. Instead of making a sale, you lose web-traffic and potential customers. You can prevent this by monitoring how your website is performing.

2.     Potential Damage to Brand Image

Customers talk, and they are interested in what others like them have to say. While most brands depend on marketing ploys to promote sales, the importance of word-of-mouth advertisement cannot be discounted. If you leave a bad impression on one customer, chances are that word will spread about it, tainting- if not tarnishing- your hard earned reputation and brand image. Who wants that?

3.     Error Detection

Website performance monitoring is the best way to prevent errors. It’s all too common for ecommerce sites to hit a snag and run into trouble.  If your site is regularly maintained and monitored, you’ll not only be able to fix a problem sooner; you might even be able to detect it beforehand and prevent it completely.

4.     Quality Maintenance

Just as quality assurance is essential for a physical store, it’s equally important for a website and web store. By using a performance testing and maintenance tool, software or application, you will be able to standardize and retain the quality of your website. Not only will that help preserve the website’s ranking on Google, it will also contribute to drive online traffic.  As it is, Google ranking is affected by the minutest change in website speed and downtime. This is the whole reason why websites are search engine optimized in the first place.

So, if you’re even partially convinced that your website needs performance monitoring, why not start the AlertBot 14-day free trial, today?

]]>
AlertBot Celebrates 10th Year of Website Monitoring https://www.alertbot.com/blog/index.php/2016/04/11/alertbot-celebrates-10th-year-of-website-monitoring/ Mon, 11 Apr 2016 10:50:35 +0000 https://alertbot.wordpress.com/?p=185 AlertBot Logo

Allentown, PA / April 11, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading real-time web application monitoring solution, AlertBot, celebrates a decade of website and server monitoring. Downtime of any length can be costly for any website or online retailer; AlertBot’s Website Monitoring Service provides best-in-class site monitoring using its TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on e-commerce-driven websites, login pages and other mission-critical pages. AlertBot serves over 10,000 users with 200 million website checks per month using its network of over 100 locations, spanning 6 continents worldwide.

“AlertBot measures every facet of a website to help our clients improve the user experience; our testing helps clients make adjustments that result in measurable gains – for instance, a major e-commerce player measured gains of $1.4 million for every second of response time their platform improved – that small improvement netted them $18 million in revenue!” states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “Over the past 10-years, AlertBot has been deployed and proven in countless real-world applications by some of the leading names in the e-commerce space.”

AlertBot’s Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives businesses including Blue Cross/Blue Shield, Chrysler, Mutual of Omaha, Sony, Microsoft & Dell Computing the information they need to ensure their applications are always running error-free and providing a quality user experience.

An illustration showing a robot with a party hat and holding a birthday cake. Text reads "AlertBot Celebrates 10 Years"

About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.

About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.

]]>
Lessons from the Dentist for Your Website’s Performance https://www.alertbot.com/blog/index.php/2016/02/25/lessons-from-the-dentist-for-your-websites-performance/ Thu, 25 Feb 2016 14:10:47 +0000 https://alertbot.wordpress.com/?p=173 Every successful website needs to undergo tests and retests (and re-retests) to ensure that the site performs well for all visitors. It’s one thing to test how your site looks in different web browsers, but what about how it actually performs?

Running an AlertBot waterfall chart is just one example of something IT and website managers can do to see how the site is performing with load times. Before you release your site’s new design, complete overhaul or its grand debut, it’s wise to give your site a thorough testing first. A simple test with AlertBot’s waterfall chart can reveal images or third party code that might be clogging up your load time — or many other possible hang-ups. Your site might look fine and dandy, but if the page is taking too long to load in this fickle web-browsing age we’re living in, it could be very costly for your business.

Waterfall-2016(Above is a real, abbreviated example of AlertBot’s waterfall charts)

In the end, it’s really like a visit to the dentist; they often give you tips and guides on how to prevent cavities and other oral problems, while helping you maintain good oral hygiene. Maybe using mouthwash or flossing daily will help keep your gums healthy and your teeth strong. Likewise, with the right web performance tools and tests, you can ensure quality conversions and hopefully prevent any possible decay in your site’s performance.

See for yourself with AlertBot’s completely free 14 day trial!

]]>
Use AlertBot To Monitor The Competition https://www.alertbot.com/blog/index.php/2015/08/25/use-alertbot-to-monitor-the-competition/ Tue, 25 Aug 2015 18:23:45 +0000 https://alertbot.wordpress.com/?p=137

Use AlertBot To Monitor The Competition

When most of us think of “website monitoring,” we usually think about how it applies to our own websites. However, website monitoring really has more uses than we may realize or consider.

Truth be told, while using AlertBot to keep an eye on our own websites and pinpoint problems that need fixing, we can actually set up monitors for any site—not just our own. This means we can actually monitor the competition as well.

The upside to monitoring the competition is that you can get an idea of how a competing website might be performing from around the world, and gauge whether your website is competing as well in those areas. Furthermore, you can see how long their page load times are and find out what features on their website may be slowing them down. It could help you figure out what to avoid in your own design or focus on what to do better in your market, for example.

Photograph of rooftop spyglass

You can test-drive this concept with our free, risk-free 14-day trial. Try it out today and start gathering actionable data on your website – and your competition’s!

]]>
What’s Google Up To With Recently Spotted “Slow” Icon? https://www.alertbot.com/blog/index.php/2015/02/27/whats-google-up-to-with-recently-spotted-slow-icon/ Fri, 27 Feb 2015 22:50:22 +0000 https://alertbot.wordpress.com/?p=99 Earlier this week, on Tuesday, Google+ user K Neeraj Kayastha discovered a new technique Google’s search engine may be getting ready to implement abroad that will warn mobile users of potentially sluggish links before clicking.

Neeraj posted screenshots from his personal Android browser that indicated a new red “SLOW” icon branded next to links for YouTube and even a Google search result (scholar.google.co.in, to be exact). Today, we tried to replicate the same result on an iPhone, but were unable to bring up any “Slow” icons on our search results. (And comments on Neeraj’s report page seemed to reflect similar experiences.)

Screenshot of a mobile screen with Google search results

So what does this mean? It’s possible that Neeraj happened to stumble on a brief Google testing of an upcoming new search result feature, and if this is indeed on the horizon for the near future, website owners may want to do all they can to avoid that little dreaded scarlet branding.

Should this feature come to light soon, now would really be the ideal time to find a Website monitoring solution for your business’s website to ensure visitors and new clients aren’t deterred by Google’s little warning.

Click here for a list of solutions and more info on how AlertBot can help.

]]>