website performance – The Official Blog https://www.alertbot.com/blog/ Thu, 29 Jan 2026 18:39:43 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 Synthetic Monitoring: Frequently Asked Questions https://www.alertbot.com/blog/index.php/2025/03/05/synthetic-monitoring-frequently-asked-questions/ Wed, 05 Mar 2025 18:32:56 +0000 https://alertbot.wordpress.com/?p=1295 Synthetic Monitoring: Frequently Asked Questions title graphic illustration of a laptop and scientists checking graphs and charts

Synthetic Monitoring: Frequently Asked Questions

One of the most important features in a comprehensive enterprise-grade web monitoring solution is synthetic monitoring. Below, we answer some frequently asked questions, so that you can clearly understand what this is, how it works, and why it’s essential vs. optional.

 

What is synthetic monitoring?

Simply put, synthetic monitoring is a method simulating the journeys that visitors take on a website, and then evaluating performance. The main purpose is to proactively identify errors or bottlenecks (so your team can fix them), including hard-to-find flaws that may be associated with variables such as browsers, devices, geography or network. Synthetic monitoring is also ideal for improving and optimizing performance (i.e., making transactions and workflows faster and simpler).

 

What are some critical insights that synthetic monitoring can reveal?

Synthetic monitoring can provide answers to core questions such as:

  • How fast is our website response time at the moment?
  • Are all our complex transactions (e.g., filling out forms, adding items to carts, etc.) functioning correctly and optimally?
  • What areas of our website receive a limited amount of traffic, and is this normal and expected or a potential problem?
  • If we are experiencing a failure or slowdown, where exactly is it?

 

How does synthetic monitoring work?

Essentially, there are three steps to setting up and implementing synthetic monitoring:

  1. Create scripts that simulate visitor interaction and behavior.
  2. Collect data gleaned from scripts.
  3. Analyze collected information so you can fix identified problems and optimize performance.

 

What is the difference between synthetic monitoring and journey monitoring?

They are the same thing, although generally the term synthetic monitoring is more common across leading web monitoring solutions.

 

How can synthetic monitoring help improve competitive advantage?

Synthetic monitoring is ideal for benchmarking performance against competitors. You can also use it to simulate traffic from different geographic locations to track APIs, SaaS products, etc. This can be especially helpful for identifying peak markets. And synthetic monitoring can help safeguard your business during times of anticipated traffic spikes (e.g., Black Friday, Cyber Monday, etc.), by alerting you of any problems right away so you can make sure your website doesn’t miss a beat.

 

How can synthetic monitoring help improve third-party vendor compliance and performance?

You can use synthetic monitoring to help ensure that SaaS vendors are meeting their Service Level Agreement (SLA) commitments.

 

We are using, or thinking of using, real user monitoring. Is this sufficient?

In the distant past this was probably fine, but these days synthetic monitoring is far superior and widely recommended by experts. Here is why: real user monitoring (RUM) uses real collected user data instead of simulated data to evaluate website performance in-the-moment and over time.

In theory, this is good. But in practice, it’s problematic because there can be use cases and workflows that visitors may not trigger, but nevertheless represent performance pitfalls and other vulnerabilities. The scope of synthetic monitoring is much wider and deeper, and it’s not limited to what visitors may or may not have done in the past, or are doing at the current time. RUM is a lake, while synthetic monitoring is an ocean.

 

How can we learn more about synthetic monitoring?

Easy! Just sign up for a FREE TRIAL of AlertBot. There is no billing information required, no installation, and you’ll be set up in minutes. And there is even better news!

AlertBot’s celebrated multi-step synthetic monitoring script recorder is simple and easy to use. Just click record, interact with your website (e.g., fill out forms, add items to your cart, etc.), and then upload your completed script to dive deep into the granular workflow details. You will clearly see what’s working and what isn’t, as well as what should be improved to optimize visitor experience. There is NO programming required!

 

Start your FREE TRIAL of AlertBot now: click here.

]]>
3 Ways Site Uptime Monitoring Boosts SEO https://www.alertbot.com/blog/index.php/2024/04/30/3-ways-site-uptime-monitoring-boosts-seo/ Tue, 30 Apr 2024 16:28:04 +0000 https://alertbot.wordpress.com/?p=1145

3 Ways Site Uptime Monitoring Boosts SEO

About 25 years ago, if someone told you to “Google” something, you’d probably smile, nod politely, and walk (or perhaps run) away. But now, Googling is the unofficial international pastime. Consider these statistics:

  • 53% of all trackable website traffic comes from search engines — primarily Google, which commands 91.75% of total worldwide search engine market share.
  • Google handles around 2 trillion (that’s 12 zeroes) searches per year.
  • 39% of customers were influenced by a relevant search.

Clearly, the ability to show up for relevant search queries — a.k.a. search engine optimization (SEO) — matters enormously. In fact, it’s beyond enormous at this point. It’s ridiculous. And there’s no slowdown on the horizon. On the contrary, SEO will only play a bigger part in the digital role in the marketing mix going forward, for two simple and satisfying reasons: it’s much more affordable than conventional marketing and advertising, and it works. And you don’t need to have an MBA or have a Bloomberg terminal on your desk to know that affordable + works = popular. But less clear is the connection between site uptime monitoring and SEO. In fact, at first glance (and second and third as well), there may seem to be no connection at all. However, as any SEO expert worth their Google Search Console will attest, there is a significant link — positive or negative. Below we highlight three ways that site uptime monitoring can boost SEO:

  1. Keep Visitors from Bouncing to the Competition

Would-be visitors aren’t the only ones who are frustrated when sites are not accessible — Google takes a dim view of this as well. Now, to avoid triggering paranoia, be assured that Google has said that occasional, short-lived downtime typically won’t negatively impact search rankings. However, ongoing or prolonged downtime is another matter entirely, and will lead to a major downgrade. Site uptime monitoring automatically alerts your SysAdmins, CTOs, and other relevant individuals when a site goes down, so that immediate steps can be taken to get things back online — and make both visitors, and (especially) Google, happy.

  1. Identify and Fix Broken Elements

Google wants to provide searchers with relevant and quality site recommendations. The first part of that equation is largely determined by elements like keyword optimization, page rank and domain authority. But the second is determined by what visitors actually experience once they arrive on a site. Site uptime monitoring helps you proactively identify broken elements like links and buttons, so that they can be fixed before Google’s web crawler notices them and starts handing out SEO citations.

  1. Boost Page Loading Speed

For a long time, SEO experts demanded that Google reveal that page loading speed was a factor in evaluating sites — and consequently in search engine rankings. And for a long time, Google sat back with its arms crossed and silently smiled (when you make north of $300 billion in revenue a year, you get to do fun stuff like that). However, a couple of years ago Google finally revealed the worst kept secret in the SEO kingdom: speed is, indeed, a factor for search. Site uptime monitoring helps you keep a close eye on page loading times, so that you can ensure that your site blazes like a brand new luxury sedan on the Autobahn, and not like a rusted out 1984 Reliant K-car that shouldn’t go faster than a bike and can’t really make left turns. The Bottom Line Site uptime monitoring is not a magic wand that will transport your site (or sites) to the coveted number one spot for relevant keywords. But as discussed above, it will significantly help your business gain an advantage in the search engine jungle — which means more visibility, more clicks, and more customers.

Start your FREE TRIAL of AlertBot now, and discover why it is the trusted site uptime monitoring solution for some of the world’s biggest organizations. There’s no billing information required, no installation, and you’ll be setup within minutes. Click here.

]]>
7 Deadly e-Commerce Checkout Sins https://www.alertbot.com/blog/index.php/2023/07/11/7-deadly-e-commerce-checkout-sins/ Tue, 11 Jul 2023 19:25:24 +0000 https://alertbot.wordpress.com/?p=912 AlertBot Blog titled "7 Deadly e-Commerce Checkout Sins" with the photo of an Asian woman sitting at a laptop with a tablet in one hand and her face in the palm of the other hand looking distressed.

7 Deadly e-Commerce Checkout Sins

Back in the 1970s when bell bottoms roamed the world and 8-tracks reigned supreme, the Eagles warned us that Hotel California was a place where you could “checkout anytime you like, but you can never leave.”

Well, on the 21st century e-commerce landscape there is a similar dilemma facing customers who want to buy everything from gardening equipment to a new car: they can try to checkout anytime they like, but they can never buy.

Below, we highlight seven deadly e-commerce checkout sins that lead to lost sales and reputation damage:

 

  1. It takes too long to buy stuff.

Patience may be a virtue, but most customers aren’t in the mood to refine this noble characteristic when they’re ready to buy stuff. After all, they’ve already invested their valuable time choosing item(s). They want to cross this task off their to-do list right away. In fact, 70% of customers say that page speed impacts their willingness to buy from an online retailer.

  1. The process is too complex.

People who buy things online are intelligent and savvy. But that doesn’t mean they want to feel as if they’re putting together IKEA furniture when going through the checkout process. They want the experience to be straightforward and simple.  They just want to provide the required information — and nothing more. Less is definitely more.

  1. Not displaying a progress bar.

A progress bar tells customers where they are in the checkout process (e.g. cart summary, sign-in, address, shipping, payment), so they know that things are headed towards a satisfying, successful conclusion. Without this information, they can get irritated if they expect the next screen to say “thank you for your purchase”, but is yet another form to fill out.

  1. No ongoing form validation.

This one is tricky. Waiting for customers to get to the end of a form before telling them that they need to fix one or multiple fields can lead to an “I can’t be bothered with this, I’m getting out of here” reaction.

The best practice here is to configure form validation to scan and report as customers move from one field to another, or possibly one section to another (e.g. shipping address to payment information). Admittedly, some customers will still be irked by these “please fix the error” messages. But sending small notes as they move through the form/section is still better than forcing them to back up after they’ve reached the finish line.

  1. No guest checkout.

For businesses, granular customer data can be far more valuable than an actual purchase. However, online sellers need to resist the temptation to force all customers to create an account before they can checkout. Otherwise, they are going to lose customers; not necessarily because those customers are reluctant to share their data, but because they just aren’t in the mood to pick a username and password, and then validate their email address.

With this in mind, sellers should provide incentives for customers to create an account by, for example, informing them that doing so will enable them to track order fulfilment, save time in the future, etc.

  1. Failing to EXPLICITLY mention all costs.

Customers hate discovering surprise costs at checkout. Ideally, sellers can avoid this problem entirely by having zero extra costs of any kind. But realistically, most sellers need to charge shipping/handling (at least until a threshold is met), and potentially other fees based on the item(s) being purchased, the location of the customer, and other factors.

The best way for sellers to deal with this is to make potential/inevitable extra costs explicit. Burying these details at the bottom of a page, and in font so tiny that customers need a telescope to read them, is more than worthy of a pair of Bad Idea Jeans.

  1. Dysfunctional buttons, fields and other elements.

Nothing screams “please don’t buy from us” louder than a checkout process where buttons, fields and other elements don’t work, or when customers are presented with a dreaded 404 Page Not Found (ironically, the funnier or more creative this page might be, the more incensed customers can get — as if the seller is shrugging off their pain and suffering).  Using a solution like AlertBot to automatically and continuously test page integrity — and proactively send alerts when something goes wrong or doesn’t work — is an absolute must.

The Bottom Line

The e-commerce landscape is fiercely competitive, and it typically takes much less for online customers to head for the virtual exits than it does for in-store customers to head for the physical exits. Online sellers need to ensure that they aren’t committing any of the seven deadly — and wholly preventable — e-commerce sins described above. Otherwise, instead of fostering engaged customers, they will trigger outraged ones.

]]>
A Closer Look at AlertBot’s Failure Reporting Feature https://www.alertbot.com/blog/index.php/2023/02/21/a-closer-look-at-alertbots-failure-reporting-feature/ Tue, 21 Feb 2023 20:32:09 +0000 https://alertbot.wordpress.com/?p=903 AlertBot Blog titled "A Closer Look at AlertBot's Failure Reporting Feature" with image of a man with a headset on sitting at a computer in front of a screen that looks like a NASA space terminal.

The year was 1995. Michael Jordan returned to the NBA. Amazon sold its first book. Windows 95 unleashed the era of taskbars, long filenames, and the recycle bin. And when people weren’t dancing the Macarena, they were flocking to see Apollo 13 and hear Tom Hanks utter the phrase that would launch millions of (mostly annoying) impersonations: “Houston, we have a problem.”

Thankfully, the eggheads in space and the eggheads on the ground worked tirelessly (and apparently smoked a whole lot of cigarettes) to get the crew home. But it was the pivotal moment when the failure was first reported that triggered the spectacular problem-solving process. If it happened an hour — or maybe even a few minutes — later, then the outcome could have been tragic instead of triumphant.

Admittedly, the brave, intrepid professionals in charge of keeping their organization’s website online and functional DON’T have to deal with life-and-death scenarios. But they DO need to deal with problems that, if left unsolved, will significantly damage competitive advantage, brand reputation and sales (immediately if we’re talking e-commerce, and eventually if we aren’t). And that’s where AlertBot’s failure alerting feature enters the picture.

What is Failure Alerting?

Failure alerting is when designated individuals — such as a SysAdmin, CTO, CIO, CEO, and so on — are proactively notified when something goes wrong with a website, such as downtime, errors, slowness, or unresponsive behavior.

As a result, just like in Apollo 13, the right people can take swift, intelligent action to fix things before visitors/customers sound the alarm bell, or worse, head out the (virtual) door and go straight to a competitor without looking back.

Notification Methods

AlertBot customers can choose any or all of the following methods to notify team members of a website failure event:

  • Email
  • Text Message
  • Phone Call

For example, a SysAdmin could receive an email, a text message, and a phone call the moment something goes wrong.

Automatic Escalation

Now, if we were in NASA Mission Control circa 1970, someone wearing really thick horned-rimmed glasses would rise above the cigarette smoke and ask: What happens if the SysAdmin doesn’t receive the email, text message, and phone call? It’s a good question, and there is an even better answer: don’t worry about it.

AlertBot’s failure reporting feature can be configured to escalate the website failure warning if certain individuals don’t respond within a specific timeframe. For example, if a SysAdmin is indisposed for any reason (driving, sleeping, etc.), then after two minutes the alert can be pushed to another designated team member such as the CTO. And if the CTO doesn’t respond within two minutes, then the alert can be pushed to the CIO, and so on.

Ideally, the individual (or multiple individuals) who are sent the first alert receive it immediately, and take rapid action. But if they don’t or can’t, then the alert is escalated accordingly. It is important to note that all of this happens automatically, so there is no possibility of human error.

Granted, none of this is as entertaining as watching Apollo 13. There’s no rousing soundtrack or Tom Hanks. Heck, there’s not even Kevin Bacon.

But when it comes to fixing website problems as quickly as possible, organizations know that the less drama, the better. That’s precisely what AlertBot’s multi-channel, auto-escalating failure reporting feature delivers. We don’t need an Oscar. We just need extremely satisfied customers — and we have a lot of those.

 

Next Up: Reviewing Failure Events Online

 In our next blog, we’ll explore reviewing failure events online to pinpoint issues and detect problems. Stay tuned!

Launch a free trial of AlertBot’s acclaimed site uptime monitoring solution. No credit card. Nothing to download. Get started in minutes. And if you decide to purchase our solution, there are NO setup fees!

]]>
What is Proactive ScriptAssist and Why is it a Game-Changer? https://www.alertbot.com/blog/index.php/2022/12/06/what-is-proactive-scriptassist-and-why-is-it-a-game-changer/ Tue, 06 Dec 2022 20:12:31 +0000 https://alertbot.wordpress.com/?p=871

AlertBot blog titled "What is Proactive ScriptAssist and Why is it a Game-Changer?" with photo of a brown-haired woman in a white t-shirt and plaid button down shirt hiking and reaching up to grab the hands of someone helping to pull her up.

What is Proactive ScriptAssist and Why is it a Game-Changer?

Sometimes — not often, but every now and then — we come across an invention that is so remarkably useful, that we wonder: how did I survive without this?

High speed internet comes to mind. So do GPS devices. And who wants to imagine a world without the cronut?

Well, it’s time to add one more invention to the list: Proactive ScriptAssist.

The Back Story
Websites are not static things. They change over time; sometimes in minor ways, and other times in major ways (for fun, check out the Internet Archive’s Wayback Machine to see what some of your favorite websites looked like in the past — like Apple’s home page from 1996 which invites folks to learn about “the future of the Macintosh”).

Now, for visitors, the fact that websites constantly change is not a problem. In fact, it’s often a good thing because the change is an update, addition, or improvement of some kind.

But for IT and InfoSec professionals who are in charge of (among other things) website monitoring in their company, these changes can — and often do — trigger all kinds of bugs and errors. Fields and forms stop working, elements stop loading (or they load v..e..r..y….s..l..o..w..l..y), and there can be security vulnerabilities as well.

Multi-Step Monitoring
Thankfully, there is a way to verify that everything is working before site visitors start sounding the alarm bells — or worse, disappearing never to return.

This method is to implement an easy-to-use web recorder to create scripts of what site visitors actually/ typically do on various web pages, and make sure that everything is working properly. This is highly effective. That’s the good news.

The not-so-good news, is that when changes occur — even fairly small ones — re-scripting monitors can be a complex process that, in some scenarios, may require a level of expertise and experience that some IT/InfoSec professionals don’t have.

What’s the solution to this obstacle? Let’s all say it together: Proactive ScriptAssist!

About Proactive ScriptAssist
Available EXCLUSIVELY from AlertBot, Proactive Script Assist is an optional plan that includes the following:

  • Our team watches over an account, and proactively re-scripts any monitors that fail. We do all of the work, and our team has years of experience. After all, we created the technology, and we know how it works!
  • Failing monitors are evaluated within 3 hours, and the customer is notified of the situation.
  • Failing monitors are re-scripted within 3 to 24 hours (our response time is rapid, but the actual duration depends on the complexity — some re-scripting efforts take longer than others).
  • Customers get unlimited re-scripting and configuration updates from our team year-round.

Plus, if needed our team offers advanced support over remote desktop sessions (join.me sessions). This is not always necessary, but it is another layer of help just in case.

The Bottom Line
Inventions that changed our lives: High speed internet. GPS. Cronuts. And now, AlertBot’s Proactive ScriptAssist. It’s an elite list, and one that we’re honored to join.

Learn More
Ready to make your IT/InfoSec teams weep with joy (which is nothing like the weeping they did that time the intern wiped out the backup)?

If you’re a current AlertBot customer, then contact your Account Manager today.

If you haven’t yet experienced AlertBot, then start your free trial today. You’ll be setup in minutes. No billing information, nothing to install, and no hassle.

Now, if you’ll excuse us, we’re going to read about the future of the Macintosh while enjoying a cronut or two (or 5).

]]>
Multi-Step Monitoring: Why it’s Essential and How it Works https://www.alertbot.com/blog/index.php/2022/06/06/multi-step-monitoring-why-its-essential-and-how-it-works/ Mon, 06 Jun 2022 18:37:22 +0000 https://alertbot.wordpress.com/?p=850 Graphic of technical items featuring a check mark, a man pointing at squares, the word "method" circled, a microscope, magnifying glass and a pie chart.

Multi-Step Monitoring: Why it’s Essential and How it Works

 The term “essential” is thrown around pretty loosely these days. That new show about the hospital (no, not that one… not that one either… yeah that one) is advertised as essential viewing. A newly-released track by a hip hop artist that describes how little they need to release new tracks in order to live much, much better than the rest of us? That’s essential listening. And how can we forget that new muffin that cannot legally be advertised as a muffin, because is technically more of a candy. That’s essential snacking (“mmmmmm….pseudo muffin”).

But then on the other end of the hype spectrum, there are things that are legitimately essential, because going without them could lead to dire consequences — or maybe even a catastrophe. And for e-commerce companies, one tool that truly qualifies as essential is multi-step monitoring.

What is Multi-Step Monitoring?

In a break with tradition in the complex world of technology, multi-step monitoring is pretty much what it sounds like: a way to track the various steps that customers take as they move through pages on a website. This way, businesses can proactively identify and fix problems such as buttons that don’t work, forms that won’t submit, links that don’t go anywhere, pages that take too long to load, and so on.

Why is Multi-Step Monitoring Essential?

 Most customers who run into problems don’t shrug them off. They get mad. And that compels them to hit the brakes and head for the exit. In fact, a whopping 88% of online consumers are less likely to return to a site after just one bad experience. So, yeah, preventing about 9 in 10 customers from disappearing is important. One might even say that it’s… wait for it… ESSENTIAL!

How Multi-Step Monitoring Works

In AlertBot, configuring multi-step monitoring is remarkably easy, and doesn’t require an advanced degree in Hypercomplex Supergeerkery, with additional specialized certifications in Megaultra Nerdology. Here is how it works (a video tutorial is also available):

  • Step 1: Login to AlertBot
  • Step 2: Go to “Monitors”
  • Step 3: Set up a new monitor.
  • Step 4: Select the TrueBrowser® Multi-Step Monitor option.
  • Step 5: Download the AlertBot Recorder (available for PC currently — this step only has to be completed once).
  • Step 6: Give the monitor a name (e.g. “Amazon 1 Multi-Step Monitor”).
  • Step 7: Launch the AlertBot Recorder, input the URL of the site (e.g. Amazon.com), and record a script simply by simulating actions that a customer would take. It is also a good idea to label steps/phases (e.g. “Homepage”, “Add to Cart,” etc.), which can be helpful when analyzing reports later on.
  • Step 8: Save the script with a unique name (e.g. “Amazon test”).
  • Step 9: Upload the script into TrueBrowser® Multi-Step Monitor (which was launched in Step 4).
  • Step 10: Hit the “Test” button.

And that’s all there is to it. When the test is complete (this can take up to two minutes), a report is automatically generated that shows:

  • The duration of each phase/step in the process.
  • Whether each process was successful or unsuccessful.
  • A waterfall chart capturing a breakdown of everything that loads on each individual page (e.g. request times, file transfers, etc.).
  • Raw browser request data that reveals anything that is not working, or that could be contributing to a degraded user experience (e.g. loading large files or images that cause slowdowns).

Tests can be run at anytime to verify that problems are fixed and improvements are made. It’s remarkably easy. And yes, it’s essential.


Learn More

Discover the benefits of multi-step monitoring. Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. 

 

]]>
Debunking 3 Website Availability Monitoring Myths https://www.alertbot.com/blog/index.php/2021/05/27/debunking-3-website-availability-monitoring-myths/ Thu, 27 May 2021 18:51:29 +0000 https://alertbot.wordpress.com/?p=749 A pretty girl with brown hair up in a bun looks intently at a laptop with her hand to her chin in a pensive manner.

Debunking 3 Website Availability Monitoring Myths

by Louis Kingston

Some myths in life are harmless, or even helpful. For example, Santa Claus has come in very, very handy for parents who want to nudge their kids from the naughty list to the nice one. And let’s give a round of applause to the Tooth Fairy, whose promise of nominal financial compensation has turned the prospect of losing a tooth from a meltdown trigger into a motivational factor.

However, other myths are on the opposite end of the spectrum: they lead to stress and costs. The bad news is that there are some rather notorious website availability monitoring myths out there. But the good news is that debunking them is simple. Here we go:

Myth #1: Free website monitoring tools are just as good as paid versions.

The Truth: So-called free website monitoring tools are riddled with gaps and vulnerabilities — simply because they’re free, and the folks who make them aren’t trying to provide a public service or earn some good karma. They’re in business, and that means there’s always (always!) a hook. Here are some of the drawbacks: zero technical support, excessive false positives, reduced test frequencies, limited testing locations, and s-l-o-w product updates. For a deeper dive into these pitfalls, read our article here.

Myth #2: Buying website availability monitoring from your host is a smart idea.

The Truth: Your web host probably offers website availability monitoring, and keeps pestering you to buy it. What’s the harm? Well, here’s the harm: your web host is a web host. That’s their jam. They don’t specialize in website monitoring, which means that customers like you are going to pay for their lack of competence and capacity. And on top of this, your web host has an inherent conflict of interest when it comes to giving you the full picture — because your hosting agreement includes uptime standards. As such, they may be less inclined to be fully transparent if they fall below this standard. Or to put it bluntly: they might lie, and you’ll have a really hard (if not impossible) time trying to detect and prove it. For more insights on why it’s a bad idea to buy website monitoring from your host, read our article here.

Myth #3: Website availability monitoring is just about website availability monitoring.

The Truth: This last myth is especially tricky. Yes, website availability monitoring is about website availability monitoring. But that’s not where it ends. Comprehensive (i.e. the kind your business needs) website monitoring also analyzes key aspects such as website usability, speed and performance — because there are situations where a website can be available, but not accessible or optimized. To learn more about why comprehensive website availability is not just a technical necessity but also a customer experience requirement, read our article here.

The Bottom Line 

Does your kid have a toothache, threatening to go to DEFCON 1? Do a myth tag team of the Tooth Fairy + Santa to avert a meltdown (and hey, you might even enjoy some extras out of the deal like getting them to clear the dishes after dinner or clean out the cat litter — kids are tough negotiators, but see what you can get).

But if you want to keep your business safe and strong, then steer clear of all myths, and equip yourself with the clarifying truths revealed above.

And speaking of clarifying truths: AlertBot TRULY offers world-class, surprisingly affordable and end-to-end comprehensive website availability monitoring — which is why it’s trusted by some of the world’s biggest companies. See for yourself by starting your free trial now.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
4 Essential Failure Analysis Reports for Monitoring Website Performance & Uptime https://www.alertbot.com/blog/index.php/2021/02/09/4-essential-failure-analysis-reports-for-monitoring-website-performance-uptime/ Tue, 09 Feb 2021 17:47:22 +0000 https://alertbot.wordpress.com/?p=733 An asian man with spiky hair leans over a reflective table holding a tablet in his hand and is touching the screen with his right hand.

4 Essential Failure Analysis Reports for Monitoring Website Performance & Uptime

by Louis Kingston

It would be nice if the same commandment held for websites. However, even an infinity of buzz cuts cannot change the fact that, alas, sometimes websites fail. And so, the question then becomes: how do you minimize the likelihood, duration and severity of website failure?

The answer probably isn’t enough to inspire a movie. But it’s more than enough to help businesses detect and remedy underlying problems with their website before they become full-blown catastrophes: use failure analysis reports.

There are four types of failure analysis reports that every business should be generating on a regular basis: Waterfall Reports, Web Page Failure Reports, Downtime Tracking, and Failure Events.

  • Waterfall Reports

Waterfall Reports enable businesses to analyze the performance of every object that loads on their web pages (e.g. scripts, stylesheets, images, etc.), in order to identify common sources of bottlenecks, errors and failures. Waterfall Reports also display HTTP response headers, which help track down the source of slowdowns and breakdowns.

  • Web Page Failure Reports

Many business websites have dozens of pages, and e-commerce websites can easily have more than 50, 100, or even 1000. Manually hunting for problems can be tedious and futile. That’s where Web Page Failure Reports come to the rescue. They often contain a screenshot of data a page might display during a failure event log. This information can then be used to fix issues before they trigger visitor/ customer rage.

  • Downtime Tracking

No, Downtime Tracking isn’t the name of one of those bands that never smile when they sing. Rather, it’s a type of report that contains statistics on website and server downtime. Understanding the size, scope and source of downtime issues is critical to resolving them.  

  • Failure Event Logs

Knowing that a web page — or element(s) within a web page — are failing is important, but it’s not the full story. Failure Event Logs fill in the gaps by providing detailed information about what tests were performed, the geographical locations affected, and the errors identified.

The Bottom Line

Are failure analysis reports as gripping and captivating as Apollo 13? No. Are they vital to website performance and business success? Yes. Because while website failure is unfortunately an occasional option, it absolutely cannot become a regular habit.

At AlertBot, we provide our customers with all of these failure analysis reports (and more) so they can get ahead of problems and avoid catastrophes. Start a free trial today.

 

 

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host https://www.alertbot.com/blog/index.php/2020/08/18/3-reasons-why-its-a-bad-idea-to-buy-site-monitoring-from-your-web-host/ Tue, 18 Aug 2020 17:16:21 +0000 https://alertbot.wordpress.com/?p=701 A image of multiple server racks on either side of a laptop in the foreground. The laptop screen shows a cloud graphic with an "X" over it. Text on the image reads "3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host"

3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host

by Louis Kingston

For baseball pitchers, the two most glorious words in the English language are “perfect game.” For actors, it’s “Oscar win” (forget all that nonsense about how “it’s an honor just to be nominated.”). For school-aged kids, it’s “snow day.” And for businesses, of course, it’s “captive audience.”

Indeed, it doesn’t matter how compelling or clever a marketing and advertising campaign might be. If audiences don’t take notice and pay attention, it may as well not exist. And if you doubt this, think of the last time you sat through 20 minutes of movie trailers — not because you wanted to, but because there was nowhere else to go (at least, not without saying “excuse me…” 10 times as you painfully twisted and squirmed your way past annoyed fellow moviegoers).

Why does this matter? It’s because your web host is singing from the captive audience songbook when it repeatedly urges you to add site monitoring to your existing hosting package. At first glance, this may seem like a good idea. After all, you know that site monitoring is important. Why not just grab it from your web host, the same way you grab a side order of fries from a fast food restaurant? Well here’s why not:

  1. Lack of Specialization

Your web host doesn’t specialize in site monitoring, which means they aren’t using the latest technology or hiring the most qualified professionals. Just as you wouldn’t want your doctor to sell you a timeshare during an exam (“You know what might help that bronchitis? Two weeks a year in a sunny and warm Florida condo, as you can see from this lovely brochure”), you don’t want your site monitoring company to do anything but site monitoring. It’s not something anyone should be dabbling in.

  1. Lack of Service Offering

When web hosts offer site monitoring, they typically focus on uptime. But site monitoring isn’t just about letting you know when your site goes dark. It’s also about making sure that your site is performing the way it’s supposed to — which means that all elements are functional (e.g. buttons, forms, multi-step processes, etc.), and all pages are loading rapidly. Without this critical information, you may believe that everything with your site is fine and all lights are green; that is, until you begin hearing from irate customers and start losing sales.

  1. Potential Conflict of Interest

Last but not least, your site host is supposed to meet an uptime standard as part of their service commitment. But if that same host is also monitoring your site performance, they may be less inclined to be completely transparent if they fall below this standard. And if they did fudge some of the numbers, how would you even know? With this in mind, are we saying that all hosts that offer site monitoring are unethical? Absolutely not. Are we saying that there is an inherent conflict of interest that should be at least concerning and troubling? You bet.


The Simple, Smart Solution

The best (and really, the only) way to solve this problem is to avoid it completely — which means not site monitoring from your host, and instead getting it from a proven, reputable vendor that:

  • Specializes in site monitoring — it’s all they do 24/7/365.
  • Offers both uptime monitoring and comprehensive performance monitoring — not just the former.
  • Has zero conflict of interest telling you the truth, the whole truth, and nothing but the truth regarding how your site is doing.

Ready to safeguard and strengthen your business with world-class, surprisingly affordable site monitoring? Then you’re ready for AlertBot! We check all of these boxes, and are trusted by some of the world’s biggest companies. Start your free trial now.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
How To Keep Traffic Spikes from Crashing Your Website https://www.alertbot.com/blog/index.php/2020/03/31/how-to-keep-traffic-spikes-from-crashing-your-website/ Tue, 31 Mar 2020 19:36:22 +0000 https://alertbot.wordpress.com/?p=685 A photo showing blurred lights for traffic moving super fast on three different pathways. Text on the image reads "How To Keep Traffic Spikes from Crashing Your Website"

How To Keep Traffic Spikes from Crashing Your Website

by Louis Kingston

At first glance — and probably second and third as well — having too much traffic seems like a really nice problem to have; like when billionaires struggle to decide which yacht to buy (“I say Thurston, the one with the tennis courts is quite lovely, but the one with the outdoor cinema is so charming”).

However, too much traffic really is a problem, because it causes websites to either dramatically s-l-o-w down (which is terrible) or crash (which is worse than terrible). And right now, as hundreds of millions of people are advised or obliged to stay at home, there are a bunch of e-commerce businesses around the world that are experiencing this harsh, costly reality.

The good news is that your business can — and should — take proactive steps to keep traffic spikes from impaling your website, and causing revenue losses and reputation damage.  Here is the to-do list:

  1. Use a content delivery network (CDN), which is a geographically distributed network of proxy servers and data centers. A CDN helps ensure that visitors — regardless of where they’re located — enjoy fast-loading pages, images, videos and other content, and it also leverages a network of servers to manage traffic spikes. Instead of a single server struggling to handle the load, multiple servers share the burden.
  1. Check, double-check, and while you’re at it, triple-check whether your current server is capable of handling a traffic surge (pay particular attention to any data caps). This assessment is especially important if your business’s website has expanded over the years, but your server capacity has remained the same since day one.
  1. Make sure that all of your software is up-to-date. In addition to patching vulnerabilities, updates can help lower the risk of a virtual traffic jam.
  1. Run a daily backup. No, this won’t prevent traffic-induced website crashes. But yes, it’s a lifeline back to normalcy if a crash is on the horizon.
  1. Use a reliable 24/7 website uptime monitoring solution like AlertBot, which proactively and immediately informs designated individuals (e.g. system administrators, CTO, CSO, etc.) if your website goes down, and can also check to make sure specific scripts and pages are working correctly. What’s more, if required you can use AlertBot’s logs as evidence to inform your host that they need to do a much better job of keeping your website online — or else you’ll head elsewhere.

The Bottom Line
More potential customers than ever before are using the web to find products and services — everything from digital gadgets to financial advisors to home repairs, and the list goes on. When the surge reaches your virtual address, you want to definitively know — and not just hope — that your website is ready, willing and able to handle the traffic.

Give AlertBot a try for FREE. There’s no billing information, no installation, and you’ll be setup within minutes. Click here

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations https://www.alertbot.com/blog/index.php/2019/11/19/the-not-so-magnificent-7-https-errors-that-infuriate-customers-and-ruin-reputations/ Tue, 19 Nov 2019 19:13:39 +0000 https://alertbot.wordpress.com/?p=650 A graphic with a bright orange background and a cartoonish illustration of a man with glasses sitting at his desk facing his computer looking angry. Next to this graphic are the numbers "404" and text "Oops... page not found." Article title above it reads "The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations"

The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations

by Louis Kingston

In the classic flick The Magnificent Seven, a pack of essentially decent but “don’t you dare park your horse in my spot or else you’ll get your spurs blasted” gunslingers come together to rid a village of some nasty bandits. There’s action. There’s drama. There’s tragedy. There’s humor. There’s romance. There’s Steve freakin’ McQueen. What’s not to love?

Well, on the dusty and dangerous internet landscape, instead of a magnificent seven to save the day, there exists seven not-so-magnificent HTTPS errors that are impossible to like, let alone love. Why? Because their purpose is to block visitors from reaching websites — which leads to lost customers and wrecked reputations.

Here’s a look at the reprehensible HTTPS errors that have their picture on Most Wanted Lists in every post office from Tombstone to Dodge City:

403 Forbidden: The 403 Forbidden error means that the server is absolutely refusing — no ifs, ands or buts — to grant permission to access a resource, despite the fact that a request is valid. Common causes include missing index files, and incorrect .htaccess configuration.

404 Not Found: The 404 Not Found error means that a web page or other resource can’t be found because they simply don’t exist. Common reasons for this include a broken link, mistyped URL, or that someone moved or deleted a page and didn’t update the server (which happens a lot).

408 Request Time Out: The 408 Request Time Out error means that the server can’t find the target or resource that it’s searching for, and after a while, just throws in the towel. Often, this is because the server is overloaded.

410 Gone: Whereas (as noted above) a 404 error implies that there might be some hope — i.e. the target file might be somewhere, just not where it’s supposed to be — the 410 Gone error snuffs out any possible optimism. It’s totally, completely and permanently gone.

500 Internal Server Error: The 500 Internal Server Error means that the server cannot process a request for any number of reasons, such as missing packages, misconfiguration, and overload.

503 Service Unavailable: The 503 Service Unavailable error means that the server is either down because of maintenance, or because it’s overloaded. Either way, the server is conjuring up its inner Gandalf and screaming: “YOU SHALL NOT PASS!”

504 Gateway Time-Out: The 504 Gateway Time-Out error means that a higher-level upstream server isn’t working and playing well with a lower-level downstream server. After a while, the downstream server gets the message that it’s not wanted, and says “Oh yeah? Well, I don’t need you either!”

Calling in the Marshall
The bad news is that these reprehensible HTTPS errors, if left unchecked, can cause a lot of damage. Indeed, few things irk and offend website visitors more than seeing an error code. But the good news is that you can call in the Marshall— a.k.a. AlertBot — to restore law and order.

AlertBot constantly scans your site’s pages to watch out for these and other HTTP errors. If and when they are detected, authorized employees (e.g. webmasters, sysadmins, etc.) are proactively notified so they can take swift action and fix the problem.

It’s lightening fast, always reliable, and as smooth as Steve McQueen. Dastardly, good-fer-nuthin’ HTTPS errors don’t stand a chance!

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: Dunkin Donuts vs Starbucks https://www.alertbot.com/blog/index.php/2019/01/29/alertbot-showdown-dunkin-donuts-vs-starbucks/ Tue, 29 Jan 2019 21:17:26 +0000 https://alertbot.wordpress.com/?p=598 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying travel cups of coffee. Text reads "AlertBot Showdown: Dunkin Donuts vs Starbucks Coffee" with the word SHOWDOWN very large at the bottom.

If there’s one snack shop you’re likely to find on any given street corner in your city, there’s a good chance it’s either a Dunkin Donuts or a Starbucks (and in some cases, they’re on either sides of the street from each other). Both chains serve up steaming hot caffeinated goodness – at varying affordability in pricing – as well as other sweet treats. And while different areas of the globe may have more common chains than these two, we East Coast natives have regular access to the fresh beans of these common coffee connoisseurs.

It’s no secret that those who rely on a warm, fresh cup of java to get their day started also know these bean beverages affect their daily performance. So we wanted to pose the question – what about the web performance of these respective coffee shops?

To test their website performance, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both DunkinDonuts.com and Starbucks.com from December 1st through Christmas Day, 2018. Given the notoriety of both establishments, we expected their performance to be as strong as their brews, and we weren’t disappointed.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both Dunkin Donuts and Starbucks’ sites performed quite well. Neither saw significant downtime, but each one experienced some sluggish speeds and even load time timeouts on a couple rare occasions.

DunkinDonuts.com experienced 99.96% uptime, with just a few errors recorded due to slow load times. None of these events lasted longer than a couple minutes, and none amounted to any significant downtime. Because of this, we still consider their performance to be quite solid.  (DunkinDonuts.com 8.5/10)

Starbucks.com performed similarly with 99.87% uptime and similar slow page load errors that didn’t amount to significant downtime but at least put a wrinkle in their performance. They experienced four times as many of these errors as Dunkin, so we have to take that into consideration with our rating. (Starbucks.com 8/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.

The speed for both sites were relatively close to each other. DunkinDonuts.com’s best speed, on average, was seen on Sunday, Dec. 2 at 4.8 seconds, which isn’t stellar by any means, but not the worst either. Their best time of day, however, was on Wednesday, Dec. 19th at 4am with 2.1 seconds. It’s considerably better, but 4am isn’t exactly prime web traffic time. Dunkin’s worst averaged day was Monday, Dec. 17th at 6.2 seconds. However, their worst time was on Saturday Dec. 22 at 9am with a crawling 10.5 seconds. The site’s overall average speed across the entire test period was 5.6 seconds.  (DunkinDonuts.com 7.5/10)

Starbucks.com didn’t fare too much better in comparison. Their best day on average was Saturday, Dec. 1st with 5.2 seconds. Their best response time was at 7am on Monday, Dec. 17 with 2 seconds. (It’s interesting that their best average time was on Dunkin’s worst averaged day.) Starbucks’ worst day on average was the previous day, Dec. 16, with 6.9 seconds, with their worst response time on average being at 9pm on Friday, Dec. 7th with a slightly-slower-than-Dunkin’s-speed of 10.7 seconds. But, as you can see, both sites performed pretty close to one another. Starbucks.com’s overall average speed during the entire test period was a tad slower, at 6.3 seconds.   (Starbucks.com 7/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

If you’ve been following these competitions at all, you’ll know that no one beats California in website load time speed. However, lately, we’ve been seeing more variety when it comes to which state in the U.S. has the faster speeds. This time around, Nevada wins (for both sites), with DunkinDonuts.com moving at a swift 1.79 seconds in The Silver State. Oregon came in second at 1.8 seconds, with Ohio at 2 seconds. Comparatively, Washington state saw the slowest speed, coming in at 10.8 seconds, with Colorado in second at 9.2 seconds and Texas in third at 9.1 seconds. (DunkinDonuts.com 8/10)

Starbucks.com loaded at 1.4 seconds in Nevada, which was faster than Dunkin’s best time. Their second fastest was 1.5 seconds in Oregon and 1.7 seconds in Ohio – all better than Dunkin’s best (1.79 seconds). However, Starbucks saw significantly slower load times than Dunkin, with all of their slowest load times being worse than Dunkin’s slowest. Washington came in at 12.5 seconds, then Colorado at 11.6 seconds, and Texas at 11.4 seconds. While they were a little faster than DunkinDonuts.com, they were also considerably slower, which is unfortunate.  (Starbucks.com 7.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find their rewards program and get ready to sign up for it. (And we’re writing about it as we’re performing the test.)

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.DunkinDonuts.com into our Chrome browser, it took 15 seconds and 1 click to find the signup page for their rewards program. (OK, maybe this is too easy?)

For www.Starbucks.com, it took one click and 10 seconds to get to the rewards signup page.

For these tests, we attempt to go into them without much prior knowledge of the site’s user side functionality to give it an unbiased test, but this one probably calls for a retest with a different approach.

Let’s try navigating their respective menus and trying to find out about their coffee items.

With this in mind, from the point of typing in DunkinDonuts.com and navigating through their menu to their coffee options, it took 4 clicks and 23 seconds to get to the page with their regular drip coffee and its nutrition info. It’s a nice website and an enjoyable one to navigate.

With the same goal in mind, for Starbucks.com, it took 5 clicks and over 35 seconds to find the brewed coffee, but the confusing menu setup made it tough to find just plain, hot drip coffee. The Dunkin menu has images for all their options, but Starbucks drops most of the images once you get to the menu, so we ended up on the cold brew menu instead. (As it turns out, it was the fifth option, “Freshly Brewed Coffee” that we actually were looking for… you’d think it’d be one of the first options, though… right?)

Given that the first test was inconclusive, the second one was a clear one for us (albeit unexpected). DunkinDonuts.com was quicker and easier to navigate, and much more user friendly.

With that in mind, here are the Usability scores:

(DunkinDonuts.com 9.5/10)
(Starbucks.com 8/10)

 

Verdict

Both sites performed respectably, but when it comes to usability and speed, one unexpectedly outperformed the other—even if just by a little bit. So, we’re pleased to announce this Showdown champion to be…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "DunkinDonuts.com"

]]>
AlertBot Showdown: Staples vs OfficeDepot https://www.alertbot.com/blog/index.php/2018/10/23/alertbot-showdown-staples-vs-officedepot/ Tue, 23 Oct 2018 17:43:08 +0000 https://alertbot.wordpress.com/?p=573 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying office supplies. Text reads "AlertBot Showdown: Staples vs Office Depot" with the word SHOWDOWN very large at the bottom.

Even though our world continues to creep ever closer to being paper-free—trading our paper tablets for iPads, office supply stores have had to reinvent the way they do business and what their focus is. Staples and OfficeDepotTh are two mega-chain retailers who’ve long been in the fight, regularly providing printing services, as well as day-to-day necessities for the workplace, like pens, calendars, computer accessories, and so much more. And with the all-in-one ecommerce solutions monopolizing the public’s needs (we’re looking at you, Amazon), the desire to shop at these niche market leaders—who typically charge more for the same products—is becoming less and less.

So, for our latest, Showdown, we looked at these two office supply bigwigs and used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from August 26 to September 16, 2018. After engaging in this different kind of “Office Olympics,” we were expecting the usual quiet response from two reliable websites (i.e. good performance), but instead found what was equivalent to, well, a fun office chair race gone horribly wrong.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both Staples and OfficeDepot’s sites seemed to perform satisfactorily, with neither site ever really seeing significant downtime, but one of them really seemed to struggle with its load time.

AlertBot ended up returning over 800 alerts from Staples.com in the evaluated time span, with half of them being slow files bogging down the page, and the other half being page load timeouts. This doesn’t necessarily mean the site crashes, just that it’s taking unusually long to load. Their site regularly had a pop-up window during this time period promoting signing up for their email list, which seemed to play a part in disrupting the site’s load time and process.  (Staples.com 5/10)

On the flip side, OfficeDepot.com performed much better (despite also having a pop-up on its page), but while it seemed to see problems less often, it did experience two failure events, experiencing 98% uptime (compared to Staples’ 100%). The majority of the errors OfficeDepot experienced were slow files or longer load times. Despite this, however, it seems as though its worst times were in the middle of the night (a frequent site maintenance time), which is common for most sites. (OfficeDepot.com 7/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.

Staples.com’s speed tests proved that load times were a regular issue. Its best day, on average, was Monday, September 17th with 7.9 seconds. It’s not the worst load time, but given that most sites are expected to load in 2 to 3 seconds these days, it’s almost three times that. Their best time of day was on Thursday, September 6 at 10am with 3.3 seconds. The worst day, on average, was Friday, September 7th with 10.3 seconds, while the worst time of day was at 1am on Sunday, September 9th with a sluggish 13.8 seconds.  (Staples.com 7/10)

OfficeDepot.com actually fared worse, comparatively. Their best day proved to be Thursday, September 6 with 9.9 seconds for the page to load. Their best time of day was at 6pm on Wednesday, September 5th at 6.4 seconds. Their worst is significantly worse, with Monday, August 27th seeing an average of 12.5 seconds, and the worst time of day being on the same day at 3am with 16.8 seconds! (OfficeDepot.com 6/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

Typically, for the geographic tests, California is king, always turning in the fastest response time. For Staples, it’s actually North Carolina, who saw an average of 3.7 seconds of page load time. Washington, DC was second at 4.7 seconds, and New York was third at 5.2 seconds. The state with the slowest results was Missouri with 15.1 seconds and New Jersey with 15 seconds. Oddly enough, California, Florida, Colorado and Virginia all averaged 15 seconds—which is unusual. (Staples.com 6.5/10)

Things were the norm for OfficeDepot, however. They saw their fastest speeds in California, at 7.5 seconds, with Virginia being second fastest at 7.7 seconds. Their slowest performance was Missouri with a crawl of 19.9 seconds, and Utah followed it up at 15.6 seconds. (OfficeDepot.com 6/10)

These aren’t the worst website load times we’ve seen, but they also weren’t anything to brag about either.

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find an office executive chair and add it to our shopping cart.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.staples.com into our Chrome browser, it took 30 seconds and 5 clicks to search for “office executive chair,” click on one to view its product page, add it to the cart, and click “checkout.” (It had us thinking “That was easy!”)

For OfficeDepot.com, it took about 40 seconds and 6 clicks to get to the checkout process. OfficeDepot had a pop-up as soon as we got to the site which added one click, and then clicking on the cart and going to the checkout seemed to be a clunkier experience.

It’s a tough call for usability, but we did find the Staples checkout process to be a tad smoother.

All things considered, here are the Usability scores:

(Staples.com 9/10)
(OfficeDepot.com 8/10)

 

Verdict

It’s surprising how closely these two office supply giants performed – and how disappointing each did as well.  Still, neither were so bad that they experienced many full-on website failures, but both could benefit from some serious attention paid to increasing their website speed. Neither site really stands out above the other with its performance, because the good and the bad often balanced each other out, but when it comes down to considering the sheer usability as a tie breaker, we feel the verdict is…

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Staples.com"

]]>
3 Reasons Why Website Speed is More Important than Ever https://www.alertbot.com/blog/index.php/2018/10/15/3-reasons-why-website-speed-is-more-important-than-ever/ Mon, 15 Oct 2018 18:01:36 +0000 https://alertbot.wordpress.com/?p=569 A graphic showing two loading bars. The top one, which is loading faster has the silhouette of a cheetah running. The bottom bar, which is much slower, has the silhouette of a snail.

3 Reasons Why Website Speed is More Important than Ever

by Louis Kingston

Today’s business environment is relentlessly fast-paced. Today’s startups blast into tomorrow’s enterprises. And just as rapidly, today’s unicorns take a one-way journey into “hey, whatever happened to…” country. However, there’s another critical piece of the velocity puzzle that many businesses are missing, and it’s costing them customers and profits: the speed of their website.

Speed Kills Lives 

Nearly 50 years ago, the government introduced the phrase “speed kills” to warn drivers that going too fast from point A to point B could result in a detour to point C (the police station), point D (the hospital) or point E (the morgue). It was good advice then, and it’s still good advice now.

But when the scene shifts from the asphalt freeway to the information superhighway, speed doesn’t kill anything. On the contrary, it keeps websites alive as far as visitors are concerned. Here are the 3 reasons why:

  1. Speed Makes Websites Sticky

The word “bouncy” has a happy and positive feel to it, while the word “sticky”…well, it doesn’t. Nobody shows up to a birthday party excited to jump around in the sticky castle, and swimming pool diving boards wouldn’t be doing their job if people stuck to them (although it would be kind of hilarious).

But when it comes to websites, sticky is glorious and bouncy is dreadful — and that’s where speed makes a massive difference. A study by Kissmetrics found that a one second delay in load time can send conversion rates plunging by seven percent! Think about that. Actually, don’t think about that. Just read this sentence. That took a whopping two (!) seconds.

  1. Speed is SEO Rocket Fuel

An old joke in the SEO world goes like this: “Where’s the best place to hide a dead body? Page two of Google.” (And in related news, an old conversation among psychologists is: “Why do SEO people make jokes about hiding dead bodies?”)

Macabre humor aside, the point is simple to understand: for most (if not all) of their keywords, businesses either need to be on page one of Google — and preferably in the top three positions — or they might as well be advertising in the Yellow Pages (ask your grandparents).

Once again, speed is a big part of the SEO story. Google — which is obsessively secretive about how its algorithm works (the first rule of Google Search Club is that you don’t talk about Google Search Club) —has actually gone ahead and formally confirmed that page speed is a significant SEO ranking factor for mobile and desktop searches.

The moral to this story? All else being equal, a website that loads faster will rank higher than a website that loads relatively slower. And in the long-run that could mean the difference between surviving or shutting down.

  1. Speed Influences Perception

Einstein revealed that time, quite literally, is relative. But you don’t have to become a physicist or get yourself on a million memes to experience the deep truth of this in your bones. Here’s a fun little experiment:

Imagine that your favorite football team is losing a very important game. It’s late in the fourth quarter, and your beloved team is behind by six points. Although the clock is ticking down one second at a time, in your view the time is racing by. Surely, the clock must be rigged!

Now, imagine that your team is ahead vs. behind. The clock is still ticking down one second a time, but to you it’s not racing — it’s grinding slowly and painfully. Yet again: the clock must be rigged!

What this simple example demonstrates is what psychologists dub the perception of speed. Essentially, this means that our emotions influence how we grasp the velocity of passing time. Just a few seconds can seem like the “blink of an eye,” or a tedious wait — as we all know from toiling at the (not-so) express line in the grocery store.

The direct link to website speed here is unmistakable: visitors dislike waiting for websites to load. Actually, they hate it. Each extra second exponentially adds to their unhappiness, and makes it more likely that they’ll exact revenge by smacking the back button on their browser — never to return.

No, this doesn’t mean that websites must load instantaneously, like flipping channels on a TV. Technology isn’t there yet, and visitors aren’t unreasonable or unrealistic. But yes, it does mean that speed is connected UX, and ultimately, with brand: fast loading times creative a positive experience and emotions that are associated with the brand, while slow loading times do the opposite.


The Bottom Line

Website speed has always been important. But these days, it’s crucial — and in many cases, it’s THE MOST IMPORTANT factor. After all, it really doesn’t matter how amazing a website is and what it offers, if visitors never get there in the first place.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

 

]]>
AlertBot Showdown: Moviepass vs Sinemia https://www.alertbot.com/blog/index.php/2018/08/21/alertbot-showdown-moviepass-vs-sinemia/ Tue, 21 Aug 2018 18:29:00 +0000 https://alertbot.wordpress.com/?p=542 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying membership cards and ticket stubs. Text reads "AlertBot Showdown: moviepass vs sinemia" with the word SHOWDOWN very large at the bottom.
With streaming services like Spotify, Apple Music and Amazon redefining how we consume music, or NetFlix, YouTube and Hulu changing how we consume movies and TV at home and on the go, it probably should be no surprise that the subscription service concept would make its way to the cinema. MoviePass has long been a leader when it comes to theater-going subscriptions, but Sinemia is a rising competitor that has thrown its hat into the ring to fight for a share of the movie-going, popcorn-munching theater ticket buyers. Both services allow movie fans to pay a specific monthly (or annual) fee to see movies on the big screen at a discounted price.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from July 1 to July 22, 2018. As both sites and services are continuing to grow and change (Heaven knows MoviePass will probably change their rules and operations again before you finish reading this sentence), we weren’t surprised to see how similar the sites for each service performed.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both MoviePass and Sinemia performed well here, but one did seem to struggle a little more than the other.

MoviePass.com experienced a 98.2% average uptime due to several days where the site seemed to perform slower than usual, causing the pages to not load fully – even triggering a strange account lookup error on the front page for several hours on July 14th. This resulted in 18 failure events cataloged by AlertBot, with an average failure time of 32 minutes. This doesn’t mean downtime, per say, but the details did show that the site was struggling with its speed and load times. (MoviePass.com 7/10)

Comparatively, Sinemia.com saw 99.98% uptime with 1 failure event, although it wasn’t anything that spelled major downtime. At worst, it appeared to be a slow page / busy error that didn’t last long enough to qualify as site downtime. Overall, Sinemia proved to be pretty reliable. (Sinemia.com 9/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.

MoviePass.com saw acceptable page load speeds overall, with their best average day being Wednesday, July 4th with 3.9 seconds. The best time of day was 1am on Friday, July 20th (which isn’t a popular time to even be using a site like theirs) at an average of just 1.6 seconds. On the other side of the proverbial coin, the slowest day was Saturday, July 14 with an average time of 8.9 seconds, and the worst time of day was also on the same day at noon (yikes!) with an embarrassing 14.1 seconds.  (MoviePass.com 7.5/10)

Sinemia actually didn’t perform too much better, with their best average speed for a single day being Saturday, July 21 with 5.4 seconds and their best time of day being Wednesday, July 4th at 5pm with 2.7 seconds. Their slowest day was Monday, July 23rd with 7.3 seconds, with the slowest time being on July 2nd at 10pm with 10.2 seconds. (Sinemia.com 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

MoviePass.com performed the fastest in California with 1.8 seconds, with Florida coming in second at 2.4 seconds. The site performed slowest in Missouri with a sluggish 10.2 seconds, with Utah coming in second at 8.5 seconds. (MoviePass.com 8/10)

For Sinemia.com, California was also the fastest at 2.9 seconds, and Virginia was second fastest at 3.5 seconds. Missouri was also the slowest, at 11.3 seconds, with Utah being second slowest at 9.1 seconds. (Sinemia.com 7.5/10)

Neither site was all that impressive in the nature of speed – which is interesting considering there isn’t a whole lot of content on their websites to slow them down.

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to start the service signup process (but not complete any forms).

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.moviepass.com into our Chrome browser, it took a mere 18 seconds and 2 clicks to see their plans and get to the signup form. It was a piece of cake.

For Sinemia.com, it was actually just as smooth. In 17 seconds and 2 clicks, we were able to select a plan and get to the signup page.

It’s a tough call for usability. They’re simple processes, but they get the job done and we have no complaints.

All things considered, here are the Usability scores:

(MoviePass.com 10/10)
(Sinemia.com 10/10)

 

Verdict

The usability usually isn’t this straightforward and clear for both sites, so it leaves us to look almost exclusively at the other categories to draw a conclusion.

Without assuming MoviePass may have more hiccups in speed due to a greater deal of traffic, Sinemia.com seems to be a clearer choice for reliability as a whole, but the sites are quite close. That bad day on July 14 also really hurt MoviePass’s performance during this evaluation period, but it can’t be ignored. So, with that said, we believe the verdict is…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Sinemia.com"

]]>
AlertBot Showdown: Playstation vs Xbox https://www.alertbot.com/blog/index.php/2018/04/06/alertbot-showdown-playstation-vs-xbox/ Fri, 06 Apr 2018 19:30:53 +0000 https://alertbot.wordpress.com/?p=517 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying video game system controllers. Text reads "AlertBot Showdown: Playstation vs XBox" with the word SHOWDOWN very large at the bottom.

It may have been squashing a goomba while punching a coin out of a brick, dodging barrels being thrown by a grumpy gorilla, sorting oddly shaped falling blocks into interlocking patterns or simply catapulting miffed fowl at a group of defenseless pigs on your mobile phone, but chances are high that everyone has played a video game at one point in their life.

Poor web performance is no game any self-respecting owner of a website should play. We recently aimed our sights at the gaming industry and picked out two heavy hitters to evaluate: Xbox and Playstation. While their websites may not be the main point of interest for gamers, they’re relied upon for information, updates and even online digital game sales. Their online gaming servers may be the most important thing to keep running smoothly in gamers’ minds, but these top players in the industry will want to make sure their website stays up and always accessible.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from February 4, 2018 to February 25, 2018. Both sites performed well—as can be expected from parent companies Microsoft (Xbox) and Sony (PlayStation)—but, as usual, one performed just slightly ahead of the other, even if not by much.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both websites experienced 100% uptime, but both sites encountered minor errors that served as a few speedbumps along the way. Still, it wasn’t enough to qualify as downtime.

Xbox.com, despite its 100% uptime, experienced around 50 “slow page” warnings and over 20 page load timeouts (where something on the page takes a bit longer to load, slowing the page’s overall performance down). Xbox.com also returned an SSL Certificate expiration notice. However, none of these qualified as significant outages, and for that we still have to give them props. (Xbox 9/10)

Playstation fared the same with 100% uptime and a lot better when it came to the little errors. They only registered 7 timeouts and 5 slow page loads, and for that we give them slightly higher marks.  (Playstation 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Speed is crucial to the gamer – be it game load times (who else hates waiting for spinning icons to finish to get us past a cut scene or moving on to a new map in a game?) or server responsiveness – so a speedy game company website is key. Xbox.com experienced pretty quick load times, with its best day being February 24th with an average of 4.6 seconds. Its best response time, however, was on February 23rd at noon with 2.2 seconds. On the flipside, its worst day was February 12 with 6.7 seconds (which isn’t all that bad), but their worst hour proved to be on February 11th at 11pm with a sluggish 13.1 seconds. (Xbox 8.5/10)

Surprisingly, Playstation turned out to be just a little bit slower, with their best day average being 6 seconds on February 22nd. Their best time by the hour was on the same day at noon with 2.3 seconds, just a hair slower than Xbox’s best time. Their worst day was a full second longer on February 11th with 11.7 seconds, and their worst time by the hour was also 13.1 seconds, but on February 10th at 7am. (Playstation 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California seems to win out most of the time as the fastest location for load times and for Xbox.com, it was no different. California saw load speeds of 2.1 seconds on average, with Florida coming in second at 2.2 seconds. Georgia, however, saw an average worst time of 10.3 seconds with Missouri coming in second at 9.2 seconds. (Xbox 8.5/10)

Playstation.com actually turned in slightly more sluggish results geographically, too. Their best location was California, as well, but it was 2.5 seconds, and Florida was a close second at 2.7 seconds. Playstation’s slowest spots were also in Georgia and Missouri, at 12.6 seconds and 11.2 seconds, respectively. It’s not the worst we’ve seen, but Xbox clearly performed better. (Playstation 7.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to add a digital download of a popular video game to the shopping cart and start the checkout process.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.xbox.com into our Chrome browser and clicking around to find the Xbox One games, choosing the featured one (which, in this case was Dragonball FighterZ), clicking “Buy Now” and getting to the account login screen, it took 1 minute and 10 seconds. From the homepage, it took 7 clicks to get to the checkout process. It’s been a while since we’ve last visited their site, so our experience was fresh, but we encountered some significant slow loading times when getting to the product page. We actually added an additional click to the process because the “Buy Now” button didn’t load properly at first (and did nothing upon its first click). Overall, we got to do what we set out to do, but the process could have gone a lot smoother.

We were hoping for a better experience from Playstation, and we got one. From the point of typing www.playstation.com into our Chrome browser, it took 4 mouse clicks and 35 seconds to find a featured video game (in this case, Bravo Team), and get to the checkout stage (which was also an account login screen). There was some delay on first clicking on the game title, but it still loaded quickly and allowed us to get to the end of the process fast.

Both sites allowed us to get the job done in a rather speedy manner, but Playstation’s site gave us a much more positive experience.

With that said, here are the Usability scores:

(Xbox 8/10)          (Playstation 9.5/10)

 

Verdict

Both sites performed very well, but that positive user experience helped push one over the other, albeit only slightly. So while it was a tough call to make, we have come to a conclusion —

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Playstation.com"

]]>
AlertBot Showdown: Reebok vs Nike https://www.alertbot.com/blog/index.php/2018/01/09/alertbot-showdown-reebok-vs-nike/ Tue, 09 Jan 2018 20:00:53 +0000 https://alertbot.wordpress.com/?p=480 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are wearing athletic brand headwear. Text reads "AlertBot Showdown: Reebok vs Nike" with the word SHOWDOWN very large at the bottom.

Whether you’re hitting the gym or the trails, you’re likely to be lacing up with some active footwear that helps you burn calories and exercise in comfort and style. When it comes to activewear, there are many companies these days who contribute their accessories and gear to our daily workout regiments, however, two major players come to the front of our minds when it comes to popular footwear brands.

For our latest AlertBot Showdown, we picked frontrunners Nike and Reebok to evaluate the website performance for each athletic wear’s online persona.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from October 1, 2017 to October 22, 2017. While both sporty sites performed well, it became pretty clear after a significant trip-up that one site left the other in the dust.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

For the first time in our experience of tracking sites for a Showdown, one of the sites in the running went down while we were actually in the office. That gave us the ability to watch the event as it unfolded while AlertBot performed its tests against the failing site. Reebok.com hit a snag on October 13 around 3:30pm EST. It took nearly a full hour for their site to recover. We manually checked their site from our desks at 4pm, and the site was still down. We checked again at 4:15 and the site was back up, however, only text was loading – no images. By 4:30pm, when we checked one more time, the Reebok.com was back up in its entirety. It was the only failure event that Reebok.com encountered during the weeks it was tested for this Showdown, but it was definitely a doozy. During this time period, their average downtime was just 99.85%, but it’s proof that “99% uptime” can still contain an hour of critical downtime. And for a retail site, this could truly prove costly. (Reebok 7/10)

On the other hand, Nike.com experienced no significant failure events and only occasionally experienced minor issues like a slow page file or a “timed out” error. From the starting line, Nike is already on the fast track to success between the two brands. (Nike 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Speed is everything for the image of brands like these, which makes it a bit ironic that both sites seem to struggle a little in this area. Reebok’s fastest average speed was on October 4th with 6.4 seconds load time. Their worst average speed was October 23 at 7.9 seconds. They’re not drastically different, but that’s not an impressive load time.  (Reebok 7/10)

At this point, one might expect Nike to sprint past Reebok in the load time category, but Nike didn’t fair much better, with 6.3 seconds being their fastest average speed on October 23 (which is coincidentally the day of Reebok’s slowest average), and Nike’s slowest average speed was 7.5 seconds. Again, they’re not great speeds, but in this case, Nike edges out Reebok, even if it is only by a slight skip rather than a jump. (Nike 7/10)

 

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

 

 

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

Looking at site response time geographically tells a different story. First off, Reebok shows that they had the fastest load time in Texas with an average of 3.7 seconds. Their second fastest time was in New Jersey at 4.8 seconds. Virginia produced the slowest return, with an average of 6.9 seconds. (Reebok 7.5/10)

Yet again, Nike only performed slightly better, with California showing the fastest average speed of 3.2 seconds and Texas showing the second fastest at 4.5 seconds. However, Nike performed worse than Reebok when it came to slowest location, with Illinois taking the cake for worst average speed, at 9.7 seconds! (Nike 7/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater, or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to add their latest running shoe to the shopping cart and start the checkout process.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.reebok.com into our Chrome browser and clicking around to find a Men’s Running Shoe, choosing the first one, choosing a size, adding it to the cart and clicking “checkout,” it took 36 seconds. From the homepage, it took 5 clicks to get to the checkout process. At first glance at the homepage of the site, it seemed like it might be a challenge to actually find what we’re looking for, but it was a pretty easy shopping experience.

From the point of typing www.nike.com into our Chrome browser, it took 8 mouse clicks and 48 seconds to find a men’s running shoe and get to the checkout stage. Upon first visiting the site, the visitor is hit with an ultra closeup of a bunch of kids in gray Nike hoodies and it takes most of the page hostage. We scrolled down to the first running shoe advertised and clicked on it, only to find that it was only a women’s shoe (which is not mentioned on the image on the homepage). We then had to click around to the men’s department, for this task’s purpose, in order to find a shoe and continue the process. Both sites get the job done, but Reebok was a more pleasant shopping experience.

With that said, here are the Usability scores:

(Reebok 9/10)        (Nike 8/10)

 

Verdict

Both sites performed respectably, but we can’t ignore that failure that Reebok experienced on the 13th. Other than that, the sites performed quite similarly (and we actually preferred Reebok’s shopping experience a little more than Nike’s). Still, since we’re really weighing in here on web performance, the winner is rather clear —

Graphic rendering of a robot with a triangular head and circle eye holding up a sign that reads "Nike.com"

]]>
Black Friday / Cyber Monday Showdown: Amazon vs Walmart vs Target https://www.alertbot.com/blog/index.php/2017/11/29/black-friday-cyber-monday-showdown-amazon-vs-walmart-vs-target/ Wed, 29 Nov 2017 00:34:35 +0000 https://alertbot.wordpress.com/?p=465 A graphic with a yellow starburst in the center and two robots charging towards a third robot. The two on the left are carrying shopping bags. The one on the right is carrying a box. The text reads "Black Friday - AlertBot Showdown: Target vs Walmart vs Amazon" with the word SHOWDOWN very large at the bottom.

It’s that time of year again, where sales conscious bargain chasers brave the throngs of other sale hunters in the frigid November early morning air on that most dreaded of retail shopping days: BLACK FRIDAY. Just hours earlier, many of these same credit-card-wielding warriors were huddled around a table with family, giving thanks once again while stuffing themselves to their waistline’s discontent with mashed potatoes, roasted turkey and homemade pie. The juxtaposition of these two contradicting practices is staggering, but it’s no less the holiday tradition year after year.

As we approach another Christmas holiday, the world of ecommerce continues to ramp up the way they approach Black Friday–and its younger electronic sibling, Cyber Monday–with many now starting their sales right after Halloween. Accordingly, we decided to do something special for our next Website Showdown: a Black Friday / Cyber Monday edition that pits the ecommerce colossus Amazon against the websites for brick-and-mortar retail mega-stores Walmart and Target. It’s a truly epic battle royale to see how each site performs during the biggest shopping days of the year.

So, as usual, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor all three sites from Thanksgiving through Black Friday and Cyber Monday, spanning from November 23, 2017 to November 27, 2017. We expected strong, reliable performance during the entire run and we were not disappointed. The results were nothing short of impressive.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Usually for this section, we evaluate each site’s performance in detail, drilling down to specific errors each one faced, and the different types of errors we usually see (like Slow Page Files, Timeouts, etc). It’s unusual for the sites in a two-site Showdown to not return a single error, much less a three-site Showdown. In this special evaluation of three sites, not one single, solitary error was found between them. All three sites avoided any kind of failure event or significant error. With the stakes so high for three of the biggest retailers on the most significant sale days of the year, one would expect nothing less. So, with that said, each site earns a perfect score for Reliability.

(Amazon 10/10)
(Walmart 10/10)
(Target 10/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Sites like Amazon, Walmart and Target boast very graphics-driven designs, and especially with monstrous sale event days like these, the graphics are often big, bold, and frequently changing.

With that said, of Amazon.com’s 5-day run, they saw the fastest day, on average, to be Sunday, November 26th with 4.3 seconds. It’s not the slickest speed a site can have, but it’s certainly not bad. On their slowest day, on average, Amazon still clocked in at 5 seconds on Cyber Monday, which is still not too shabby. When looking at specific times of day for performance, the best hour was at 5AM on Sunday with an impressive 3.4 seconds, while Cyber Monday also saw the slowest hour at 7AM with 6.7 seconds.
(Amazon 9/10)

Walmart.com held their own surprisingly well during this time, too. Their best average day was Thanksgiving Day, November 23rd at 4.2 seconds, just barely edging ahead of Amazon. Their worst day on average was Saturday, November 25th, also at 5 seconds. Finally, their best hour on average was on Thanksgiving at a remarkable 2.7 seconds at 5PM. Their worst time on average was 6.4 seconds at 2AM on Sunday, November 26.
(Walmart 9.5/10)

Last, but certainly not least, Target.com didn’t perform quite as well as the other two, but they still performed respectably, especially considering the fact their site avoided any failure events. Their best day for speed, on average, was Thanksgiving Day at 5.2 seconds, which is worse than both Amazon and Walmart’s worst days. Target’s slowest day on average was Sunday, November 26 at 5.4 seconds, which at the very least, shows a great consistency for the performance of the retail chain’s online presence. Their fastest hour turned out to be on Black Friday at 9AM with 3.9 seconds, with their slowest being on Cyber Monday at 4AM with 7.6 seconds.
(Target 8.5/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California tends to prove to see the fastest web transaction speeds in the country, and in this test scenario, they once again come out on top for each website. For Amazon.com, the titan of ecommerce saw average load times of 2 seconds in the The Golden State, with their next-fastest location being Texas at 3.2 seconds. When it came to their slowest locations, Illinois came in at the bottom with 6.6 seconds, with Georgia just above them with 6.3 seconds.
(Amazon 9/10)

Walmart.com was only a millisecond faster, seeing an average load time of 1.9 seconds in California, also coming in faster in Texas at 2.7 seconds. But Walmart saw a placement swap for which state was the slowest, with Georgia coming in at the bottom at 6.6 seconds and Illinois right above them at 6.5 seconds.
(Walmart 9.5/10)

Target loaded on average at 2.7 seconds in California, with Texas coming in next at 3.5 seconds. Again, Target’s fastest speeds proved to be slower than their competitors. The slowest average speed that Target saw in the U.S. was also Georgia, at 7.2 seconds, but Washington stepped in as their second slowest, at 7 seconds flat.
(Target 8.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we always select a common task a user might typically try to accomplish when visiting the sites we’re testing and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater. Like with the most recent Showdown for Lowes and Home Depot, we decided to see what the experience would be like to use these three different websites to add a common product to the shopping cart.

For each of these processes, let’s see about adding the PS4 version of new video game Star Wars: Battlefront II to our shopping cart. To begin each process, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.amazon.com into our Chrome browser, typing “Star Wars Battlefront 2” into the store’s search box and adding it to the cart, it took 30 seconds. From the front page, it took about 5 clicks (including selecting the autocomplete suggestion in the search bar) to get to the final “Place Order” window.

From the point of typing www.walmart.com into our Chrome browser, it took about 4 clicks and 35 seconds to get to the Cart Checkout window. The autocomplete was a little clumsy to deal with (it was tough to tell if the browser was really proceeding to load the site), but overall, it was a decent experience.

From the point of typing www.target.com into our Chrome browser, it took about 5 clicks and 27 seconds to get to the Cart Checkout window.

All three sites were good experiences, although each one has a very different feel. It’s a tough call to say which user experience we found to be better, so we decided to try a second test. This time, we chose something different, like Wonder Woman on Blu-Ray. We also decided to try Mozilla Firefox this time.

The process of finding the Blu-Ray disc and getting to the checkout process on Amazon took about 4 clicks and 25 seconds. The process on Walmart.com took 26 seconds and 5 clicks. On Target.com, it took roughly 24 seconds and 4 clicks. This time, we noticed that in the search results, there’s a convenient “Add to cart” option next to the items on Target’s site that Walmart and Amazon both DON’T have. This definitely gives Target a slight edge over their competitors. And with that being the only real significant difference, outside of its slightly faster completion time, we’ll have to say Target wins the Usability portion of this Showdown.

(Amazon 9.5/10)
(Walmart 9.5/10)
(Target 10/10)

 

Verdict

With stakes this high, you would only expect the best from the leaders in the retail industry, so it comes as no surprise that the results were so good and so close. This may be the toughest Showdown we’ve had to score yet, especially with three hats in the ring this time around.

But, with all things accounted for – reliability, speed, geographical performance, and the site’s usability – we’ve reached our verdict:

WINNER:

 Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Walmart.com"

 

 

]]>
Website Monitoring Leader AlertBot Adds Mac Support for Web Recorder & Enhances SSL Testing Functionality https://www.alertbot.com/blog/index.php/2017/10/25/website-monitoring-leader-alertbot-adds-mac-support-for-web-recorder-enhances-ssl-testing-functionality/ Wed, 25 Oct 2017 12:04:52 +0000 https://alertbot.wordpress.com/?p=460

AlertBot logo with triangle logo image

 

Website Monitoring Leader AlertBot Adds Mac Support for Web Recorder & Enhances SSL Testing Functionality

AlertBot’s multi-step web recorder, which has been available to Windows users for several years and now supports Mac users, is a fast, easy and reliable way to verify that all interactions on a website are working properly.

 

ALLENTOWN, PA (October 25, 2017) – AlertBot announced today that per a new update it has added Mac support to its acclaimed multi-step web recorder, and has made several other security and usability improvements.

AlertBot’s multi-step web recorder is a fast, easy and reliable way to verify that all interactions on a website are working properly. Customers simply click record, interact with their website as desired (e.g. perform a search, put items in a cart, and so on), and upload their finished script to AlertBot, which then automatically performs these pre-set actions at regularly scheduled intervals. Any variations or concerns are immediately sent to customers for investigation and resolution.

Customers can also re-record their script at any time through AlertBot’s desktop dashboard, or through the re-designed viewer for smartphone and tablets, which per the update is now faster and easier to use.

“We are excited to bring our multi-step web recorder to our Mac customers, which allows them to change their multi-step testing scripts more easily,” commented Pedro Pequeno, President of InfoGenius.com, Inc. which owns and operates AlertBot. “Mac users are an important and valued part of our user base, and we want to make sure they continue to have the best tools available.”

Also featured in the update are new advanced SSL error ignoring and TLF features, which give customers greater control over site diagnostics, and helps them meet PCI compliance standards. For example, customers now can choose how to handle SSL certificate expiration dates, domain mismatches, and other common certificate issues, as well as specify which Transport Socket Layer (TLS) versions to allow.

Other key usability improvements include:

  • New “Set Active” and “Pause” buttons that enable customers to select and change the status of a batch of monitors in a single operation.
  • The ability for customers to run a summary report for any monitor from the main menu.
  • Alert scheduling is now more intuitive and easier to setup.

Added Mr. Pequeno: “With the surge in data breaches, PCI compliance standards are more important than ever. AlertBot’s enhanced monitoring capabilities help our customers ensure that the SSL aspects of this compliance commitment are always being met.”

About AlertBot

Founded in 2006, through its industry-leading TrueBrowser® solution AlertBot enables businesses to continuously monitor the availability and performance of their mission critical public Internet services from across the country and around the world. When AlertBot detects an issue with websites or servers, it analyzes the problem within seconds from multiple geographic locations, and delivers real-time alerts to business leaders and system administrators via devices such as smartphones and mobile devices. Thousands of companies trust AlertBot to help them deliver the uptime and performance they expect, and their customers demand. Learn more at http://www.AlertBot.com.

About InfoGenius.com, Inc.

Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Learn more at http://www.infogenius.com.

]]>
AlertBot Showdown: FedEx vs UPS https://www.alertbot.com/blog/index.php/2017/05/23/alertbot-showdown-fedex-vs-ups/ Tue, 23 May 2017 18:51:27 +0000 https://alertbot.wordpress.com/?p=414


A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying rectangular shipping boxes. Text reads "AlertBot Showdown: FedEx vs UPS" with the word SHOWDOWN very large at the bottom.

 

One of the most appealing things about ordering items online is receiving packages in the mail. Not only is it convenient for the fruits of your shopping toils to be brought directly to your door, but you can do your shopping from anywhere at any time of the day or night (and in your pajamas if you so desire). Two well-known, worldwide services that nearly everyone who has sent or received a parcel has used are UPS and FedEx. Both services are easily accessible for sending packages, and both are frequently used for receiving them. Both services also have websites that enable users to track their packages (if they’ve been given a tracking number), while also helping to provide resources for sending them out.

For our third Showdown, we set out to track the performance of these two services, trucking along until we could wrap up the results for delivery to you.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both parcel service sites for three weeks, spanning from March 27, 2017 to April 17, 2017. Not surprisingly, the performance proved to be reliable for both sites. Neither service’s site went down, but one did prove to perform a little faster than the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

FedEx’s website experienced not a single, solitary failure event. At the very worst, it may have experienced some slight slowness for a short period of time, but it didn’t affect their overall reliability results. (FedEx 10/10)

UPS’s website was a different story, but there were also no failure events or periods of actual downtime either. The most UPS’s site saw were a handful of warnings that the site was performing a little slower than usual, and a little slower than the average expected load time. These periods of minor slowness only lasted for about 3 to 5 minutes each.   (UPS 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user.  We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Both websites have pretty basic homepages, so the load times for customers should be fairly quick (even on a slow internet connection) if the sites aren’t experiencing any server issues.

FedEx’s site speed is fantastic, averaging less than 1 second on most occasions. Its response time was recorded on Wednesday, April 5, 2017 at 0.5 seconds, while its slowest response time was on Monday, April 17, 2016 at just over 2 seconds (which is still very good). (FedEx 10/10)

UPS was also pretty good, but their best response time was about the same as FedEx’s worst response time. UPS’s best response time was 2 seconds on Tuesday, April 11, while their worst was on Monday, April 10th with just a hair under 6 seconds. The standard used to be 7 seconds, but these days, users expect sites to load in roughly 2 seconds.  (UPS 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart


Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

It’s also interesting to note that in most of these tests we’ve done for these Showdowns so far, California seems to frequently come out on top when it comes to website speed. With that said, FedEx seemed to perform best in California – at just under half a second, and performed the “worst” in Virginia, which still averaged around an impressive 1.1 seconds.  (FedEx 10/10)

UPS also saw its best results in California, but clocked in at around 1.4 seconds there. Texas returned the slowest results, however, averaging around 5.6 seconds. (UPS 8/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we went through the motions of ordering tickets for a recent movie on MovieTickets.com and Fandango.com, For this evaluation of FedEx and UPS, we’ll see how the experience of tracking a real package goes.

For each package tracking process, we started with having the tracking number copied onto our clipboard and then typed the URL of the test site into our browser.

From the point of typing www.FedEx.com into our Firefox browser, selecting the tracking tab at the top, pasting the tracking number into the search field on the left sidebar and clicking “Track,” it took only 15 seconds to get to the tracking results. That’s really fast! We then tried the same process again using the Google Chrome browser, for which the “region” needed to be selected first this time, it took only a second longer to complete!

Now, from the point of typing www.UPS.com into our Firefox browser, selecting the region, pasting the tracking number into the search field on the left sidebar and clicking “Track,” it took roughly 22 seconds to get to the tracking results. That’s not bad, but it’s clearly slower than our FedEx experience. We then tried the same process again using the Google Chrome browser and it took an impressive 12 seconds to complete!

So, with all things considered, with the goal being to track a package as quickly as possible, here are the Usability scores:

(FedEx 10/10)        (UPS 9/10)


Final Verdict

It’s a close match, to be honest, but we’d have to say that FedEx.com still outperformed UPS.com in the speed factor, delivering more than just highly anticipated parcels to its customers, but reliable website performance swiftly as well.

So, for the third AlertBot Showdown, the site that gets to join the ranks of previous winners Apple and Fandango is…

 

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "FedEx.com"

> ]]> Why is Website Performance Monitoring Necessary? https://www.alertbot.com/blog/index.php/2017/01/17/why-is-website-performance-monitoring-necessary/ Tue, 17 Jan 2017 11:00:04 +0000 https://alertbot.wordpress.com/?p=343 Photo of two hands holding a tablet horizontally with illustrations of graphs and icons floating off the face of the tablet.

Given the fact that we live in a highly-digitized world today, websites, blogs and web-stores are now an essential component of any business and brand. While waiting for a site’s content to load can be annoying for a user, it can also be potentially disastrous for business.

That, however, is only one reason to monitor the performance of your website. Here are four more:

1.     Loss of Sales and Web-Traffic

First and foremost, businesses maintain websites and have web-stores to promote commercial growth. Now, imagine a situation where you’ve gone to a store and the service is impossibly slow. The salesmen and women are hardly making an effort to engage or help you and you just decide to take your business elsewhere. The same happens to a shopper when they visit a website that takes ages to load. Instead of making a sale, you lose web-traffic and potential customers. You can prevent this by monitoring how your website is performing.

2.     Potential Damage to Brand Image

Customers talk, and they are interested in what others like them have to say. While most brands depend on marketing ploys to promote sales, the importance of word-of-mouth advertisement cannot be discounted. If you leave a bad impression on one customer, chances are that word will spread about it, tainting- if not tarnishing- your hard earned reputation and brand image. Who wants that?

3.     Error Detection

Website performance monitoring is the best way to prevent errors. It’s all too common for ecommerce sites to hit a snag and run into trouble.  If your site is regularly maintained and monitored, you’ll not only be able to fix a problem sooner; you might even be able to detect it beforehand and prevent it completely.

4.     Quality Maintenance

Just as quality assurance is essential for a physical store, it’s equally important for a website and web store. By using a performance testing and maintenance tool, software or application, you will be able to standardize and retain the quality of your website. Not only will that help preserve the website’s ranking on Google, it will also contribute to drive online traffic.  As it is, Google ranking is affected by the minutest change in website speed and downtime. This is the whole reason why websites are search engine optimized in the first place.

So, if you’re even partially convinced that your website needs performance monitoring, why not start the AlertBot 14-day free trial, today?

]]>