web performance – The Official Blog https://www.alertbot.com/blog/ Thu, 29 Jan 2026 18:42:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 Synthetic Monitoring: Frequently Asked Questions https://www.alertbot.com/blog/index.php/2025/03/05/synthetic-monitoring-frequently-asked-questions/ Wed, 05 Mar 2025 18:32:56 +0000 https://alertbot.wordpress.com/?p=1295 Synthetic Monitoring: Frequently Asked Questions title graphic illustration of a laptop and scientists checking graphs and charts

Synthetic Monitoring: Frequently Asked Questions

One of the most important features in a comprehensive enterprise-grade web monitoring solution is synthetic monitoring. Below, we answer some frequently asked questions, so that you can clearly understand what this is, how it works, and why it’s essential vs. optional.

 

What is synthetic monitoring?

Simply put, synthetic monitoring is a method simulating the journeys that visitors take on a website, and then evaluating performance. The main purpose is to proactively identify errors or bottlenecks (so your team can fix them), including hard-to-find flaws that may be associated with variables such as browsers, devices, geography or network. Synthetic monitoring is also ideal for improving and optimizing performance (i.e., making transactions and workflows faster and simpler).

 

What are some critical insights that synthetic monitoring can reveal?

Synthetic monitoring can provide answers to core questions such as:

  • How fast is our website response time at the moment?
  • Are all our complex transactions (e.g., filling out forms, adding items to carts, etc.) functioning correctly and optimally?
  • What areas of our website receive a limited amount of traffic, and is this normal and expected or a potential problem?
  • If we are experiencing a failure or slowdown, where exactly is it?

 

How does synthetic monitoring work?

Essentially, there are three steps to setting up and implementing synthetic monitoring:

  1. Create scripts that simulate visitor interaction and behavior.
  2. Collect data gleaned from scripts.
  3. Analyze collected information so you can fix identified problems and optimize performance.

 

What is the difference between synthetic monitoring and journey monitoring?

They are the same thing, although generally the term synthetic monitoring is more common across leading web monitoring solutions.

 

How can synthetic monitoring help improve competitive advantage?

Synthetic monitoring is ideal for benchmarking performance against competitors. You can also use it to simulate traffic from different geographic locations to track APIs, SaaS products, etc. This can be especially helpful for identifying peak markets. And synthetic monitoring can help safeguard your business during times of anticipated traffic spikes (e.g., Black Friday, Cyber Monday, etc.), by alerting you of any problems right away so you can make sure your website doesn’t miss a beat.

 

How can synthetic monitoring help improve third-party vendor compliance and performance?

You can use synthetic monitoring to help ensure that SaaS vendors are meeting their Service Level Agreement (SLA) commitments.

 

We are using, or thinking of using, real user monitoring. Is this sufficient?

In the distant past this was probably fine, but these days synthetic monitoring is far superior and widely recommended by experts. Here is why: real user monitoring (RUM) uses real collected user data instead of simulated data to evaluate website performance in-the-moment and over time.

In theory, this is good. But in practice, it’s problematic because there can be use cases and workflows that visitors may not trigger, but nevertheless represent performance pitfalls and other vulnerabilities. The scope of synthetic monitoring is much wider and deeper, and it’s not limited to what visitors may or may not have done in the past, or are doing at the current time. RUM is a lake, while synthetic monitoring is an ocean.

 

How can we learn more about synthetic monitoring?

Easy! Just sign up for a FREE TRIAL of AlertBot. There is no billing information required, no installation, and you’ll be set up in minutes. And there is even better news!

AlertBot’s celebrated multi-step synthetic monitoring script recorder is simple and easy to use. Just click record, interact with your website (e.g., fill out forms, add items to your cart, etc.), and then upload your completed script to dive deep into the granular workflow details. You will clearly see what’s working and what isn’t, as well as what should be improved to optimize visitor experience. There is NO programming required!

 

Start your FREE TRIAL of AlertBot now: click here.

]]>
7 Deadly e-Commerce Checkout Sins https://www.alertbot.com/blog/index.php/2023/07/11/7-deadly-e-commerce-checkout-sins/ Tue, 11 Jul 2023 19:25:24 +0000 https://alertbot.wordpress.com/?p=912 AlertBot Blog titled "7 Deadly e-Commerce Checkout Sins" with the photo of an Asian woman sitting at a laptop with a tablet in one hand and her face in the palm of the other hand looking distressed.

7 Deadly e-Commerce Checkout Sins

Back in the 1970s when bell bottoms roamed the world and 8-tracks reigned supreme, the Eagles warned us that Hotel California was a place where you could “checkout anytime you like, but you can never leave.”

Well, on the 21st century e-commerce landscape there is a similar dilemma facing customers who want to buy everything from gardening equipment to a new car: they can try to checkout anytime they like, but they can never buy.

Below, we highlight seven deadly e-commerce checkout sins that lead to lost sales and reputation damage:

 

  1. It takes too long to buy stuff.

Patience may be a virtue, but most customers aren’t in the mood to refine this noble characteristic when they’re ready to buy stuff. After all, they’ve already invested their valuable time choosing item(s). They want to cross this task off their to-do list right away. In fact, 70% of customers say that page speed impacts their willingness to buy from an online retailer.

  1. The process is too complex.

People who buy things online are intelligent and savvy. But that doesn’t mean they want to feel as if they’re putting together IKEA furniture when going through the checkout process. They want the experience to be straightforward and simple.  They just want to provide the required information — and nothing more. Less is definitely more.

  1. Not displaying a progress bar.

A progress bar tells customers where they are in the checkout process (e.g. cart summary, sign-in, address, shipping, payment), so they know that things are headed towards a satisfying, successful conclusion. Without this information, they can get irritated if they expect the next screen to say “thank you for your purchase”, but is yet another form to fill out.

  1. No ongoing form validation.

This one is tricky. Waiting for customers to get to the end of a form before telling them that they need to fix one or multiple fields can lead to an “I can’t be bothered with this, I’m getting out of here” reaction.

The best practice here is to configure form validation to scan and report as customers move from one field to another, or possibly one section to another (e.g. shipping address to payment information). Admittedly, some customers will still be irked by these “please fix the error” messages. But sending small notes as they move through the form/section is still better than forcing them to back up after they’ve reached the finish line.

  1. No guest checkout.

For businesses, granular customer data can be far more valuable than an actual purchase. However, online sellers need to resist the temptation to force all customers to create an account before they can checkout. Otherwise, they are going to lose customers; not necessarily because those customers are reluctant to share their data, but because they just aren’t in the mood to pick a username and password, and then validate their email address.

With this in mind, sellers should provide incentives for customers to create an account by, for example, informing them that doing so will enable them to track order fulfilment, save time in the future, etc.

  1. Failing to EXPLICITLY mention all costs.

Customers hate discovering surprise costs at checkout. Ideally, sellers can avoid this problem entirely by having zero extra costs of any kind. But realistically, most sellers need to charge shipping/handling (at least until a threshold is met), and potentially other fees based on the item(s) being purchased, the location of the customer, and other factors.

The best way for sellers to deal with this is to make potential/inevitable extra costs explicit. Burying these details at the bottom of a page, and in font so tiny that customers need a telescope to read them, is more than worthy of a pair of Bad Idea Jeans.

  1. Dysfunctional buttons, fields and other elements.

Nothing screams “please don’t buy from us” louder than a checkout process where buttons, fields and other elements don’t work, or when customers are presented with a dreaded 404 Page Not Found (ironically, the funnier or more creative this page might be, the more incensed customers can get — as if the seller is shrugging off their pain and suffering).  Using a solution like AlertBot to automatically and continuously test page integrity — and proactively send alerts when something goes wrong or doesn’t work — is an absolute must.

The Bottom Line

The e-commerce landscape is fiercely competitive, and it typically takes much less for online customers to head for the virtual exits than it does for in-store customers to head for the physical exits. Online sellers need to ensure that they aren’t committing any of the seven deadly — and wholly preventable — e-commerce sins described above. Otherwise, instead of fostering engaged customers, they will trigger outraged ones.

]]>
A Closer Look at AlertBot’s Failure Reporting Feature https://www.alertbot.com/blog/index.php/2023/02/21/a-closer-look-at-alertbots-failure-reporting-feature/ Tue, 21 Feb 2023 20:32:09 +0000 https://alertbot.wordpress.com/?p=903 AlertBot Blog titled "A Closer Look at AlertBot's Failure Reporting Feature" with image of a man with a headset on sitting at a computer in front of a screen that looks like a NASA space terminal.

The year was 1995. Michael Jordan returned to the NBA. Amazon sold its first book. Windows 95 unleashed the era of taskbars, long filenames, and the recycle bin. And when people weren’t dancing the Macarena, they were flocking to see Apollo 13 and hear Tom Hanks utter the phrase that would launch millions of (mostly annoying) impersonations: “Houston, we have a problem.”

Thankfully, the eggheads in space and the eggheads on the ground worked tirelessly (and apparently smoked a whole lot of cigarettes) to get the crew home. But it was the pivotal moment when the failure was first reported that triggered the spectacular problem-solving process. If it happened an hour — or maybe even a few minutes — later, then the outcome could have been tragic instead of triumphant.

Admittedly, the brave, intrepid professionals in charge of keeping their organization’s website online and functional DON’T have to deal with life-and-death scenarios. But they DO need to deal with problems that, if left unsolved, will significantly damage competitive advantage, brand reputation and sales (immediately if we’re talking e-commerce, and eventually if we aren’t). And that’s where AlertBot’s failure alerting feature enters the picture.

What is Failure Alerting?

Failure alerting is when designated individuals — such as a SysAdmin, CTO, CIO, CEO, and so on — are proactively notified when something goes wrong with a website, such as downtime, errors, slowness, or unresponsive behavior.

As a result, just like in Apollo 13, the right people can take swift, intelligent action to fix things before visitors/customers sound the alarm bell, or worse, head out the (virtual) door and go straight to a competitor without looking back.

Notification Methods

AlertBot customers can choose any or all of the following methods to notify team members of a website failure event:

  • Email
  • Text Message
  • Phone Call

For example, a SysAdmin could receive an email, a text message, and a phone call the moment something goes wrong.

Automatic Escalation

Now, if we were in NASA Mission Control circa 1970, someone wearing really thick horned-rimmed glasses would rise above the cigarette smoke and ask: What happens if the SysAdmin doesn’t receive the email, text message, and phone call? It’s a good question, and there is an even better answer: don’t worry about it.

AlertBot’s failure reporting feature can be configured to escalate the website failure warning if certain individuals don’t respond within a specific timeframe. For example, if a SysAdmin is indisposed for any reason (driving, sleeping, etc.), then after two minutes the alert can be pushed to another designated team member such as the CTO. And if the CTO doesn’t respond within two minutes, then the alert can be pushed to the CIO, and so on.

Ideally, the individual (or multiple individuals) who are sent the first alert receive it immediately, and take rapid action. But if they don’t or can’t, then the alert is escalated accordingly. It is important to note that all of this happens automatically, so there is no possibility of human error.

Granted, none of this is as entertaining as watching Apollo 13. There’s no rousing soundtrack or Tom Hanks. Heck, there’s not even Kevin Bacon.

But when it comes to fixing website problems as quickly as possible, organizations know that the less drama, the better. That’s precisely what AlertBot’s multi-channel, auto-escalating failure reporting feature delivers. We don’t need an Oscar. We just need extremely satisfied customers — and we have a lot of those.

 

Next Up: Reviewing Failure Events Online

 In our next blog, we’ll explore reviewing failure events online to pinpoint issues and detect problems. Stay tuned!

Launch a free trial of AlertBot’s acclaimed site uptime monitoring solution. No credit card. Nothing to download. Get started in minutes. And if you decide to purchase our solution, there are NO setup fees!

]]>
Just How Bad is a Down, Slow, or Dysfunctional Website? It’s Worse than You Think! https://www.alertbot.com/blog/index.php/2022/09/22/just-how-bad-is-a-down-slow-or-dysfunctional-website-its-worse-than-you-think/ Thu, 22 Sep 2022 17:31:50 +0000 https://alertbot.wordpress.com/?p=866 AlertBot Blog titled "Just How Bad is a Down, Slow, or Dysfunctional Website? It’s Worse than You Think!" with an aerial view of a man with his hand on a laptop keyboard with the word "Waiting" and an hourglass on the monitor screen.

Just How Bad is a Down, Slow, or Dysfunctional Website? It’s Worse than You Think!

Have you ever watched a movie (*cough* Godfather III) and said to yourself: “wow, this is so incredibly bad — I don’t think this can get worse!” But then it does. Much, much worse.

Well, having a down, slow, or dysfunctional website is similarly nightmarish — just when you think the reputation devastation is finally over, there’s more on the horizon. With apologies to Shakespeare: hell hath no fury like a customer scorned.

Not convinced? Here’s what happens to companies that get on the wrong side of their customers:

  • 61% of customers say they will switch to a new brand after one bad experience. (Zendesk)
  • 13% of customers who have a negative experience will tell 15+ people. (Esteban Kolsky)
  • $289 is the average value of every lost business relationship in the U.S. per year. (Neil Patel)

Scary stuff, huh? “But wait — there’s more!”

These days, many unhappy customers publish reviews to punish companies that fail to meet their expectations. But guess what? These eviscerating appraisals are not just seen by other potential customers (many of whom quickly decide not to move into becoming actual customers). They are also seen by potential job candidates who are not enthusiastic about joining an organization that is used as target practice by denizens of the interwebs (everyone from THE ALL CAPS BRIGADE to the “tl;dr” force to the League of Extraordinary Grammarians).

However, just as all nightmares eventually come to an end (hey, even Godfather III mercifully rolls credits at the 2-hour-42-minute mark), there is something that companies can do to dial back — or better yet, eliminate — customer outrage caused by a down, slow, or dysfunctional website: get AlertBot.

AlertBot’s fully integrated monitoring platform monitors all your websites, web applications, mobile sites and services — all in one place. Unlike many other products in the marketplace, AlertBot doesn’t merely monitor a URL’s basic availability. It dives much deeper and monitors full page functionality using real web browsers in order to verify every page element, script, and interactive feature. As a result, you can proactively scan for errors, track and optimize load times, pinpoint issues, and get alerted to problems and failures.

The bottom line? A down, slow, or dysfunctional website can be so catastrophic that it makes Godfather III look like, well, Godfather I or Godfather II. Don’t hope for an Oscar just to win a Razzie. Get AlertBot and inspire your target audience to cheer vs. churn.

Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes.

]]>
How Site Uptime Impacts SEO (Hint: It’s a REALLY Big Deal) https://www.alertbot.com/blog/index.php/2022/08/11/how-site-uptime-impacts-seo-hint-its-a-really-big-deal/ Thu, 11 Aug 2022 18:55:16 +0000 https://alertbot.wordpress.com/?p=860 AlertBot blog titled "How Site Uptime Impacts SEO (Hint: It’s a REALLY Big Deal)" with photo of a woman's hands pointing at a web browser featuring Google on an iPad tablet.How Site Uptime Impacts SEO (Hint: It’s a REALLY Big Deal)

It is arguably the most important 3-letter acronym on the digital marketing landscape. No, it’s not ROI. It’s SEO. Consider that:

Clearly, effective SEO is extremely important. And for many businesses — especially smaller companies that are competing against big, established enterprises — it’s a matter of survival. However, for some decision-makers outside of the digital marketing world, the link between SEO and site uptime is less clear. Let’s fix that.

For Search Engines, it’s All About Relevance

 Realtors like to point out that the three most important factors in evaluating a property are: location, location, and location. Well, the big brains behind search engines like Google and (to a lesser extent) Bing and Yahoo are obsessed with: relevance, relevance, and relevance.

What this means, is that when responding to a search query — anything from “tennis rackets” to “what’s this itchy red bump on my foot?” — search engines strive to produce results that will be seen by searchers as relevant. Otherwise, eventually searchers will switch search engine brands (e.g. leaving Google and using Bing). Relevance is the glue that keeps the relationship sticky. And unlike with those glorious model airplanes that many of us failed to create when we were kids, in this case, the more glue the better.

 Downtime Damages Relevance

 Since search engines strive to deliver relevant search results (and therefore positive user experience), it makes sense that downtime — which can be defined as a site being inaccessible or outright disappearing — is the enemy.

After all, if a searcher looking to buy a tennis racket clicks a site and discovers that it’s unavailable, then they won’t just punish the company that they hoped to engage: they will, in time, punish the search engine that pointed them in that direction. That fear keeps search engine folks awake at night (including mighty Google which commands more than 90% of the desktop and mobile search marketplace), and it explains why downtime is such a threat: it damages relevance.

 Is 100% Uptime Absolutely Vital?

This warning about downtime begs an important question: do companies that want to stay far, far away from Google’s, Bing’s and Yahoo’s penalty box have to ensure 100% uptime? Not necessarily. While uninterrupted availability is ideal, it is not realistic. Occasionally, a site will go down for a few seconds or perhaps even longer. There are a variety of reasons for this, such as problems with a web host, an unexpected spike in traffic, and ol’ fashioned human error (hey, we all make mistaks…er, mistakes).

However, the top priority for all businesses that want to win the SEO game must be to minimize site downtime in terms of both frequency and duration. They also need to know why site downtime occurs, in order to proactively address issues and keep them from recurring. And that is where site uptime monitoring enters the picture.

What to Look for in Site Uptime Monitoring

 There are many site uptime monitoring products in the marketplace, ranging from superficial (and usually free — hey, we get what we pay for), to robust and reliable. Obviously, organizations need to choose from among the latter and avoid the former. To that end, here is what to look for in a site uptime monitoring solution:

  • Full site monitoring. It is not enough to monitor a site’s basic availability — because that only confirms that it exists, not that it is actually functional. It is also necessary to verify each page element, script, and interactive feature, as well as scan for errors, track load times, and pinpoint problems.
  • Automated alerts delivered by phone, email and/or texts to specified individuals (e.g. CTO, IT Director, etc.), in the event of site downtime — so that the right people can take immediate action.
  • The ability to record and monitor multi-step web processes with real browsers (across desktop and mobile devices), mouse clicks, and keyboard interactions.
  • The ability to monitor any port on any server or device to ensure that it is up and running, including: ICMP, TCP, SMTP, POP3, IMAP, DNS, FTP, Telnet, and custom ports.
  • A range of performance reports including website load times, geographic performance, failure analysis, and more.
  • Ongoing monitoring from a large group of locations around the world, which is essential for avoiding false alarms and verifying that a site is truly down/unavailable.

And it goes without saying: a legitimate and reliable site uptime monitoring solution must be backed by a responsive team of experts who will immediately take ownership of an issue and see it through to resolution. This cannot be emphasized enough, because the only thing worse than site downtime is trying to get help from people who don’t know what they’re doing. It gets ugly in a hurry.

SEO is Here to Stay

The rules of SEO will change — this much is certain (Google tinkers with its algorithm hundreds of times a year). But what isn’t going to change for search engines is the supreme importance of delivering relevant results. This means effective site uptime monitoring is not an option. It is essential, and companies that fail to heed this wisdom will soon be expressing another 3-letter acronym: SOS.

AlertBot is a leading site uptime monitoring solution that checks ALL of the features and functions above, which is why it’s trusted by some of the world’s biggest brands. Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. 

]]>
Why Website UX “Edge Cases” Lead to Visitor Frustration — and What to Do About It https://www.alertbot.com/blog/index.php/2022/02/21/why-website-ux-edge-cases-lead-to-visitor-frustration-and-what-to-do-about-it/ Mon, 21 Feb 2022 21:27:54 +0000 https://alertbot.wordpress.com/?p=837 A mountain climber is silhouetted on a deep blue sky background as he hangs off a cliff by one hand, and a tether hangs off his belt.

Why Website UX “Edge Cases” Lead to Visitor Frustration — and What to Do About It

The year was 1993. Beanie Babies invaded the planet. Dinosaurs dominated cinemas worldwide when they escaped from Jurassic Park. Seinfeld won the Emmy for Outstanding Comedy Series (you might say that Jerry & co. were masters of their domain). And righteous rockers Aerosmith extolled the virtues of “living on the edge.”

A lot — and we are talking A LOT — has changed since 1993; especially that advice about living on the edge. Frankly, the last thing that companies want is for their website visitors to go anywhere near the edge, because they may fall off.

Edge Cases
What we are talking about here are “edge cases,” which refer to website UX pitfalls that are unlikely — but nevertheless possible. And when visitors experience one of these edge cases, it is not a matter of whether they will get mad: it is a question of how enraged they will become. Hell hath no fury like visitors thrust into a nasty edge case. Here are some examples:

  • A visitor incorrectly inputs their credit card data into a form, which causes the form to crash.
  • A visitor clicks or taps the search function, but without putting anything in the search field, which causes the website to hang.
  • A new website is launched and everything seems fine (there are a lot of fist bumps and “WE DID IT!” cheers among the development team), but there are sections of bad core that manifest hours, days, weeks, or even months down the road.

As a result of these negative experiences, visitors cannot move forward as both they and the company desire — or to use a term from the UX world, their momentum on “The Happy Path” — is thwarted.  Fortunately, that is where synthetic monitoring enters the picture.

The Role of Synthetic Monitoring
Synthetic monitoring (sometimes referred to as journey monitoring) is a method of simulating and evaluating the various journeys that visitors take on a website: where they go, what they do, what buttons they press, what forms they fill out, and so on.

With synthetic monitoring, companies can proactively identify and address edge case scenarios, but without having to rely on excessive manual testing or live user monitoring. This is not only more efficient, but it exposes edge cases that would otherwise go undetected.

Ideally, addressing edge case scenarios means eliminating them entirely — such as fixing bad code. But at the very least, companies can put up signposts that point visitors in the right direction. For example, since there is no way to 100% guarantee that every visitor will correctly input their credit card number, a form can be modified to tell visitors when an input error has occurred.


AlertBot: Avoiding the Edge
AlertBot supports advanced and easy-to-use synthetic monitoring that helps companies run and evaluate various UX scenarios before their visitors do — and ultimately reduce edge cases. Hey, Aerosmith is welcome to live on the edge (who are we to criticize the group that brought us Guitar Hero?). But companies that want to drive visitor engagement — and prevent frustration — should live as far away from the edge as possible.

Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. 

]]>
“Frodo, We Aren’t in the Shire Anymore”: The Importance of a Customer Journey & How to Avoid Wrecking It https://www.alertbot.com/blog/index.php/2021/07/29/frodo-we-arent-in-the-shire-anymore-the-importance-of-a-customer-journey-how-to-avoid-wrecking-it/ Thu, 29 Jul 2021 20:35:38 +0000 https://alertbot.wordpress.com/?p=761 An image of the Shire film set from Lord of the Rings that shows Bilbo's home's front door. Text on the image reads “Frodo, We Aren’t in the Shire Anymore” - The Importance of a Customer Journey & How to Avoid Wrecking It

“Frodo, We Aren’t in the Shire Anymore”: The Importance of a Customer Journey & How to Avoid Wrecking It

by Louis Kingston

Fans of Lord of the Rings — otherwise known as “Ringers” — never grow weary of reading or watching Frodo and his fellow Hobbits journey through Middle Earth on an epic quest to Mordor (where rumor has it there now exists a very stylish Starbucks at the base of Mount Doom).

Well, customers who visit a website are on an important journey as well. Granted, it doesn’t involve saving the world from evil entities that never sleep.  But it does involve achieving objectives that, ultimately, culminate in a sale — whether that happens on the same visit (e-commerce) or weeks down the road (B2B). And that brings us to the customer journey map.

The customer journey map is a visual tool that enables businesses to identify where, when and how customers engage their brand — and make the trek from curious prospects to enthusiastic brand ambassadors. There are five phases on the journey:

  1. Awareness: customers in this phase know that they have a problem (although they may not yet know the full scope or severity of the problem), and they are in the process of conducting research. It is possible for visitors to go straight into a conversation via phone, video or chat. However, research has found that, on average, customers access 3-5 pieces of content before speaking (or texting) with a sales rep. This content can include videos, articles, ebooks, white papers, checklists, and so on.
  2. Evaluation: customers in this phase are aware of their problem and are comparing different options. Content such as case studies, testimonials and demos are highly influential here. However, nothing beats a free trial.
  3. Decision: customers in this phase have put together a shortlist (in a formal manner as is usually the case with B2B sales, or more casually as is often the case with B2C/e-commerce sales) and are ready to make a decision.
  1. Retention: customers in this phase have made a purchase, and now the goal is to keep them on the roster. The importance of retention cannot be overestimated. It costs 5x more to acquire a new customer than it does to nurture an existing customer, and increasing retention by just 5% can boost profits by 25-95%. Tactics that can help foster a strong, loyal relationship include special offers, follow-up outreach (via email, app, phone, video, or in-person for B2B), customer satisfaction surveys, etc.
  1. Advocacy: customers in this phase are happy and willing to recommend a company to their professional and personal network, as well as members of their various social media communities (e.g. Facebook and LinkedIn groups, etc.). Research has found that 83% of B2C customers and 99% of B2B customers are influenced by various types of word-of-mouth marketing.

In theory, the customer journey is straightforward. However, in practice — and just as Frodo & Co. discovered — the quest can have many twists and turns. No, there aren’t any orcs, hobgoblins or balrogs along the way, but there are some dangerous foes that include:

The bad news? Any one of these is enough to send customers heading straight for the exit, never to return. The good news? AlertBot’s leading solution continuously monitors for ALL of these from multiple locations around the world — and proactively notifies key individuals (e.g. CIOs, CTOs, SysAdmins, etc.) when a problem occurs.

You could say that AlertBot is leading-edge technology worthy of Gandalf, and yet so intuitive and easy-to-use that Pippin Took could manage everything (even after having a few pints at the Prancing Pony).

See for yourself by starting your free trial of AlertBot now.   

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host https://www.alertbot.com/blog/index.php/2020/08/18/3-reasons-why-its-a-bad-idea-to-buy-site-monitoring-from-your-web-host/ Tue, 18 Aug 2020 17:16:21 +0000 https://alertbot.wordpress.com/?p=701 A image of multiple server racks on either side of a laptop in the foreground. The laptop screen shows a cloud graphic with an "X" over it. Text on the image reads "3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host"

3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host

by Louis Kingston

For baseball pitchers, the two most glorious words in the English language are “perfect game.” For actors, it’s “Oscar win” (forget all that nonsense about how “it’s an honor just to be nominated.”). For school-aged kids, it’s “snow day.” And for businesses, of course, it’s “captive audience.”

Indeed, it doesn’t matter how compelling or clever a marketing and advertising campaign might be. If audiences don’t take notice and pay attention, it may as well not exist. And if you doubt this, think of the last time you sat through 20 minutes of movie trailers — not because you wanted to, but because there was nowhere else to go (at least, not without saying “excuse me…” 10 times as you painfully twisted and squirmed your way past annoyed fellow moviegoers).

Why does this matter? It’s because your web host is singing from the captive audience songbook when it repeatedly urges you to add site monitoring to your existing hosting package. At first glance, this may seem like a good idea. After all, you know that site monitoring is important. Why not just grab it from your web host, the same way you grab a side order of fries from a fast food restaurant? Well here’s why not:

  1. Lack of Specialization

Your web host doesn’t specialize in site monitoring, which means they aren’t using the latest technology or hiring the most qualified professionals. Just as you wouldn’t want your doctor to sell you a timeshare during an exam (“You know what might help that bronchitis? Two weeks a year in a sunny and warm Florida condo, as you can see from this lovely brochure”), you don’t want your site monitoring company to do anything but site monitoring. It’s not something anyone should be dabbling in.

  1. Lack of Service Offering

When web hosts offer site monitoring, they typically focus on uptime. But site monitoring isn’t just about letting you know when your site goes dark. It’s also about making sure that your site is performing the way it’s supposed to — which means that all elements are functional (e.g. buttons, forms, multi-step processes, etc.), and all pages are loading rapidly. Without this critical information, you may believe that everything with your site is fine and all lights are green; that is, until you begin hearing from irate customers and start losing sales.

  1. Potential Conflict of Interest

Last but not least, your site host is supposed to meet an uptime standard as part of their service commitment. But if that same host is also monitoring your site performance, they may be less inclined to be completely transparent if they fall below this standard. And if they did fudge some of the numbers, how would you even know? With this in mind, are we saying that all hosts that offer site monitoring are unethical? Absolutely not. Are we saying that there is an inherent conflict of interest that should be at least concerning and troubling? You bet.


The Simple, Smart Solution

The best (and really, the only) way to solve this problem is to avoid it completely — which means not site monitoring from your host, and instead getting it from a proven, reputable vendor that:

  • Specializes in site monitoring — it’s all they do 24/7/365.
  • Offers both uptime monitoring and comprehensive performance monitoring — not just the former.
  • Has zero conflict of interest telling you the truth, the whole truth, and nothing but the truth regarding how your site is doing.

Ready to safeguard and strengthen your business with world-class, surprisingly affordable site monitoring? Then you’re ready for AlertBot! We check all of these boxes, and are trusted by some of the world’s biggest companies. Start your free trial now.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations https://www.alertbot.com/blog/index.php/2019/11/19/the-not-so-magnificent-7-https-errors-that-infuriate-customers-and-ruin-reputations/ Tue, 19 Nov 2019 19:13:39 +0000 https://alertbot.wordpress.com/?p=650 A graphic with a bright orange background and a cartoonish illustration of a man with glasses sitting at his desk facing his computer looking angry. Next to this graphic are the numbers "404" and text "Oops... page not found." Article title above it reads "The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations"

The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations

by Louis Kingston

In the classic flick The Magnificent Seven, a pack of essentially decent but “don’t you dare park your horse in my spot or else you’ll get your spurs blasted” gunslingers come together to rid a village of some nasty bandits. There’s action. There’s drama. There’s tragedy. There’s humor. There’s romance. There’s Steve freakin’ McQueen. What’s not to love?

Well, on the dusty and dangerous internet landscape, instead of a magnificent seven to save the day, there exists seven not-so-magnificent HTTPS errors that are impossible to like, let alone love. Why? Because their purpose is to block visitors from reaching websites — which leads to lost customers and wrecked reputations.

Here’s a look at the reprehensible HTTPS errors that have their picture on Most Wanted Lists in every post office from Tombstone to Dodge City:

403 Forbidden: The 403 Forbidden error means that the server is absolutely refusing — no ifs, ands or buts — to grant permission to access a resource, despite the fact that a request is valid. Common causes include missing index files, and incorrect .htaccess configuration.

404 Not Found: The 404 Not Found error means that a web page or other resource can’t be found because they simply don’t exist. Common reasons for this include a broken link, mistyped URL, or that someone moved or deleted a page and didn’t update the server (which happens a lot).

408 Request Time Out: The 408 Request Time Out error means that the server can’t find the target or resource that it’s searching for, and after a while, just throws in the towel. Often, this is because the server is overloaded.

410 Gone: Whereas (as noted above) a 404 error implies that there might be some hope — i.e. the target file might be somewhere, just not where it’s supposed to be — the 410 Gone error snuffs out any possible optimism. It’s totally, completely and permanently gone.

500 Internal Server Error: The 500 Internal Server Error means that the server cannot process a request for any number of reasons, such as missing packages, misconfiguration, and overload.

503 Service Unavailable: The 503 Service Unavailable error means that the server is either down because of maintenance, or because it’s overloaded. Either way, the server is conjuring up its inner Gandalf and screaming: “YOU SHALL NOT PASS!”

504 Gateway Time-Out: The 504 Gateway Time-Out error means that a higher-level upstream server isn’t working and playing well with a lower-level downstream server. After a while, the downstream server gets the message that it’s not wanted, and says “Oh yeah? Well, I don’t need you either!”

Calling in the Marshall
The bad news is that these reprehensible HTTPS errors, if left unchecked, can cause a lot of damage. Indeed, few things irk and offend website visitors more than seeing an error code. But the good news is that you can call in the Marshall— a.k.a. AlertBot — to restore law and order.

AlertBot constantly scans your site’s pages to watch out for these and other HTTP errors. If and when they are detected, authorized employees (e.g. webmasters, sysadmins, etc.) are proactively notified so they can take swift action and fix the problem.

It’s lightening fast, always reliable, and as smooth as Steve McQueen. Dastardly, good-fer-nuthin’ HTTPS errors don’t stand a chance!

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: Aeropostale vs GAP (The Final Showdown) https://www.alertbot.com/blog/index.php/2019/10/08/alertbot-showdown-aeropostale-vs-gap/ Tue, 08 Oct 2019 18:54:23 +0000 https://alertbot.wordpress.com/?p=636 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying shopping bags. Text reads "AlertBot Showdown: Aeropostale vs GAP" with the word SHOWDOWN very large at the bottom.
When you think of trendy, casual clothes, names like GAP, Aeropostale and Abercrombie are likely to be among the retailers that come to mind. While many of the brands we’ve come to know and trust over the years still maintain brick and mortar stores, all of them have had to make the transition to having a presence online in the wonderful digital world we call “ecommerce.”

Shopping for clothes in person is an entirely different experience than shopping online (and only being able to guestimate how their purchase may look or fit in real life), but we wanted to evaluate the online shopping reliability of two of these brands when it comes to the world wide web and their own individual website performance.

To test their web performance quality, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both GAP.com and Aeropostale.com from August 4th through August 18, 2019. (We originally planned to evaluate Abercrombie.com instead of Aero, at first, but the site produced so many errors that we decided to choose a different company’s site to monitor.)

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both Aero’s and GAP’s sites achieved 99% uptime. Neither saw significant downtime, which is expected, but each one experienced some sluggish speeds and even load time timeouts on a couple occasions.

Aeropostale.com experienced 99.64% uptime, with over 20 errors recorded due to slow load times or brief periods of unresponsiveness. None of these events lasted longer than a couple minutes, however, and none of them amounted to any significant downtime. Because of this, we still consider their performance to be pretty good.  (Aeropostale.com 8/10)

GAP.com experienced fewer issues, but struggled with some significant slowness on August 9th, resulting in 99.50% uptime. Otherwise, they would have an overall stronger performance during this time period than Aero. (GAP.com 8/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring. We calculate the speed as an overall average across all locations during the time span selected for this Showdown.

When it comes to page load times, Aeropostale performed respectably, but at about twice the load time as GAP’s site. Their best day, on average, was Monday, August 5th with 6.1 seconds. Their worst day, on average, was Thursday, August 15th, with 6.8 seconds. The site’s overall average speed across the entire test period was 6.97 seconds, which isn’t terrible, but it also isn’t much to brag about. However, one thing certainly gleaned from these results is that Aero’s site is relatively consistent across the board, in regards to their speed.  (Aeropostale.com 7/10)

As teased above, GAP.com performed about twice as fast as Aeropostale.com did. Their best day, on average, was Sunday, August 4th with 2.4 seconds. That’s a pretty decent load time. GAP.com’s worst averaged day was Friday, August 9th, at 3.35 seconds, which is still almost half the time of Aero’s best day. The site’s overall average speed across the entire test period was 2.8 seconds, which is rather impressive.     (GAP.com 8.5/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. For this portion of the test, we compare the overall average speeds of each individual location captured during the selected period of time for this Showdown.

When it comes to geographic performance, it seems safe to say that Aero’s site is all over the map. They performed best in North Carolina at an average of 2.6 seconds, with Nevada in second at 3 seconds and Oregon third at 3.1 seconds. Those times are not bad at all. However, their slowest time was a dismal 13.3 seconds (ouch!) in Missouri, followed by 13 seconds in California, and Washington DC in third place at 12.1 seconds. (Aeropostale.com 7/10)

GAP.com also saw some drastic differences on either side of the scale, but not nearly as substantial a difference as Aero’s. Their fastest average performance was seen in Nevada, at 1.7 seconds. Oregon came in second at 1.7 seconds, and Virginia was third at 1.8 seconds. Missouri was once again at the bottom of the proverbially bargain bin with 6.3 seconds, followed by Colorado at 5.21 seconds and Texas at 5.17 seconds. Still, GAP’s geographically slowest times look like Aero’s overall fastest times, which is rather disappointing.  (GAP.com 8.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to see if we can find a nice sweater (since we’d love to cozy up in this fall weather) and add it to our cart.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.GAP.com into our Chrome browser, it took 39.10 seconds and 8 clicks to get a sweater into the shopping card and begin the checkout process. GAP had two pop-ups about coupons and joining their mailing list, and it took a few clicks to get around those. Then we navigated to the Men’s section, selected the first long sleeve crewneck we found and added it to the cart. (And hey, it’s 40% off, too. Woohoo!)

For www.aeropostale.com, it took 6 clicks and 35 seconds to browse their fall collection, snag a thermal hoodie tee, add it to the cart, and click checkout (and hey, the price was about half-off, too!).

Honestly, both sites are pretty nice, easy to use, and straightforward. The pop-ups on GAP.com were a bit annoying, especially with there being two of them, but it’s tough to gripe about getting offered coupons to save money when you’re shopping. Aero’s site felt just a smidge more inviting, like you’re browsing a tangible catalog, and it seemed to offer quite a few options up front.

All things considered, our Usability scores are:

(Aeropostale.com 9/10)
(GAP.com 9/10)

 

Verdict

Both sites performed respectably, but when it comes to speed, one definitely outperformed the other—and the positive usability experience is just gravy. So, we’re pleased to announce this Showdown champion to be…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Gap.com"

]]>
4 Common Causes of Cart Abandonment — and How to Solve Them https://www.alertbot.com/blog/index.php/2019/09/05/4-common-causes-of-cart-abandonment-and-how-to-solve-them/ Thu, 05 Sep 2019 21:27:52 +0000 https://alertbot.wordpress.com/?p=626

Image of a shopping cart with green trim set against a white wall. Text on the image reads "4 Common Causes of Cart Abandonment — and How to Solve Them"

4 Common Causes of Cart Abandonment — and How to Solve Them

by Louis Kingston

It’s a sad story that has become so common, that it just kind of blends into the background — like that awful elevator jazz that some coffee shops play (Thelonious Monk would NOT approve), or economy class in-flight meals (there’s less sodium on a salt lick, and you don’t get rammed in the ankle by a cabin trolley). Alas, we’re talking about the cart abandonment epidemic.

And epidemic is indeed the right word, because this problem is not local or limited. Forrester Research pegs the number of customers who bid adios to their cart at 87%, with 70% of them choosing to do so just before checkout. Overall, $18 billion worth of products each year are left to languish in digital trolleys.

Here are four common and costly cart-based reasons why customers flee the sales funnel, rather than triumphantly complete the buyer’s journey:

  1. Unexpected costs.

Customers don’t merely dislike unexpected costs like shipping, or nebulous “handling” fees (what, are people buying plutonium or something?). They absolutely hate them. There might even be a clinical psychological aversion to this called “unexpectedcostphobia.”

The solution: be transparent about all automatic or potential costs by advertising a clear and realistic estimate, providing a delivery calculator on the home page (not buried at the end of the checkout process), and if possible, offering free shipping for a minimum purchase.

  1. Obliging customers to create an account.

A decade or two ago, customers didn’t mind creating an account to purchase something online, simply because they didn’t know there was any other way. It was part of the deal, like the turning of the earth or standing in line for longer than you should at the post office. It’s going to happen.

But now, customers have enjoyed a taste of the guest checkout experience — and many of them love it; especially if they’re suffering from security fatigue and wince at the idea of remembering more login credentials. Naturally, e-commerce sites that fail to cater to this preference set themselves up for plenty of cart abandonment.

The solution: if creating an account is mandatory, make the process as simple and fast as possible (and then make it even simpler and faster). In addition, give customers an incentive to create an account such as a discount offer, special gift, or anything else that has value and isn’t going to lead to a bankruptcy filing.

  1. Long and winding checkout process.

In 1970, The Beatles sang about the “Long and Winding Road” and scored yet another U.S. Billboard #1 hit. However, e-commerce sites that have a long and winding checkout process aren’t going to be certified platinum. They’re going to be certified terrified, because cart abandonment rates will be far higher than their competition.

The solution: ruthlessly streamline down the checkout process to the bare minimum, and use as few fields as possible. Yes, getting as much glorious customer data is important — but it’s not as important as getting customers on the roster in the first place.

  1. Bugs, bugs and more bugs.

Even entomologists don’t like website bugs and other completely preventable technical errors that make online shopping irritating instead of enjoyable. Even one of these bugs is enough to trigger cart (and brand) abandonment — let alone a bunch of them.

The solution: use a reputable third-party platform to constantly monitor all important web pages and multi-step processes — such as login, signup, checkout and so on — to proactively detect and destroy bugs, or anything else that makes customers miserable like slow page loading. Learn more about this here.


The Bottom Line
Completely eliminating cart abandonment isn’t possible, because there will always be customers who pause or stop the purchase process. But solving all of the problems described above significantly increases the chances that both carts and customers will get to the finish line, and be inspired to come back for more. And isn’t that the whole point?

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes https://www.alertbot.com/blog/index.php/2019/07/22/if-you-build-it-they-wont-come-5-big-scary-and-costly-e-commerce-site-mistakes/ Mon, 22 Jul 2019 06:55:52 +0000 https://alertbot.wordpress.com/?p=623 Photograph of a corn field set against a bright blue sky. Test on it reads "If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes"

If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes

by Louis Kingston

In the 1989 flick Field of Dreams, Kevin Costner turns his Iowa cornfield into a baseball field because a voice tells him: if you build it, he will come. The “he” in question is his late father, and the movie has a magical, uplifting ending that makes us want to dream again (and possibly, play baseball or eat some corn).

Well, many folks who launch e-commerce sites also believe that: if I build it, they will come. This time, “they” means throngs of happy, profitable customers. Except…they don’t. And before long, the site is forced to scale down or shut down. Even writing to Kevin Costner doesn’t help — even if you promise to watch a double feature of The Postman and Waterworld (not recommended without a physician’s approval).

The bad news is that this kind of misery happens all the time. The good news — actually, make that the amazing, glorious, Field-of-Dreams-ending-like news — is that preventing this doom and gloom is largely a matter of avoiding these five big, scary and costly e-commerce site mistakes:

  1. Lousy UX

Tiny buttons that are impossible to click on a mobile device without a magnifying glass and hands the size of a Ken doll. Search functions that neither search nor function. Elusive top level categories. Gigantic banners that pop open and chase customers around from page to page, like a kind of online shopping Terminator (“I’ll be baaaaaack!”). These are just some of the many ways that lousy UX destroys e-commerce sites.

The remedy? Monitor all pages and multi-step processes (e.g. login areas, signups, checkout, etc.), to identify bottlenecks where customers routinely encounter errors or unresponsive behavior, and fix any gaps and leaks right away. Learn more about doing this here.

  1. S…l…o…w…n…e…s…s

Just how vital is speed? Behold these grizzly statistics:

  • A one-second delay in load time can send conversion rates plunging by seven percent. (Source: Kissmetrics)
  • 70% of customers say that a website’s loading time affects their willingness to purchase. (Source: Unbounce)
  • As page load time increases from 1 second to 3 seconds the probability of bounce increases by 32%; from 1 second to 5 seconds the probability of bounce increases by 90%; and from 1 second to 10 seconds the probability of bounce increases by 123% (source: Google)

The remedy? Be ruthless about making your e-commerce site as fast as possible (and then make it even faster). Here are the usual suspects: bloated HTML, ad network code, images not optimized, and using public networks to transmit private data. There are other culprits, but look here first — you’ll be amazed at how much speed you unleash.

  1. Not Focusing on SEO — or Focusing too Much on SEO

Let’s talk about health. Some people have poor health because they don’t exercise at all. Their daily calisthenic routine involves digging in the couch for the remote. And then on the other end of the spectrum, there are people who work out too much — like, we’re talking to extremely, unhealthy levels. You know the type.

The same phenomenon occurs in the e-commerce world when it comes to SEO. Some sites don’t focus on SEO, which means they aren’t going to get found by the 35% of customers who start their buyer’s journey from Google. And some focus too much on SEO, that they neglect other channels and tactics — including good, old fashioned pure promotion.

The remedy? Definitely make SEO part of the visibility strategy. But don’t make it the end-all-and-be-all of online existence. It’s important, but it’s not everything.

  1. Bad Customer Service

 Customer service is as important in the online world as the brick-and-mortar world, and in some cases it’s even more important, because exiting the buyer’s journey is so simple — as is writing a scathing zero-star review that would have made Roger Ebert wince. Unfortunately, many e-commerce sites treat customer service as an afterthought or a necessary evil, rather than an asset that should be leveraged to optimize customer experience and generate loyalty.

The remedy? Make customer service — characterized by the ease, speed, and quality of responsiveness and resolution — a big part of the plan. It’s not an expense, but an investment.

  1. Lack of Original, Compelling Content

E-commerce sites aren’t vending machines, yet many of them seem to take their inspiration from these handy contraptions that dispense candy and soda in exchange for money and the push of a button (be careful you don’t press the wrong one — you might end up with that oatmeal cookie that has been there since 2007, and not the Snickers bar that you’re craving).

However, most customers — even those who are very focused on getting a specific item, like a pair of sneakers, a smartphone, or a hotel room — want and expect to access relevant information to help them make a safer, smarter purchase decision. This could be videos, infographics, social proof (e.g. testimonials, reviews, case studies, etc.), articles, blog posts, and downloadable assets like ebooks,  checklists, and so on.

The remedy? Don’t skimp on creating original, compelling content. As a bonus, this will help with SEO and can connect you with profitable customers who are not in your primary target market.

The Bottom Line

Competition on the e-commerce landscape for the hearts, minds, and indeed, wallets of customers is ferocious. Avoiding these mistakes will go a long, long way to helping your e-commerce site survive and thrive.

You may even make enough profit to retire early, buy a cornfield in Iowa, and then turn it into a baseball field that inspires the feel-good movie of the year. Hey, it worked once before, right?

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: VIVE vs Oculus https://www.alertbot.com/blog/index.php/2019/06/27/alertbot-showdown-vive-vs-oculus/ Thu, 27 Jun 2019 19:48:56 +0000 https://alertbot.wordpress.com/?p=611 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are wearing Virtual Reality head sets and holding the controls. Text reads "AlertBot Showdown: Oculus vs Vive" with the word SHOWDOWN very large at the bottom.

As technology continues to morph change with the times, the virtual reality experience keeps becoming more widespread and immersive. Two of the leading brands in the VR game are unmistakably VIVE (HTC) and Oculus. Both companies are leaders in the ever-expanding digital world of virtual reality, with both having released or having plans to release new headset models this summer.

While these brands may corner the market on connecting to the virtual realm, we wondered how they stack up when it comes to the world wide web and their own individual website performance.

To test their web performance quality, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both VIVE.com and Oculus.com from May 1st through May 22, 2019. Given the high regard in which these companies are held because of their products, we expected their web performance to be strong.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both VIVE’s and Oculus’s sites did perform quite well. Neither saw significant downtime, but each one experienced some sluggish speeds and even load time timeouts on a couple rare occasions.

VIVE.com experienced 99.91% uptime, with just a few errors recorded due to slow load times. None of these events lasted longer than a couple minutes, and none of them amounted to any significant downtime. Because of this, we still consider their performance to be quite solid.  (VIVE.com 8/10)

Oculus.com performed similarly with 99.98% uptime and similar slow page load errors that didn’t amount to significant downtime but at least put a minor hiccup in their performance. They experienced four times fewer of these errors than VIVE, so they ended up coming out just a tiny bit more on top. (Oculus.com 8.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring. We calculate the speed as an overall average across all locations during the time span selected for this Showdown.

The speed for both websites were also relatively close to each other. VIVE.com’s best speed, on average, was seen on Monday, May 13 at 3.2 seconds, which isn’t bad. Their best time of day, however, was on Tuesday, May 21 at 5am with 1.6 seconds. It’s definitely better, although it’s doubtful that they usually see a high number of traffic on a given morning. VIVE.com’s worst averaged day was Thursday, May 23rd at just 5.1 seconds. However, their worst time was on Wednesday, May 22nd at 2pm with a much less admirable 8.8 seconds. The site’s overall average speed across the entire test period was 3.78 seconds.  (VIVE.com 8/10)

Oculus.com performed very similarly. Their best day on average was Thursday, May 2nd with 3.7 seconds. Their best response time was at 9am on Wednesday, May 15 with 2.05 seconds. Oculus.com’s worst averaged day was also (like VIVE’s) Thursday, May 23rd at just 4.37 seconds (although that’s slightly better than VIVE’s worst). However, their worst time of day was on Wednesday, May 1st at 6am with 7.49 seconds (making their slowest time a full second faster than VIVE’s slowest). The site’s overall average speed across the entire test period was 3.96 seconds (Just a smidge slower than VIVE’s).     (Oculus.com 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. For this portion of the test, we compare the overall average speeds of each individual location captured during the selected period of time for this Showdown.

Previously, California had reigned supreme as the fastest state in the U.S. But lately, other states have been stepping up, dethroning The Golden State. This time, North Carolina wins (for both sites), with VIVE.com moving at a breezy 1.69 seconds in The Old North State. Oregon came in second at 1.8 seconds, with Arizona at 2 seconds. Comparatively, Washington state saw the slowest speed, coming in at a shameful 10.9 seconds, with Washington DC in second at 7.55 seconds and Texas in third at 7.43 seconds. (VIVE.com 8/10)

Oculus.com was also under two seconds with 1.9 seconds in North Carolina. Their second fastest was 2.2 seconds in Nevada and 2.3 seconds in Oregon. Overall, they were pretty close to VIVE. However, while Oculus saw a better overall “slowest” location, the second and third slowest were a little worse. Washington, DC came in at 8.66 seconds, then Washington state at 8.65 seconds, and Texas at 8.55 seconds. For the most part, though, the sites performed rather closely.  (Oculus.com 8/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to see if we can order their latest VR headset.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.VIVE.com into our Chrome browser, it took 1 minute and 36 seconds (and a wealth of clicks) to come to the conclusion that you cannot order anything from their website (at least not easily, even though there’s a shopping cart icon on their menu bar), and that viewing a map to “Try VIVE Today” tells us that we have to live in Livingston, UK if we want to visit a store.

For www.Oculus.com, it took 3 clicks and 16 seconds to add the Oculus Quest 64 GB headset to our cart and be ready to checkout.

For these tests, we attempt to go into them without much prior knowledge of the site’s user side functionality to give it an unbiased test, so we’re pretty surprised at how drastically different the user experience was here. To give VIVE a fighting chance – even before trying Oculus’s site – we tried choosing a different headset in the event that maybe the most recent one isn’t available yet, and it still didn’t help. Perhaps the problem is that we’re performing the test from the US and VIVE’s parent company, HTC, appears to be UK-based. After further investigation, however, it appears that the only way to get to a purchasing option on VIVE’s site is to look at the “comparison” portion of the products page. Still, it seems odd that they wouldn’t make it easier and clearer to order their products. (Also, it appears that the webpage ends when you’re scrolling through, but it merely eventually changes the panel you’re “stopped” on as you scroll down, and then it moves you down the page to the next panel before stopping you again. It’s a neat design, perhaps, but no doubt a little confusing at first.)

With that in mind, here are the Usability scores:

(VIVE.com 5.5/10)
(Oculus.com 9/10)

 

Verdict

Both sites performed respectably, but when it comes to usability and speed, one unexpectedly outperformed the other—especially when it came to usability. So, we’re pleased to announce this Showdown champion to be…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Oculus.com"

]]>
AlertBot Showdown: Dunkin Donuts vs Starbucks https://www.alertbot.com/blog/index.php/2019/01/29/alertbot-showdown-dunkin-donuts-vs-starbucks/ Tue, 29 Jan 2019 21:17:26 +0000 https://alertbot.wordpress.com/?p=598 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying travel cups of coffee. Text reads "AlertBot Showdown: Dunkin Donuts vs Starbucks Coffee" with the word SHOWDOWN very large at the bottom.

If there’s one snack shop you’re likely to find on any given street corner in your city, there’s a good chance it’s either a Dunkin Donuts or a Starbucks (and in some cases, they’re on either sides of the street from each other). Both chains serve up steaming hot caffeinated goodness – at varying affordability in pricing – as well as other sweet treats. And while different areas of the globe may have more common chains than these two, we East Coast natives have regular access to the fresh beans of these common coffee connoisseurs.

It’s no secret that those who rely on a warm, fresh cup of java to get their day started also know these bean beverages affect their daily performance. So we wanted to pose the question – what about the web performance of these respective coffee shops?

To test their website performance, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both DunkinDonuts.com and Starbucks.com from December 1st through Christmas Day, 2018. Given the notoriety of both establishments, we expected their performance to be as strong as their brews, and we weren’t disappointed.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both Dunkin Donuts and Starbucks’ sites performed quite well. Neither saw significant downtime, but each one experienced some sluggish speeds and even load time timeouts on a couple rare occasions.

DunkinDonuts.com experienced 99.96% uptime, with just a few errors recorded due to slow load times. None of these events lasted longer than a couple minutes, and none amounted to any significant downtime. Because of this, we still consider their performance to be quite solid.  (DunkinDonuts.com 8.5/10)

Starbucks.com performed similarly with 99.87% uptime and similar slow page load errors that didn’t amount to significant downtime but at least put a wrinkle in their performance. They experienced four times as many of these errors as Dunkin, so we have to take that into consideration with our rating. (Starbucks.com 8/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.

The speed for both sites were relatively close to each other. DunkinDonuts.com’s best speed, on average, was seen on Sunday, Dec. 2 at 4.8 seconds, which isn’t stellar by any means, but not the worst either. Their best time of day, however, was on Wednesday, Dec. 19th at 4am with 2.1 seconds. It’s considerably better, but 4am isn’t exactly prime web traffic time. Dunkin’s worst averaged day was Monday, Dec. 17th at 6.2 seconds. However, their worst time was on Saturday Dec. 22 at 9am with a crawling 10.5 seconds. The site’s overall average speed across the entire test period was 5.6 seconds.  (DunkinDonuts.com 7.5/10)

Starbucks.com didn’t fare too much better in comparison. Their best day on average was Saturday, Dec. 1st with 5.2 seconds. Their best response time was at 7am on Monday, Dec. 17 with 2 seconds. (It’s interesting that their best average time was on Dunkin’s worst averaged day.) Starbucks’ worst day on average was the previous day, Dec. 16, with 6.9 seconds, with their worst response time on average being at 9pm on Friday, Dec. 7th with a slightly-slower-than-Dunkin’s-speed of 10.7 seconds. But, as you can see, both sites performed pretty close to one another. Starbucks.com’s overall average speed during the entire test period was a tad slower, at 6.3 seconds.   (Starbucks.com 7/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

If you’ve been following these competitions at all, you’ll know that no one beats California in website load time speed. However, lately, we’ve been seeing more variety when it comes to which state in the U.S. has the faster speeds. This time around, Nevada wins (for both sites), with DunkinDonuts.com moving at a swift 1.79 seconds in The Silver State. Oregon came in second at 1.8 seconds, with Ohio at 2 seconds. Comparatively, Washington state saw the slowest speed, coming in at 10.8 seconds, with Colorado in second at 9.2 seconds and Texas in third at 9.1 seconds. (DunkinDonuts.com 8/10)

Starbucks.com loaded at 1.4 seconds in Nevada, which was faster than Dunkin’s best time. Their second fastest was 1.5 seconds in Oregon and 1.7 seconds in Ohio – all better than Dunkin’s best (1.79 seconds). However, Starbucks saw significantly slower load times than Dunkin, with all of their slowest load times being worse than Dunkin’s slowest. Washington came in at 12.5 seconds, then Colorado at 11.6 seconds, and Texas at 11.4 seconds. While they were a little faster than DunkinDonuts.com, they were also considerably slower, which is unfortunate.  (Starbucks.com 7.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find their rewards program and get ready to sign up for it. (And we’re writing about it as we’re performing the test.)

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.DunkinDonuts.com into our Chrome browser, it took 15 seconds and 1 click to find the signup page for their rewards program. (OK, maybe this is too easy?)

For www.Starbucks.com, it took one click and 10 seconds to get to the rewards signup page.

For these tests, we attempt to go into them without much prior knowledge of the site’s user side functionality to give it an unbiased test, but this one probably calls for a retest with a different approach.

Let’s try navigating their respective menus and trying to find out about their coffee items.

With this in mind, from the point of typing in DunkinDonuts.com and navigating through their menu to their coffee options, it took 4 clicks and 23 seconds to get to the page with their regular drip coffee and its nutrition info. It’s a nice website and an enjoyable one to navigate.

With the same goal in mind, for Starbucks.com, it took 5 clicks and over 35 seconds to find the brewed coffee, but the confusing menu setup made it tough to find just plain, hot drip coffee. The Dunkin menu has images for all their options, but Starbucks drops most of the images once you get to the menu, so we ended up on the cold brew menu instead. (As it turns out, it was the fifth option, “Freshly Brewed Coffee” that we actually were looking for… you’d think it’d be one of the first options, though… right?)

Given that the first test was inconclusive, the second one was a clear one for us (albeit unexpected). DunkinDonuts.com was quicker and easier to navigate, and much more user friendly.

With that in mind, here are the Usability scores:

(DunkinDonuts.com 9.5/10)
(Starbucks.com 8/10)

 

Verdict

Both sites performed respectably, but when it comes to usability and speed, one unexpectedly outperformed the other—even if just by a little bit. So, we’re pleased to announce this Showdown champion to be…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "DunkinDonuts.com"

]]>
Black Friday / Cyber Monday 2018 Showdown: Amazon vs Walmart vs Target https://www.alertbot.com/blog/index.php/2018/11/29/black-friday-cyber-monday-2018-showdown-amazon-vs-walmart-vs-target/ Thu, 29 Nov 2018 19:13:25 +0000 https://alertbot.wordpress.com/?p=584 A graphic with a yellow starburst in the center and two robots charging towards a third robot. The two on the left are carrying shopping bags. The one on the right is carrying a box. The text reads "Cyber Week 2018 - AlertBot Showdown: Target vs Walmart vs Amazon" with the word SHOWDOWN very large at the bottom.
Last year, we stepped outside the usual format of our Website Showdown blogs to not only tackle Black Friday and Cyber Monday, but to cover three of the biggest retailers in the process. It was a battle royale for the ages: Walmart vs Target vs Amazon: three web retailer giants duking it out for kingship in the ecommerce realm. Walmart.com edged out its competitors just a bit in 2017, so we were especially curious to see who might reign supreme in 2018. Would Walmart keep the title, or has Target or Amazon stepped up their game?

While we’re still recovering from full bellies and empty wallets from the Thanksgiving celebratory weekend, we poured over the performance results for each site to drill in to see how they compared to last year’s event.

As usual, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor all three sites from Thanksgiving Day through Black Friday and Cyber Monday, spanning from November 22, 2018 to November 26, 2018. We expected strong, reliable performance again during the entire run and we were not disappointed. The results were nothing short of impressive. In fact, we were impressed to mostly see improvement this year over last year.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Last year, in an unusual feat, each site experienced not a single error or failure event. The same mostly held true for 2018, but both Walmart.com and Target.com struggled with a few slow file load times (which can cause a page to load slower), but it was never enough to cause any actual site downtime. With that in mind, we think it’s still fine to award 10’s across the board.

(Amazon 10/10)
(Walmart 10/10)
(Target 10/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart


Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Last year was the first time we ran this event, so it was interesting to be able to compare last year’s results with this year’s. Ecommerce sites tend to have very graphics-heavy designs, and especially with sale events like these, the graphics are often big, frequently changing, and sometimes even animated or video-driven. (Amazon even had live video streaming at one point throughout the purchasing frenzy!)

With that said, through Amazon.com’s 5-day run, they saw the fastest day, on average, to be Sunday, November 25th with 4.2 seconds—which is almost exactly what last year was (Their fastest was also a Sunday at 4.3 seconds). Their slowest day, on average, was actually on Black Friday itself at 4.5 seconds, which, admittedly, still isn’t too bad. When looking at specific times of day for performance, the best hour was 7AM on Sunday with an impressive 2.6 seconds (an improvement over last year by almost a full second), while the day before saw the slowest hour at noon with a dismal 9.3 seconds (which was significantly worse than last year).
(Amazon 9/10)

Walmart.com was the fastest last year and proved not only to hold that title again this year, but they also showed improvement! Their best average day was Cyber Monday, November 26th at 3.8 seconds. Their worst day on average was Sunday, November 25th,  at 4.1 seconds (Coincidentally, it was also Nov. 25th last year, but this year it was almost a full second faster). Finally, their best hour on average was on Cyber Monday at an impressive 1.8 seconds at 6PM. Their worst time on average was 6.9 seconds at 5PM on Black Friday, which is not when you want to be experiencing your slowest web speed.
(Walmart 9.5/10)

Last, but certainly not least, Target.com performed respectably, but once again underperformed in comparison to the other two. Their best day for speed, on average, was Black Friday at 5.4 seconds, which is not only worse than both Amazon and Walmart’s worst days, but it’s .2 seconds slower than their performance last year. Target’s slowest day on average was Cyber Monday, November 26 at 6.3 seconds, almost a full second slower than last year. Their fastest hour turned out to be on Black Friday at 5AM with 3.1 seconds, which is a slight improvement, with their slowest time being on Monday at 3PM with 8.9 seconds, over a second longer than last year, and sadly during mid-day on Cyber Monday.
(Target 8.5/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California has almost always come out on top as the fastest state, but this year they were consistently dethroned by none other than Oregon! For Amazon.com, the ecommerce mega-site saw average load times of 1.4 seconds in the The Beaver State, with their next-fastest location being Ohio at 1.6 seconds and Nevada at 1.8 seconds. When it came to their slowest locations, Washington, D.C. took the prize at a sluggish 7.5 seconds and Washington state clocking in at 7.3 seconds.
(Amazon 9/10)

Just like in 2017, Walmart.com was faster, but by a mere millisecond, seeing an average load time of 1.3 seconds in Oregon. Nevada and Ohio followed at Amazon’s fastest time, 1.4 seconds. Washington state saw the site’s slowest load time at 6.8 seconds, with Colorado coming in at 6.5 seconds and Texas at 6.3 seconds – all of them being faster than Amazon’s worst locations.
(Walmart 9.5/10)

Target actually saw some improvement this year with its average load time being fastest in Nevada at 2.3 seconds in (last year’s was 2.7 in California), while Oregon came in second at 2.5 seconds and Ohio third at 2.7 seconds. And like last year, Target’s fastest speeds proved to be slower than their competitors. The slowest average speed that Target saw in the U.S. was sadly worse than last year. Washington state clocked in at a truly dismal 10.7-second average load time, with Colorado a second behind at 9.6 seconds, and Texas at 9.3 seconds. It’s unfortunate that Target continues to miss the mark for website speed.
(Target 8.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we always select a common task a user might typically try to accomplish when visiting the sites we’re testing and replicate it. For last year’s Showdown, we decided to see what the experience would be like to use these three different websites to add a common product to the shopping cart. To do this, we selected one item to search for and add to our cart, and this year we decided to do the same again.

For each of these processes, we picked an easy item to search for, and sought to add a Blu-Ray copy of Disney and Pixar’s Incredibles 2 to our shopping cart. To begin each process, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.amazon.com into our Chrome browser, typing “Incredibles 2 blu-ray” into the store’s search box, and adding it to the cart, it took 34 seconds. From the front page, it took about 5 clicks (including having to log in to get to the final checkout) to get to the “Place your order” window.

From the point of typing www.walmart.com into Chrome and going through the same process, it took about 6 clicks and 32 seconds to log in and get to the final cart checkout page.

And from the point of typing www.target.com into our Chrome browser, it also took about 6 clicks and 32 seconds to log in and get to the checkout window.

Each site was a good experience to use, although each one has a different feel and approach. It’s a tough call to say which user experience we found to be better, but each one was straightforward and easy to use. If we judge the sites based on search results, Amazon tried suggesting a few things unrelated to the specific search of the “blu-ray” disc first (like a Jurassic Park daily deal and a preorder for Venom), while both Target and Walmart have more direct and accurate results (even though Walmart suggests the DVD and 4K before the actual blu-ray). In that case, we’d have to give Walmart and Target a little more props for accuracy in their product search.

(Amazon 9.5/10)
(Walmart 9.5/10)
(Target 10/10)

 

Verdict

With stakes this high once again, you would only expect the best from the leaders in ecommerce, so it comes as no surprise that the results were so good and so close.

With all things accounted for – reliability, speed, geographical performance, and the site’s usability – we’ve reached our verdict, and it surprises even us for a second year in a row:

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Walmart.com"

]]>
AlertBot Showdown: Moviepass vs Sinemia https://www.alertbot.com/blog/index.php/2018/08/21/alertbot-showdown-moviepass-vs-sinemia/ Tue, 21 Aug 2018 18:29:00 +0000 https://alertbot.wordpress.com/?p=542 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying membership cards and ticket stubs. Text reads "AlertBot Showdown: moviepass vs sinemia" with the word SHOWDOWN very large at the bottom.
With streaming services like Spotify, Apple Music and Amazon redefining how we consume music, or NetFlix, YouTube and Hulu changing how we consume movies and TV at home and on the go, it probably should be no surprise that the subscription service concept would make its way to the cinema. MoviePass has long been a leader when it comes to theater-going subscriptions, but Sinemia is a rising competitor that has thrown its hat into the ring to fight for a share of the movie-going, popcorn-munching theater ticket buyers. Both services allow movie fans to pay a specific monthly (or annual) fee to see movies on the big screen at a discounted price.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from July 1 to July 22, 2018. As both sites and services are continuing to grow and change (Heaven knows MoviePass will probably change their rules and operations again before you finish reading this sentence), we weren’t surprised to see how similar the sites for each service performed.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both MoviePass and Sinemia performed well here, but one did seem to struggle a little more than the other.

MoviePass.com experienced a 98.2% average uptime due to several days where the site seemed to perform slower than usual, causing the pages to not load fully – even triggering a strange account lookup error on the front page for several hours on July 14th. This resulted in 18 failure events cataloged by AlertBot, with an average failure time of 32 minutes. This doesn’t mean downtime, per say, but the details did show that the site was struggling with its speed and load times. (MoviePass.com 7/10)

Comparatively, Sinemia.com saw 99.98% uptime with 1 failure event, although it wasn’t anything that spelled major downtime. At worst, it appeared to be a slow page / busy error that didn’t last long enough to qualify as site downtime. Overall, Sinemia proved to be pretty reliable. (Sinemia.com 9/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.

MoviePass.com saw acceptable page load speeds overall, with their best average day being Wednesday, July 4th with 3.9 seconds. The best time of day was 1am on Friday, July 20th (which isn’t a popular time to even be using a site like theirs) at an average of just 1.6 seconds. On the other side of the proverbial coin, the slowest day was Saturday, July 14 with an average time of 8.9 seconds, and the worst time of day was also on the same day at noon (yikes!) with an embarrassing 14.1 seconds.  (MoviePass.com 7.5/10)

Sinemia actually didn’t perform too much better, with their best average speed for a single day being Saturday, July 21 with 5.4 seconds and their best time of day being Wednesday, July 4th at 5pm with 2.7 seconds. Their slowest day was Monday, July 23rd with 7.3 seconds, with the slowest time being on July 2nd at 10pm with 10.2 seconds. (Sinemia.com 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

MoviePass.com performed the fastest in California with 1.8 seconds, with Florida coming in second at 2.4 seconds. The site performed slowest in Missouri with a sluggish 10.2 seconds, with Utah coming in second at 8.5 seconds. (MoviePass.com 8/10)

For Sinemia.com, California was also the fastest at 2.9 seconds, and Virginia was second fastest at 3.5 seconds. Missouri was also the slowest, at 11.3 seconds, with Utah being second slowest at 9.1 seconds. (Sinemia.com 7.5/10)

Neither site was all that impressive in the nature of speed – which is interesting considering there isn’t a whole lot of content on their websites to slow them down.

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to start the service signup process (but not complete any forms).

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.moviepass.com into our Chrome browser, it took a mere 18 seconds and 2 clicks to see their plans and get to the signup form. It was a piece of cake.

For Sinemia.com, it was actually just as smooth. In 17 seconds and 2 clicks, we were able to select a plan and get to the signup page.

It’s a tough call for usability. They’re simple processes, but they get the job done and we have no complaints.

All things considered, here are the Usability scores:

(MoviePass.com 10/10)
(Sinemia.com 10/10)

 

Verdict

The usability usually isn’t this straightforward and clear for both sites, so it leaves us to look almost exclusively at the other categories to draw a conclusion.

Without assuming MoviePass may have more hiccups in speed due to a greater deal of traffic, Sinemia.com seems to be a clearer choice for reliability as a whole, but the sites are quite close. That bad day on July 14 also really hurt MoviePass’s performance during this evaluation period, but it can’t be ignored. So, with that said, we believe the verdict is…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Sinemia.com"

]]>
AlertBot Showdown: Michaels vs A.C. Moore https://www.alertbot.com/blog/index.php/2018/05/31/alertbot-showdown-michaels-vs-a-c-moore/ Thu, 31 May 2018 20:04:33 +0000 https://alertbot.wordpress.com/?p=527 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying arts and crafts supplies, like paint brushes and plants. Text reads "AlertBot Showdown: Michaels vs A.C. Moore" with the word SHOWDOWN very large at the bottom.

Whether it’s designing a centerpiece for home or an event, perusing the aisles for tools for a school project, or locating a frame for that beloved photograph, it’s likely you’ve found yourself inside an arts and crafts store at some point. From cloth patterns to drawing pencils to blank canvases and custom framing, these craft supply stores are just what creative people  look for in a retailer.

With the rise of ecommerce, arts and crafts stores are just as accessible from the comfort of your computer or mobile device. For artists and crafters, something is undoubtedly lost when shopping online for these kinds of supplies, but the ease of online shopping is undeniable. Two of the biggest players in the market are Michael’s and A.C. Moore, so for this, our ninth, Showdown, we’ve pit the web performance of these two leading crafty retailers against each other.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from March 25, 2018 to April 8, 2018. As expected, both sites performed quite well, but as in most cases like this, one site saw better results than the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both websites did really well here, with neither site seeing any significant, true downtime.

Michaels.com experienced 99.9% average uptime due to 2 page load timeout failure events (where something on the page takes a bit longer to load, slowing the page’s overall performance down). When drilling down to see what errors Michaels.com returned, it signaled 17 instances where the page took longer to load than expected, and 15 times where something on the page took too long to load and slowed the page down. Still, despite the 2 timeouts, Michaels did well overall. (Michaels.com 8.5/10)

Comparatively, ACMoore.com saw 100% uptime with no significant failure events. However, there were still 4 recorded moments where there was a slow file and 4 occurrences of when the page itself took longer to load than expected. Still, ACMoore.com never actually went down, so we have to give them high marks for that.
(ACMoore.com 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Michaels.com saw pretty decent page load speeds overall, with their best average day being Wednesday, April 4th with 3.5 seconds. The best time of day was 6pm on Friday, April 6th at an average of just 2.1 seconds. On the flip side, the slowest day was Sunday, March 25 with an average time of 6.8 seconds, and the worst time of day was Sunday, April 8 at 8pm with 6.7 seconds.  (Michaels.com 8.5/10)

ACMoore.com was truly impressive with their load time. Their best day—Tuesday, March 27 with an average of just 1.5 seconds! A.C. Moore’s best time was even faster with Wednesday, April 4th, at 10pm seeing a load time of just 1.2 seconds. Even more amazing was the fact that ACMoore.com’s worst day—Thursday, March 29–saw an average load time of 1.8 seconds! Their worst time, however, was significantly longer (in comparison) at 3.8 seconds on Thursday, April 5 at 3pm. (It’s interesting that both slower speeds were on a Thursday.) It was really a rarity that ACMoore.com went over 2 seconds in load time, and for that, we have to applaud their excellent web performance. (ACMoore.com 10/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California continues to reign supreme as the leading location in speed. Michaels.com loaded within 2 seconds (on average) in California, with Florida seeing the second fastest speed of 2.5 seconds. Missouri turned out to have the slowest load time of 7.1 seconds, while Utah came in second-to-last at 4.9 seconds. (Michaels.com 8.5/10)

For ACMoore.com, California is the fastest, once again, at an average of just 1.9 seconds. The second fastest, again, is Florida with 2.4 seconds. The slowest speed time is also seen in Missouri at an average of 8.2 seconds, with NJ coming in second-to-last at 5.5 seconds. It’s interesting to note that ACMoore.com proved to have faster speeds than Michaels, but also slower speeds (when it comes to loading in specific locations). (ACMoore.com 8.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find some paint brushes, add them to the shopping cart and start the checkout process.

For each of these processes, we started by opening a new tab in Mozilla Firefox and typing in the site’s URL.

From the point of typing www.michaels.com into our Firefox browser and searching “paintbrushes” in the product search box, it took 30 seconds and 4 clicks to select a pack of brushes, add them to the cart and view the cart.  It was definitely a smooth experience.

ACMoore.com was, unfortunately, a far more frustrating experience. Upon visiting the site, we were hit with a pop-up asking for us to signup for their email list to get a coupon. Plus, their signup box at the top of the page is typically where a site search would go, so it’s easy to mix them up (despite the “Sign Up for Offers” label next to it). It didn’t take long to discover that their site also doesn’t seem to specialize in craft materials, as a search for something as basic as “paintbrushes” returned nothing. We tried altering the wording in our search a bit but gave up after reaching a minute and a half.

To be fair, we decided to run the usability process again with different search criteria. ACMoore.com seems organized by craft project ideas, without any real discernable things you can purchase from their site (and yet, they have a shopping cart), which makes the sites quite different from each other (and gives Michaels.com an edge over ACMoore.com in sheer product availability and variety). In the end, while the brick and mortar stores are very similar, their online presences are not. So we decided to run it again to see how fast we can get to, and briefly look around, their individual Weekly Ads.

For Michaels.com, it took about 2 clicks and roughly 10 seconds to get to the Weekly Ad for May 6 and start clicking around. It offered two choices for ads, but we chose the basic ad for the week to browse. It was a very easy experience.

For ACMoore.com, it took 20 seconds, 3 clicks and typing in our zip code to get to our local area A.C. Moore store’s ad before we could start clicking around. The ad isn’t nearly as thorough or as nice as Michael’s is, either.

All things considered, here are the Usability scores:

(Michaels.com 10/10)
(ACMoore.com 3/10)

 

Verdict

When it comes to speed, ACMoore.com bested their competitor, Michaels.com, but given the lack of substance and actual storefront of ACMoore.com, it may not be too fair to compare them. However, a quick lap through the aisles of both brick-and-mortar stores for each brand will show just how similar each store is. So, with taking everything into consideration, and both sites performing very well when it comes to the actual site reliability, it’s hard not to give weight to the user experience when making the final conclusion…

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Michaels.com"

]]>
AlertBot Showdown: Playstation vs Xbox https://www.alertbot.com/blog/index.php/2018/04/06/alertbot-showdown-playstation-vs-xbox/ Fri, 06 Apr 2018 19:30:53 +0000 https://alertbot.wordpress.com/?p=517 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying video game system controllers. Text reads "AlertBot Showdown: Playstation vs XBox" with the word SHOWDOWN very large at the bottom.

It may have been squashing a goomba while punching a coin out of a brick, dodging barrels being thrown by a grumpy gorilla, sorting oddly shaped falling blocks into interlocking patterns or simply catapulting miffed fowl at a group of defenseless pigs on your mobile phone, but chances are high that everyone has played a video game at one point in their life.

Poor web performance is no game any self-respecting owner of a website should play. We recently aimed our sights at the gaming industry and picked out two heavy hitters to evaluate: Xbox and Playstation. While their websites may not be the main point of interest for gamers, they’re relied upon for information, updates and even online digital game sales. Their online gaming servers may be the most important thing to keep running smoothly in gamers’ minds, but these top players in the industry will want to make sure their website stays up and always accessible.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from February 4, 2018 to February 25, 2018. Both sites performed well—as can be expected from parent companies Microsoft (Xbox) and Sony (PlayStation)—but, as usual, one performed just slightly ahead of the other, even if not by much.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both websites experienced 100% uptime, but both sites encountered minor errors that served as a few speedbumps along the way. Still, it wasn’t enough to qualify as downtime.

Xbox.com, despite its 100% uptime, experienced around 50 “slow page” warnings and over 20 page load timeouts (where something on the page takes a bit longer to load, slowing the page’s overall performance down). Xbox.com also returned an SSL Certificate expiration notice. However, none of these qualified as significant outages, and for that we still have to give them props. (Xbox 9/10)

Playstation fared the same with 100% uptime and a lot better when it came to the little errors. They only registered 7 timeouts and 5 slow page loads, and for that we give them slightly higher marks.  (Playstation 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Speed is crucial to the gamer – be it game load times (who else hates waiting for spinning icons to finish to get us past a cut scene or moving on to a new map in a game?) or server responsiveness – so a speedy game company website is key. Xbox.com experienced pretty quick load times, with its best day being February 24th with an average of 4.6 seconds. Its best response time, however, was on February 23rd at noon with 2.2 seconds. On the flipside, its worst day was February 12 with 6.7 seconds (which isn’t all that bad), but their worst hour proved to be on February 11th at 11pm with a sluggish 13.1 seconds. (Xbox 8.5/10)

Surprisingly, Playstation turned out to be just a little bit slower, with their best day average being 6 seconds on February 22nd. Their best time by the hour was on the same day at noon with 2.3 seconds, just a hair slower than Xbox’s best time. Their worst day was a full second longer on February 11th with 11.7 seconds, and their worst time by the hour was also 13.1 seconds, but on February 10th at 7am. (Playstation 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California seems to win out most of the time as the fastest location for load times and for Xbox.com, it was no different. California saw load speeds of 2.1 seconds on average, with Florida coming in second at 2.2 seconds. Georgia, however, saw an average worst time of 10.3 seconds with Missouri coming in second at 9.2 seconds. (Xbox 8.5/10)

Playstation.com actually turned in slightly more sluggish results geographically, too. Their best location was California, as well, but it was 2.5 seconds, and Florida was a close second at 2.7 seconds. Playstation’s slowest spots were also in Georgia and Missouri, at 12.6 seconds and 11.2 seconds, respectively. It’s not the worst we’ve seen, but Xbox clearly performed better. (Playstation 7.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to add a digital download of a popular video game to the shopping cart and start the checkout process.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.xbox.com into our Chrome browser and clicking around to find the Xbox One games, choosing the featured one (which, in this case was Dragonball FighterZ), clicking “Buy Now” and getting to the account login screen, it took 1 minute and 10 seconds. From the homepage, it took 7 clicks to get to the checkout process. It’s been a while since we’ve last visited their site, so our experience was fresh, but we encountered some significant slow loading times when getting to the product page. We actually added an additional click to the process because the “Buy Now” button didn’t load properly at first (and did nothing upon its first click). Overall, we got to do what we set out to do, but the process could have gone a lot smoother.

We were hoping for a better experience from Playstation, and we got one. From the point of typing www.playstation.com into our Chrome browser, it took 4 mouse clicks and 35 seconds to find a featured video game (in this case, Bravo Team), and get to the checkout stage (which was also an account login screen). There was some delay on first clicking on the game title, but it still loaded quickly and allowed us to get to the end of the process fast.

Both sites allowed us to get the job done in a rather speedy manner, but Playstation’s site gave us a much more positive experience.

With that said, here are the Usability scores:

(Xbox 8/10)          (Playstation 9.5/10)

 

Verdict

Both sites performed very well, but that positive user experience helped push one over the other, albeit only slightly. So while it was a tough call to make, we have come to a conclusion —

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Playstation.com"

]]>
10 Ways to Optimize Images to Improve Your Website Performance https://www.alertbot.com/blog/index.php/2017/11/07/10-ways-to-optimize-images-to-improve-your-website-performance/ Tue, 07 Nov 2017 19:07:09 +0000 https://alertbot.wordpress.com/?p=462 A graphic showing a desktop monitor, a laptop screen, a tablet screen and a mobile phone screen and all of them are displaying various kinds of icons - like a magnifying glass, wifi symbol, shopping cart, video game controller, etc.

10 Ways to Optimize Images to Improve Your Website Performance

by Louis Kingston

“Visuals express ideas in a snackable manner.” – Kim Garst, CEO of Boom Social

Visual imagery on websites is a powerful tool to grab the user’s attention keeping them curious, engaged and interacting on your webpage. Humans are a visual species. Our brains can process an image within 13 milliseconds with over half of the brain devoted to processing the visual information it receives. We show excellent memory capability for remembering pictures that is much higher than retaining text. Over 65% of the population are visual learners. What this means is that our websites must contain a healthy dose of visual images to keep a visitor engaged. Whether it’s on our homepage, service pages, in our blog articles, on our e-commerce sites –images are essential to driving sales, conversions and ultimately company growth.

Are Images Slowing Down Your Load Speed?

However, the images used must be optimized so that they don’t hamper your website’s performance. If they are too large, they are going to slow down your website’s loading speed. The Google algorithm doesn’t like that. More than seven seconds to load and Google’s going to ignore you, and you won’t make it to page one of SERP’s (search engine results page). The search engine’s focus is on organically profiling businesses that offer a great user experience; slow load speed will just have potential visitors clicking away.

Google loves text, and when it crawls your site, it can’t ‘read’ your images unless you have created file names, alt tags, and captions to describe the image. You are losing out on a perfect SEO opportunity if you don’t optimize your images.

Let’s investigate ten ways you can achieve image optimization for your website…

  1. Use keywords in the image file name. The file name affords a perfect opportunity to include your primary keywords as well as giving Google enough text, so it knows what it is “looking’ at on your webpage. But make sure you never keyword stuff these descriptions. You don’t have to use descriptions for decorative images (that would be overkill and Google might penalize you).

 

  1. Images must be scaled to fit the size it will be displayed on the site. The mistake many people make is that they think that once they take a large image and put it into a small size display area, it will then not take up so much ‘space.’ But the file size is still enormous and will continue to take a long time to load. The image should first be scaled to the size you want it to be displayed. You can also choose to remove any pictures that are no longer serving your website which will also improve the overall load speed.

 

  1. Always reduce the image file to the lowest possible size without compromising too much quality. Many online tools can assist you to reduce your file size, like JpegMini, io, ImageOptim etc. Aim to keep your image file size below 70kb (if possible).

 

  1. Use responsive Images for a better mobile experience. When you use responsive images plugins that apply the srcset attribute, it allows your pictures to display differently for each device screen width. If you are using WordPress, this function is automated.

 

  1. Add Customer-Centric Captions. According to KissMetric, the captions under images are read 300% more than the body content. Visitors to web pages are scanning information, and a well-captioned image can provide them with a wealth of info at a glance. Remember that the images should always be relevant to the content.

 

  1. Always be visible with alt tags. Proving alt tag text ensures that your images can always be ‘seen.’ If a user is unable to download images or if they are using a screen reader due to being visually impaired, the alt tag will describe the image.

 

  1. Make sure to add image tags to your XML image sitemaps. This helps Google with indexing the images on your site. If you are making use of JavaScript galleries or other flashy pop-ups, let Google know what they are and where they are located on your sitemap so they can crawl these images on the web pages’ source code.

 

  1. Remove metadata from raster images. If you are using raster images, there is often unnecessary info attached to it like geo-location and other information regarding the camera used which only takes up space. You will make the overall file size much smaller when you can get rid of this extra metadata.

 

  1. Where possible use vector images. This format is ideal for multi-device use with high-resolution. Raster should only be used when there are complex scenes with loads of detail and irregular shapes. Then using GIF, PNG, JPEG, JPEG-XR, and WebP will be the right choice. Experiment with the raster settings to reduce the quality to free up more bytes.

 

  1. Scalable Vector Graphics (SVG) should be minified and compressed. Minifying SVG files will reduce their size and GZIP can be used to compress them.

How is Your Website Performing at the Moment?

Of course, these are just ten basic image optimization pointers. You can drill down even further on image optimization to enhance your website performance. If you would like to find out more about your website’s performance, AlertBot can show you what elements are slowing down your site or what bottlenecks are causing user traffic to click away. We also offer a Free 14-day trial (without collecting any billing info). Give us a try!

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: HomeDepot vs Lowes https://www.alertbot.com/blog/index.php/2017/10/11/alertbot-showdown-homedepot-vs-lowes/ Wed, 11 Oct 2017 09:57:06 +0000 https://alertbot.wordpress.com/?p=449 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying planks of wood. Text reads "AlertBot Showdown: The Home Depot vs Lowe's" with the word SHOWDOWN very large at the bottom. Tiny hardware nails are sprinkled around the image.

Living in an age where nearly every industry is driven by ecommerce, it should come as no surprise that this includes the home improvement world. Home Depot and Lowes are titans in their industry, and both have a strong online presence. But when it comes to who may have the better performing site, we set out to nail down one true winner.

For our fifth website Showdown, the AlertBot team got out their proverbial measuring tape and slipped on a stylish apron to dig in to the performance of HomeDepot.com vs Lowes.com.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from August 11, 2017 to August 31, 2017. Not surprisingly, the performance for these heavy lifters proved to be rather resilient for both sites. Neither service’s site experienced significant downtime, but as usual, one did prove to perform a little better the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

HomeDepot.com performed quite well over the tested time period, experiencing no failure events. At most, it had a couple hiccups, like a short-lived Timed Out error or a Slow Page File notice, but none of these occurrences caused any amount of significant downtime. (HomeDepot 9/10)

On the other hand, Lowes’ site experienced one failure event on August 21st when the site was not responding for roughly three minutes around 12:21 in the afternoon. When errors like these occur, AlertBot tests them from a second location to confirm if the error is widespread or just a brief localized outage. In this instance, the error persisted after a few tests in different locations, qualifying it for actual site downtime, before the issue resolved.    (Lowes 8/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

HomeDepot.com has a great deal of graphics on the front page, which typically slows sites down considerably. However, it didn’t seem to slow this site down much. HomeDepot.com’s best day, on average, was Tuesday, August 29th  with an impressive load time of 1.1 seconds. The “worst” day average was still an impressive 1.9 seconds.  When evaluating the site’s speed by hour, the site loaded in just 0.8 seconds at 1AM on Sunday August 20th. The worst hour was also on August 20th, at 2PM with 5.1 seconds. Overall, HomeDepot.com’s speed is quite good.  (HomeDepot 9.5/10)

Lowes.com has drastically less content on its front page, but it performed considerably slower than HomeDepot.com did. Sadly, Lowes best day was actually slower than HomeDepot’s worst, with an average of 6 seconds on Sunday, August 13th. Lowes.com’s worst day was Monday, August 26th with 7.1 seconds. That’s not horrendous, but with sites being expected to perform faster and faster these days, a respected retail giant like Lowes needs to up their speed game. On an hourly average basis, their best time was 11PM on Wednesday, August 23rd with 7.1 seconds (Again, their fastest time is slower than HomeDepot’s slowest). Their worst load time by hour was Sunday, August 27th at 1PM with a sluggish 10.1 seconds. (Lowes 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

Usually when we look at site speeds across the United States, sites tend to perform better in California than anywhere else. This isn’t the case for HomeDepot.com, however. For Home Depot, Florida appeared to experience the fastest web transaction (less than one second), while it experienced the slowest transaction test in California (But it’s still only 2.3 seconds). After Florida, however, it experienced the next fastest web transactions in New Jersey and North Carolina (both at 1 second). (HomeDepot 9/10)

Lowes.com had the fastest web transaction in California at 3 seconds. The next fastest was North Carolina, already up to 4.3 seconds. The slowest performance occurred in New York at a whopping 9.4 seconds (with the second-slowest being Georgia with 9.3 seconds). (Lowes 7.5/10)

 

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater. For this Showdown, we’ll see what the experience is like to use their respective websites to add a common product to the shopping cart.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.homedepot.com into our Chrome browser and entering “leather gloves” into the search box, choosing one and adding it to the cart, it took 25 seconds. From the front page, it took 5 clicks to get to the “Checkout now” process. It wasn’t bad, but we found the Lowes process just a bit smoother.

From the point of typing www.lowes.com into our Chrome browser, it took 4 mouse clicks and 20 seconds to get the gloves into the shopping cart and view the cart. The “Add to cart” button is much more obvious and visible on Lowes’ site, where it took a moment to locate it on Home Depot’s site. And while both sites offer a “compare” option so you can look at product features side by side, it wasn’t very noticeable on HomeDepot’s site, while it was more prominent on Lowes.com.

The aesthetic of both websites isn’t bad, but Lowes has a crisper and more streamlined appearance and functionality. Both sites get the job done pretty quickly, but we had a slightly smoother experience with Lowes. With that said, here are the Usability scores:

(HomeDepot 9/10)       (Lowes 10/10)

 

Final Verdict

Both sites performed respectably, but HomeDepot.com clearly performed faster and was more reliable than Lowes.com. Despite the fact that we may have preferred the shopping experience on Lowes.com just a little bit more, one cannot ignore the slower site performance.

So, for the fifth AlertBot Showdown, the site that gets to join the ranks of previous winners Apple, FedEx, and Burger King is…

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "HomeDepot.com"

]]>
Tortoise, Dinosaur or Ostrich? Proactive vs Reactive Web Monitoring https://www.alertbot.com/blog/index.php/2017/07/19/tortoise-dinosaur-or-ostrich-proactive-vs-reactive-web-monitoring/ Wed, 19 Jul 2017 11:15:31 +0000 https://alertbot.wordpress.com/?p=425 A set of three photographs: The first shows a tortoise, the middle one shows the bones of a t-rex, and the third is an ostrich with its head buried in desert sand.

Tortoise, Dinosaur or Ostrich?
Proactive vs Reactive Web Monitoring – 3 Metaphors From the Animal Kingdom
by Penny Hoelscher

In February 2017, Amazon Web Services’ (AWS) S3 web-based storage service suffered an outage that led to half of the internet “melting down” and costing businesses millions. It was caused by an operator’s typing error when issuing a routine command to take a few S3 servers offline.

What has this got to do with you?

Despite the fact that the entire outage lasted 4 hours and 17 minutes, Amazon came under attack from experts and customers in toe-curling global headline news. AppleInsider reported that even Apple was affected, with a variety of cloud services experiencing outages and slowdowns. Apple relies on Amazon for portions of its cloud infrastructure. Albeit not as a result of the meltdown, rumor has it the company is thought to be gradually shifting away from its dependence on Amazon.

Perhaps you’re not an Amazon or an Apple, but you too may be vulnerable. It all boils down to reliability which has a direct affect on your revenue stream. If your web application or site delivers poor performance, your customers will go to your faster, more modern, more customer-centric competitors where they experience less downtime, fewer outages and faster page loading times, and better service. The result: you will lose sales, money and even your reputation.

How can you tell that it’s time to upgrade your website monitoring tool and get expert assistance? Well, you’ve already dropped the ball when you start noticing a decline in visitors; when once a waterfall, the stream of traffic to your website has slowed to a trickle. An external website monitoring tool like AlertBot can alert you to potential signs of trouble, like:

  • Degraded performance, e.g. due to page bloating, inefficient scripts, backend services.
  • Cyber attacks, e.g. website defacing and file changes.
  • Incompatible website and addons, e.g. by loading and testing site scripts.
  • Software and database issues, e.g. overloaded application servers and database bottlenecks.
  • Server failures, e.g. SSL, DNS, HTTP, and Ping.

In a nutshell, if your website persona resembles one of the following – tortoise, dinosaur or ostrich – you’re in trouble:


TORTOISE: Outages, high down-times and slow loading times

The internet is not like your local shopping mall which is a convenient one-stop shop for all your household needs. These days, “I want it and I want it now” customers have far more options and if you’re closed for business, they’re not going to go and have a cup of coffee and wait for your door to open again; they’re simply going to mosey over to your competitors. Only one thing hasn’t changed in the digital sphere: some old adages hold true. Thing is, customer loyalty is a fair weather friend in an online environment, and when it comes to affiliate loyalty, frankly, for them, time is money.

Website monitoring tools not only report on outages and high down-times, they help you to identify where (e.g. a particular geographic location), when (e.g. peak hours) and why (e.g. network issues) these are occurring. You may find it is your business model that is at fault, not slow servers or bloated software; for instance, perhaps you’re doing maintenance and performing upgrades at the wrong time in a different time zone to that of your head office.

In addition, page loading speed is one of the ways Google ranks your web pages. This matters because when searching for products and services, customers will click on the matching businesses Google serves first.


DINOSAUR – Being behind the times

Google lowers mobile page rankings for companies who do not have a mobile responsive web design. New website design trends have changed the face of online businesses and today’s tech-savvy generation can spot an old-fashioned, un-cool design in a heartbeat. But, keeping up with new design technologies can have an impact on your website’s performance. Page bloat is much like a beer belly; extraneous code, affiliate advertising and toxic data (storage of unnecessary and dated information) creeps up sneakily but has a huge impact.

One of the main benefits of a professional website monitoring service is that it provides you with an automated artificial intelligence that can manage big data and learn from the information it receives. You don’t have to wait for users to complain or continuously test the site yourself, and, because your business is constantly evolving, it is able to update its algorithm in tandem. These sophisticated technologies not only gather and analyze the data you need to make an informed decision about performance, they provide you with the solutions.

Cyber attacks are a 21st century bane to which all online businesses – big and small – are vulnerable. Of increasing concern is that at many companies, it can take months before a data breach is detected, giving cyber criminals plenty of time to ravage their victims’ systems. AlertBot can’t prevent a data breach but it can alert you when you’re attacked, e.g. by notifying you that files have been changed or your site has inexplicably gone down.


OSTRICH – Customer complaints

Negative social media posts can be harsh on a business’s reputation. Often, it may appear unfair, especially when the trolls join the battle to bring you down. Sure, you need a team to monitor social media channels and publicly appease customers (including the trolls) who have issues, but that’s not enough. An external website monitoring service can give you advance warning of problems with your system.

Customer Experience (CX) is not just about the latest trends – mobile first, conversational brands, emotional engagement, predictive analytics and personalization, etc.; CX is about serving customer needs and wants (read: demands) BEFORE they start complaining. Once your website starts exhibiting dinosaur or tortoise characteristics because you’ve been acting like an ostrich with its head in the sand, it is too late; all you will have is reminders of your ex-customers’ public vents still floating around on complaints forums and social media channels.


Conclusion

The Amazon debacle should be a wake-up call for businesses to be more proactive with regard to monitoring the uptime and infrastructure of their systems. Imagine how red your company’s face would be if you don’t notice a crisis before your users do and you have to be informed by irate calls and emails from them.

A monitoring tool like AlertBot simulates actual user behaviors and interactions, and runs tests using popular web browsers like Chrome and Firefox in real-time. It’s easy to set up (no installation necessary) and allows you to create scripts for different user experiences across multiple devices, using multiple features and functions, enabling you to be proactive at the best of times, and timeously reactive at the worst (after all, accidents do happen.)

]]>
AlertBot Showdown: FedEx vs UPS https://www.alertbot.com/blog/index.php/2017/05/23/alertbot-showdown-fedex-vs-ups/ Tue, 23 May 2017 18:51:27 +0000 https://alertbot.wordpress.com/?p=414


A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying rectangular shipping boxes. Text reads "AlertBot Showdown: FedEx vs UPS" with the word SHOWDOWN very large at the bottom.

 

One of the most appealing things about ordering items online is receiving packages in the mail. Not only is it convenient for the fruits of your shopping toils to be brought directly to your door, but you can do your shopping from anywhere at any time of the day or night (and in your pajamas if you so desire). Two well-known, worldwide services that nearly everyone who has sent or received a parcel has used are UPS and FedEx. Both services are easily accessible for sending packages, and both are frequently used for receiving them. Both services also have websites that enable users to track their packages (if they’ve been given a tracking number), while also helping to provide resources for sending them out.

For our third Showdown, we set out to track the performance of these two services, trucking along until we could wrap up the results for delivery to you.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both parcel service sites for three weeks, spanning from March 27, 2017 to April 17, 2017. Not surprisingly, the performance proved to be reliable for both sites. Neither service’s site went down, but one did prove to perform a little faster than the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

FedEx’s website experienced not a single, solitary failure event. At the very worst, it may have experienced some slight slowness for a short period of time, but it didn’t affect their overall reliability results. (FedEx 10/10)

UPS’s website was a different story, but there were also no failure events or periods of actual downtime either. The most UPS’s site saw were a handful of warnings that the site was performing a little slower than usual, and a little slower than the average expected load time. These periods of minor slowness only lasted for about 3 to 5 minutes each.   (UPS 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user.  We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Both websites have pretty basic homepages, so the load times for customers should be fairly quick (even on a slow internet connection) if the sites aren’t experiencing any server issues.

FedEx’s site speed is fantastic, averaging less than 1 second on most occasions. Its response time was recorded on Wednesday, April 5, 2017 at 0.5 seconds, while its slowest response time was on Monday, April 17, 2016 at just over 2 seconds (which is still very good). (FedEx 10/10)

UPS was also pretty good, but their best response time was about the same as FedEx’s worst response time. UPS’s best response time was 2 seconds on Tuesday, April 11, while their worst was on Monday, April 10th with just a hair under 6 seconds. The standard used to be 7 seconds, but these days, users expect sites to load in roughly 2 seconds.  (UPS 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart


Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

It’s also interesting to note that in most of these tests we’ve done for these Showdowns so far, California seems to frequently come out on top when it comes to website speed. With that said, FedEx seemed to perform best in California – at just under half a second, and performed the “worst” in Virginia, which still averaged around an impressive 1.1 seconds.  (FedEx 10/10)

UPS also saw its best results in California, but clocked in at around 1.4 seconds there. Texas returned the slowest results, however, averaging around 5.6 seconds. (UPS 8/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we went through the motions of ordering tickets for a recent movie on MovieTickets.com and Fandango.com, For this evaluation of FedEx and UPS, we’ll see how the experience of tracking a real package goes.

For each package tracking process, we started with having the tracking number copied onto our clipboard and then typed the URL of the test site into our browser.

From the point of typing www.FedEx.com into our Firefox browser, selecting the tracking tab at the top, pasting the tracking number into the search field on the left sidebar and clicking “Track,” it took only 15 seconds to get to the tracking results. That’s really fast! We then tried the same process again using the Google Chrome browser, for which the “region” needed to be selected first this time, it took only a second longer to complete!

Now, from the point of typing www.UPS.com into our Firefox browser, selecting the region, pasting the tracking number into the search field on the left sidebar and clicking “Track,” it took roughly 22 seconds to get to the tracking results. That’s not bad, but it’s clearly slower than our FedEx experience. We then tried the same process again using the Google Chrome browser and it took an impressive 12 seconds to complete!

So, with all things considered, with the goal being to track a package as quickly as possible, here are the Usability scores:

(FedEx 10/10)        (UPS 9/10)


Final Verdict

It’s a close match, to be honest, but we’d have to say that FedEx.com still outperformed UPS.com in the speed factor, delivering more than just highly anticipated parcels to its customers, but reliable website performance swiftly as well.

So, for the third AlertBot Showdown, the site that gets to join the ranks of previous winners Apple and Fandango is…

 

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "FedEx.com"

> ]]> When Does Most Website Downtime Occur? https://www.alertbot.com/blog/index.php/2017/03/27/when-does-most-website-downtime-occur/ Mon, 27 Mar 2017 17:57:01 +0000 https://alertbot.wordpress.com/?p=364 Photograph of a man looking distressed with six arms coming off of him, each holding a different item. The items include a planner book, a calculator, a magnifying glass, a laptop, an abacus, and a marker.

When Does Most Website Downtime Occur?

To become competitive in the global market, it’s crucial for your business to have a strong online presence. One of the best ways to ensure this is to have a user-friendly business website that is accessible ’round the clock. And if your customers rely heavily on your website, you know that any amount of time your site is down could be rather costly.

Frankly, website downtime is inevitable. Even the big online giants like Microsoft, Google, Facebook, eBay, YouTube, Amazon and CNN have experienced website downtime at some point.  However, the good news is that you can mitigate the risk and lower the length of time your site remains inactive if you are familiar with some of the likely causes of website downtime.

Let’s dig a little deeper to find out the common causes of site downtime:

§  Server Overload

Server overloads occur when a big wave of online traffic overwhelms a server. Now, there are two situations when this happens. First, it happens if your site is being hosted on a shared server. Resources on shared servers are limited and they have to be stretched to support high volumes of traffic and site-processing needs, which can cause server overload. As a result, your site may be inaccessible to users for hours.

Second, server overloads may also happen on major online shopping days, like Black Friday and Cyber Monday, or any other occasion for that matter, when you have significant discount deals and special sales running on your website. Such deals draw in heavy traffic, thus increasing the chances of server overload and site downtime.

§  Hardware Failures

Server and network failures can bring a website to a screeching halt in no time flat. This could be caused by things like hard drive failures, power supply failures, circuit board failures, or cabling failures. It can also be caused by more troubling failures like data center infrastructure failures or network peering failures.

§  Webmaster Errors

Your business may experience downtime because of errors caused by the site’s webmaster. For example, your site may not be accessible to your audience if your webmaster forgets to renew the site’s hosting contract or domain name.

§  Coding Errors

Some common coding errors are incorrect syntax, infinite loops and typos. All of these errors can exhaust the resources of the server and yield 500 (Internal Server) error codes, resulting in website downtime.

§  Cyber Attack

With the surge in cyber crime, you need to make sure that your website is well-protected from cybercriminals, hackers and viral infections. Cybercriminals know how to hijack websites and redirect your site visitors to other websites or expose them to malicious content.

All of this can result in lengthy website downtime, which can be detrimental to your business sales, profits and reputation. And that is definitely something that no business owner wants! One way to help prevent cyber attacks is to keep your IT team, and those directly responsible for the health of your website and server, in the know about the latest cyber threats.

§  Distributed Denial of Service Attacks (DDoS)

Also known as DDoS, Distributed Denial of Service Attacks can also bring your online business to a standstill. DDoS are planned attacks. In these instances, heavy traffic is deliberately directed from different sources to cause servers to overload and, in some cases, crash entirely.

§  Natural Disasters

Website downtime may also occur when your data center is hit by a natural disaster like floods, hurricanes, earthquakes, fires, etc.

§  Planned Downtime or Server Maintenance

Lastly, if you have a dedicated server, you may need to go offline for server maintenance. This usually involves upgrading hardware components, drivers, operating systems, firmware, and even software applications. With these planned occurrences, you can alert customers ahead of time to the planned outage, which can help combat and minimize the effect it may have on your business.

Knowing the reasons for, and causes of, website downtime is crucial as it will help you devise and implement the right mix of strategies to overcome and avoid it.

AlertBot’s external website monitoring service exists to help businesses like yours to identify and fix website errors when they happen and hopefully prevent future downtime. Visit www.AlertBot.com for more information and to signup for a free, no-risk trial.

]]>
AlertBot Showdown: Fandango.com vs MovieTickets.com https://www.alertbot.com/blog/index.php/2017/03/06/alertbot-showdown-fandango-com-vs-movietickets-com/ Mon, 06 Mar 2017 11:37:00 +0000 https://alertbot.wordpress.com/?p=371 A graphic with a yellow starburst in the center and two robots charging towards each other. Text reads "AlertBot Showdown: Fandango vs MovieTickets.com" with the word SHOWDOWN very large at the bottom. Kernels of popcorn are scattered around the image.
Last fall, AlertBot debuted its first website “Showdown,” pitting cell phone giants Apple and Samsung against each other in an epic web performance death match (of sorts) to see whose website performed the best. (For those results – and to see who won – check out that blog here.)

For our second Showdown, we decided to grab an oversized bucket of popcorn, an unreasonably large cup of soda, and a pair of cheap, plastic 3D glasses and plopped down into the comfiest of chairs to evaluate two of the premiere movie ticket buying sites: Fandango and MovieTickets.com.

If you’re a movie buff who loves a night out reclining in front of ceiling-high silver screens to watch the latest Hollywood has to offer, chances are you’ve purchased tickets online before. And what would be more frustrating than website failure while you’re trying to combat the masses to secure your entry into an anticipated film’s opening night?
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites from December 26, 2016 to January 16, 2017. Not surprisingly, the performance proved to be pretty good overall, although one of the sites experienced some pretty significant issues on one of the days. Both sites saw some minor “Slow Page” warnings, but MovieTickets.com took a hit right after Christmas with a dreaded “Server Too Busy” error, meaning their website couldn’t withstand the weight of the traffic it was getting.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for the causes of those failures.
Fandango’s website experienced not one single failure event. The worst things seemed to get for Fandango in this time period was a handful of minor “Slow Page” warnings on Dec. 29th and after the new year on the 2nd and 5th. (Fandango 10/10)

Meanwhile, MovieTickets.com experienced what AlertBot considered to be 13 failure events. While most of them were only 3 to 5-minute-long for slow page loads, on Dec. 27th (a Tuesday), the site saw some significant outages where the site was down and reporting a “Server Too Busy” error for nearly 6 hours! When visiting the site during this stretch of time, chances are users were met with the dreaded “Server Too Busy” error message in their web browser instead of the actual site. This would have been super frustrating for visitors (especially if you’re trying to order tickets in a jiffy). (MovieTickets.com 7/10)

 

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Both websites have pretty busy front pages, but both tend to change often and feature videos or Flash-driven ads and some graphic-heavy content – all of which can really compromise a website’s load time.

Fandango’s speed is solid, averaging less than 2 seconds. Its fastest day was Friday, Jan. 13, 2017 at 1.7 seconds and its slowest was Thursday, Dec. 29, 2016 at just over 2 seconds. (Fandango 9/10)

MovieTickets.com didn’t fare quite as well, unfortunately. On its best day, Tuesday Jan. 3, the front page took almost 4 seconds to load. On its worst day, Dec. 27 (also a Tuesday), it took almost 7 seconds to load – which is definitely below the current online industry’s website load time standards. (MovieTickets.com 7/10)

 

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart


Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. Fandango saw the fastest load times in California (I suppose that makes sense, given the movie industry being centered there), with the slowest happening in Texas. Still, the slowest times were typically still under 2 seconds of load time, with Washington, Virginia and Texas all seeing load times around 2 seconds or pushing 3 seconds. (Fandango 10/10)

For MovieTickets.com, it’s a different story. We already know they struggled with speed, but the question here is – where? California is also the best location for MovieTickets.com, with load times around 2.7 seconds. The worst, again, is Texas, with almost 6.5 seconds. Florida and North Carolina also performed well, while Washington joined Texas as one of the slower locations. (MovieTickets.com 8/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart


Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we chose the task of ordering the latest cellphone from the respective sites of Apple and Samsung. For these sites, we’ll see how the experience of ordering movie tickets compares to one another.

Starting with selecting a movie to buy tickets for, we approached each site with the goal of ordering two tickets for the recently released The LEGO Batman Movie.

The LEGO Batman Movie theatrical poster showing LEGO Batman characters running toward the screen.

By selecting a brand new film, it was easy to find the film on the homepage of Fandango.com and start clicking through to order tickets. From the point of typing www.Fandango.com into our Firefox browser, clicking on the LEGO Batman Movie poster, putting in our zip code, selecting the next available time and number of tickets, it took roughly 40 seconds to get to the Fandango checkout. That’s not bad. If this were for real, we would have probably spent extra time checking our show time options a bit more, choosing 2D over 3D, etc. But for this task, we figured it’s best to keep it simple. The whole process took about 4 clicks of the mouse with a little typing to put my zip code in.

For MovieTickets.com, we found the experience to be mostly the same. Except, when we put our zip code in, it seemed like MovieTickets.com gave more options right off the bat. Fandango suggests the closest theater for your zip code and the first batch of showings it finds (in this case, it’s a 3D showing), while MovieTickets gives you the full list of showings and format options. Our experience felt more thorough with MovieTickets, getting more choices right away. But we also feel like more options delayed our browsing experience because we had to read and think more. Still, the browsing time for MovieTickets.com – to complete the same process – was the same 4 clicks and around the same 40 seconds.
We tried both again from Google Chrome, and not even factoring in our newfound familiarity with the process for both sites, we found both processes to take 30 seconds each this time. They’re easy sites to navigate and their load times were swift.

So, with all things considered, with the goal being to get tickets for one of the most recent films released ordered, here are the Usability scores:
(Fandango 9/10)      (MovieTickets.com 10/10)


Final Verdict

We’d say Fandango won by quite a bit, given their better web performance over MovieTickets.com, but I think I enjoyed the usability of MovieTickets.com over that of Fandango’s. The fact that Fandango doesn’t present show time options upfront is a little unfortunate.

Still, one cannot ignore good web performance, and I have to hand it to Fandango for achieving impressive site speed and reliability. So, with that said, the result of the second AlertBot Showdown is…

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Fandango"

]]>
How Much Impact Does an Hour of Website Downtime Have on a Business? https://www.alertbot.com/blog/index.php/2017/02/27/how-much-impact-does-an-hour-of-website-downtime-have-on-a-business/ Mon, 27 Feb 2017 11:00:27 +0000 https://alertbot.wordpress.com/?p=353 An illustration of a business man with a briefcase running away from a shadowed monster with red eyes and red graph arrows coming from its head and mouth that are pointing downward. The background is a yellow grid with a couple money symbols.

How Much Impact Does an Hour of Website Downtime Have on a Business?

So, your business website is offline again and your IT team has sprung into action, trying to pinpoint the issue and fix it as soon as possible. Sure, it’s good that your IT experts are handling the problem responsibly, but do you know how much money your business may have lost during your website’s downtime? Well, if you are a major player in the ecommerce industry, chances are you could have lost millions of dollars by now. And that is not an overstatement.

Like it or not, even an hour of downtime can do a great deal of damage to your online business. Did you know that in 2014, Google experienced downtime which was caused by a virus and all Gmail, Google+ and Google Drive were affected by it? This downtime lasted for an hour, which decreased Google stocks by 2.4 percent.

But that’s not all! Amazon, the e-shopping giant, experienced 2 hours of downtime, presenting site visitors with cryptic HTTP messages. In just 2 hours, Amazon lost an estimated total of $3.48 million. That’s huge!

So, if you wish to estimate the true cost of an hour of website downtime has to your business, then you’ve come to the right place. Here are some of the more important variables you must consider when calculating this cost:

§  Impact on Business Sales

To figure out exactly how much an episode of website downtime costs in terms of sales lost, you’d need to determine what your average profits per minute are during the time period the downtime occurred. You can then multiply that average profit per minute times the number of downtime minutes to determine your total lost sales profits. If the downtime occurs at 2 in the afternoon, for example, it is most likely going to cost your business more sales than if the outage had happened at, say, 2 in the morning, when web traffic is typically much lighter.

§  Damage Done to Your Business Reputation

Downtime (especially if it’s frequent or at a crucial time) can scar your business’s reputation, losing the trust and loyalty of customers in your brand. Just like many businesses, you too have invested good money and a great deal of time in brand building. Your time and money can go to waste if you experience downtime—even if it is for just an hour. When considering the true cost of your site’s downtime, it is important that you keep in mind the resources you’ll need to spend to repair your tainted brand image going forward.

§  Money Wasted in Marketing Campaigns

Another factor to consider when determining the cost is the money you have invested in your marketing efforts, like PPC (pay-per-click) campaigns. You need to figure out the amount of money that was spent on marketing while your site was experiencing downtime. This is important to calculate, because let’s face it – you literally didn’t reap any benefits from the invested money, because your site was inaccessible when prospects clicked on the PPC link or advertisement.

Prevention is Always Best!

Calculating the cost you might have incurred due to an hour of website downtime is essential, but there are precautions you can take to avoid unplanned downtime and keep your business up and running ’round the clock (and be a hero!). AlertBot is an intuitive web-based website monitoring service that can alert your team about website errors and slowness within seconds, and also help you keep track of your site performance. All of this is much needed to mitigate downtime issues significantly. Start the AlertBot 14-day free trial today!

]]>
Why is Website Performance Monitoring Necessary? https://www.alertbot.com/blog/index.php/2017/01/17/why-is-website-performance-monitoring-necessary/ Tue, 17 Jan 2017 11:00:04 +0000 https://alertbot.wordpress.com/?p=343 Photo of two hands holding a tablet horizontally with illustrations of graphs and icons floating off the face of the tablet.

Given the fact that we live in a highly-digitized world today, websites, blogs and web-stores are now an essential component of any business and brand. While waiting for a site’s content to load can be annoying for a user, it can also be potentially disastrous for business.

That, however, is only one reason to monitor the performance of your website. Here are four more:

1.     Loss of Sales and Web-Traffic

First and foremost, businesses maintain websites and have web-stores to promote commercial growth. Now, imagine a situation where you’ve gone to a store and the service is impossibly slow. The salesmen and women are hardly making an effort to engage or help you and you just decide to take your business elsewhere. The same happens to a shopper when they visit a website that takes ages to load. Instead of making a sale, you lose web-traffic and potential customers. You can prevent this by monitoring how your website is performing.

2.     Potential Damage to Brand Image

Customers talk, and they are interested in what others like them have to say. While most brands depend on marketing ploys to promote sales, the importance of word-of-mouth advertisement cannot be discounted. If you leave a bad impression on one customer, chances are that word will spread about it, tainting- if not tarnishing- your hard earned reputation and brand image. Who wants that?

3.     Error Detection

Website performance monitoring is the best way to prevent errors. It’s all too common for ecommerce sites to hit a snag and run into trouble.  If your site is regularly maintained and monitored, you’ll not only be able to fix a problem sooner; you might even be able to detect it beforehand and prevent it completely.

4.     Quality Maintenance

Just as quality assurance is essential for a physical store, it’s equally important for a website and web store. By using a performance testing and maintenance tool, software or application, you will be able to standardize and retain the quality of your website. Not only will that help preserve the website’s ranking on Google, it will also contribute to drive online traffic.  As it is, Google ranking is affected by the minutest change in website speed and downtime. This is the whole reason why websites are search engine optimized in the first place.

So, if you’re even partially convinced that your website needs performance monitoring, why not start the AlertBot 14-day free trial, today?

]]>
Press Release: AlertBot Launches New Blog Series ‘Website Showdowns’ https://www.alertbot.com/blog/index.php/2016/12/13/alertbot-launches-new-blog-series-website-showdowns/ Tue, 13 Dec 2016 19:54:00 +0000 https://alertbot.wordpress.com/?p=339 AlertBot Launches New Blog Series: “Website Showdowns”

Allentown, PA / December 13, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading TrueBrowser®-based web application monitoring solution, AlertBot, is pleased to announce the launch of a new series of AlertBot blogs the team has dubbed ‘Website Showdowns.’ AlertBot’s Showdown blogs will feature monitoring results from competing websites, showcasing AlertBot’s TrueBrowser® technology at work, which combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions.

The AlertBot Showdown blogs will evaluate each website’s performance based on four categories, including reliability, speed, geographical performance and usability, complete with time-based trends and detailed analytics.

This month’s scrimmage pits rivals Apple.com against Samsung.com. With two titans of industry like these going head to head, the results were, for the most part, not unexpected.  Read the full report here.

AlertBot continues to remain on the cutting edge of website performance. With 85 Global Test Locations operating over 7 Internet Backbones developed during the past decade, AlertBot has established their reputation in real-world private industry applications. AlertBot serves over 10,000 users spanning 6 continents worldwide with 200 million website checks per month. Their Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine.

About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.


About InfoGenius.com, Inc.:

Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.

]]>
AlertBot Now SAM Certified for Government Website Monitoring https://www.alertbot.com/blog/index.php/2016/09/22/alertbot-now-sam-certified-for-government-website-monitoring/ Thu, 22 Sep 2016 17:21:16 +0000 https://alertbot.wordpress.com/?p=255 AlertBot Now SAM Certified for Government Website Monitoring

Allentown, PA / September 21, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading TrueBrowser®-based web application monitoring solution, AlertBot, is pleased to announce that they are now SAM certified and are looking to grow their relationships with Federal and State Governments. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions. Downtime of any length can be costly for any website or online business; AlertBot’s Website Monitoring Service uses TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on government websites, login pages and other mission-critical pages. Learn More about Trusted Government Website Monitoring.

“With 85 Global Test Locations operating over 7 Internet Backbones developed during the past decade, AlertBot has established their reputation in real-world private industry applications; this level of website testing and monitoring is both proven and ready for Public Service Deployment as Federal and State Agencies rollout ‘Next Gen’ consumer style, interactive websites,” states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “We’re looking forward to showcasing AlertBot’s TrueBrowser® technology and capabilities to Governmental Agencies throughout the country and help them validate their client usage.”

AlertBot serves over 10,000 users spanning 6 continents worldwide with 200 million website checks per month. Their Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives government organizations, including the U.S. Department of Energy, Virginia state government, NOAA, U.S. Marine Corps, and Smithsonian Institution, the information they need to ensure their applications are always running error-free and providing a quality user experience. AlertBot has registered to do business with Federal and State Agencies using the following registrations: DUNS: 624818493; CAGE: 6QP16; NAICS: 518210, 454111, & 334290.

About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.

About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.

###

]]>
AlertBot Celebrates 10th Year of Website Monitoring https://www.alertbot.com/blog/index.php/2016/04/11/alertbot-celebrates-10th-year-of-website-monitoring/ Mon, 11 Apr 2016 10:50:35 +0000 https://alertbot.wordpress.com/?p=185 AlertBot Logo

Allentown, PA / April 11, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading real-time web application monitoring solution, AlertBot, celebrates a decade of website and server monitoring. Downtime of any length can be costly for any website or online retailer; AlertBot’s Website Monitoring Service provides best-in-class site monitoring using its TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on e-commerce-driven websites, login pages and other mission-critical pages. AlertBot serves over 10,000 users with 200 million website checks per month using its network of over 100 locations, spanning 6 continents worldwide.

“AlertBot measures every facet of a website to help our clients improve the user experience; our testing helps clients make adjustments that result in measurable gains – for instance, a major e-commerce player measured gains of $1.4 million for every second of response time their platform improved – that small improvement netted them $18 million in revenue!” states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “Over the past 10-years, AlertBot has been deployed and proven in countless real-world applications by some of the leading names in the e-commerce space.”

AlertBot’s Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives businesses including Blue Cross/Blue Shield, Chrysler, Mutual of Omaha, Sony, Microsoft & Dell Computing the information they need to ensure their applications are always running error-free and providing a quality user experience.

An illustration showing a robot with a party hat and holding a birthday cake. Text reads "AlertBot Celebrates 10 Years"

About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.

About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.

]]>
Don’t Let Third-Party Code Wreck Your Website! https://www.alertbot.com/blog/index.php/2016/03/31/dont-let-third-party-code-wreck-your-website/ Thu, 31 Mar 2016 17:44:44 +0000 https://alertbot.wordpress.com/?p=182 When you’re evaluating a website’s performance, you may find that several culprits could come into play that can bog down your website’s load time. Today, we’re going to take a look at one of the biggest – if not the biggest – causes of web performance problems: third party code.

If you’re not quite sure what that is, third party code is usually any code provided by another company or website to plug in / embed a service on your website. For example, you may have a web stats tracking code, a banner ad rotator, or a couple lines of code that drops your Twitter or Instagram feed onto your website. These pieces of code are considered third party code since they’re provided by another source.

2015-Q1-Third-Party-code-cropped

Some of the problems that this kind of code can cause may be:

  • Slow page load times
  • SSL errors (there could be a non-secure component in the code)
  • Unexpected javascript errors of various kinds
  • Failure to load some of your website content
  • Inaccurate stats tracking

The case of causing inaccurate stats is a particularly interesting one that most people don’t consider. Problems with third party code could render your website’s stats unreliable if the stats code is not fully loading. When this happens, you may only be getting partial information about your visitors or no information at all.  If you make business decisions based on those stats, you may be making the wrong decisions based on misinformation.

In the case of third-party code causing slow page load times or loading errors, it affects your visitors’ experiences on your website.  Unhappy visitors may choose not to buy from you and often times won’t ever return to your website.

So what can you do in this situation? First off, you’ll want to diagnose the problem to make sure it is indeed the third party code causing the problems. AlertBot is an excellent service to use for finding out what is causing a bottleneck in your load time.

Once you know for sure that it is the third party code creating the issue, here are a few things you can do to resolve problems with third-party code:

  1. Ask the Third-party provider to resolve the problem – The solution may be as simple as contacting the third party, informing them of the issue(s) you’re having and asking them to fix it. It’s possible that they’re not even aware there’s a problem.
  2. Remove the third party code altogether – This may be the quickest and easiest solution, but obviously it doesn’t solve the problem if you really need the code on your site.
  3. Look for other third party code providers – This may be your best course of action. While it can be time consuming to search for viable solutions, if you need the code, trying something else out could be the most sensible option. And if you can find reviews on the solution from other users who have tried it out, that’s even better.
  4. Move to a purchasable / installable application – Free third-party code is great, and just dropping in a piece of third-party code is a nice time saver, but sometimes taking the high road and paying for an installable solution (with support) could be the best option for your business, especially when your own customers or clients are involved.
  5. Ask a web developer to look at it – This might not be possible for every site owner, but it’s an especially good option if your company has a programming department. There’s a good chance that just moving the code to a different location in your page’s HTML (or onto a different page altogether) could drastically improve the situation.

So, as you can see, third party code can greatly impact your website. And if you’re experiencing some web performance issues and you’re utilizing third party code, there’s a pretty good chance that code may be the catalyst for those problems.

Sign up for a risk-free trial of AlertBot today and start down the path to better performance for your website.  AlertBot can track the performance of all your third-party code and lets you know when it’s causing problems.

]]>
Lessons from the Dentist for Your Website’s Performance https://www.alertbot.com/blog/index.php/2016/02/25/lessons-from-the-dentist-for-your-websites-performance/ Thu, 25 Feb 2016 14:10:47 +0000 https://alertbot.wordpress.com/?p=173 Every successful website needs to undergo tests and retests (and re-retests) to ensure that the site performs well for all visitors. It’s one thing to test how your site looks in different web browsers, but what about how it actually performs?

Running an AlertBot waterfall chart is just one example of something IT and website managers can do to see how the site is performing with load times. Before you release your site’s new design, complete overhaul or its grand debut, it’s wise to give your site a thorough testing first. A simple test with AlertBot’s waterfall chart can reveal images or third party code that might be clogging up your load time — or many other possible hang-ups. Your site might look fine and dandy, but if the page is taking too long to load in this fickle web-browsing age we’re living in, it could be very costly for your business.

Waterfall-2016(Above is a real, abbreviated example of AlertBot’s waterfall charts)

In the end, it’s really like a visit to the dentist; they often give you tips and guides on how to prevent cavities and other oral problems, while helping you maintain good oral hygiene. Maybe using mouthwash or flossing daily will help keep your gums healthy and your teeth strong. Likewise, with the right web performance tools and tests, you can ensure quality conversions and hopefully prevent any possible decay in your site’s performance.

See for yourself with AlertBot’s completely free 14 day trial!

]]>
Black Friday 2015 Web Performance Report https://www.alertbot.com/blog/index.php/2015/11/30/black-friday-2015-web-performance-report/ Mon, 30 Nov 2015 21:39:02 +0000 https://alertbot.wordpress.com/?p=150 Although today–Cyber Monday–is a glorified online extension of the annual brick and mortar post-Turkey Day national shopping binge, let’s take a look at how some of the top online retailers performed over the holiday weekend.

While websites like Walmart, Fanatics and QVC experienced a couple several-minute outages on Thanksgiving Day, one of the sites that seemed to struggle the most on Black Friday this year was the online destination for department store Neiman Marcus. The site even experienced a downtime of two hours in the morning.

Black and white graphic of a twisted, bent shopping cart in white on a black background. Text reads "Black Friday 2015"Second only to NeimanMarcus.com, however, is online tech retailer Newegg.com, who experienced some slow page load times no doubt due to the heightened traffic. Finally, Staples.com also experienced some short outages, but nothing more than a few minutes each.

Through Saturday and Sunday, it was much of the same with Neiman Marcus, Staples and Newegg, with Walmart seeing a few hiccups and Shutterfly.com experiencing a 45-minute outage due to the server being overloaded with traffic. Sony’s Playstation Network also experienced some significant downtime on Saturday, which also affected their online store.

Downtime of any length can be costly for any online retailer. According to this article by Evolven.com, “The average cost of data center downtime across industries was $5,600 per minute.” Clearly, that would add up real quick – especially on a major shopping day.

With AlertBot’s monitoring services, not only can you be alerted the moment your site experiences an outage or slow load times, but you’ll be able to use the AlertBot charts and reports to find potential hang-ups and future problems that will result in unnecessary downtime.

Give AlertBot a try with our totally free trial period and start seeing how AlertBot can look out for your business to help you prevent serious financial loss and online disasters.

]]>
Get Your Website Ready For Holiday Traffic https://www.alertbot.com/blog/index.php/2015/09/17/get-your-website-ready-for-holiday-traffic/ Thu, 17 Sep 2015 16:52:36 +0000 https://alertbot.wordpress.com/?p=140 A graphic showing a computer monitor with a cracked screen with fragments flying around. Text reads "Black Friday"

Get Your Website Ready For Holiday Traffic

It’s that time of year again. As we say farewell to summer and prepare for the coming of autumn next week, online retailers are faced with one harsh reality: Black Friday is a mere two months away. And while that may seem like a long time from now to some, now is really the time for preparation. And just like any brick and mortar retailer needs to have their store ready to go with employees on hand to wrangle the shopping masses, websites need to make sure their site is tuned up and ready for an influx of traffic.

If you’re feeling pretty confident that you’re ready and that this warning may seem premature or unnecessary altogether, let’s take a moment to spotlight last year’s Black Friday festivities and pitfalls.

The biggest name to have experienced major website failures last November was electronics retail chain Best Buy. Issues were recorded and reported on throughout the day on Black Friday and it sent social media abuzz with chatter and complaints about the site’s performance—or lack thereof.

Best Buy error page with an illustration of a wreath with a bow

Best Buy wasn’t the only one affected, however. Computer company HP’s webstore also experienced failure, while in the UK, online stores Currys (electronics), Argos (department store) and Tesco (groceries) all went down as well.

So what can we glean from this?

If you’re an online retailer, you’re probably already thinking about the holidays and getting prepared, but now is the most crucial time to not only make sure you have reliable website monitoring, but to evaluate your website’s performance so you can make improvements before the big online sale days. And you’re in luck – AlertBot can assist with your performance evaluation and help you rest assured that your site will perform better in time for the holidays. Try it out for free with our 14-day trial.

]]>
Use AlertBot To Monitor The Competition https://www.alertbot.com/blog/index.php/2015/08/25/use-alertbot-to-monitor-the-competition/ Tue, 25 Aug 2015 18:23:45 +0000 https://alertbot.wordpress.com/?p=137

Use AlertBot To Monitor The Competition

When most of us think of “website monitoring,” we usually think about how it applies to our own websites. However, website monitoring really has more uses than we may realize or consider.

Truth be told, while using AlertBot to keep an eye on our own websites and pinpoint problems that need fixing, we can actually set up monitors for any site—not just our own. This means we can actually monitor the competition as well.

The upside to monitoring the competition is that you can get an idea of how a competing website might be performing from around the world, and gauge whether your website is competing as well in those areas. Furthermore, you can see how long their page load times are and find out what features on their website may be slowing them down. It could help you figure out what to avoid in your own design or focus on what to do better in your market, for example.

Photograph of rooftop spyglass

You can test-drive this concept with our free, risk-free 14-day trial. Try it out today and start gathering actionable data on your website – and your competition’s!

]]>
Are You Testing Your Site’s Performance With Different Browsers? https://www.alertbot.com/blog/index.php/2015/07/24/are-you-testing-your-sites-performance-with-different-browsers/ Fri, 24 Jul 2015 17:45:20 +0000 https://alertbot.wordpress.com/?p=128 Are You Testing Your Site’s Performance With Different Browsers?

Web developers know browser compatibility can be a real headache, however, browser compatibility doesn’t just affect web developers. Recently, one AlertBot customer received an alert that their site had failed. When investigating the failure, they found that their site was actually not completely down, but that AlertBot had discovered that their site had stopped working in just one browser. Their website was working fine with Chrome, Internet Explorer (IE), Safari, etc, but had stopped loading with Firefox. Thanks to AlertBot’s TrueBrowser™ Monitoring options, which allowed them to test their website in multiple browsers, they were able to identify the problem with that one browser quickly and fix it.

For web developers, it’s easy to simply open your site in each of the popular web browsers to check it for compatibility, find that it’s working smoothly, and then never follow-up on it again. However, websites, servers and backend resources change often. AlertBot’s TrueBrowser™ Monitors can be set up to check your site regularly with each of the popular web browsers and make sure nothing has changed. So, for example, with AlertBot, you can set up a Test Scenario to check your website with Chrome, another one to check it with Firefox, then another with IE, etc. This way, you’ll know the very instant your site stops functioning within one of these popular browsers.

Browser logos circled around the AlertBot logo

It’s also just a super easy way to not have to worry about browser compatibility as often. Think about it; these days, web browsers are constantly auto-updating to new versions and web masters are constantly updating their websites. It’s a lot to keep up with–testing your site’s performance with each browser every time this happens–so having something as simple as an automatic browser monitor frequently testing your site’s reliability is one less worry for website owners.

Take the AlertBot TrueBrowser™ Monitor for a spin with a completely free trial and let us start watching your back for you!

]]>
What’s Google Up To With Recently Spotted “Slow” Icon? https://www.alertbot.com/blog/index.php/2015/02/27/whats-google-up-to-with-recently-spotted-slow-icon/ Fri, 27 Feb 2015 22:50:22 +0000 https://alertbot.wordpress.com/?p=99 Earlier this week, on Tuesday, Google+ user K Neeraj Kayastha discovered a new technique Google’s search engine may be getting ready to implement abroad that will warn mobile users of potentially sluggish links before clicking.

Neeraj posted screenshots from his personal Android browser that indicated a new red “SLOW” icon branded next to links for YouTube and even a Google search result (scholar.google.co.in, to be exact). Today, we tried to replicate the same result on an iPhone, but were unable to bring up any “Slow” icons on our search results. (And comments on Neeraj’s report page seemed to reflect similar experiences.)

Screenshot of a mobile screen with Google search results

So what does this mean? It’s possible that Neeraj happened to stumble on a brief Google testing of an upcoming new search result feature, and if this is indeed on the horizon for the near future, website owners may want to do all they can to avoid that little dreaded scarlet branding.

Should this feature come to light soon, now would really be the ideal time to find a Website monitoring solution for your business’s website to ensure visitors and new clients aren’t deterred by Google’s little warning.

Click here for a list of solutions and more info on how AlertBot can help.

]]>