AlertBot – The Official Blog https://www.alertbot.com/blog/ Thu, 29 Jan 2026 18:42:13 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 Why Your Website Monitoring Solution Needs a Do-Not-Disturb Feature https://www.alertbot.com/blog/index.php/2023/03/27/why-your-website-monitoring-solution-needs-a-do-not-disturb-feature/ Mon, 27 Mar 2023 18:20:43 +0000 https://alertbot.wordpress.com/?p=907 AlertBot blog titled "Why Your Website Monitoring Solution Needs a Do-Not-Disturb Feature" with a photo of a hotel room out of focus in the background and a "Do Not Disturb" sign on the doorknob in the foreground. AlertBot logo is placed in the lower right.

Why Your Website Monitoring Solution Needs a Do-Not-Disturb Feature

It is so low-tech that Gen Z’ers and other digital natives may faint (or perhaps the avatar in a VR game that they are playing may faint) to learn that one of the greatest inventions in the history of our species is the humble do-not-disturb sign. Indeed, this magical placard is like having a very own private Gandalf shouting: YOU SHALL NOT PASS!

However, the glory of do-not-disturb is not limited to hotels, motels, and teenagers’ bedrooms. It is also a must-have feature in website monitoring solutions.

Why is a Do-Not-Disturb Feature So Important?

It does not take a Jeopardy! champion to know that do-not-disturb means (…wait for it…) “do-not-disturb” — which seems like the very last thing that organizations would want if there are site performance issues. On the contrary, the alarm bells via SMS, email and/or phone call should ring loud and clear. Or…maybe not.

In some cases, it makes perfect sense to pull individuals or teams off the notification list. For example:

  • An individual such as a CTO or SysAdmin is on leave (vacation, illness, personal, etc.).
  • A site is being tested, migrated or updated, and as such it will go offline for a specific and predictable period of time.
  • Per policy, certain individuals/teams do not get alerted to website performance issues during specific times (e.g. after hours, weekends, holidays, etc.).

What to Look For

A do-not-disturb feature is essential. But this does not mean that all website monitoring solutions that claim to offer this are in the same class. Here is what to look for:

  • An extremely easy-to-use dashboard. Advanced technology is dizzyingly complex. But if you need a PhD in computer science and a wall full of industry certificates to use the do-not-disturb feature, then keep looking. Easy does it.
  • The ability to create as few or as many do-not-disturb events as required, and to see all of them at-a-glance through the extremely easy-to-use dashboard.
  • The option to add notes so that everyone understands why a do-not-disturb schedule was created. For example: “we are migrating to new site on 2/15 and as a result the site will be down from 12:00am to 4:00am”.
  • The option to select individuals or teams who will not get notified.
  • The option to schedule the do-not-disturb as a one-time or ongoing event.
  • The option to select what sites the do-not-disturb applies to. For example: the Dev Team will not be alerted to uptime issues for, say, the “ACME Anti-Roadrunner Anvils” site from 12:00am to 5:00am on 2/15, but they will be alerted to performance issues for all other sites that are being monitored.

The Bottom Line

 Without a versatile do-not-disturb feature, members of your organization will be very disturbed — because at certain times, they will be alerted to website performance issues that they cannot and should not do anything about. This is a waste of time and resources, and can trigger confusion and chaos (and, let’s face it, it’s not great for blood pressure levels, either).

AlertBot’s website monitoring solution has a built-in do-not-disturb feature that checks ALL of the boxes described above. Learn more with a free trial. There is nothing to download and install, no billing information required, and you will be 100% setup in minutes. Get started now: click here.

]]>
A Closer Look at AlertBot’s Failure Reporting Feature https://www.alertbot.com/blog/index.php/2023/02/21/a-closer-look-at-alertbots-failure-reporting-feature/ Tue, 21 Feb 2023 20:32:09 +0000 https://alertbot.wordpress.com/?p=903 AlertBot Blog titled "A Closer Look at AlertBot's Failure Reporting Feature" with image of a man with a headset on sitting at a computer in front of a screen that looks like a NASA space terminal.

The year was 1995. Michael Jordan returned to the NBA. Amazon sold its first book. Windows 95 unleashed the era of taskbars, long filenames, and the recycle bin. And when people weren’t dancing the Macarena, they were flocking to see Apollo 13 and hear Tom Hanks utter the phrase that would launch millions of (mostly annoying) impersonations: “Houston, we have a problem.”

Thankfully, the eggheads in space and the eggheads on the ground worked tirelessly (and apparently smoked a whole lot of cigarettes) to get the crew home. But it was the pivotal moment when the failure was first reported that triggered the spectacular problem-solving process. If it happened an hour — or maybe even a few minutes — later, then the outcome could have been tragic instead of triumphant.

Admittedly, the brave, intrepid professionals in charge of keeping their organization’s website online and functional DON’T have to deal with life-and-death scenarios. But they DO need to deal with problems that, if left unsolved, will significantly damage competitive advantage, brand reputation and sales (immediately if we’re talking e-commerce, and eventually if we aren’t). And that’s where AlertBot’s failure alerting feature enters the picture.

What is Failure Alerting?

Failure alerting is when designated individuals — such as a SysAdmin, CTO, CIO, CEO, and so on — are proactively notified when something goes wrong with a website, such as downtime, errors, slowness, or unresponsive behavior.

As a result, just like in Apollo 13, the right people can take swift, intelligent action to fix things before visitors/customers sound the alarm bell, or worse, head out the (virtual) door and go straight to a competitor without looking back.

Notification Methods

AlertBot customers can choose any or all of the following methods to notify team members of a website failure event:

  • Email
  • Text Message
  • Phone Call

For example, a SysAdmin could receive an email, a text message, and a phone call the moment something goes wrong.

Automatic Escalation

Now, if we were in NASA Mission Control circa 1970, someone wearing really thick horned-rimmed glasses would rise above the cigarette smoke and ask: What happens if the SysAdmin doesn’t receive the email, text message, and phone call? It’s a good question, and there is an even better answer: don’t worry about it.

AlertBot’s failure reporting feature can be configured to escalate the website failure warning if certain individuals don’t respond within a specific timeframe. For example, if a SysAdmin is indisposed for any reason (driving, sleeping, etc.), then after two minutes the alert can be pushed to another designated team member such as the CTO. And if the CTO doesn’t respond within two minutes, then the alert can be pushed to the CIO, and so on.

Ideally, the individual (or multiple individuals) who are sent the first alert receive it immediately, and take rapid action. But if they don’t or can’t, then the alert is escalated accordingly. It is important to note that all of this happens automatically, so there is no possibility of human error.

Granted, none of this is as entertaining as watching Apollo 13. There’s no rousing soundtrack or Tom Hanks. Heck, there’s not even Kevin Bacon.

But when it comes to fixing website problems as quickly as possible, organizations know that the less drama, the better. That’s precisely what AlertBot’s multi-channel, auto-escalating failure reporting feature delivers. We don’t need an Oscar. We just need extremely satisfied customers — and we have a lot of those.

 

Next Up: Reviewing Failure Events Online

 In our next blog, we’ll explore reviewing failure events online to pinpoint issues and detect problems. Stay tuned!

Launch a free trial of AlertBot’s acclaimed site uptime monitoring solution. No credit card. Nothing to download. Get started in minutes. And if you decide to purchase our solution, there are NO setup fees!

]]>
AlertBot’s BattleBots World Championship VII Las Vegas Set Visit https://www.alertbot.com/blog/index.php/2023/01/16/alertbots-battlebots-world-championship-vii-las-vegas-set-visit/ Mon, 16 Jan 2023 19:12:58 +0000 https://alertbot.wordpress.com/?p=885 AlertBot's BattleBots World Championship VII Las Vegas Set Visit with a photo of five people looking at the camera posing at a robot workshop booth.

Sometimes it just makes too much sense. When the opportunity arose for AlertBot to sponsor one of the highly talented teams in the BattleBots tournament, it just seemed like a no-brainer. (I mean, come on — we’re AlertBOT… it’s a match made in robotic heaven!) In this case, we were able to be among the select sponsors for team Whiplash, a much-celebrated family-run team that regularly competes in BattleBots. As part of the sponsorship, the Whiplash gang invited us to witness the filming of the latest BattleBots season in Las Vegas, Nevada, and it didn’t take much convincing for us to start booking our trip to Sin City.

A pair of us from the AlertBot team flew out to Vegas to meet the Vasquez family – AKA collectively known as Whiplash – on Monday, October 17th, 2022, to get a personal tour of the facilities. We met with Whiplash’s Debbie Vasquez (Whiplash Team Manager), who graciously showed us around the BattleBots pit area, and was an absolute delight to talk to. She even introduced us to other teams that we could speak with and see their bots prior to the fights. We met with teams that traveled as far as Australia (DeathRoll) to be here for the filming of the show. We enjoyed meeting the entire Whiplash team, which included Matthew Vasquez (Whiplash Team Captain, Designer, Builder and Driver), Jason Vasquez (Whiplash Builder, Auxiliary Weapons Operator, Pit Crew ), Jeff Vasquez (Whiplash Team Builder, Pit Crew), Debbie Vasquez (Whiplash Team Manager) and others on their team. They were all just like you see them on TV and a pleasure to be around.

2021 marked the first year that a new BattleBots arena building was set up to be a permanent hub for BattleBots tournaments. Next to the main arena building is a small collection of tents for various specialties dedicated to the needs of the BattleBots teams. Right alongside the arena is a designated welding area, where Lincoln Electric is set up to assist the teams in working on — or fixing — their respective bots. On the other side of these small tents is the main pit area tent, where one would find every single team set up inside with individual workstations for each team. It looked very much like a tradeshow with tables promoting the teams or selling merch. However, these are quite literally stations where the teams feverishly work on their bots — whether setting them up for their first fight or rebuilding them after a particularly violent encounter. Each team’s work area was also graced with a widescreen TV so they could watch the fights live while working, keeping the builders in the loop as to the progress of the new season. The hope and excitement in that pit area on the eve of the first day of filming the new season was palpable. Sadly, while each match would result in a winner, there must also be a loser.

We were amazed by the goodwill between the teams, too. You might expect there to be a cutthroat competitive nature between them, but instead, there was a shocking amount of love and admiration shared among the teams. By the way they behaved, you would think they were all on the same team together. It was hard to imagine these teams remaining friends after one might totally debilitate or demolish the bot of another. But somehow, they do. Still, it was impossible not to notice the passion, detail, and effort that went into each bot. Each team had immense hope of success with their bots, and you almost couldn’t imagine their hard work resulting in utter heartbreak.

The following day, we arrived early to make it through the front gate check-in area and join the VIP’s in finding a place to sit inside the arena in the audience on the bleachers. Each taping session is 4 hours long, and each day includes 2 of these recording sessions, with a 2-hour break between them. Fans can buy tickets to any of these sessions (pending ticket availability, of course) online, so they could attend one of these sessions, or both if they desired. We attended both the morning and the afternoon sessions that first day, with a set number of fights occurring in each session and extras squeezed in if possible.

Fans were expected to be very impassioned and involved in each taping session and were often instructed to cheer at specific times. Granted, you don’t have to tell these fans to be excited; they just naturally were. But for taping reasons, there needed to be specific moments of cheering and reactions from the fans to make the event appear smooth for the episodes that would air.

Everyone you’d expect to be in attendance at a BattleBots taping was indeed there. Announcers Chris Rose and Kenny Florian were there to offer their pre- and post-fight announcer commentary, and Faruq Tauheed was there to announce each fight (or, in some cases, re-announce the fight, if he or the producers needed a different take from him). The judges, who would clarify any close-call fights were also on the other side of the arena cage, and we’d learn of their final verdict when Faruq made his official announcement.

For the audience, comedian Bill Dwyer, who was the host of the show during its first iteration in 2000 and 2001, played hype man to the audience, and was just a lot of fun. He interacted with us on a personal level, as well as getting the younger fans engaged (and often rewarding them with free t-shirts and such). He would fill in the downtime between fights, which helped some of the slower moments pass by more quickly.

Members of the individual bot teams also would frequently run over to the stands and hand out signs or stickers to fans to enjoy or hold up during their fight to cheer them on. It was a neat little bonus for being there in person.

A given fight would start with Faruq’s announcement, the teams walking out (and posing), and their bots being wheeled into the arena “battlebox” on hydraulic carts. After setup, the countdown would begin, and the bots would go at each other for the win. Each fight is given 3 minutes total to play out, which were easily the most exciting minutes of the day, but some fights didn’t last even half that time. A fight would end early if one bot rendered the other undriveable, but other fights would last the full three minutes and then go to the judges to make the final call as to who the winner would be. In most of those cases, the winner would still be chosen “unanimously” across all the judges.

The fights were all pretty exciting. One match ended after about 20 or 30 seconds with a super quick KO, while a couple others needed the full time to complete. One particular fight ended with a bot catching on fire and it would take some time for the arena to be cleared and readied up for the next fight. In the second session, a pair of bots got stuck together after less than 30 seconds of fighting, and after quite some time trying to get them apart, they were cut apart and taken out of the arena for the next fight to commence. There was definitely no shortage of memorable moments during a full day of filming!

When we left Vegas for home, we took along with us a new appreciation for BattleBots and their talented teams. It’s a sport that appreciates its fans and has a surprising amount of heart on and off camera (especially off camera). We only witnessed a handful of the fights that will be televised next year, but you can be sure we’ll be tuning in to watch these teams go head-to-head for the championship! Fans can tune in on Thursday’s at 8pm (check your local listings) to see the new season of BattleBots on The Discovery Channel. Go, Whiplash!

A photo of the BattleBots cage with a monitor in front showing the BattleBots logo.

Two men standing with arms folded in the BattleBots arena archway. Banners of previous winners hang above them.

The BattleBots stage with a production crane set off to the right. Battlebots shown in large lettering above the set backdrop.

A view into the BattleBots caged arena from outside the arena. A monitor with the BattleBots logo on it is displayed above it.

Two men with arms folded standing in the BattleBots arena archway with the Battlebots arena cage in the background.

The BattleBots caged arena view from the stands. A monitor with the BattleBots logo hangs in front of the cage from the ceiling.

An outside view of the BattleBots building in Las Vegas with a hotel in the background. A metal fence is shown in the foreground with the parking lot behind it.

]]>
Connect with AlertBot on Social Media! https://www.alertbot.com/blog/index.php/2022/12/15/connect-with-alertbot-on-social-media/ Thu, 15 Dec 2022 21:08:51 +0000 https://alertbot.wordpress.com/?p=876 A tabletop with a coffee cup, grid paper, keyboard, pen, AlertBot logo, the word "follow us" formed from the cord of a mouse with the logos of facebook, instagram, youtube, twitter, and linkedin

Hey friends! You can follow AlertBot on social media channels at the following links:

X (Twitter): https://www.twitter.com/AlertBot
Facebook: https://www.facebook.com/AlertBot/
Instagram: https://www.instagram.com/alertbot/
Threads: https://threads.net/alertbot
YouTube: https://www.youtube.com/AlertBot/
LinkedIn: https://www.linkedin.com/company/alertbot/

]]>
What is Proactive ScriptAssist and Why is it a Game-Changer? https://www.alertbot.com/blog/index.php/2022/12/06/what-is-proactive-scriptassist-and-why-is-it-a-game-changer/ Tue, 06 Dec 2022 20:12:31 +0000 https://alertbot.wordpress.com/?p=871

AlertBot blog titled "What is Proactive ScriptAssist and Why is it a Game-Changer?" with photo of a brown-haired woman in a white t-shirt and plaid button down shirt hiking and reaching up to grab the hands of someone helping to pull her up.

What is Proactive ScriptAssist and Why is it a Game-Changer?

Sometimes — not often, but every now and then — we come across an invention that is so remarkably useful, that we wonder: how did I survive without this?

High speed internet comes to mind. So do GPS devices. And who wants to imagine a world without the cronut?

Well, it’s time to add one more invention to the list: Proactive ScriptAssist.

The Back Story
Websites are not static things. They change over time; sometimes in minor ways, and other times in major ways (for fun, check out the Internet Archive’s Wayback Machine to see what some of your favorite websites looked like in the past — like Apple’s home page from 1996 which invites folks to learn about “the future of the Macintosh”).

Now, for visitors, the fact that websites constantly change is not a problem. In fact, it’s often a good thing because the change is an update, addition, or improvement of some kind.

But for IT and InfoSec professionals who are in charge of (among other things) website monitoring in their company, these changes can — and often do — trigger all kinds of bugs and errors. Fields and forms stop working, elements stop loading (or they load v..e..r..y….s..l..o..w..l..y), and there can be security vulnerabilities as well.

Multi-Step Monitoring
Thankfully, there is a way to verify that everything is working before site visitors start sounding the alarm bells — or worse, disappearing never to return.

This method is to implement an easy-to-use web recorder to create scripts of what site visitors actually/ typically do on various web pages, and make sure that everything is working properly. This is highly effective. That’s the good news.

The not-so-good news, is that when changes occur — even fairly small ones — re-scripting monitors can be a complex process that, in some scenarios, may require a level of expertise and experience that some IT/InfoSec professionals don’t have.

What’s the solution to this obstacle? Let’s all say it together: Proactive ScriptAssist!

About Proactive ScriptAssist
Available EXCLUSIVELY from AlertBot, Proactive Script Assist is an optional plan that includes the following:

  • Our team watches over an account, and proactively re-scripts any monitors that fail. We do all of the work, and our team has years of experience. After all, we created the technology, and we know how it works!
  • Failing monitors are evaluated within 3 hours, and the customer is notified of the situation.
  • Failing monitors are re-scripted within 3 to 24 hours (our response time is rapid, but the actual duration depends on the complexity — some re-scripting efforts take longer than others).
  • Customers get unlimited re-scripting and configuration updates from our team year-round.

Plus, if needed our team offers advanced support over remote desktop sessions (join.me sessions). This is not always necessary, but it is another layer of help just in case.

The Bottom Line
Inventions that changed our lives: High speed internet. GPS. Cronuts. And now, AlertBot’s Proactive ScriptAssist. It’s an elite list, and one that we’re honored to join.

Learn More
Ready to make your IT/InfoSec teams weep with joy (which is nothing like the weeping they did that time the intern wiped out the backup)?

If you’re a current AlertBot customer, then contact your Account Manager today.

If you haven’t yet experienced AlertBot, then start your free trial today. You’ll be setup in minutes. No billing information, nothing to install, and no hassle.

Now, if you’ll excuse us, we’re going to read about the future of the Macintosh while enjoying a cronut or two (or 5).

]]>
Just How Bad is a Down, Slow, or Dysfunctional Website? It’s Worse than You Think! https://www.alertbot.com/blog/index.php/2022/09/22/just-how-bad-is-a-down-slow-or-dysfunctional-website-its-worse-than-you-think/ Thu, 22 Sep 2022 17:31:50 +0000 https://alertbot.wordpress.com/?p=866 AlertBot Blog titled "Just How Bad is a Down, Slow, or Dysfunctional Website? It’s Worse than You Think!" with an aerial view of a man with his hand on a laptop keyboard with the word "Waiting" and an hourglass on the monitor screen.

Just How Bad is a Down, Slow, or Dysfunctional Website? It’s Worse than You Think!

Have you ever watched a movie (*cough* Godfather III) and said to yourself: “wow, this is so incredibly bad — I don’t think this can get worse!” But then it does. Much, much worse.

Well, having a down, slow, or dysfunctional website is similarly nightmarish — just when you think the reputation devastation is finally over, there’s more on the horizon. With apologies to Shakespeare: hell hath no fury like a customer scorned.

Not convinced? Here’s what happens to companies that get on the wrong side of their customers:

  • 61% of customers say they will switch to a new brand after one bad experience. (Zendesk)
  • 13% of customers who have a negative experience will tell 15+ people. (Esteban Kolsky)
  • $289 is the average value of every lost business relationship in the U.S. per year. (Neil Patel)

Scary stuff, huh? “But wait — there’s more!”

These days, many unhappy customers publish reviews to punish companies that fail to meet their expectations. But guess what? These eviscerating appraisals are not just seen by other potential customers (many of whom quickly decide not to move into becoming actual customers). They are also seen by potential job candidates who are not enthusiastic about joining an organization that is used as target practice by denizens of the interwebs (everyone from THE ALL CAPS BRIGADE to the “tl;dr” force to the League of Extraordinary Grammarians).

However, just as all nightmares eventually come to an end (hey, even Godfather III mercifully rolls credits at the 2-hour-42-minute mark), there is something that companies can do to dial back — or better yet, eliminate — customer outrage caused by a down, slow, or dysfunctional website: get AlertBot.

AlertBot’s fully integrated monitoring platform monitors all your websites, web applications, mobile sites and services — all in one place. Unlike many other products in the marketplace, AlertBot doesn’t merely monitor a URL’s basic availability. It dives much deeper and monitors full page functionality using real web browsers in order to verify every page element, script, and interactive feature. As a result, you can proactively scan for errors, track and optimize load times, pinpoint issues, and get alerted to problems and failures.

The bottom line? A down, slow, or dysfunctional website can be so catastrophic that it makes Godfather III look like, well, Godfather I or Godfather II. Don’t hope for an Oscar just to win a Razzie. Get AlertBot and inspire your target audience to cheer vs. churn.

Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes.

]]>
How Site Uptime Impacts SEO (Hint: It’s a REALLY Big Deal) https://www.alertbot.com/blog/index.php/2022/08/11/how-site-uptime-impacts-seo-hint-its-a-really-big-deal/ Thu, 11 Aug 2022 18:55:16 +0000 https://alertbot.wordpress.com/?p=860 AlertBot blog titled "How Site Uptime Impacts SEO (Hint: It’s a REALLY Big Deal)" with photo of a woman's hands pointing at a web browser featuring Google on an iPad tablet.How Site Uptime Impacts SEO (Hint: It’s a REALLY Big Deal)

It is arguably the most important 3-letter acronym on the digital marketing landscape. No, it’s not ROI. It’s SEO. Consider that:

Clearly, effective SEO is extremely important. And for many businesses — especially smaller companies that are competing against big, established enterprises — it’s a matter of survival. However, for some decision-makers outside of the digital marketing world, the link between SEO and site uptime is less clear. Let’s fix that.

For Search Engines, it’s All About Relevance

 Realtors like to point out that the three most important factors in evaluating a property are: location, location, and location. Well, the big brains behind search engines like Google and (to a lesser extent) Bing and Yahoo are obsessed with: relevance, relevance, and relevance.

What this means, is that when responding to a search query — anything from “tennis rackets” to “what’s this itchy red bump on my foot?” — search engines strive to produce results that will be seen by searchers as relevant. Otherwise, eventually searchers will switch search engine brands (e.g. leaving Google and using Bing). Relevance is the glue that keeps the relationship sticky. And unlike with those glorious model airplanes that many of us failed to create when we were kids, in this case, the more glue the better.

 Downtime Damages Relevance

 Since search engines strive to deliver relevant search results (and therefore positive user experience), it makes sense that downtime — which can be defined as a site being inaccessible or outright disappearing — is the enemy.

After all, if a searcher looking to buy a tennis racket clicks a site and discovers that it’s unavailable, then they won’t just punish the company that they hoped to engage: they will, in time, punish the search engine that pointed them in that direction. That fear keeps search engine folks awake at night (including mighty Google which commands more than 90% of the desktop and mobile search marketplace), and it explains why downtime is such a threat: it damages relevance.

 Is 100% Uptime Absolutely Vital?

This warning about downtime begs an important question: do companies that want to stay far, far away from Google’s, Bing’s and Yahoo’s penalty box have to ensure 100% uptime? Not necessarily. While uninterrupted availability is ideal, it is not realistic. Occasionally, a site will go down for a few seconds or perhaps even longer. There are a variety of reasons for this, such as problems with a web host, an unexpected spike in traffic, and ol’ fashioned human error (hey, we all make mistaks…er, mistakes).

However, the top priority for all businesses that want to win the SEO game must be to minimize site downtime in terms of both frequency and duration. They also need to know why site downtime occurs, in order to proactively address issues and keep them from recurring. And that is where site uptime monitoring enters the picture.

What to Look for in Site Uptime Monitoring

 There are many site uptime monitoring products in the marketplace, ranging from superficial (and usually free — hey, we get what we pay for), to robust and reliable. Obviously, organizations need to choose from among the latter and avoid the former. To that end, here is what to look for in a site uptime monitoring solution:

  • Full site monitoring. It is not enough to monitor a site’s basic availability — because that only confirms that it exists, not that it is actually functional. It is also necessary to verify each page element, script, and interactive feature, as well as scan for errors, track load times, and pinpoint problems.
  • Automated alerts delivered by phone, email and/or texts to specified individuals (e.g. CTO, IT Director, etc.), in the event of site downtime — so that the right people can take immediate action.
  • The ability to record and monitor multi-step web processes with real browsers (across desktop and mobile devices), mouse clicks, and keyboard interactions.
  • The ability to monitor any port on any server or device to ensure that it is up and running, including: ICMP, TCP, SMTP, POP3, IMAP, DNS, FTP, Telnet, and custom ports.
  • A range of performance reports including website load times, geographic performance, failure analysis, and more.
  • Ongoing monitoring from a large group of locations around the world, which is essential for avoiding false alarms and verifying that a site is truly down/unavailable.

And it goes without saying: a legitimate and reliable site uptime monitoring solution must be backed by a responsive team of experts who will immediately take ownership of an issue and see it through to resolution. This cannot be emphasized enough, because the only thing worse than site downtime is trying to get help from people who don’t know what they’re doing. It gets ugly in a hurry.

SEO is Here to Stay

The rules of SEO will change — this much is certain (Google tinkers with its algorithm hundreds of times a year). But what isn’t going to change for search engines is the supreme importance of delivering relevant results. This means effective site uptime monitoring is not an option. It is essential, and companies that fail to heed this wisdom will soon be expressing another 3-letter acronym: SOS.

AlertBot is a leading site uptime monitoring solution that checks ALL of the features and functions above, which is why it’s trusted by some of the world’s biggest brands. Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. 

]]>
Multi-Step Monitoring: Why it’s Essential and How it Works https://www.alertbot.com/blog/index.php/2022/06/06/multi-step-monitoring-why-its-essential-and-how-it-works/ Mon, 06 Jun 2022 18:37:22 +0000 https://alertbot.wordpress.com/?p=850 Graphic of technical items featuring a check mark, a man pointing at squares, the word "method" circled, a microscope, magnifying glass and a pie chart.

Multi-Step Monitoring: Why it’s Essential and How it Works

 The term “essential” is thrown around pretty loosely these days. That new show about the hospital (no, not that one… not that one either… yeah that one) is advertised as essential viewing. A newly-released track by a hip hop artist that describes how little they need to release new tracks in order to live much, much better than the rest of us? That’s essential listening. And how can we forget that new muffin that cannot legally be advertised as a muffin, because is technically more of a candy. That’s essential snacking (“mmmmmm….pseudo muffin”).

But then on the other end of the hype spectrum, there are things that are legitimately essential, because going without them could lead to dire consequences — or maybe even a catastrophe. And for e-commerce companies, one tool that truly qualifies as essential is multi-step monitoring.

What is Multi-Step Monitoring?

In a break with tradition in the complex world of technology, multi-step monitoring is pretty much what it sounds like: a way to track the various steps that customers take as they move through pages on a website. This way, businesses can proactively identify and fix problems such as buttons that don’t work, forms that won’t submit, links that don’t go anywhere, pages that take too long to load, and so on.

Why is Multi-Step Monitoring Essential?

 Most customers who run into problems don’t shrug them off. They get mad. And that compels them to hit the brakes and head for the exit. In fact, a whopping 88% of online consumers are less likely to return to a site after just one bad experience. So, yeah, preventing about 9 in 10 customers from disappearing is important. One might even say that it’s… wait for it… ESSENTIAL!

How Multi-Step Monitoring Works

In AlertBot, configuring multi-step monitoring is remarkably easy, and doesn’t require an advanced degree in Hypercomplex Supergeerkery, with additional specialized certifications in Megaultra Nerdology. Here is how it works (a video tutorial is also available):

  • Step 1: Login to AlertBot
  • Step 2: Go to “Monitors”
  • Step 3: Set up a new monitor.
  • Step 4: Select the TrueBrowser® Multi-Step Monitor option.
  • Step 5: Download the AlertBot Recorder (available for PC currently — this step only has to be completed once).
  • Step 6: Give the monitor a name (e.g. “Amazon 1 Multi-Step Monitor”).
  • Step 7: Launch the AlertBot Recorder, input the URL of the site (e.g. Amazon.com), and record a script simply by simulating actions that a customer would take. It is also a good idea to label steps/phases (e.g. “Homepage”, “Add to Cart,” etc.), which can be helpful when analyzing reports later on.
  • Step 8: Save the script with a unique name (e.g. “Amazon test”).
  • Step 9: Upload the script into TrueBrowser® Multi-Step Monitor (which was launched in Step 4).
  • Step 10: Hit the “Test” button.

And that’s all there is to it. When the test is complete (this can take up to two minutes), a report is automatically generated that shows:

  • The duration of each phase/step in the process.
  • Whether each process was successful or unsuccessful.
  • A waterfall chart capturing a breakdown of everything that loads on each individual page (e.g. request times, file transfers, etc.).
  • Raw browser request data that reveals anything that is not working, or that could be contributing to a degraded user experience (e.g. loading large files or images that cause slowdowns).

Tests can be run at anytime to verify that problems are fixed and improvements are made. It’s remarkably easy. And yes, it’s essential.


Learn More

Discover the benefits of multi-step monitoring. Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. 

 

]]>
The 3-Step Communication Game Plan for a Site Outage (One of Our LEAST Favorite Things) https://www.alertbot.com/blog/index.php/2022/04/21/the-3-step-communication-game-plan-for-a-site-outage-one-of-our-least-favorite-things/ Thu, 21 Apr 2022 17:00:37 +0000 https://alertbot.wordpress.com/?p=844 A hand holding chalk writes football plays on a blackboard, using x's, arrows and o's.

The 3-Step Communication Game Plan for a Site Outage (One of Our LEAST Favorite Things)

If those von Trapp Family singers from The Sound of Music collectively woke up in a really, really bad mood and decided to write a song about their least favorite things, then it’s a safe bet that not being able to connect to a website would make the list (alongside airline passengers who tilt their seat back, and clam shell plastic packaging).

Indeed, the level of rage that many people experience when their browser presents them with a “cannot connect to that website” message is enough to trigger a blood pressure monitoring app alarm on a smartwatch. It’s the equivalent of going to a store, only to find out that the door is locked. Actually, it may be worse than that, because at least there could be some therapeutic comfort in commiserating with other disappointed customers. But in the virtual world, the journey is usually solo — and so is the misery.

The bad news is that there is no way to absolutely, completely, and ultimately prevent site outages from happening. However, the good news is that companies can — actually, scratch that: they must — be proactive to mitigate the pain and suffering; both across their site visitors, and for themselves. To that end, here is a three-step communication game plan:


Step 1: Tell the story.

Without delay (not even for lunch), companies should leap into their operational digital properties — e.g. social media, email, SMS, chat, widget, etc. — and clearly describe:

  • What’s going on and why the site is down.
  • When the outage started.
  • What is being done.
  • When the outage is likely to end based on all available information and best estimates.
  • Options and workarounds (if they exist).
  • Relevant policy or process changes (e.g. “due to this unforeseen event we are extending our normal 30-day return window, to ensure that customers who are affected can return items without any inconvenience”).


Step 2: Update the status page.

All of the information shared through social media and other channels should be published to a dedicated status page, which — as the name suggests — exists for one purpose only: to highlight and describe the status of a website (or possibly multiple websites that are part of the same brand or portfolio). It is vital to keep the status page updated to reflect the current phase: investigating, fixing, resolving, and resolved.

In addition, the status page should invite visitors to subscribe, so that they can receive real-time notifications when things change — and ultimately, when they get back to normal.


Step 3: Conduct a postmortem and share the findings.

Once the outage is history, companies should figure out precisely what went wrong. Using a top-rated site uptime monitoring tool, like AlertBot, can provide helpful clues, and just as valuably, ensure that there isn’t a repeat performance. This information should be shared with the customer community and all other stakeholders, such as suppliers and strategic partners.

Typically, this information is shared through a blog post, which all social media accounts such as Facebook, Twitter, and Instagram (etc.) point to. Even if the company is not technically at fault (after all, nobody wants to be assailed by a DDoS attack), the fact remains that visitors were inconvenienced. An authentic apology goes a long way to easing frayed nerves and restoring trust.


The Bottom Line

Site outages are dreadful. Yet, they happen, and companies need to have a communication game plan to minimize the frustration for visitors, and the adverse impact on their reputation. The von Trapp Family singers would approve (and probably turn it into a song that you can’t get out of your head, no matter how hard you try).

]]>
Why Website UX “Edge Cases” Lead to Visitor Frustration — and What to Do About It https://www.alertbot.com/blog/index.php/2022/02/21/why-website-ux-edge-cases-lead-to-visitor-frustration-and-what-to-do-about-it/ Mon, 21 Feb 2022 21:27:54 +0000 https://alertbot.wordpress.com/?p=837 A mountain climber is silhouetted on a deep blue sky background as he hangs off a cliff by one hand, and a tether hangs off his belt.

Why Website UX “Edge Cases” Lead to Visitor Frustration — and What to Do About It

The year was 1993. Beanie Babies invaded the planet. Dinosaurs dominated cinemas worldwide when they escaped from Jurassic Park. Seinfeld won the Emmy for Outstanding Comedy Series (you might say that Jerry & co. were masters of their domain). And righteous rockers Aerosmith extolled the virtues of “living on the edge.”

A lot — and we are talking A LOT — has changed since 1993; especially that advice about living on the edge. Frankly, the last thing that companies want is for their website visitors to go anywhere near the edge, because they may fall off.

Edge Cases
What we are talking about here are “edge cases,” which refer to website UX pitfalls that are unlikely — but nevertheless possible. And when visitors experience one of these edge cases, it is not a matter of whether they will get mad: it is a question of how enraged they will become. Hell hath no fury like visitors thrust into a nasty edge case. Here are some examples:

  • A visitor incorrectly inputs their credit card data into a form, which causes the form to crash.
  • A visitor clicks or taps the search function, but without putting anything in the search field, which causes the website to hang.
  • A new website is launched and everything seems fine (there are a lot of fist bumps and “WE DID IT!” cheers among the development team), but there are sections of bad core that manifest hours, days, weeks, or even months down the road.

As a result of these negative experiences, visitors cannot move forward as both they and the company desire — or to use a term from the UX world, their momentum on “The Happy Path” — is thwarted.  Fortunately, that is where synthetic monitoring enters the picture.

The Role of Synthetic Monitoring
Synthetic monitoring (sometimes referred to as journey monitoring) is a method of simulating and evaluating the various journeys that visitors take on a website: where they go, what they do, what buttons they press, what forms they fill out, and so on.

With synthetic monitoring, companies can proactively identify and address edge case scenarios, but without having to rely on excessive manual testing or live user monitoring. This is not only more efficient, but it exposes edge cases that would otherwise go undetected.

Ideally, addressing edge case scenarios means eliminating them entirely — such as fixing bad code. But at the very least, companies can put up signposts that point visitors in the right direction. For example, since there is no way to 100% guarantee that every visitor will correctly input their credit card number, a form can be modified to tell visitors when an input error has occurred.


AlertBot: Avoiding the Edge
AlertBot supports advanced and easy-to-use synthetic monitoring that helps companies run and evaluate various UX scenarios before their visitors do — and ultimately reduce edge cases. Hey, Aerosmith is welcome to live on the edge (who are we to criticize the group that brought us Guitar Hero?). But companies that want to drive visitor engagement — and prevent frustration — should live as far away from the edge as possible.

Start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. 

]]>
How a Superior Site Uptime Monitoring Solution Could Save Your Organization $1.85 Million https://www.alertbot.com/blog/index.php/2021/12/17/how-a-superior-site-uptime-monitoring-solution-could-save-your-organization-1-85-million/ Fri, 17 Dec 2021 00:10:04 +0000 https://alertbot.wordpress.com/?p=831 Graphic showing two computer monitors with hands coming out of the sides of them. The screen on the left shows a lock symbol with a hand reaching out to offer money to the other hand, which is coming out of a laptop with a skull and crossbones graphic on the screen. The hand receiving the money is holding out a key to the other hand in exchange for the money.

We all know the pleasure we feel when we dig into an old pair of jeans and pull out a crumpled $5 bill, or when we finally get around to vacuuming our car (“Hey, I don’t remember eating onion rings in here”) and find a few bucks in loose change. It’s as if the universe has taken a moment to smile on us.

Now imagine that, instead of finding enough money to buy some more onion rings (“Oh yeah, I remember when I ate onion rings in here — wow, that was a long time ago”), you get your hands on a cool $1.85 million. Pleasure isn’t the word for that. Euphoria is.

Well, in a sense, that is what owners, investors, and anyone else who has a financial stake in your organization could feel if you choose a superior site uptime monitoring solution. Why? Because new research has revealed that $1.85 million is the average price tag that organizations pay to recover from a ransomware attack — a figure that has more than doubled in the last year. Let’s unpack this by taking a look at ransomware, and then explaining the link to site uptime monitoring.

What is Ransomware?

Essentially, ransomware is a type of malware that infects a computer, and blocks access to it unless victims pay a fee (a.k.a. a ransom). And if that was not nefarious enough, there are two other things about ransomware that need to be part of the story.

The first is that victims are given a very limited amount of time to pay up. If they fail to do so, then the threat — which is often carried out — is they will permanently lose access to their data, or their data will end up being disclosed on the dark web or elsewhere. The second is that even after they pay the ransom in full, only 8% of victims get 100% of their data back, and 29% get up to 50% of their data back. In the legitimate business world, this kind of chronic non-fulfilment would lead to excessive customer loss, and probably lawsuits and investigations. But on the cybercrime landscape, it’s standard operating procedure. There is no complaints department or review site (“We were very disappointed in this hacker who failed to return all of our data, but we are adding a star because communication was prompt”).

Where a Site Uptime Monitoring Solution Enters the Picture

A superior site uptime monitoring solution cannot block ransomware attacks. For strategies and tactics on that front, we recommend this helpful article at eSecurityPlanet.com, and this site by the Cybersecurity and Infrastructure Security Agency (CISA).

However, a superior site monitoring solutions CAN do something that hackers earnestly hope that potential victims do not realize: immediately alert them to a ransomware attack — even if it’s at 3:00am — so they can rapidly roll-out an uncorrupted back-up and carry on without disruption or (and here is the euphoric part) having to pay $1.85 million or more in ransom/recovery costs.

Then, the organization can move to fortify cybersecurity defenses and reduce the size of the attack surface (probably by deploying many of the recommendations highlighted by the sources listed above), ultimately reducing the likelihood of future ransomware attacks.

The Bottom Line

Ransomware is on the rise, with the number of reported incidents surging 183% between the first two quarters of 2021. A superior site uptime monitoring solution won’t stop these attacks or frankly even slow them down. Hackers are notorious for doing things over and over again until they stop working — and unfortunately, ransomware is quite profitable. But it can give organizations the warning and time they need to strengthen their defenses, and in the process potentially save an average of $1.85 million.

Launch a free trial of AlertBot’s superior site uptime monitoring solution. No credit card. Nothing to download. Setup in minutes.  

 

]]>
AlertBot Cyber Week Sale! https://www.alertbot.com/blog/index.php/2021/11/29/alertbot-cyber-week-sale/ Mon, 29 Nov 2021 09:00:30 +0000 https://alertbot.wordpress.com/?p=822 Graphic that features a cellphone screen and big words saying "AlertBot Cyber Week Sale." Get 20% off for the life of your account! Signup now at AlertBot.com. Promo code 2021CW20 All New Accounts Get 20% off their plan!

It’s Cyber Week! All new AlertBot signups this week get 20% off for the life of their account! Use promo code 2021CW20 when you sign up to claim this deal! https://www.AlertBot.com

]]>
AlertBot Black Friday Deal! https://www.alertbot.com/blog/index.php/2021/11/24/alertbot-black-friday-deal/ Wed, 24 Nov 2021 19:48:33 +0000 https://alertbot.wordpress.com/?p=815 Graphich displaying large text that reads "AlertBot Black Friday" with smaller text that reads "Special Offer Valid November 22 thru 29, 2021. Get 20% off the life of your account! All New Accounts Get 20% off their plan! Signup now at AlertBot.com Promo Code 2021BF20"

It’s Black Friday all week for AlertBot! All new signups this week get 20% off for the life of their account! Use promo code 2021BF20 when you sign up to claim this deal!

https://www.AlertBot.com

]]>
What Exactly is a Website Monitoring “False Alarm” and Why You Should Care About It https://www.alertbot.com/blog/index.php/2021/09/28/what-exactly-is-a-website-monitoring-false-alarm-and-why-you-should-care-about-it/ Tue, 28 Sep 2021 17:45:05 +0000 https://alertbot.wordpress.com/?p=771 A man in a rose colored polo shirt leaning over with his hand on a fire alarm, about to depress the button.

What Exactly is a Website Monitoring “False Alarm”
and Why You Should Care About It

by Louis Kingston

You know what falsehoods are. You know what false teeth are. You may even know some falsehoods about false teeth. But do you know what a website monitoring false alarm (also known as a “false positive”) is? If not, then please keep reading to find out — because it’s a very big deal.

What is a False Alarm?

Remember back in grade school, when the fire bell suddenly went off in class and you were instructed to exit the class single-file and march outside? As you rose from your desk, heart racing, you wondered if you’d ever see your Trapper Keeper, Real Ghostbusters lunchbox and JanSport backpack ever again. But after you and your classmates were wrangled into the parking lot to stand in the brisk autumn air for what felt like an eternity, you soon learn it was just some older kid who thought it’d be funny to pull that shiny red lever on the hallway wall.

Well, that’s essentially what a false alarm is: a result that incorrectly indicates that a particular condition or attribute is present (i.e. it wasn’t a real fire consuming your place of education; it was merely a “false alarm” thanks to that jerk in the grade above yours).

What is a Website Monitoring False Alarm?

What you need and expect from a website monitoring tool is to know precisely when your website goes down. Why? Because research has found that the average cost of site downtime is $5,600 per minute. And remember, we are just talking about the average cost here. Some site downtime fiascos are much more costly. Just ask Amazon, which lost an estimated $99 million after going down for 63 minutes during Prime week in 2018. Granted, most businesses (including yours, unless you happen to be Jeff Bezos) won’t have to shell out $1.65 million a minute due to website downtime, but the basic point should be clear: site downtime is costly, and false alarms are supposed to minimize this financial damage.

But what happens when a website monitoring downtime alarm goes off, but nothing is actually wrong? It gets chalked up to a false alarm.

Why Website Monitoring False Alarms Are So Common

Many website monitoring tools — and virtually all of the free kind — have a test server in one location. If that test server detects that a website is not available, it does the only thing it can: sound the alarm. And that seems to be the correct thing to do, right? Well, not exactly.

Let’s say that that the website in this example is only down for a few seconds due to an isolated power outage. The test server has no way of knowing this (i.e. that the website is back up). And so, it is going to generate a false alarm.

The Solution: Multiple Testing Server Locations

Now, imagine that there are multiple test servers spread out across the country — say, one in New York and one in Los Angeles. The test server in New York detects that a website has gone down, and triggers a red alert (this test server is a big Star Trek fan). But it doesn’t sound the alarm. Instead, 60 seconds later the test server in Los Angeles checks in. If the website is up, then it cancels the red alert. If the website is down, then it confirms the initial diagnosis by the test server in New York, and the alarm goes off.

The AlertBot Advantage

At AlertBot, we hate false alarms even more than our customers. That’s why unlike many other website monitoring tools — and again, virtually all of the free ones — we have test servers located across the U.S. and worldwide. We don’t guess whether our customer’s website is down. We know.

Plus, when it is necessary to send out an alert, our system automatically and immediately contacts key people — such as network administrators, SysAdmins, CIOs, etc. — through email, SMS/text message, or phone (or any combination).

What’s more, our test servers keep checking for website site availability, and provide an update (again, in the preferred method) if it goes back up. We also highlight the amount of time that the website — or a specific portion/page of the website — was down. Our customers use this information to keep an eye on overall website performance, proactively detect problems, and ensure that their web host is consistently meeting uptime standards.

Ready to bid false alarms a true farewell? Then start a FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. Click here.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
“Frodo, We Aren’t in the Shire Anymore”: The Importance of a Customer Journey & How to Avoid Wrecking It https://www.alertbot.com/blog/index.php/2021/07/29/frodo-we-arent-in-the-shire-anymore-the-importance-of-a-customer-journey-how-to-avoid-wrecking-it/ Thu, 29 Jul 2021 20:35:38 +0000 https://alertbot.wordpress.com/?p=761 An image of the Shire film set from Lord of the Rings that shows Bilbo's home's front door. Text on the image reads “Frodo, We Aren’t in the Shire Anymore” - The Importance of a Customer Journey & How to Avoid Wrecking It

“Frodo, We Aren’t in the Shire Anymore”: The Importance of a Customer Journey & How to Avoid Wrecking It

by Louis Kingston

Fans of Lord of the Rings — otherwise known as “Ringers” — never grow weary of reading or watching Frodo and his fellow Hobbits journey through Middle Earth on an epic quest to Mordor (where rumor has it there now exists a very stylish Starbucks at the base of Mount Doom).

Well, customers who visit a website are on an important journey as well. Granted, it doesn’t involve saving the world from evil entities that never sleep.  But it does involve achieving objectives that, ultimately, culminate in a sale — whether that happens on the same visit (e-commerce) or weeks down the road (B2B). And that brings us to the customer journey map.

The customer journey map is a visual tool that enables businesses to identify where, when and how customers engage their brand — and make the trek from curious prospects to enthusiastic brand ambassadors. There are five phases on the journey:

  1. Awareness: customers in this phase know that they have a problem (although they may not yet know the full scope or severity of the problem), and they are in the process of conducting research. It is possible for visitors to go straight into a conversation via phone, video or chat. However, research has found that, on average, customers access 3-5 pieces of content before speaking (or texting) with a sales rep. This content can include videos, articles, ebooks, white papers, checklists, and so on.
  2. Evaluation: customers in this phase are aware of their problem and are comparing different options. Content such as case studies, testimonials and demos are highly influential here. However, nothing beats a free trial.
  3. Decision: customers in this phase have put together a shortlist (in a formal manner as is usually the case with B2B sales, or more casually as is often the case with B2C/e-commerce sales) and are ready to make a decision.
  1. Retention: customers in this phase have made a purchase, and now the goal is to keep them on the roster. The importance of retention cannot be overestimated. It costs 5x more to acquire a new customer than it does to nurture an existing customer, and increasing retention by just 5% can boost profits by 25-95%. Tactics that can help foster a strong, loyal relationship include special offers, follow-up outreach (via email, app, phone, video, or in-person for B2B), customer satisfaction surveys, etc.
  1. Advocacy: customers in this phase are happy and willing to recommend a company to their professional and personal network, as well as members of their various social media communities (e.g. Facebook and LinkedIn groups, etc.). Research has found that 83% of B2C customers and 99% of B2B customers are influenced by various types of word-of-mouth marketing.

In theory, the customer journey is straightforward. However, in practice — and just as Frodo & Co. discovered — the quest can have many twists and turns. No, there aren’t any orcs, hobgoblins or balrogs along the way, but there are some dangerous foes that include:

The bad news? Any one of these is enough to send customers heading straight for the exit, never to return. The good news? AlertBot’s leading solution continuously monitors for ALL of these from multiple locations around the world — and proactively notifies key individuals (e.g. CIOs, CTOs, SysAdmins, etc.) when a problem occurs.

You could say that AlertBot is leading-edge technology worthy of Gandalf, and yet so intuitive and easy-to-use that Pippin Took could manage everything (even after having a few pints at the Prancing Pony).

See for yourself by starting your free trial of AlertBot now.   

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
Debunking 3 Website Availability Monitoring Myths https://www.alertbot.com/blog/index.php/2021/05/27/debunking-3-website-availability-monitoring-myths/ Thu, 27 May 2021 18:51:29 +0000 https://alertbot.wordpress.com/?p=749 A pretty girl with brown hair up in a bun looks intently at a laptop with her hand to her chin in a pensive manner.

Debunking 3 Website Availability Monitoring Myths

by Louis Kingston

Some myths in life are harmless, or even helpful. For example, Santa Claus has come in very, very handy for parents who want to nudge their kids from the naughty list to the nice one. And let’s give a round of applause to the Tooth Fairy, whose promise of nominal financial compensation has turned the prospect of losing a tooth from a meltdown trigger into a motivational factor.

However, other myths are on the opposite end of the spectrum: they lead to stress and costs. The bad news is that there are some rather notorious website availability monitoring myths out there. But the good news is that debunking them is simple. Here we go:

Myth #1: Free website monitoring tools are just as good as paid versions.

The Truth: So-called free website monitoring tools are riddled with gaps and vulnerabilities — simply because they’re free, and the folks who make them aren’t trying to provide a public service or earn some good karma. They’re in business, and that means there’s always (always!) a hook. Here are some of the drawbacks: zero technical support, excessive false positives, reduced test frequencies, limited testing locations, and s-l-o-w product updates. For a deeper dive into these pitfalls, read our article here.

Myth #2: Buying website availability monitoring from your host is a smart idea.

The Truth: Your web host probably offers website availability monitoring, and keeps pestering you to buy it. What’s the harm? Well, here’s the harm: your web host is a web host. That’s their jam. They don’t specialize in website monitoring, which means that customers like you are going to pay for their lack of competence and capacity. And on top of this, your web host has an inherent conflict of interest when it comes to giving you the full picture — because your hosting agreement includes uptime standards. As such, they may be less inclined to be fully transparent if they fall below this standard. Or to put it bluntly: they might lie, and you’ll have a really hard (if not impossible) time trying to detect and prove it. For more insights on why it’s a bad idea to buy website monitoring from your host, read our article here.

Myth #3: Website availability monitoring is just about website availability monitoring.

The Truth: This last myth is especially tricky. Yes, website availability monitoring is about website availability monitoring. But that’s not where it ends. Comprehensive (i.e. the kind your business needs) website monitoring also analyzes key aspects such as website usability, speed and performance — because there are situations where a website can be available, but not accessible or optimized. To learn more about why comprehensive website availability is not just a technical necessity but also a customer experience requirement, read our article here.

The Bottom Line 

Does your kid have a toothache, threatening to go to DEFCON 1? Do a myth tag team of the Tooth Fairy + Santa to avert a meltdown (and hey, you might even enjoy some extras out of the deal like getting them to clear the dishes after dinner or clean out the cat litter — kids are tough negotiators, but see what you can get).

But if you want to keep your business safe and strong, then steer clear of all myths, and equip yourself with the clarifying truths revealed above.

And speaking of clarifying truths: AlertBot TRULY offers world-class, surprisingly affordable and end-to-end comprehensive website availability monitoring — which is why it’s trusted by some of the world’s biggest companies. See for yourself by starting your free trial now.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
How to Solve 6 Common Browser Incompatibility Issues https://www.alertbot.com/blog/index.php/2021/04/06/how-to-solve-6-common-browser-incompatibility-issues/ Tue, 06 Apr 2021 09:15:30 +0000 https://alertbot.wordpress.com/?p=741 A graphic with an orange background and graphics of gears, charts and browser screens in the foreground.

How to Solve 6 Common Browser Incompatibility Issues

by Louis Kingston

You have spent a small — or perhaps a large — fortune on your website, and now you’re ready to reap the rewards. You can picture it now: delighted visitors gushing about speed, performance, features, and functions.

Except…that’s not happening. Instead, visitors are running into browser compatibility issues — which means instead of moving forward on the buyer’s journey, they are heading straight to a competitor. That’s the bad news.

The good news is that you can (and frankly, you must) fix browser compatibility issues ASAP. Here are six of the most common problems, along with their associated solutions:

Problem: Various browsers render CSS styles differently.
Solution: Force all browsers to reset to the same basics by using CSS reset style sheets, such as Normalize.css (which is Github-based), HTML5Reset, or Eric Meyers CSS Reset. 

Problem: Browsers automatically default to “Quirks Mode,” which results in unresponsive tags and flawed rendering.
Solution: Add this magical line (without the quotation marks) “!DOCTYPE html” at the beginning of the codebase, which forces browsers to operate in Strict Mode vs. Quirks Mode.

Problem: Outdated Javascript fails to automatically detect older browsers.
Solution: Eliminate browser detection and replace it with Modernizr, which rapidly runs various tests to detail all applicable browser functions.

Problem: Unvalidated HTML/CSS leads to coding errors that some browsers do not auto-correct.
Solution: Use tools like W3C HTML validator and Jigsaw CSS validator to catch and fix errors, including the really tiny ones that can lead to major incompatibility headaches.

Problem: Certain functions designed to run on specific browsers are instead running on multiple browsers that cannot handle the request.
Solution: Add common vendor prefixes to the code, such as -webkit- (Chrome, Safari, newer versions of Opera, most iOS browsers, and any other WebKit- based browser), -moz- (Firefox), -o- (pre-WebKit versions of Opera), and -ms- (Internet Explorer and Microsoft Edge).

Problem: Third party libraries aren’t loading and working properly.
Solution:  Use trusted frameworks that are cross-browser friendly, such as Angular JS and React JS (web application development framework), Bootstrap and Animate (CSS libraries), and JQuery (scripting library).

How AlertBot Can Help
AlertBot monitors your website with real web browsers — not simulations! — to capture the most authentic end-user experience, and identify problems that others miss. Your development team can use this reliable information to solve problems, and ensure that all visitors enjoy a flawless experience.

Start your FREE TRIAL of AlertBot now. There’s no billing information required, no installation, and you’ll be setup within minutes. Click here.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
4 Essential Failure Analysis Reports for Monitoring Website Performance & Uptime https://www.alertbot.com/blog/index.php/2021/02/09/4-essential-failure-analysis-reports-for-monitoring-website-performance-uptime/ Tue, 09 Feb 2021 17:47:22 +0000 https://alertbot.wordpress.com/?p=733 An asian man with spiky hair leans over a reflective table holding a tablet in his hand and is touching the screen with his right hand.

4 Essential Failure Analysis Reports for Monitoring Website Performance & Uptime

by Louis Kingston

It would be nice if the same commandment held for websites. However, even an infinity of buzz cuts cannot change the fact that, alas, sometimes websites fail. And so, the question then becomes: how do you minimize the likelihood, duration and severity of website failure?

The answer probably isn’t enough to inspire a movie. But it’s more than enough to help businesses detect and remedy underlying problems with their website before they become full-blown catastrophes: use failure analysis reports.

There are four types of failure analysis reports that every business should be generating on a regular basis: Waterfall Reports, Web Page Failure Reports, Downtime Tracking, and Failure Events.

  • Waterfall Reports

Waterfall Reports enable businesses to analyze the performance of every object that loads on their web pages (e.g. scripts, stylesheets, images, etc.), in order to identify common sources of bottlenecks, errors and failures. Waterfall Reports also display HTTP response headers, which help track down the source of slowdowns and breakdowns.

  • Web Page Failure Reports

Many business websites have dozens of pages, and e-commerce websites can easily have more than 50, 100, or even 1000. Manually hunting for problems can be tedious and futile. That’s where Web Page Failure Reports come to the rescue. They often contain a screenshot of data a page might display during a failure event log. This information can then be used to fix issues before they trigger visitor/ customer rage.

  • Downtime Tracking

No, Downtime Tracking isn’t the name of one of those bands that never smile when they sing. Rather, it’s a type of report that contains statistics on website and server downtime. Understanding the size, scope and source of downtime issues is critical to resolving them.  

  • Failure Event Logs

Knowing that a web page — or element(s) within a web page — are failing is important, but it’s not the full story. Failure Event Logs fill in the gaps by providing detailed information about what tests were performed, the geographical locations affected, and the errors identified.

The Bottom Line

Are failure analysis reports as gripping and captivating as Apollo 13? No. Are they vital to website performance and business success? Yes. Because while website failure is unfortunately an occasional option, it absolutely cannot become a regular habit.

At AlertBot, we provide our customers with all of these failure analysis reports (and more) so they can get ahead of problems and avoid catastrophes. Start a free trial today.

 

 

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
Why Your Website Host’s “100% Guaranteed Uptime” Promise is Bogus — and What to Do About It https://www.alertbot.com/blog/index.php/2021/01/14/why-your-website-hosts-100-guaranteed-uptime-promise-is-bogus-and-what-to-do-about-it/ Thu, 14 Jan 2021 19:32:25 +0000 https://alertbot.wordpress.com/?p=725

Image of a man with his hands on his head, looking down in a distressed manner with his eyes closed and a laptop sitting in front of him with the lid mostly closed.

Why Your Website Host’s “100% Guaranteed Uptime” Promise is Bogus — and What to Do About It

by Louis Kingston

It’s been said that the devil is in the details. Well, along the same lines — and as we all know from miserable experience — when it comes to guarantees, the devil is in the small print. And there’s no better (or worse) example of this than with respect to the gleaming, confidence-inspiring claim by web hosts that they deliver 100% guaranteed uptime. Except, well, they don’t.

Here’s the thing: what you, everyone you know, and even random strangers in the street define as uptime — i.e. a website being online, operational and accessible — is not how web hosts define uptime. Confused? Of course, you are. To make sense of this, you need to think like a web host.

Multiple Pieces of the Uptime Puzzle

There are multiple pieces of the uptime puzzle: the server on which your website lives, the data center that physically houses multiple servers, the ISP that connects to the internet, and the carrier that links traffic between multiple ISPs. The uptime guarantee offered by web hosts begins and ends with the server and, if they own it, the data center. It does not include issues or problems with the ISP or carrier. As such, if there are points of failure in either of those components, then when your website does go down, your host will technically be meeting its promise. You’ve heard of a non-apology apology? Well, this is a non-guarantee guarantee — and it’s just as lousy.

Less than 100% Uptime = the Same Story

Now, you may have a website host that doesn’t sing from the 100% uptime/zero downtime songbook. It may, for example, promise 99.99% guaranteed uptime, or pledge some other Ivory soap-inspired technical cleanliness standard. Yet again, the same murky logic described above applies: as long as the host’s servers and (if owned) data center are humming along, then it’s an uptime guarantee party and everyone’s invited.

The Real Guarantee

At this point, you may be wondering — and not in a curious, childlike way, but in an agitated “what on earth is going on here!?” way — about what recourse you have available if and when your host does, indeed, bear responsibility for your website going down. That’s where the Service Level Agreement (SLA) kicks in.

Basically, in most cases, the SLA between you and your web host will entitle you to a prorated rebate based on downtime that meets two conditions: 1) the downtime is the responsibility or fault of the web host, and not the ISP, the carrier, the power company, hackers, natural disasters, wizard spells, alien invasion (or just alien visitation), or any other factor that is beyond its control; 2) the downtime can be proven.

So for example, if your business pays $100/month for managed web hosting and your site goes down for half a day— and both of these conditions are met — then you’ll either get around $3.33; most likely as a credit that will be applied to your next bill. Quite the luxurious guarantee, isn’t it?

What You Can Do About It

The bad news is that you can’t demand that your website host’s 100% uptime guarantee is, in fact, a 100% uptime guarantee as you, and pretty much everyone else, would define it. Unless the FCC and FTC decide that this is false advertising (and they haven’t done that… yet), then the splashy promise will remain– and so will the legalese fine print.

But the good news is that you can equip yourself with a globally trusted advanced website monitoring solution like AlertBot, so that you instantly know exactly when your site goes down, why it went down and for how long. You can then use this data to pinpoint problems and fix issues immediately. AlertBot’s popular health map reports deliver crucial performance metrics direct to your inbox to assure you stay on top of your sites. This will also determine whether you should change hosts to one that is relatively better at keeping their promises.

Give AlertBot’s FREE trial a try today. There’s no billing information required, no installation, and you’ll be setup within minutes. Click here.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
7 Tips to Help Remote Workers Secure Their Home Wi-Fi https://www.alertbot.com/blog/index.php/2020/12/01/7-tips-to-help-remote-workers-secure-their-home-wi-fi/ Tue, 01 Dec 2020 21:32:48 +0000 https://alertbot.wordpress.com/?p=714 A laptop sitting on a wooden desk with two hands resting on the keyboard. Two browser windows are shown on the screen at an angle, with the graphic of a cloud and a lock floating in front of the screen.

7 Tips to Help Remote Workers Secure Their Home Wi-Fi

by Louis Kingston

Do you remember that old song called “Dem Bones” that goes: “ankle bone is connected to the shin bone, shin bone is connected to the knee bone, knee bone is connected to the thigh bone…” and so on? (It’s in your head now, isn’t it?)

Well, legend has it that hackers sing a similar song to their kids that goes: “remote worker’s wi-fi connected to the corporate network, corporate network connected to the privileged accounts, privileged accounts connected to the confidential data.”

True, it’s not as catchy, but hackers have never been about style points. They’ve been about doing what works over and over again until it stops working. And unfortunately, they’re having a ridiculously easy time these days hacking remote worker wi-fi setups, and establishing a foothold from which they launch into corporate networks — often with the goal of deploying malware to harvest confidential data (e.g. customer credit card numbers). 

The solution to this problem? Ensure that remote workers fortify their home wi-fi setup, because it is definitely not in full security mode out-of-the-box. The problem with this solution? Maybe remote workers — especially non-technical types — don’t know what to do, and are afraid if they tinker with their router then they won’t just be banished from the land of Zoom conferences and Slack chats with colleagues, but they won’t be able to surf bizarre Reddit subs at 3:00am or watch Minecraft videos on YouTube. What kind of existence is that?

Fortunately, going from Wi-Fi security zero to hero doesn’t require a PhD in Geekology. Here are seven things that remote workers can and should do right now (if they haven’t wisely done so already) to protect themselves and their organization:

  1. Toggle WPA2-Personal (Wi-Fi Protected Access 2) instead of WEP (Wired Equivalent Privacy).
  • Change the default pre-shared key (PSK) to something that is: 100% unique; at least 15 characters in length; uses a mix of numbers, letters and symbols; does not use memorable key paths (e.g. “asdfg”); does not use common substitutions (e.g. passw0rd); does not use any dictionary words; and does not use any personally identifying information (e.g. pet’s name).
  • Change the Wi-Fi router’s default administrative credentials. To change it, remote workers simply head to an online platform that is usually located at http://192.168.1.1 or http://192.168.0.1, enter the details password (typically “admin”), and change the password accordingly. The new password should check all the boxes listed in #2.
  • Disable Wi-Fi Protected Setup (WPS). This was designed to make life easier by allowing people to connect devices to their home network (e.g. laptop, tablet, smartphone, etc.) simply by pressing a button on their router. No password verification is needed. However, while this is convenient, the protocol between the router and devices could be vulnerable to brute-force hacking techniques.
  • Keep the router’s firmware updated. For instructions on how to do this for a variety of router types, check out this handy step-by-step guide (including screenshots).
  • Change the default network name. This is used by the router’s encryption algorithm (along with the password) to secure communications. However, widely available password cracking dictionaries (a.k.a. rainbow tables) include common network names. The best advice is to pick something short and boring — e.g. “QT24L” and not “Can’t Hack This.”
  • Turn off network name broadcasting. Yes, it’s convenient to see the network name when connecting devices. But it’s also convenient for hackers, and for neighbors who may not necessarily want to steal data, but have no qualms about stealing bandwidth to torrent files, stream movies, and so on.

The Bottom Line

Will implementing all seven of these recommendations make a home Wi-Fi network impenetrable? No. As long as there is going to be Wi-Fi, there is going to be risk. However, doing all of the above will certainly make it tougher for hackers, and like home burglars, most of them target low hanging fruit. If a Wi-Fi connection puts up a fight, they’ll usually just move on to the next victim until they find one who hasn’t followed the advice in this article.

 

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
6 Tips to Prepare Your E-Commerce Site for the Biggest Holiday Traffic Surge Ever https://www.alertbot.com/blog/index.php/2020/10/06/6-tips-to-prepare-your-e-commerce-site-for-the-biggest-holiday-traffic-surge-ever/ Tue, 06 Oct 2020 21:51:25 +0000 https://alertbot.wordpress.com/?p=709 A beautiful woman with brow n hair holding and looking at her cellphone, with five shopping bags also in her hands. Text on the image reads "6 Tips to Prepare Your E-Commerce Site for the Biggest Holiday Traffic Surge Ever"

6 Tips to Prepare Your E-Commerce Site for the Biggest Holiday Traffic Surge Ever

by Louis Kingston

So it begins.

No, we are not talking about the school year, the football season, or a dizzying array of television shows about zombies, detectives, and of course: zombie detectives (seriously, it’s a thing).

Rather, we are talking about the beginning of what for most ecommerce businesses is the make-or-break race to the end of the year called “gift buying season.” Except this year, things are going to be different.

To understand why, let’s zoom in on what, for most ecommerce businesses, is the most critical period of the gift buying season — Cyber Week — which starts on Thanksgiving, and runs through to Cyber Monday. According to research by BigCommerce.com, during Cyber Week 2019 same-store sales across all verticals increased by a whopping 21% compared to Cyber Week 2018, and the average order value jumped by 10%.

And now, we come to 2020, a year in which billions of people are either obligated or advised to stay at home. These folks aren’t going to even consider hopping into their car to navigate the mall jungle. Instead, they’re going to pause Fortnite, minimize Reddit, crack their knuckles, replace the battery in their mouse, and BUY all kinds of stuff online: from toaster ovens to 60” 4K TVs to luxury sneakers to mounted singing bass fish (remember those?).

Simply put: 2020 is not just going to break e-commerce sales records, but it is going to obliterate them. In fact, in terms of how many people buy stuff online and how much they buy, there may never be another year quite like it in terms of year-over-year surges in volume and value.

For e-commerce businesses, this makes the 2020 gift buying season absolutely critical — which in turn means that crashed or slow websites are NOT OK. In fact, the mere idea of their possible existence is horrifying and just plain unacceptable, like a floating island of fire ants (which, unfortunately, is also a thing).

To prevent a catastrophe worse than anything the Griswold Family might experience, here are six essential things do to:

  1. Anticipate how much traffic is likely to arrive — keeping in mind that it’s probably going to be much higher than in past years during Cyber Week and throughout the remainder of 2020 — and proactively ensure that the host can handle the load. For more insight and advice, read our article on how to prevent traffic spikes from crashing your website.
  1. Assess site speed, and if necessary dial up the velocity (and then dial it up some more). If you have any doubts about this, read our article on the 3 reasons why website speed is more important than ever.
  1. Eliminate the 4 common causes of cart abandonment: unexpected costs, forcing customers to create an account, a long and winding checkout process, and bugs.
  1. Optimize the mobile shopping and purchasing experience. During Cyber Week in 2019, mobile orders were responsible for 49% of all online sales.
  1. Analyze and address security vulnerabilities. Unfortunately, shoppers aren’t the only people who love e-commerce sites — hackers do too, and they demonstrate their affection by deploying malware and launching DDoS attacks.
  1. Use a reputable and proven site uptime and availability monitoring solution like AlertBot, which will instantly notify authorized individuals (e.g. webmasters, sysadmins, etc.) if something goes wrong.

The Bottom Line

There aren’t many things that can be said with certainty about 2020. However, two things make the list: we will hear the phrase “new normal” at least a thousand more times before the year is up, and the gift buying season for e-commerce businesses is going to be colossal.

Whether that is colossal good (think Avengers) or colossal bad (think the Death Star) will largely be determined by the six essential factors described above. Which epic story do you want your e-commerce business to tell in the months ahead?

Try AlertBot today and see why it’s trusted and recommended by some of the world’s biggest enterprises. There’s no billing information to provide, nothing to download, and you’ll be completely set up in minutes — click here.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host https://www.alertbot.com/blog/index.php/2020/08/18/3-reasons-why-its-a-bad-idea-to-buy-site-monitoring-from-your-web-host/ Tue, 18 Aug 2020 17:16:21 +0000 https://alertbot.wordpress.com/?p=701 A image of multiple server racks on either side of a laptop in the foreground. The laptop screen shows a cloud graphic with an "X" over it. Text on the image reads "3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host"

3 Reasons Why It’s a Bad Idea to Buy Site Monitoring from Your Web Host

by Louis Kingston

For baseball pitchers, the two most glorious words in the English language are “perfect game.” For actors, it’s “Oscar win” (forget all that nonsense about how “it’s an honor just to be nominated.”). For school-aged kids, it’s “snow day.” And for businesses, of course, it’s “captive audience.”

Indeed, it doesn’t matter how compelling or clever a marketing and advertising campaign might be. If audiences don’t take notice and pay attention, it may as well not exist. And if you doubt this, think of the last time you sat through 20 minutes of movie trailers — not because you wanted to, but because there was nowhere else to go (at least, not without saying “excuse me…” 10 times as you painfully twisted and squirmed your way past annoyed fellow moviegoers).

Why does this matter? It’s because your web host is singing from the captive audience songbook when it repeatedly urges you to add site monitoring to your existing hosting package. At first glance, this may seem like a good idea. After all, you know that site monitoring is important. Why not just grab it from your web host, the same way you grab a side order of fries from a fast food restaurant? Well here’s why not:

  1. Lack of Specialization

Your web host doesn’t specialize in site monitoring, which means they aren’t using the latest technology or hiring the most qualified professionals. Just as you wouldn’t want your doctor to sell you a timeshare during an exam (“You know what might help that bronchitis? Two weeks a year in a sunny and warm Florida condo, as you can see from this lovely brochure”), you don’t want your site monitoring company to do anything but site monitoring. It’s not something anyone should be dabbling in.

  1. Lack of Service Offering

When web hosts offer site monitoring, they typically focus on uptime. But site monitoring isn’t just about letting you know when your site goes dark. It’s also about making sure that your site is performing the way it’s supposed to — which means that all elements are functional (e.g. buttons, forms, multi-step processes, etc.), and all pages are loading rapidly. Without this critical information, you may believe that everything with your site is fine and all lights are green; that is, until you begin hearing from irate customers and start losing sales.

  1. Potential Conflict of Interest

Last but not least, your site host is supposed to meet an uptime standard as part of their service commitment. But if that same host is also monitoring your site performance, they may be less inclined to be completely transparent if they fall below this standard. And if they did fudge some of the numbers, how would you even know? With this in mind, are we saying that all hosts that offer site monitoring are unethical? Absolutely not. Are we saying that there is an inherent conflict of interest that should be at least concerning and troubling? You bet.


The Simple, Smart Solution

The best (and really, the only) way to solve this problem is to avoid it completely — which means not site monitoring from your host, and instead getting it from a proven, reputable vendor that:

  • Specializes in site monitoring — it’s all they do 24/7/365.
  • Offers both uptime monitoring and comprehensive performance monitoring — not just the former.
  • Has zero conflict of interest telling you the truth, the whole truth, and nothing but the truth regarding how your site is doing.

Ready to safeguard and strengthen your business with world-class, surprisingly affordable site monitoring? Then you’re ready for AlertBot! We check all of these boxes, and are trusted by some of the world’s biggest companies. Start your free trial now.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
The Basics of DNS Monitoring: What It Is, How It Works, and Why It’s Essential for Your Business https://www.alertbot.com/blog/index.php/2020/06/19/the-basics-of-dns-monitoring-what-it-is-how-it-works-and-why-its-essential-for-your-business/ Fri, 19 Jun 2020 21:02:28 +0000 https://alertbot.wordpress.com/?p=696 A rendering of a planet in the foreground and a planet behind that with a little moon to the bottom right of it. A Star Trek starship shows small off to the top right. Text on the image reads "The Basics of DNS Monitoring: What It Is, How It Works, and Why It’s Essential for Your Business"

The Basics of DNS Monitoring: What It Is, How It Works, and Why It’s Essential for Your Business

by Louis Kingston

On Star Trek, there’s an incredibly useful device called the universal translator. As you’d expect, it allows everyone to understand each other. For example, if Captain Jean Luc Picard bumped into a race of aliens that bore a striking resemblance to Commander Riker’s beard, then they could set a date for some Earl Grey tea (hot) thanks to the universal translator. Without it, there might be grave misunderstandings and the firing of photon torpedoes.

DNS: The Next Generation

Well, the internet has its own kind of universal translator, which is somewhat less gloriously called a Domain Name System, or DNS for short. Essentially, DNS is a protocol that establishes the standards for how computers exchange data on the internet, as well as private networks. The purpose is to convert domain names into an Internet Protocol (IP) address, so that computers can identify and communicate with each other. Without the universal language of DNS, surfing the web wouldn’t be surfing at all. It would be more like wading through quicksand because we’d all have to keep track of hundreds, if not thousands, of IP addresses.

How DNS Works

Let’s say that you type “Google.com” into your web browser. Behind the scenes, your browser sends out a request to a recursive name server in order to get the IP address for Google.com (if the recursive name server comes up empty, then the back-up plan is to check with an authoritative name server, which has information on every domain). Ultimately, provided that the website in question exists, the browser is provided with an IP address that tells it precisely where to go.

Now, does this mean that you could type in the IP address and cut out the middleman? Yes. For example, if you really wanted to, then you could type 172.217.10.14 — which is Google’s IP address — into your browser and head straight to Google.com without passing a DNS (or collecting $200). But why would you want to!? A DNS allows you to remember simple names instead of complex 10-digit numbers.

Why DNS Monitoring is Essential: Part 1

The first reason why your business needs DNS monitoring should be self-evident: if for any reason your site name isn’t resolving, then visitors won’t be able to reach it. For all intents and purposes, it will be down. Constant and automated monitoring checks to see that everything is working and there is no need for anyone to scream “RED ALERT!”

Why DNS Monitoring is Essential: Part 2

DNS monitoring also checks to see that the name resolution process is swift vs slow. Why is this so important? Consider this:

  • A study by Kissmetrics found that a one second delay in load time can send conversion rates plunging by seven percent.
  • Google — a company notorious for never confirming or denying anything to do with its ultra-secret search engine algorithm, has bucked tradition and formally verified that page speed is a significant SEO ranking factor for mobile and desktop searches (now please hold still for memory wipe procedure).
  • Thanks (or make that “no thanks”) to a phenomenon that psychologists dub the perception of speed, visitors don’t just dislike slow websites: they hate them with a passion that borders on — and often surpasses — blinding hatred. With apologies to Shakespeare: Hell hath no fury like a visitor delayed.

Why DNS Monitoring is Essential: Part 3

Hackers frequently target DNS servers to redirect visitors to sites that deliver malware. Even scarier, hackers can obtain SSL encryption certificates that allow them to intercept and decrypt email and virtual private network (VPN) credentials.

The Bottom Line

DNS Monitoring lets you know three things that are more important than not plugging in a hair dryer  when the U.S.S. Enterprise goes to warp speed: that your site is up, that your DNS server has not been hijacked by hackers, and that it’s resolving quickly. Without this information, the only way you will know that something is wrong is when angry customers or panicked colleagues start calling.

Boldly Go with AlertBot!

AlertBot automatically and continuously monitors your DNS servers (regardless of where they are located) to ensure that everything checks out, including A records (IPv4), AAAA records (IPv6), aliases (CNAME), SMTP mail server mappings (MX records), DNS zone delegates (NS records), SOA serial numbers, and more. And if an issue is suspected or detected, your team is immediately alerted so they can take action and solve the problem.

Start a free trial now and boldly go with AlertBot!

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
10 Reasons for Website Crashes https://www.alertbot.com/blog/index.php/2020/04/23/10-reasons-for-site-crashes/ Thu, 23 Apr 2020 21:50:01 +0000 https://alertbot.wordpress.com/?p=691 A beautiful photo of a grassy field with a mountain range in the background. Text on the image reads "10 Reasons for Website Crashes"

10 Reasons for Site Crashes

by Louis Kingston

In the classic movie The Sound of Music, the whimsical governess Maria and the Von Trapp children sing about their favorite things — like raindrops and roses and whiskers on kittens. It’s joyful, it’s inspiring, and it’s in perfect harmony backed by a full orchestra. Isn’t Austria lovely?

Well, if Maria and co. were running a website (perhaps something to do with selling lederhosen or offering hiking tours in the hills), here are 10 things that absolutely wouldn’t be among their favorite things since they cause sites to crash:

  1. Coding errors, usually after a maintenance or an upgrade.
  2. Bugs in the programming that, alas, should have been spotted and destroyed long ago.
  3. Incompatible plugins and extensions. This is a BIG problem with WordPress sites!
  4. Traffic surges, which may require upgrading the hosting package to get more disk space and/or implementing a content delivery network (CDN).
  5. Malware attacks, which not only lead to site crashes, but can land businesses on blacklists that block legitimate emails from getting through.
  6. Hacker attacks, such as DDoS. Sometimes businesses are targeted directly by bad actors or unhappy ex-customers, and sometimes businesses are caught up in the net as part of a large scale campaign.
  7. Service provider and host errors, which are probably the most frustrating of all reasons for site crashes. There is virtually nothing that a business can do but wait for a third party to get their act together.
  8. Domain expiry. Yes, sometimes sites crash simply because the domain wasn’t renewed.
  9. Google blacklists, which happen when Google decides that a site is deceptive (note: this technically doesn’t cause a site to crash, but it effectively does the same thing since it blocks traffic).
  10. Data center shutdowns, which happens during an emergency such as a fire or flood, or sometimes even by accident. For example, back in 2017 Amazon’s web host crashed due to an employee taking more servers offline than he intended (wonder what that guy’s next performance review was like?).

First, the Bad News…

AlertBot’s acclaimed technology cannot prevent these dreadful things from crashing your site — although now that you know what you’re up against, you can be proactive. For example, you should test all plugins/extensions before adding them to your site; make sure that you have the right hosting package, and so on.

…now, the Good News!

AlertBot’s acclaimed technology CAN make sure that your team is immediately notified whenever your site crashes, so that you can take switch action and resolve the problem before your visitors get frustrated and head to the competition.

Try AlertBot free and discover why it will quickly become one of your business’s favorite things. Heck, you might even start singing about it in the halls.  

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
How To Keep Traffic Spikes from Crashing Your Website https://www.alertbot.com/blog/index.php/2020/03/31/how-to-keep-traffic-spikes-from-crashing-your-website/ Tue, 31 Mar 2020 19:36:22 +0000 https://alertbot.wordpress.com/?p=685 A photo showing blurred lights for traffic moving super fast on three different pathways. Text on the image reads "How To Keep Traffic Spikes from Crashing Your Website"

How To Keep Traffic Spikes from Crashing Your Website

by Louis Kingston

At first glance — and probably second and third as well — having too much traffic seems like a really nice problem to have; like when billionaires struggle to decide which yacht to buy (“I say Thurston, the one with the tennis courts is quite lovely, but the one with the outdoor cinema is so charming”).

However, too much traffic really is a problem, because it causes websites to either dramatically s-l-o-w down (which is terrible) or crash (which is worse than terrible). And right now, as hundreds of millions of people are advised or obliged to stay at home, there are a bunch of e-commerce businesses around the world that are experiencing this harsh, costly reality.

The good news is that your business can — and should — take proactive steps to keep traffic spikes from impaling your website, and causing revenue losses and reputation damage.  Here is the to-do list:

  1. Use a content delivery network (CDN), which is a geographically distributed network of proxy servers and data centers. A CDN helps ensure that visitors — regardless of where they’re located — enjoy fast-loading pages, images, videos and other content, and it also leverages a network of servers to manage traffic spikes. Instead of a single server struggling to handle the load, multiple servers share the burden.
  1. Check, double-check, and while you’re at it, triple-check whether your current server is capable of handling a traffic surge (pay particular attention to any data caps). This assessment is especially important if your business’s website has expanded over the years, but your server capacity has remained the same since day one.
  1. Make sure that all of your software is up-to-date. In addition to patching vulnerabilities, updates can help lower the risk of a virtual traffic jam.
  1. Run a daily backup. No, this won’t prevent traffic-induced website crashes. But yes, it’s a lifeline back to normalcy if a crash is on the horizon.
  1. Use a reliable 24/7 website uptime monitoring solution like AlertBot, which proactively and immediately informs designated individuals (e.g. system administrators, CTO, CSO, etc.) if your website goes down, and can also check to make sure specific scripts and pages are working correctly. What’s more, if required you can use AlertBot’s logs as evidence to inform your host that they need to do a much better job of keeping your website online — or else you’ll head elsewhere.

The Bottom Line
More potential customers than ever before are using the web to find products and services — everything from digital gadgets to financial advisors to home repairs, and the list goes on. When the surge reaches your virtual address, you want to definitively know — and not just hope — that your website is ready, willing and able to handle the traffic.

Give AlertBot a try for FREE. There’s no billing information, no installation, and you’ll be setup within minutes. Click here

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
Beware These 5 Possible Dangers Lurking in Free Website Monitoring Tools https://www.alertbot.com/blog/index.php/2020/02/17/beware-these-5-possible-dangers-lurking-in-free-website-monitoring-tools/ Mon, 17 Feb 2020 21:38:36 +0000 https://alertbot.wordpress.com/?p=672 A beautiful woman with brown hair and her right hand to her forehead looking concerned. Her left hand is holding her glasses. She's looking down at her laptop. A chart with graphs is in the background. Text on the image reads "Beware These 5 Possible Dangers Lurking in Free Website Monitoring Tools"

Beware These 5 Possible Dangers Lurking in Free Website Monitoring Tools

by Louis Kingston

We’ve been told by the poets that the best things in life are free: A sunrise in spring, the scent of a flower, the coo of a baby, having a buddy who can get his hands on football tickets. It’s all so beautiful and uplifting (especially the football tickets).

But at the same time, the economists remind us that there’s no such thing as a free lunch. And of course, we know from experience that this is often the case. How many times have we taken advantage of a so-called free offer, only to end up disappointed instead of delighted? A handful? Dozens? Hundreds? (And we haven’t even brought up that notorious gym membership yet…)

And that brings us to website monitoring. You know that this is important — or make that vital — to your business’s success. Indeed, going off-the-grid for even a minute can lead to lost sales and lasting reputation damage, and ongoing downtime issues can negatively impact search engine rankings. Hell hath no fury like Google and Bing scorned.

But what you may not know, is that the throng of free site monitoring tools out there may be part of the problem — not the solution. Here are five potential dangers lurking in these tools:

  1. No Technical Support

Many free site monitoring tools offer no technical support to help you pinpoint issues and identify potential vulnerabilities and weaknesses. Instead, they provide you with a FAQ (or some other similar resource), and expect you to solve your own problems. You can’t even complain about this, because there’s nobody to complain to.

  1. Excessive False Positives

 When is a downtime alert not a downtime alert? When it’s a false positive. These are truly (not falsely) frustrating and terrifying, and they’re a common problem among some free site monitoring tools.

  1. Reduced Test Frequencies

In their marketing, all free site monitoring tools promise to “constantly scan your site.” That sounds comforting. But some of these tools define “constantly” differently than you would— and not in a good way. Several minutes can pass between test frequencies, which means that if something goes wrong, you’ll be left in the dark for quite a while.

  1. Limited Testing Locations

Many free site monitoring tools test from one or two locations (which is a worst practice) instead of from multiple locations around the world (which is a best practice).

  1. Slow, Limited Product Updates

Many free site monitoring tools don’t get the latest, greatest and safest product updates — because the companies that make them can’t afford to do so. After all, someone has to pay for that stuff.

Why Free in the First Place?

In light of the above, you may be asking a very sensible question: with so many fundamental drawbacks and limitations, why do some companies offer free site monitoring tools in the first place?

In two words: loss leader.

In more than two words: these companies use a free site monitoring tool to get customers onto their roster, after which the upsell parade starts — and it never, ever ends. Eventually, some of these customers end up buying a premium (license/subscription) site monitoring solution at a hefty price tag. The company does a happy dance, rings a bell, updates a giant telethon-like tote board, and smokes a bunch of cigars.

OK, they don’t do any of those things (at least, we hope they don’t), but the fact remains that the free site monitoring tool was never a legitimate, functional business-grade solution in the first place. Economists 1, poets 0.

And Then, There’s AlertBot!

AlertBot isn’t free, for the simple reason that we:

  • Provide exceptional technical support
  • Filter out and prevent false positives
  • Conduct frequent testing
  • Test from multiple locations around the world
  • Regularly update our technology

At the same time, AlertBot is refreshingly affordable and makes CEOs and CFOs as happy as it makes CTOs and CSOs. So yes, the best things in life are free. But second best is getting a GREAT deal on a solution that over-delivers. That’s AlertBot. Try it now and see for yourself.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
Word (and Warning) to the Wise: Site Downtime isn’t Just a Technical Issue — it’s a Customer Experience Problem https://www.alertbot.com/blog/index.php/2020/01/13/word-and-warning-to-the-wise-site-downtime-isnt-just-a-technical-issue-its-a-customer-experience-problem/ Mon, 13 Jan 2020 22:49:31 +0000 https://alertbot.wordpress.com/?p=659 A finger resting on a yellow star with four blank stars to the right of it, meaning it's a 1 star rating. Text above this reads "Word (and Warning) to the Wise: Site Downtime isn’t Just a Technical Issue — it’s a Customer Experience Problem"

Word (and Warning) to the Wise: Site Downtime isn’t Just a Technical Issue — it’s a Customer Experience Problem

by Louis Kingston

Businesses of all sizes — from small startups to large enterprises — are spending an enormous amount of money and time to deliver outstanding customer experience (CX). For example, they’re deploying contact centers, implementing customer-friendly return and warranty policies, training their workforce to be customer-centric, and the list goes on. And now, according to research by Walker Insights, CX is poised to overtake price and product as the most influential brand differentiator. To put this another way: customers are happily willing to pay a higher price, and for a more limited selection, if they’re getting the attention, performance, respect and results they expect — and frankly, demand.

The CX Gap that is Swallowing Customers

However, despite the fact that the CX party has been going on for a while and there’s no slowdown in sight, there’s a gap that many businesses are overlooking — one that is swallowing up their current and future customers, and transporting them directly to the competition: site downtime.

Here’s the thing: traditionally, site downtime has been primarily, if not exclusively, viewed through a technical lens, similar to a car breaking down or a roof springing a leak. And there is obviously truth in this perception. But it’s not the whole story, because customers out there on the virtual landscape equate site experience with customer experience. As such, when a site goes dark, they don’t think: “This customer-centric business has a technical problem with their website, and are surely going to fix it ASAP.” Instead, they think: “Wow, if this is what their website is like, then the rest of the business must be just as dysfunctional.”

Now, is this perception fair? Frankly, no. The vast majority of businesses — let’s say 99% of them — with site downtime truly care about delivering good (if not great) CX. These are the same businesses that, as noted above, are spending plenty of money and time on CX-related investments and training. They seriously and urgently want to get CX right.

But when their website breaks down or blows a virtual tire, this legitimate, longstanding investment and CX commitment is undermined — and customers react accordingly. Here are some of the grizzly numbers:

  • 50% of customers say they have abandoned a transaction or purchase due to poor customer service.
  • 51% of customers say they will never do business with a company again after one instance of poor customer service.
  • 74% of customers say they are likely to switch brands if the purchasing process is too difficult.
  • 95% of customers tell others about poor customer service.

The Bottom Line 

The takeaway here isn’t that businesses need to care more about CX — because they know this already, and (hopefully) are acting on this understanding. Rather, it’s that businesses need to see the direct, immediate link between poor CX and site downtime. It’s not just a technical issue. For current and future customers, it’s the difference between whether they move forward on the buyer’s journey and serve as a profitable brand advisor, or whether they head for the exit and never look back.

Protect Your Reputation + Impress Your Customers 

AlertBot delivers world-class, surprisingly affordable monitoring that immediately notifies you when your site is not operational. You can then take rapid, focused action and solve the problem before your customers form the wrong impression — and never give you a second chance to make it right. Launch your free trial of AlertBot today.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations https://www.alertbot.com/blog/index.php/2019/11/19/the-not-so-magnificent-7-https-errors-that-infuriate-customers-and-ruin-reputations/ Tue, 19 Nov 2019 19:13:39 +0000 https://alertbot.wordpress.com/?p=650 A graphic with a bright orange background and a cartoonish illustration of a man with glasses sitting at his desk facing his computer looking angry. Next to this graphic are the numbers "404" and text "Oops... page not found." Article title above it reads "The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations"

The (Not-So-Magnificent) 7 HTTPS Errors that Infuriate Customers and Ruin Reputations

by Louis Kingston

In the classic flick The Magnificent Seven, a pack of essentially decent but “don’t you dare park your horse in my spot or else you’ll get your spurs blasted” gunslingers come together to rid a village of some nasty bandits. There’s action. There’s drama. There’s tragedy. There’s humor. There’s romance. There’s Steve freakin’ McQueen. What’s not to love?

Well, on the dusty and dangerous internet landscape, instead of a magnificent seven to save the day, there exists seven not-so-magnificent HTTPS errors that are impossible to like, let alone love. Why? Because their purpose is to block visitors from reaching websites — which leads to lost customers and wrecked reputations.

Here’s a look at the reprehensible HTTPS errors that have their picture on Most Wanted Lists in every post office from Tombstone to Dodge City:

403 Forbidden: The 403 Forbidden error means that the server is absolutely refusing — no ifs, ands or buts — to grant permission to access a resource, despite the fact that a request is valid. Common causes include missing index files, and incorrect .htaccess configuration.

404 Not Found: The 404 Not Found error means that a web page or other resource can’t be found because they simply don’t exist. Common reasons for this include a broken link, mistyped URL, or that someone moved or deleted a page and didn’t update the server (which happens a lot).

408 Request Time Out: The 408 Request Time Out error means that the server can’t find the target or resource that it’s searching for, and after a while, just throws in the towel. Often, this is because the server is overloaded.

410 Gone: Whereas (as noted above) a 404 error implies that there might be some hope — i.e. the target file might be somewhere, just not where it’s supposed to be — the 410 Gone error snuffs out any possible optimism. It’s totally, completely and permanently gone.

500 Internal Server Error: The 500 Internal Server Error means that the server cannot process a request for any number of reasons, such as missing packages, misconfiguration, and overload.

503 Service Unavailable: The 503 Service Unavailable error means that the server is either down because of maintenance, or because it’s overloaded. Either way, the server is conjuring up its inner Gandalf and screaming: “YOU SHALL NOT PASS!”

504 Gateway Time-Out: The 504 Gateway Time-Out error means that a higher-level upstream server isn’t working and playing well with a lower-level downstream server. After a while, the downstream server gets the message that it’s not wanted, and says “Oh yeah? Well, I don’t need you either!”

Calling in the Marshall
The bad news is that these reprehensible HTTPS errors, if left unchecked, can cause a lot of damage. Indeed, few things irk and offend website visitors more than seeing an error code. But the good news is that you can call in the Marshall— a.k.a. AlertBot — to restore law and order.

AlertBot constantly scans your site’s pages to watch out for these and other HTTP errors. If and when they are detected, authorized employees (e.g. webmasters, sysadmins, etc.) are proactively notified so they can take swift action and fix the problem.

It’s lightening fast, always reliable, and as smooth as Steve McQueen. Dastardly, good-fer-nuthin’ HTTPS errors don’t stand a chance!

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: Aeropostale vs GAP (The Final Showdown) https://www.alertbot.com/blog/index.php/2019/10/08/alertbot-showdown-aeropostale-vs-gap/ Tue, 08 Oct 2019 18:54:23 +0000 https://alertbot.wordpress.com/?p=636 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying shopping bags. Text reads "AlertBot Showdown: Aeropostale vs GAP" with the word SHOWDOWN very large at the bottom.
When you think of trendy, casual clothes, names like GAP, Aeropostale and Abercrombie are likely to be among the retailers that come to mind. While many of the brands we’ve come to know and trust over the years still maintain brick and mortar stores, all of them have had to make the transition to having a presence online in the wonderful digital world we call “ecommerce.”

Shopping for clothes in person is an entirely different experience than shopping online (and only being able to guestimate how their purchase may look or fit in real life), but we wanted to evaluate the online shopping reliability of two of these brands when it comes to the world wide web and their own individual website performance.

To test their web performance quality, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both GAP.com and Aeropostale.com from August 4th through August 18, 2019. (We originally planned to evaluate Abercrombie.com instead of Aero, at first, but the site produced so many errors that we decided to choose a different company’s site to monitor.)

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both Aero’s and GAP’s sites achieved 99% uptime. Neither saw significant downtime, which is expected, but each one experienced some sluggish speeds and even load time timeouts on a couple occasions.

Aeropostale.com experienced 99.64% uptime, with over 20 errors recorded due to slow load times or brief periods of unresponsiveness. None of these events lasted longer than a couple minutes, however, and none of them amounted to any significant downtime. Because of this, we still consider their performance to be pretty good.  (Aeropostale.com 8/10)

GAP.com experienced fewer issues, but struggled with some significant slowness on August 9th, resulting in 99.50% uptime. Otherwise, they would have an overall stronger performance during this time period than Aero. (GAP.com 8/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring. We calculate the speed as an overall average across all locations during the time span selected for this Showdown.

When it comes to page load times, Aeropostale performed respectably, but at about twice the load time as GAP’s site. Their best day, on average, was Monday, August 5th with 6.1 seconds. Their worst day, on average, was Thursday, August 15th, with 6.8 seconds. The site’s overall average speed across the entire test period was 6.97 seconds, which isn’t terrible, but it also isn’t much to brag about. However, one thing certainly gleaned from these results is that Aero’s site is relatively consistent across the board, in regards to their speed.  (Aeropostale.com 7/10)

As teased above, GAP.com performed about twice as fast as Aeropostale.com did. Their best day, on average, was Sunday, August 4th with 2.4 seconds. That’s a pretty decent load time. GAP.com’s worst averaged day was Friday, August 9th, at 3.35 seconds, which is still almost half the time of Aero’s best day. The site’s overall average speed across the entire test period was 2.8 seconds, which is rather impressive.     (GAP.com 8.5/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. For this portion of the test, we compare the overall average speeds of each individual location captured during the selected period of time for this Showdown.

When it comes to geographic performance, it seems safe to say that Aero’s site is all over the map. They performed best in North Carolina at an average of 2.6 seconds, with Nevada in second at 3 seconds and Oregon third at 3.1 seconds. Those times are not bad at all. However, their slowest time was a dismal 13.3 seconds (ouch!) in Missouri, followed by 13 seconds in California, and Washington DC in third place at 12.1 seconds. (Aeropostale.com 7/10)

GAP.com also saw some drastic differences on either side of the scale, but not nearly as substantial a difference as Aero’s. Their fastest average performance was seen in Nevada, at 1.7 seconds. Oregon came in second at 1.7 seconds, and Virginia was third at 1.8 seconds. Missouri was once again at the bottom of the proverbially bargain bin with 6.3 seconds, followed by Colorado at 5.21 seconds and Texas at 5.17 seconds. Still, GAP’s geographically slowest times look like Aero’s overall fastest times, which is rather disappointing.  (GAP.com 8.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to see if we can find a nice sweater (since we’d love to cozy up in this fall weather) and add it to our cart.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.GAP.com into our Chrome browser, it took 39.10 seconds and 8 clicks to get a sweater into the shopping card and begin the checkout process. GAP had two pop-ups about coupons and joining their mailing list, and it took a few clicks to get around those. Then we navigated to the Men’s section, selected the first long sleeve crewneck we found and added it to the cart. (And hey, it’s 40% off, too. Woohoo!)

For www.aeropostale.com, it took 6 clicks and 35 seconds to browse their fall collection, snag a thermal hoodie tee, add it to the cart, and click checkout (and hey, the price was about half-off, too!).

Honestly, both sites are pretty nice, easy to use, and straightforward. The pop-ups on GAP.com were a bit annoying, especially with there being two of them, but it’s tough to gripe about getting offered coupons to save money when you’re shopping. Aero’s site felt just a smidge more inviting, like you’re browsing a tangible catalog, and it seemed to offer quite a few options up front.

All things considered, our Usability scores are:

(Aeropostale.com 9/10)
(GAP.com 9/10)

 

Verdict

Both sites performed respectably, but when it comes to speed, one definitely outperformed the other—and the positive usability experience is just gravy. So, we’re pleased to announce this Showdown champion to be…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Gap.com"

]]>
4 Common Causes of Cart Abandonment — and How to Solve Them https://www.alertbot.com/blog/index.php/2019/09/05/4-common-causes-of-cart-abandonment-and-how-to-solve-them/ Thu, 05 Sep 2019 21:27:52 +0000 https://alertbot.wordpress.com/?p=626

Image of a shopping cart with green trim set against a white wall. Text on the image reads "4 Common Causes of Cart Abandonment — and How to Solve Them"

4 Common Causes of Cart Abandonment — and How to Solve Them

by Louis Kingston

It’s a sad story that has become so common, that it just kind of blends into the background — like that awful elevator jazz that some coffee shops play (Thelonious Monk would NOT approve), or economy class in-flight meals (there’s less sodium on a salt lick, and you don’t get rammed in the ankle by a cabin trolley). Alas, we’re talking about the cart abandonment epidemic.

And epidemic is indeed the right word, because this problem is not local or limited. Forrester Research pegs the number of customers who bid adios to their cart at 87%, with 70% of them choosing to do so just before checkout. Overall, $18 billion worth of products each year are left to languish in digital trolleys.

Here are four common and costly cart-based reasons why customers flee the sales funnel, rather than triumphantly complete the buyer’s journey:

  1. Unexpected costs.

Customers don’t merely dislike unexpected costs like shipping, or nebulous “handling” fees (what, are people buying plutonium or something?). They absolutely hate them. There might even be a clinical psychological aversion to this called “unexpectedcostphobia.”

The solution: be transparent about all automatic or potential costs by advertising a clear and realistic estimate, providing a delivery calculator on the home page (not buried at the end of the checkout process), and if possible, offering free shipping for a minimum purchase.

  1. Obliging customers to create an account.

A decade or two ago, customers didn’t mind creating an account to purchase something online, simply because they didn’t know there was any other way. It was part of the deal, like the turning of the earth or standing in line for longer than you should at the post office. It’s going to happen.

But now, customers have enjoyed a taste of the guest checkout experience — and many of them love it; especially if they’re suffering from security fatigue and wince at the idea of remembering more login credentials. Naturally, e-commerce sites that fail to cater to this preference set themselves up for plenty of cart abandonment.

The solution: if creating an account is mandatory, make the process as simple and fast as possible (and then make it even simpler and faster). In addition, give customers an incentive to create an account such as a discount offer, special gift, or anything else that has value and isn’t going to lead to a bankruptcy filing.

  1. Long and winding checkout process.

In 1970, The Beatles sang about the “Long and Winding Road” and scored yet another U.S. Billboard #1 hit. However, e-commerce sites that have a long and winding checkout process aren’t going to be certified platinum. They’re going to be certified terrified, because cart abandonment rates will be far higher than their competition.

The solution: ruthlessly streamline down the checkout process to the bare minimum, and use as few fields as possible. Yes, getting as much glorious customer data is important — but it’s not as important as getting customers on the roster in the first place.

  1. Bugs, bugs and more bugs.

Even entomologists don’t like website bugs and other completely preventable technical errors that make online shopping irritating instead of enjoyable. Even one of these bugs is enough to trigger cart (and brand) abandonment — let alone a bunch of them.

The solution: use a reputable third-party platform to constantly monitor all important web pages and multi-step processes — such as login, signup, checkout and so on — to proactively detect and destroy bugs, or anything else that makes customers miserable like slow page loading. Learn more about this here.


The Bottom Line
Completely eliminating cart abandonment isn’t possible, because there will always be customers who pause or stop the purchase process. But solving all of the problems described above significantly increases the chances that both carts and customers will get to the finish line, and be inspired to come back for more. And isn’t that the whole point?

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes https://www.alertbot.com/blog/index.php/2019/07/22/if-you-build-it-they-wont-come-5-big-scary-and-costly-e-commerce-site-mistakes/ Mon, 22 Jul 2019 06:55:52 +0000 https://alertbot.wordpress.com/?p=623 Photograph of a corn field set against a bright blue sky. Test on it reads "If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes"

If You Build It, They Won’t Come: 5 Big, Scary and Costly e-Commerce Site Mistakes

by Louis Kingston

In the 1989 flick Field of Dreams, Kevin Costner turns his Iowa cornfield into a baseball field because a voice tells him: if you build it, he will come. The “he” in question is his late father, and the movie has a magical, uplifting ending that makes us want to dream again (and possibly, play baseball or eat some corn).

Well, many folks who launch e-commerce sites also believe that: if I build it, they will come. This time, “they” means throngs of happy, profitable customers. Except…they don’t. And before long, the site is forced to scale down or shut down. Even writing to Kevin Costner doesn’t help — even if you promise to watch a double feature of The Postman and Waterworld (not recommended without a physician’s approval).

The bad news is that this kind of misery happens all the time. The good news — actually, make that the amazing, glorious, Field-of-Dreams-ending-like news — is that preventing this doom and gloom is largely a matter of avoiding these five big, scary and costly e-commerce site mistakes:

  1. Lousy UX

Tiny buttons that are impossible to click on a mobile device without a magnifying glass and hands the size of a Ken doll. Search functions that neither search nor function. Elusive top level categories. Gigantic banners that pop open and chase customers around from page to page, like a kind of online shopping Terminator (“I’ll be baaaaaack!”). These are just some of the many ways that lousy UX destroys e-commerce sites.

The remedy? Monitor all pages and multi-step processes (e.g. login areas, signups, checkout, etc.), to identify bottlenecks where customers routinely encounter errors or unresponsive behavior, and fix any gaps and leaks right away. Learn more about doing this here.

  1. S…l…o…w…n…e…s…s

Just how vital is speed? Behold these grizzly statistics:

  • A one-second delay in load time can send conversion rates plunging by seven percent. (Source: Kissmetrics)
  • 70% of customers say that a website’s loading time affects their willingness to purchase. (Source: Unbounce)
  • As page load time increases from 1 second to 3 seconds the probability of bounce increases by 32%; from 1 second to 5 seconds the probability of bounce increases by 90%; and from 1 second to 10 seconds the probability of bounce increases by 123% (source: Google)

The remedy? Be ruthless about making your e-commerce site as fast as possible (and then make it even faster). Here are the usual suspects: bloated HTML, ad network code, images not optimized, and using public networks to transmit private data. There are other culprits, but look here first — you’ll be amazed at how much speed you unleash.

  1. Not Focusing on SEO — or Focusing too Much on SEO

Let’s talk about health. Some people have poor health because they don’t exercise at all. Their daily calisthenic routine involves digging in the couch for the remote. And then on the other end of the spectrum, there are people who work out too much — like, we’re talking to extremely, unhealthy levels. You know the type.

The same phenomenon occurs in the e-commerce world when it comes to SEO. Some sites don’t focus on SEO, which means they aren’t going to get found by the 35% of customers who start their buyer’s journey from Google. And some focus too much on SEO, that they neglect other channels and tactics — including good, old fashioned pure promotion.

The remedy? Definitely make SEO part of the visibility strategy. But don’t make it the end-all-and-be-all of online existence. It’s important, but it’s not everything.

  1. Bad Customer Service

 Customer service is as important in the online world as the brick-and-mortar world, and in some cases it’s even more important, because exiting the buyer’s journey is so simple — as is writing a scathing zero-star review that would have made Roger Ebert wince. Unfortunately, many e-commerce sites treat customer service as an afterthought or a necessary evil, rather than an asset that should be leveraged to optimize customer experience and generate loyalty.

The remedy? Make customer service — characterized by the ease, speed, and quality of responsiveness and resolution — a big part of the plan. It’s not an expense, but an investment.

  1. Lack of Original, Compelling Content

E-commerce sites aren’t vending machines, yet many of them seem to take their inspiration from these handy contraptions that dispense candy and soda in exchange for money and the push of a button (be careful you don’t press the wrong one — you might end up with that oatmeal cookie that has been there since 2007, and not the Snickers bar that you’re craving).

However, most customers — even those who are very focused on getting a specific item, like a pair of sneakers, a smartphone, or a hotel room — want and expect to access relevant information to help them make a safer, smarter purchase decision. This could be videos, infographics, social proof (e.g. testimonials, reviews, case studies, etc.), articles, blog posts, and downloadable assets like ebooks,  checklists, and so on.

The remedy? Don’t skimp on creating original, compelling content. As a bonus, this will help with SEO and can connect you with profitable customers who are not in your primary target market.

The Bottom Line

Competition on the e-commerce landscape for the hearts, minds, and indeed, wallets of customers is ferocious. Avoiding these mistakes will go a long, long way to helping your e-commerce site survive and thrive.

You may even make enough profit to retire early, buy a cornfield in Iowa, and then turn it into a baseball field that inspires the feel-good movie of the year. Hey, it worked once before, right?

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: VIVE vs Oculus https://www.alertbot.com/blog/index.php/2019/06/27/alertbot-showdown-vive-vs-oculus/ Thu, 27 Jun 2019 19:48:56 +0000 https://alertbot.wordpress.com/?p=611 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are wearing Virtual Reality head sets and holding the controls. Text reads "AlertBot Showdown: Oculus vs Vive" with the word SHOWDOWN very large at the bottom.

As technology continues to morph change with the times, the virtual reality experience keeps becoming more widespread and immersive. Two of the leading brands in the VR game are unmistakably VIVE (HTC) and Oculus. Both companies are leaders in the ever-expanding digital world of virtual reality, with both having released or having plans to release new headset models this summer.

While these brands may corner the market on connecting to the virtual realm, we wondered how they stack up when it comes to the world wide web and their own individual website performance.

To test their web performance quality, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both VIVE.com and Oculus.com from May 1st through May 22, 2019. Given the high regard in which these companies are held because of their products, we expected their web performance to be strong.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both VIVE’s and Oculus’s sites did perform quite well. Neither saw significant downtime, but each one experienced some sluggish speeds and even load time timeouts on a couple rare occasions.

VIVE.com experienced 99.91% uptime, with just a few errors recorded due to slow load times. None of these events lasted longer than a couple minutes, and none of them amounted to any significant downtime. Because of this, we still consider their performance to be quite solid.  (VIVE.com 8/10)

Oculus.com performed similarly with 99.98% uptime and similar slow page load errors that didn’t amount to significant downtime but at least put a minor hiccup in their performance. They experienced four times fewer of these errors than VIVE, so they ended up coming out just a tiny bit more on top. (Oculus.com 8.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring. We calculate the speed as an overall average across all locations during the time span selected for this Showdown.

The speed for both websites were also relatively close to each other. VIVE.com’s best speed, on average, was seen on Monday, May 13 at 3.2 seconds, which isn’t bad. Their best time of day, however, was on Tuesday, May 21 at 5am with 1.6 seconds. It’s definitely better, although it’s doubtful that they usually see a high number of traffic on a given morning. VIVE.com’s worst averaged day was Thursday, May 23rd at just 5.1 seconds. However, their worst time was on Wednesday, May 22nd at 2pm with a much less admirable 8.8 seconds. The site’s overall average speed across the entire test period was 3.78 seconds.  (VIVE.com 8/10)

Oculus.com performed very similarly. Their best day on average was Thursday, May 2nd with 3.7 seconds. Their best response time was at 9am on Wednesday, May 15 with 2.05 seconds. Oculus.com’s worst averaged day was also (like VIVE’s) Thursday, May 23rd at just 4.37 seconds (although that’s slightly better than VIVE’s worst). However, their worst time of day was on Wednesday, May 1st at 6am with 7.49 seconds (making their slowest time a full second faster than VIVE’s slowest). The site’s overall average speed across the entire test period was 3.96 seconds (Just a smidge slower than VIVE’s).     (Oculus.com 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. For this portion of the test, we compare the overall average speeds of each individual location captured during the selected period of time for this Showdown.

Previously, California had reigned supreme as the fastest state in the U.S. But lately, other states have been stepping up, dethroning The Golden State. This time, North Carolina wins (for both sites), with VIVE.com moving at a breezy 1.69 seconds in The Old North State. Oregon came in second at 1.8 seconds, with Arizona at 2 seconds. Comparatively, Washington state saw the slowest speed, coming in at a shameful 10.9 seconds, with Washington DC in second at 7.55 seconds and Texas in third at 7.43 seconds. (VIVE.com 8/10)

Oculus.com was also under two seconds with 1.9 seconds in North Carolina. Their second fastest was 2.2 seconds in Nevada and 2.3 seconds in Oregon. Overall, they were pretty close to VIVE. However, while Oculus saw a better overall “slowest” location, the second and third slowest were a little worse. Washington, DC came in at 8.66 seconds, then Washington state at 8.65 seconds, and Texas at 8.55 seconds. For the most part, though, the sites performed rather closely.  (Oculus.com 8/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to see if we can order their latest VR headset.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.VIVE.com into our Chrome browser, it took 1 minute and 36 seconds (and a wealth of clicks) to come to the conclusion that you cannot order anything from their website (at least not easily, even though there’s a shopping cart icon on their menu bar), and that viewing a map to “Try VIVE Today” tells us that we have to live in Livingston, UK if we want to visit a store.

For www.Oculus.com, it took 3 clicks and 16 seconds to add the Oculus Quest 64 GB headset to our cart and be ready to checkout.

For these tests, we attempt to go into them without much prior knowledge of the site’s user side functionality to give it an unbiased test, so we’re pretty surprised at how drastically different the user experience was here. To give VIVE a fighting chance – even before trying Oculus’s site – we tried choosing a different headset in the event that maybe the most recent one isn’t available yet, and it still didn’t help. Perhaps the problem is that we’re performing the test from the US and VIVE’s parent company, HTC, appears to be UK-based. After further investigation, however, it appears that the only way to get to a purchasing option on VIVE’s site is to look at the “comparison” portion of the products page. Still, it seems odd that they wouldn’t make it easier and clearer to order their products. (Also, it appears that the webpage ends when you’re scrolling through, but it merely eventually changes the panel you’re “stopped” on as you scroll down, and then it moves you down the page to the next panel before stopping you again. It’s a neat design, perhaps, but no doubt a little confusing at first.)

With that in mind, here are the Usability scores:

(VIVE.com 5.5/10)
(Oculus.com 9/10)

 

Verdict

Both sites performed respectably, but when it comes to usability and speed, one unexpectedly outperformed the other—especially when it came to usability. So, we’re pleased to announce this Showdown champion to be…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Oculus.com"

]]>
Choosing a Website Monitoring Firm? Ask These 5 Questions Before You Buy — Not After https://www.alertbot.com/blog/index.php/2019/03/19/choosing-a-website-monitoring-firm-ask-these-5-questions-before-you-buy-not-after/ Tue, 19 Mar 2019 10:00:49 +0000 https://alertbot.wordpress.com/?p=608 A beautiful woman with long brown, wavy hair sitting in front of her laptop wearing a blue and white striped shirt and holding out a credit card in her left hand.

Choosing a Website Monitoring Firm? Ask These 5 Questions Before You Buy — not After

by Louis Kingston

Hey brother, can you spare $5 million?

That’s about what Amazon estimates it lost in sales back in 2013, when its website went down for around 40 minutes. For the math junkies out there, that’s $125,000 a minute, or $2,083.33 a second.

Granted, most businesses won’t suffer this kind of hefty financial setback if their website goes down. Sometimes, it pays not to be a unicorn. However, it’s enough to say that there will be a significant and wholly unwelcome cost — either due to lost sales (as in the case of Amazon), or lasting reputation damage. There can also be compliance issues that lead to fines and sanctions. Fortunately, that’s where website monitoring firms ride to the rescue and avert disaster, right? Well, yes and no.

Here is why: just like any other marketplace, there are good website monitoring firms out there, and there are bad website monitoring firms.  Obviously, your mission is to make sure that you choose the former and avoid the latter. But how? All firms promise to offer “comprehensive and robust” web monitoring services. And based on this, you may believe that the only real difference between them is price — which is utterly not the case. There are major categorical differences. And you do not want to discover after you sign (or affix your e-sig) on the dotted line that you’re on the wrong end of an over-promise and under-deliver arrangement.

To avoid that fate and help you filter website monitoring firms worth exploring from firms best avoided, here are seven questions to ask before you buy — not after:

  1. Is your platform fully integrated?

Ensure that you get a fully integrated monitoring platform that covers all of your digital properties —- including your websites, mobile websites, web apps, and cloud services (SaaS) — so that you can access all of the real-time information you need in one place. Juggling multiple tools isn’t just tedious and complicated, but it can lead to errors, oversights and disasters.

  1. How deep do you dive?

Don’t settle for just monitoring the basic availability of your URL. That’s like taking your car into the mechanic for a tune up, and as long as it starts then everything is perfect (and you get a bill for $150). You want to dive deep and monitor full page functionality within real web browsers, verify all elements, scripts, and interactive features (like real clicks and keyboard interactions), and scan for errors to proactively detect problems. You also want the option to monitor any port on any server or device, and track load times since, as we’ve written about, businesses with s-l-o-w websites are hanging out a virtual “Going Out of Business” sign.

  1. Is there anything to install?

Steer clear of (usually empty) promises that installation and setup is fast, easy, breezy, exciting, or any other adjective that you’d expect to hear in a shampoo commercial. You shouldn’t have to install anything whatsoever, and setup should take a matter of minutes — not hours or days.

  1. Do we have to maintain anything?

That groan you hear is the echo of countless IT professionals who have valiantly fought — but lost — the battle to maintain website monitoring tools. End the suffering and be the hero that your IT team needs by choosing a firm that handles all maintenance, including ongoing updates and innovations.

  1. Do you offer a free trial?

There may be “no such thing as a free lunch,” but there is indeed such a thing as a free trial. The firms on your shortlist should offer you a full two-week trial vs. a few days, so that you can put everything to the test in your environment. After all, you wouldn’t buy a car without a test drive, right? Except in this case, there is no salesperson sitting beside you saying, “what’s it going to take to get you to drive home in this baby?”

The Bottom Line

Choosing the right website monitoring firm — and avoiding the wrong ones — is a critically important decision that, sooner or later, will impact your bottom line: for better or for worse. Asking prospective vendors all of the above questions is a smart and practical way to ensure that your selection is rewarding vs. regrettable.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: Dunkin Donuts vs Starbucks https://www.alertbot.com/blog/index.php/2019/01/29/alertbot-showdown-dunkin-donuts-vs-starbucks/ Tue, 29 Jan 2019 21:17:26 +0000 https://alertbot.wordpress.com/?p=598 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying travel cups of coffee. Text reads "AlertBot Showdown: Dunkin Donuts vs Starbucks Coffee" with the word SHOWDOWN very large at the bottom.

If there’s one snack shop you’re likely to find on any given street corner in your city, there’s a good chance it’s either a Dunkin Donuts or a Starbucks (and in some cases, they’re on either sides of the street from each other). Both chains serve up steaming hot caffeinated goodness – at varying affordability in pricing – as well as other sweet treats. And while different areas of the globe may have more common chains than these two, we East Coast natives have regular access to the fresh beans of these common coffee connoisseurs.

It’s no secret that those who rely on a warm, fresh cup of java to get their day started also know these bean beverages affect their daily performance. So we wanted to pose the question – what about the web performance of these respective coffee shops?

To test their website performance, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both DunkinDonuts.com and Starbucks.com from December 1st through Christmas Day, 2018. Given the notoriety of both establishments, we expected their performance to be as strong as their brews, and we weren’t disappointed.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both Dunkin Donuts and Starbucks’ sites performed quite well. Neither saw significant downtime, but each one experienced some sluggish speeds and even load time timeouts on a couple rare occasions.

DunkinDonuts.com experienced 99.96% uptime, with just a few errors recorded due to slow load times. None of these events lasted longer than a couple minutes, and none amounted to any significant downtime. Because of this, we still consider their performance to be quite solid.  (DunkinDonuts.com 8.5/10)

Starbucks.com performed similarly with 99.87% uptime and similar slow page load errors that didn’t amount to significant downtime but at least put a wrinkle in their performance. They experienced four times as many of these errors as Dunkin, so we have to take that into consideration with our rating. (Starbucks.com 8/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.

The speed for both sites were relatively close to each other. DunkinDonuts.com’s best speed, on average, was seen on Sunday, Dec. 2 at 4.8 seconds, which isn’t stellar by any means, but not the worst either. Their best time of day, however, was on Wednesday, Dec. 19th at 4am with 2.1 seconds. It’s considerably better, but 4am isn’t exactly prime web traffic time. Dunkin’s worst averaged day was Monday, Dec. 17th at 6.2 seconds. However, their worst time was on Saturday Dec. 22 at 9am with a crawling 10.5 seconds. The site’s overall average speed across the entire test period was 5.6 seconds.  (DunkinDonuts.com 7.5/10)

Starbucks.com didn’t fare too much better in comparison. Their best day on average was Saturday, Dec. 1st with 5.2 seconds. Their best response time was at 7am on Monday, Dec. 17 with 2 seconds. (It’s interesting that their best average time was on Dunkin’s worst averaged day.) Starbucks’ worst day on average was the previous day, Dec. 16, with 6.9 seconds, with their worst response time on average being at 9pm on Friday, Dec. 7th with a slightly-slower-than-Dunkin’s-speed of 10.7 seconds. But, as you can see, both sites performed pretty close to one another. Starbucks.com’s overall average speed during the entire test period was a tad slower, at 6.3 seconds.   (Starbucks.com 7/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

If you’ve been following these competitions at all, you’ll know that no one beats California in website load time speed. However, lately, we’ve been seeing more variety when it comes to which state in the U.S. has the faster speeds. This time around, Nevada wins (for both sites), with DunkinDonuts.com moving at a swift 1.79 seconds in The Silver State. Oregon came in second at 1.8 seconds, with Ohio at 2 seconds. Comparatively, Washington state saw the slowest speed, coming in at 10.8 seconds, with Colorado in second at 9.2 seconds and Texas in third at 9.1 seconds. (DunkinDonuts.com 8/10)

Starbucks.com loaded at 1.4 seconds in Nevada, which was faster than Dunkin’s best time. Their second fastest was 1.5 seconds in Oregon and 1.7 seconds in Ohio – all better than Dunkin’s best (1.79 seconds). However, Starbucks saw significantly slower load times than Dunkin, with all of their slowest load times being worse than Dunkin’s slowest. Washington came in at 12.5 seconds, then Colorado at 11.6 seconds, and Texas at 11.4 seconds. While they were a little faster than DunkinDonuts.com, they were also considerably slower, which is unfortunate.  (Starbucks.com 7.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find their rewards program and get ready to sign up for it. (And we’re writing about it as we’re performing the test.)

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.DunkinDonuts.com into our Chrome browser, it took 15 seconds and 1 click to find the signup page for their rewards program. (OK, maybe this is too easy?)

For www.Starbucks.com, it took one click and 10 seconds to get to the rewards signup page.

For these tests, we attempt to go into them without much prior knowledge of the site’s user side functionality to give it an unbiased test, but this one probably calls for a retest with a different approach.

Let’s try navigating their respective menus and trying to find out about their coffee items.

With this in mind, from the point of typing in DunkinDonuts.com and navigating through their menu to their coffee options, it took 4 clicks and 23 seconds to get to the page with their regular drip coffee and its nutrition info. It’s a nice website and an enjoyable one to navigate.

With the same goal in mind, for Starbucks.com, it took 5 clicks and over 35 seconds to find the brewed coffee, but the confusing menu setup made it tough to find just plain, hot drip coffee. The Dunkin menu has images for all their options, but Starbucks drops most of the images once you get to the menu, so we ended up on the cold brew menu instead. (As it turns out, it was the fifth option, “Freshly Brewed Coffee” that we actually were looking for… you’d think it’d be one of the first options, though… right?)

Given that the first test was inconclusive, the second one was a clear one for us (albeit unexpected). DunkinDonuts.com was quicker and easier to navigate, and much more user friendly.

With that in mind, here are the Usability scores:

(DunkinDonuts.com 9.5/10)
(Starbucks.com 8/10)

 

Verdict

Both sites performed respectably, but when it comes to usability and speed, one unexpectedly outperformed the other—even if just by a little bit. So, we’re pleased to announce this Showdown champion to be…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "DunkinDonuts.com"

]]>
Black Friday / Cyber Monday 2018 Showdown: Amazon vs Walmart vs Target https://www.alertbot.com/blog/index.php/2018/11/29/black-friday-cyber-monday-2018-showdown-amazon-vs-walmart-vs-target/ Thu, 29 Nov 2018 19:13:25 +0000 https://alertbot.wordpress.com/?p=584 A graphic with a yellow starburst in the center and two robots charging towards a third robot. The two on the left are carrying shopping bags. The one on the right is carrying a box. The text reads "Cyber Week 2018 - AlertBot Showdown: Target vs Walmart vs Amazon" with the word SHOWDOWN very large at the bottom.
Last year, we stepped outside the usual format of our Website Showdown blogs to not only tackle Black Friday and Cyber Monday, but to cover three of the biggest retailers in the process. It was a battle royale for the ages: Walmart vs Target vs Amazon: three web retailer giants duking it out for kingship in the ecommerce realm. Walmart.com edged out its competitors just a bit in 2017, so we were especially curious to see who might reign supreme in 2018. Would Walmart keep the title, or has Target or Amazon stepped up their game?

While we’re still recovering from full bellies and empty wallets from the Thanksgiving celebratory weekend, we poured over the performance results for each site to drill in to see how they compared to last year’s event.

As usual, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor all three sites from Thanksgiving Day through Black Friday and Cyber Monday, spanning from November 22, 2018 to November 26, 2018. We expected strong, reliable performance again during the entire run and we were not disappointed. The results were nothing short of impressive. In fact, we were impressed to mostly see improvement this year over last year.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Last year, in an unusual feat, each site experienced not a single error or failure event. The same mostly held true for 2018, but both Walmart.com and Target.com struggled with a few slow file load times (which can cause a page to load slower), but it was never enough to cause any actual site downtime. With that in mind, we think it’s still fine to award 10’s across the board.

(Amazon 10/10)
(Walmart 10/10)
(Target 10/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart


Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Last year was the first time we ran this event, so it was interesting to be able to compare last year’s results with this year’s. Ecommerce sites tend to have very graphics-heavy designs, and especially with sale events like these, the graphics are often big, frequently changing, and sometimes even animated or video-driven. (Amazon even had live video streaming at one point throughout the purchasing frenzy!)

With that said, through Amazon.com’s 5-day run, they saw the fastest day, on average, to be Sunday, November 25th with 4.2 seconds—which is almost exactly what last year was (Their fastest was also a Sunday at 4.3 seconds). Their slowest day, on average, was actually on Black Friday itself at 4.5 seconds, which, admittedly, still isn’t too bad. When looking at specific times of day for performance, the best hour was 7AM on Sunday with an impressive 2.6 seconds (an improvement over last year by almost a full second), while the day before saw the slowest hour at noon with a dismal 9.3 seconds (which was significantly worse than last year).
(Amazon 9/10)

Walmart.com was the fastest last year and proved not only to hold that title again this year, but they also showed improvement! Their best average day was Cyber Monday, November 26th at 3.8 seconds. Their worst day on average was Sunday, November 25th,  at 4.1 seconds (Coincidentally, it was also Nov. 25th last year, but this year it was almost a full second faster). Finally, their best hour on average was on Cyber Monday at an impressive 1.8 seconds at 6PM. Their worst time on average was 6.9 seconds at 5PM on Black Friday, which is not when you want to be experiencing your slowest web speed.
(Walmart 9.5/10)

Last, but certainly not least, Target.com performed respectably, but once again underperformed in comparison to the other two. Their best day for speed, on average, was Black Friday at 5.4 seconds, which is not only worse than both Amazon and Walmart’s worst days, but it’s .2 seconds slower than their performance last year. Target’s slowest day on average was Cyber Monday, November 26 at 6.3 seconds, almost a full second slower than last year. Their fastest hour turned out to be on Black Friday at 5AM with 3.1 seconds, which is a slight improvement, with their slowest time being on Monday at 3PM with 8.9 seconds, over a second longer than last year, and sadly during mid-day on Cyber Monday.
(Target 8.5/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California has almost always come out on top as the fastest state, but this year they were consistently dethroned by none other than Oregon! For Amazon.com, the ecommerce mega-site saw average load times of 1.4 seconds in the The Beaver State, with their next-fastest location being Ohio at 1.6 seconds and Nevada at 1.8 seconds. When it came to their slowest locations, Washington, D.C. took the prize at a sluggish 7.5 seconds and Washington state clocking in at 7.3 seconds.
(Amazon 9/10)

Just like in 2017, Walmart.com was faster, but by a mere millisecond, seeing an average load time of 1.3 seconds in Oregon. Nevada and Ohio followed at Amazon’s fastest time, 1.4 seconds. Washington state saw the site’s slowest load time at 6.8 seconds, with Colorado coming in at 6.5 seconds and Texas at 6.3 seconds – all of them being faster than Amazon’s worst locations.
(Walmart 9.5/10)

Target actually saw some improvement this year with its average load time being fastest in Nevada at 2.3 seconds in (last year’s was 2.7 in California), while Oregon came in second at 2.5 seconds and Ohio third at 2.7 seconds. And like last year, Target’s fastest speeds proved to be slower than their competitors. The slowest average speed that Target saw in the U.S. was sadly worse than last year. Washington state clocked in at a truly dismal 10.7-second average load time, with Colorado a second behind at 9.6 seconds, and Texas at 9.3 seconds. It’s unfortunate that Target continues to miss the mark for website speed.
(Target 8.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we always select a common task a user might typically try to accomplish when visiting the sites we’re testing and replicate it. For last year’s Showdown, we decided to see what the experience would be like to use these three different websites to add a common product to the shopping cart. To do this, we selected one item to search for and add to our cart, and this year we decided to do the same again.

For each of these processes, we picked an easy item to search for, and sought to add a Blu-Ray copy of Disney and Pixar’s Incredibles 2 to our shopping cart. To begin each process, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.amazon.com into our Chrome browser, typing “Incredibles 2 blu-ray” into the store’s search box, and adding it to the cart, it took 34 seconds. From the front page, it took about 5 clicks (including having to log in to get to the final checkout) to get to the “Place your order” window.

From the point of typing www.walmart.com into Chrome and going through the same process, it took about 6 clicks and 32 seconds to log in and get to the final cart checkout page.

And from the point of typing www.target.com into our Chrome browser, it also took about 6 clicks and 32 seconds to log in and get to the checkout window.

Each site was a good experience to use, although each one has a different feel and approach. It’s a tough call to say which user experience we found to be better, but each one was straightforward and easy to use. If we judge the sites based on search results, Amazon tried suggesting a few things unrelated to the specific search of the “blu-ray” disc first (like a Jurassic Park daily deal and a preorder for Venom), while both Target and Walmart have more direct and accurate results (even though Walmart suggests the DVD and 4K before the actual blu-ray). In that case, we’d have to give Walmart and Target a little more props for accuracy in their product search.

(Amazon 9.5/10)
(Walmart 9.5/10)
(Target 10/10)

 

Verdict

With stakes this high once again, you would only expect the best from the leaders in ecommerce, so it comes as no surprise that the results were so good and so close.

With all things accounted for – reliability, speed, geographical performance, and the site’s usability – we’ve reached our verdict, and it surprises even us for a second year in a row:

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Walmart.com"

]]>
AlertBot Showdown: Staples vs OfficeDepot https://www.alertbot.com/blog/index.php/2018/10/23/alertbot-showdown-staples-vs-officedepot/ Tue, 23 Oct 2018 17:43:08 +0000 https://alertbot.wordpress.com/?p=573 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying office supplies. Text reads "AlertBot Showdown: Staples vs Office Depot" with the word SHOWDOWN very large at the bottom.

Even though our world continues to creep ever closer to being paper-free—trading our paper tablets for iPads, office supply stores have had to reinvent the way they do business and what their focus is. Staples and OfficeDepotTh are two mega-chain retailers who’ve long been in the fight, regularly providing printing services, as well as day-to-day necessities for the workplace, like pens, calendars, computer accessories, and so much more. And with the all-in-one ecommerce solutions monopolizing the public’s needs (we’re looking at you, Amazon), the desire to shop at these niche market leaders—who typically charge more for the same products—is becoming less and less.

So, for our latest, Showdown, we looked at these two office supply bigwigs and used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from August 26 to September 16, 2018. After engaging in this different kind of “Office Olympics,” we were expecting the usual quiet response from two reliable websites (i.e. good performance), but instead found what was equivalent to, well, a fun office chair race gone horribly wrong.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both Staples and OfficeDepot’s sites seemed to perform satisfactorily, with neither site ever really seeing significant downtime, but one of them really seemed to struggle with its load time.

AlertBot ended up returning over 800 alerts from Staples.com in the evaluated time span, with half of them being slow files bogging down the page, and the other half being page load timeouts. This doesn’t necessarily mean the site crashes, just that it’s taking unusually long to load. Their site regularly had a pop-up window during this time period promoting signing up for their email list, which seemed to play a part in disrupting the site’s load time and process.  (Staples.com 5/10)

On the flip side, OfficeDepot.com performed much better (despite also having a pop-up on its page), but while it seemed to see problems less often, it did experience two failure events, experiencing 98% uptime (compared to Staples’ 100%). The majority of the errors OfficeDepot experienced were slow files or longer load times. Despite this, however, it seems as though its worst times were in the middle of the night (a frequent site maintenance time), which is common for most sites. (OfficeDepot.com 7/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.

Staples.com’s speed tests proved that load times were a regular issue. Its best day, on average, was Monday, September 17th with 7.9 seconds. It’s not the worst load time, but given that most sites are expected to load in 2 to 3 seconds these days, it’s almost three times that. Their best time of day was on Thursday, September 6 at 10am with 3.3 seconds. The worst day, on average, was Friday, September 7th with 10.3 seconds, while the worst time of day was at 1am on Sunday, September 9th with a sluggish 13.8 seconds.  (Staples.com 7/10)

OfficeDepot.com actually fared worse, comparatively. Their best day proved to be Thursday, September 6 with 9.9 seconds for the page to load. Their best time of day was at 6pm on Wednesday, September 5th at 6.4 seconds. Their worst is significantly worse, with Monday, August 27th seeing an average of 12.5 seconds, and the worst time of day being on the same day at 3am with 16.8 seconds! (OfficeDepot.com 6/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

Typically, for the geographic tests, California is king, always turning in the fastest response time. For Staples, it’s actually North Carolina, who saw an average of 3.7 seconds of page load time. Washington, DC was second at 4.7 seconds, and New York was third at 5.2 seconds. The state with the slowest results was Missouri with 15.1 seconds and New Jersey with 15 seconds. Oddly enough, California, Florida, Colorado and Virginia all averaged 15 seconds—which is unusual. (Staples.com 6.5/10)

Things were the norm for OfficeDepot, however. They saw their fastest speeds in California, at 7.5 seconds, with Virginia being second fastest at 7.7 seconds. Their slowest performance was Missouri with a crawl of 19.9 seconds, and Utah followed it up at 15.6 seconds. (OfficeDepot.com 6/10)

These aren’t the worst website load times we’ve seen, but they also weren’t anything to brag about either.

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For evaluating a site’s usability, we always select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find an office executive chair and add it to our shopping cart.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.staples.com into our Chrome browser, it took 30 seconds and 5 clicks to search for “office executive chair,” click on one to view its product page, add it to the cart, and click “checkout.” (It had us thinking “That was easy!”)

For OfficeDepot.com, it took about 40 seconds and 6 clicks to get to the checkout process. OfficeDepot had a pop-up as soon as we got to the site which added one click, and then clicking on the cart and going to the checkout seemed to be a clunkier experience.

It’s a tough call for usability, but we did find the Staples checkout process to be a tad smoother.

All things considered, here are the Usability scores:

(Staples.com 9/10)
(OfficeDepot.com 8/10)

 

Verdict

It’s surprising how closely these two office supply giants performed – and how disappointing each did as well.  Still, neither were so bad that they experienced many full-on website failures, but both could benefit from some serious attention paid to increasing their website speed. Neither site really stands out above the other with its performance, because the good and the bad often balanced each other out, but when it comes down to considering the sheer usability as a tie breaker, we feel the verdict is…

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Staples.com"

]]>
3 Reasons Why Website Speed is More Important than Ever https://www.alertbot.com/blog/index.php/2018/10/15/3-reasons-why-website-speed-is-more-important-than-ever/ Mon, 15 Oct 2018 18:01:36 +0000 https://alertbot.wordpress.com/?p=569 A graphic showing two loading bars. The top one, which is loading faster has the silhouette of a cheetah running. The bottom bar, which is much slower, has the silhouette of a snail.

3 Reasons Why Website Speed is More Important than Ever

by Louis Kingston

Today’s business environment is relentlessly fast-paced. Today’s startups blast into tomorrow’s enterprises. And just as rapidly, today’s unicorns take a one-way journey into “hey, whatever happened to…” country. However, there’s another critical piece of the velocity puzzle that many businesses are missing, and it’s costing them customers and profits: the speed of their website.

Speed Kills Lives 

Nearly 50 years ago, the government introduced the phrase “speed kills” to warn drivers that going too fast from point A to point B could result in a detour to point C (the police station), point D (the hospital) or point E (the morgue). It was good advice then, and it’s still good advice now.

But when the scene shifts from the asphalt freeway to the information superhighway, speed doesn’t kill anything. On the contrary, it keeps websites alive as far as visitors are concerned. Here are the 3 reasons why:

  1. Speed Makes Websites Sticky

The word “bouncy” has a happy and positive feel to it, while the word “sticky”…well, it doesn’t. Nobody shows up to a birthday party excited to jump around in the sticky castle, and swimming pool diving boards wouldn’t be doing their job if people stuck to them (although it would be kind of hilarious).

But when it comes to websites, sticky is glorious and bouncy is dreadful — and that’s where speed makes a massive difference. A study by Kissmetrics found that a one second delay in load time can send conversion rates plunging by seven percent! Think about that. Actually, don’t think about that. Just read this sentence. That took a whopping two (!) seconds.

  1. Speed is SEO Rocket Fuel

An old joke in the SEO world goes like this: “Where’s the best place to hide a dead body? Page two of Google.” (And in related news, an old conversation among psychologists is: “Why do SEO people make jokes about hiding dead bodies?”)

Macabre humor aside, the point is simple to understand: for most (if not all) of their keywords, businesses either need to be on page one of Google — and preferably in the top three positions — or they might as well be advertising in the Yellow Pages (ask your grandparents).

Once again, speed is a big part of the SEO story. Google — which is obsessively secretive about how its algorithm works (the first rule of Google Search Club is that you don’t talk about Google Search Club) —has actually gone ahead and formally confirmed that page speed is a significant SEO ranking factor for mobile and desktop searches.

The moral to this story? All else being equal, a website that loads faster will rank higher than a website that loads relatively slower. And in the long-run that could mean the difference between surviving or shutting down.

  1. Speed Influences Perception

Einstein revealed that time, quite literally, is relative. But you don’t have to become a physicist or get yourself on a million memes to experience the deep truth of this in your bones. Here’s a fun little experiment:

Imagine that your favorite football team is losing a very important game. It’s late in the fourth quarter, and your beloved team is behind by six points. Although the clock is ticking down one second at a time, in your view the time is racing by. Surely, the clock must be rigged!

Now, imagine that your team is ahead vs. behind. The clock is still ticking down one second a time, but to you it’s not racing — it’s grinding slowly and painfully. Yet again: the clock must be rigged!

What this simple example demonstrates is what psychologists dub the perception of speed. Essentially, this means that our emotions influence how we grasp the velocity of passing time. Just a few seconds can seem like the “blink of an eye,” or a tedious wait — as we all know from toiling at the (not-so) express line in the grocery store.

The direct link to website speed here is unmistakable: visitors dislike waiting for websites to load. Actually, they hate it. Each extra second exponentially adds to their unhappiness, and makes it more likely that they’ll exact revenge by smacking the back button on their browser — never to return.

No, this doesn’t mean that websites must load instantaneously, like flipping channels on a TV. Technology isn’t there yet, and visitors aren’t unreasonable or unrealistic. But yes, it does mean that speed is connected UX, and ultimately, with brand: fast loading times creative a positive experience and emotions that are associated with the brand, while slow loading times do the opposite.


The Bottom Line

Website speed has always been important. But these days, it’s crucial — and in many cases, it’s THE MOST IMPORTANT factor. After all, it really doesn’t matter how amazing a website is and what it offers, if visitors never get there in the first place.

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

 

]]>
AlertBot Participates In Lehigh Valley Zoo’s “Run Wild for Animal Conservation” 5K Race https://www.alertbot.com/blog/index.php/2018/08/31/alertbot-participates-in-lehigh-valley-zoos-run-wild-for-animal-conservation-5k-race/ Fri, 31 Aug 2018 18:58:48 +0000 https://alertbot.wordpress.com/?p=552 Several members of AlertBot’s staff joined over 900 participants in Sunday’s “Run Wild for Animal Conservation” 5K Race at the Lehigh Valley Zoo.

While on the Run Wild 5K/10K trail, we ran through the Trexler Game Preserve, an 1100 acre animal sanctuary. The race finished inside the Lehigh Valley Zoo, which led everyone past its animal exhibits, including camels, zebras, and kangaroos. A few of us even stuck around after the race to mingle a bit with the zoo’s various furry residents.

The AlertBot team is excited to be able contribute to such a noble cause as Animal Conservation, especially with thousands of species remaining endangered today. All proceeds from the race went to benefit Lehigh Valley Zoo’s animal conservation efforts, which raised over $25,000 last year and nearly doubled that this year, raising $40,000.

Run Wild was a success, and we can’t wait for the next opportunity to strap on our sneakers and join in the efforts to make a difference in our community!

 

Runners with numbers on the back of their shirts participating in an event Runners with numbers on the back of their shirts participating in an event, crossing the finish line Two men with their hands up to give each other a high five A male runner crossing the finish line Three runners with numbers on their shirts posing for a photo Three runners with numbers on their shirts posing for a photo A photo of giraffe in its pen at the zoo A photo of giraffe in its pen at the zoo A photo of zebra in its pen at the zoo

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: Moviepass vs Sinemia https://www.alertbot.com/blog/index.php/2018/08/21/alertbot-showdown-moviepass-vs-sinemia/ Tue, 21 Aug 2018 18:29:00 +0000 https://alertbot.wordpress.com/?p=542 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying membership cards and ticket stubs. Text reads "AlertBot Showdown: moviepass vs sinemia" with the word SHOWDOWN very large at the bottom.
With streaming services like Spotify, Apple Music and Amazon redefining how we consume music, or NetFlix, YouTube and Hulu changing how we consume movies and TV at home and on the go, it probably should be no surprise that the subscription service concept would make its way to the cinema. MoviePass has long been a leader when it comes to theater-going subscriptions, but Sinemia is a rising competitor that has thrown its hat into the ring to fight for a share of the movie-going, popcorn-munching theater ticket buyers. Both services allow movie fans to pay a specific monthly (or annual) fee to see movies on the big screen at a discounted price.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from July 1 to July 22, 2018. As both sites and services are continuing to grow and change (Heaven knows MoviePass will probably change their rules and operations again before you finish reading this sentence), we weren’t surprised to see how similar the sites for each service performed.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both MoviePass and Sinemia performed well here, but one did seem to struggle a little more than the other.

MoviePass.com experienced a 98.2% average uptime due to several days where the site seemed to perform slower than usual, causing the pages to not load fully – even triggering a strange account lookup error on the front page for several hours on July 14th. This resulted in 18 failure events cataloged by AlertBot, with an average failure time of 32 minutes. This doesn’t mean downtime, per say, but the details did show that the site was struggling with its speed and load times. (MoviePass.com 7/10)

Comparatively, Sinemia.com saw 99.98% uptime with 1 failure event, although it wasn’t anything that spelled major downtime. At worst, it appeared to be a slow page / busy error that didn’t last long enough to qualify as site downtime. Overall, Sinemia proved to be pretty reliable. (Sinemia.com 9/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser™ monitoring.

MoviePass.com saw acceptable page load speeds overall, with their best average day being Wednesday, July 4th with 3.9 seconds. The best time of day was 1am on Friday, July 20th (which isn’t a popular time to even be using a site like theirs) at an average of just 1.6 seconds. On the other side of the proverbial coin, the slowest day was Saturday, July 14 with an average time of 8.9 seconds, and the worst time of day was also on the same day at noon (yikes!) with an embarrassing 14.1 seconds.  (MoviePass.com 7.5/10)

Sinemia actually didn’t perform too much better, with their best average speed for a single day being Saturday, July 21 with 5.4 seconds and their best time of day being Wednesday, July 4th at 5pm with 2.7 seconds. Their slowest day was Monday, July 23rd with 7.3 seconds, with the slowest time being on July 2nd at 10pm with 10.2 seconds. (Sinemia.com 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

MoviePass.com performed the fastest in California with 1.8 seconds, with Florida coming in second at 2.4 seconds. The site performed slowest in Missouri with a sluggish 10.2 seconds, with Utah coming in second at 8.5 seconds. (MoviePass.com 8/10)

For Sinemia.com, California was also the fastest at 2.9 seconds, and Virginia was second fastest at 3.5 seconds. Missouri was also the slowest, at 11.3 seconds, with Utah being second slowest at 9.1 seconds. (Sinemia.com 7.5/10)

Neither site was all that impressive in the nature of speed – which is interesting considering there isn’t a whole lot of content on their websites to slow them down.

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to start the service signup process (but not complete any forms).

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.moviepass.com into our Chrome browser, it took a mere 18 seconds and 2 clicks to see their plans and get to the signup form. It was a piece of cake.

For Sinemia.com, it was actually just as smooth. In 17 seconds and 2 clicks, we were able to select a plan and get to the signup page.

It’s a tough call for usability. They’re simple processes, but they get the job done and we have no complaints.

All things considered, here are the Usability scores:

(MoviePass.com 10/10)
(Sinemia.com 10/10)

 

Verdict

The usability usually isn’t this straightforward and clear for both sites, so it leaves us to look almost exclusively at the other categories to draw a conclusion.

Without assuming MoviePass may have more hiccups in speed due to a greater deal of traffic, Sinemia.com seems to be a clearer choice for reliability as a whole, but the sites are quite close. That bad day on July 14 also really hurt MoviePass’s performance during this evaluation period, but it can’t be ignored. So, with that said, we believe the verdict is…

Winner:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Sinemia.com"

]]>
How To Reduce HTTP Requests To Speed Up Your Site https://www.alertbot.com/blog/index.php/2018/06/07/how-to-reduce-http-requests-to-speed-up-your-site/ Thu, 07 Jun 2018 09:04:02 +0000 https://alertbot.wordpress.com/?p=538 An extreme closeup of a website address bar on a web browser with "http://ww" visible


How To Reduce HTTP Requests To Speed Up Your Site

by Louis Kingston

Most of us are blissfully unaware of the technical feats happening in the background when we browse to a webpage. We typically only notice that there is some technical failure when the site we’re visiting takes such a long time to load that we get impatient and click refresh or the site outright displays an error message.  In actuality, there’s a lot that goes on between your web browser and the web page you’re visiting.

For digital marketers, it might seem that their expertise only needs to be focused on Search Engine Optimization, content marketing, and Pay Per Click. But, their hard work is never going to see the page one light of day on the Search Engine Results pages if the page’s load speed is extremely slow. Visitors will just click away to a site that loads quickly and without any errors.

In Pursuit of a Better User Experience

One of the main reasons for a slow site speed is a high number of HTTP (Hyper Text Transfer Protocol) requests.

In a nutshell, an HTTP request entails the following procedure when you decide to visit a website:

  1. Your web browser communicates with the web server that hosts the relevant site that you are trying to visit.
  2. The web browser asks the host servers to send it all the files that make up the page you’re visiting. This includes all of the text, images, and multimedia.
  3. The host then acknowledges your web browser’s request, sending over the files that make up the webpage, which are rendered as they’re received within your web browser.

Are You Taking Too Long to Respond?

Imagine it is your website that is being visited. Before the files can display on the visitor’s browser, a separate HTTP request will be made for every file that makes up that pages (images, javascripts, style sheets, etc). There can be many large files that can take an extremely long time to download. Most great sites these days have data that carry high definition images and can result in slow load times. This makes the Google Algorithm unhappy, and you get penalized. This can cause you to lose your top spot on the search engine results page.  On top of losing position in search results, it’s shown that your potential visitors will not stick around to wait for a slow page to load, and you can say goodbye to conversions. In fact, 47% of visitors to a site want to see a load speed of fewer than 2 seconds! (KISSmetrics report). After three seconds, you can expect 40% of people to hightail it out of there to find a faster solution.

The ideal number of files that make up a single web page is 10-30 files, but these days we see the number of HTTP requests balloon to over 100 per page on some sites!

How Can You Lower Your HTTP Requests?

  1. Check your website’s current HTTP request number to see what is taking the longest to load. This can be done via AlertBot’s detailed and easy-to-read waterfall chart.
  2. Eliminate all large, unnecessary images that aren’t contributing any value.
  3. Optimize your remaining images by reducing the file size. Many servers automatically compress images for you, but you must double check.
  4. Video and other social media integration tools could be adding seconds to your load speed, and then you will need to decide whether they are essential to your site.
  5. Remove JavaScript, or at the very least make it asynchronous. A browser will always load an entire JavaScript file before loading the rest of the page. This can take up a significant amount of the load speed lag.
  6. Ask your developer to combine CSS files to increase load speed. You can also use CSS Sprites to put all your image requests into just one.

Sounds technical, right?

AlertBot can provide you with the tools you need to pinpoint performance issues and help set you on the right path to better website performance. AlertBot offers a Free 14-day trial (without collecting any billing info).  Give us a try!

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
AlertBot Showdown: Michaels vs A.C. Moore https://www.alertbot.com/blog/index.php/2018/05/31/alertbot-showdown-michaels-vs-a-c-moore/ Thu, 31 May 2018 20:04:33 +0000 https://alertbot.wordpress.com/?p=527 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying arts and crafts supplies, like paint brushes and plants. Text reads "AlertBot Showdown: Michaels vs A.C. Moore" with the word SHOWDOWN very large at the bottom.

Whether it’s designing a centerpiece for home or an event, perusing the aisles for tools for a school project, or locating a frame for that beloved photograph, it’s likely you’ve found yourself inside an arts and crafts store at some point. From cloth patterns to drawing pencils to blank canvases and custom framing, these craft supply stores are just what creative people  look for in a retailer.

With the rise of ecommerce, arts and crafts stores are just as accessible from the comfort of your computer or mobile device. For artists and crafters, something is undoubtedly lost when shopping online for these kinds of supplies, but the ease of online shopping is undeniable. Two of the biggest players in the market are Michael’s and A.C. Moore, so for this, our ninth, Showdown, we’ve pit the web performance of these two leading crafty retailers against each other.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from March 25, 2018 to April 8, 2018. As expected, both sites performed quite well, but as in most cases like this, one site saw better results than the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both websites did really well here, with neither site seeing any significant, true downtime.

Michaels.com experienced 99.9% average uptime due to 2 page load timeout failure events (where something on the page takes a bit longer to load, slowing the page’s overall performance down). When drilling down to see what errors Michaels.com returned, it signaled 17 instances where the page took longer to load than expected, and 15 times where something on the page took too long to load and slowed the page down. Still, despite the 2 timeouts, Michaels did well overall. (Michaels.com 8.5/10)

Comparatively, ACMoore.com saw 100% uptime with no significant failure events. However, there were still 4 recorded moments where there was a slow file and 4 occurrences of when the page itself took longer to load than expected. Still, ACMoore.com never actually went down, so we have to give them high marks for that.
(ACMoore.com 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Michaels.com saw pretty decent page load speeds overall, with their best average day being Wednesday, April 4th with 3.5 seconds. The best time of day was 6pm on Friday, April 6th at an average of just 2.1 seconds. On the flip side, the slowest day was Sunday, March 25 with an average time of 6.8 seconds, and the worst time of day was Sunday, April 8 at 8pm with 6.7 seconds.  (Michaels.com 8.5/10)

ACMoore.com was truly impressive with their load time. Their best day—Tuesday, March 27 with an average of just 1.5 seconds! A.C. Moore’s best time was even faster with Wednesday, April 4th, at 10pm seeing a load time of just 1.2 seconds. Even more amazing was the fact that ACMoore.com’s worst day—Thursday, March 29–saw an average load time of 1.8 seconds! Their worst time, however, was significantly longer (in comparison) at 3.8 seconds on Thursday, April 5 at 3pm. (It’s interesting that both slower speeds were on a Thursday.) It was really a rarity that ACMoore.com went over 2 seconds in load time, and for that, we have to applaud their excellent web performance. (ACMoore.com 10/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California continues to reign supreme as the leading location in speed. Michaels.com loaded within 2 seconds (on average) in California, with Florida seeing the second fastest speed of 2.5 seconds. Missouri turned out to have the slowest load time of 7.1 seconds, while Utah came in second-to-last at 4.9 seconds. (Michaels.com 8.5/10)

For ACMoore.com, California is the fastest, once again, at an average of just 1.9 seconds. The second fastest, again, is Florida with 2.4 seconds. The slowest speed time is also seen in Missouri at an average of 8.2 seconds, with NJ coming in second-to-last at 5.5 seconds. It’s interesting to note that ACMoore.com proved to have faster speeds than Michaels, but also slower speeds (when it comes to loading in specific locations). (ACMoore.com 8.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to find some paint brushes, add them to the shopping cart and start the checkout process.

For each of these processes, we started by opening a new tab in Mozilla Firefox and typing in the site’s URL.

From the point of typing www.michaels.com into our Firefox browser and searching “paintbrushes” in the product search box, it took 30 seconds and 4 clicks to select a pack of brushes, add them to the cart and view the cart.  It was definitely a smooth experience.

ACMoore.com was, unfortunately, a far more frustrating experience. Upon visiting the site, we were hit with a pop-up asking for us to signup for their email list to get a coupon. Plus, their signup box at the top of the page is typically where a site search would go, so it’s easy to mix them up (despite the “Sign Up for Offers” label next to it). It didn’t take long to discover that their site also doesn’t seem to specialize in craft materials, as a search for something as basic as “paintbrushes” returned nothing. We tried altering the wording in our search a bit but gave up after reaching a minute and a half.

To be fair, we decided to run the usability process again with different search criteria. ACMoore.com seems organized by craft project ideas, without any real discernable things you can purchase from their site (and yet, they have a shopping cart), which makes the sites quite different from each other (and gives Michaels.com an edge over ACMoore.com in sheer product availability and variety). In the end, while the brick and mortar stores are very similar, their online presences are not. So we decided to run it again to see how fast we can get to, and briefly look around, their individual Weekly Ads.

For Michaels.com, it took about 2 clicks and roughly 10 seconds to get to the Weekly Ad for May 6 and start clicking around. It offered two choices for ads, but we chose the basic ad for the week to browse. It was a very easy experience.

For ACMoore.com, it took 20 seconds, 3 clicks and typing in our zip code to get to our local area A.C. Moore store’s ad before we could start clicking around. The ad isn’t nearly as thorough or as nice as Michael’s is, either.

All things considered, here are the Usability scores:

(Michaels.com 10/10)
(ACMoore.com 3/10)

 

Verdict

When it comes to speed, ACMoore.com bested their competitor, Michaels.com, but given the lack of substance and actual storefront of ACMoore.com, it may not be too fair to compare them. However, a quick lap through the aisles of both brick-and-mortar stores for each brand will show just how similar each store is. So, with taking everything into consideration, and both sites performing very well when it comes to the actual site reliability, it’s hard not to give weight to the user experience when making the final conclusion…

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Michaels.com"

]]>
AlertBot Showdown: Playstation vs Xbox https://www.alertbot.com/blog/index.php/2018/04/06/alertbot-showdown-playstation-vs-xbox/ Fri, 06 Apr 2018 19:30:53 +0000 https://alertbot.wordpress.com/?p=517 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying video game system controllers. Text reads "AlertBot Showdown: Playstation vs XBox" with the word SHOWDOWN very large at the bottom.

It may have been squashing a goomba while punching a coin out of a brick, dodging barrels being thrown by a grumpy gorilla, sorting oddly shaped falling blocks into interlocking patterns or simply catapulting miffed fowl at a group of defenseless pigs on your mobile phone, but chances are high that everyone has played a video game at one point in their life.

Poor web performance is no game any self-respecting owner of a website should play. We recently aimed our sights at the gaming industry and picked out two heavy hitters to evaluate: Xbox and Playstation. While their websites may not be the main point of interest for gamers, they’re relied upon for information, updates and even online digital game sales. Their online gaming servers may be the most important thing to keep running smoothly in gamers’ minds, but these top players in the industry will want to make sure their website stays up and always accessible.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from February 4, 2018 to February 25, 2018. Both sites performed well—as can be expected from parent companies Microsoft (Xbox) and Sony (PlayStation)—but, as usual, one performed just slightly ahead of the other, even if not by much.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both websites experienced 100% uptime, but both sites encountered minor errors that served as a few speedbumps along the way. Still, it wasn’t enough to qualify as downtime.

Xbox.com, despite its 100% uptime, experienced around 50 “slow page” warnings and over 20 page load timeouts (where something on the page takes a bit longer to load, slowing the page’s overall performance down). Xbox.com also returned an SSL Certificate expiration notice. However, none of these qualified as significant outages, and for that we still have to give them props. (Xbox 9/10)

Playstation fared the same with 100% uptime and a lot better when it came to the little errors. They only registered 7 timeouts and 5 slow page loads, and for that we give them slightly higher marks.  (Playstation 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Speed is crucial to the gamer – be it game load times (who else hates waiting for spinning icons to finish to get us past a cut scene or moving on to a new map in a game?) or server responsiveness – so a speedy game company website is key. Xbox.com experienced pretty quick load times, with its best day being February 24th with an average of 4.6 seconds. Its best response time, however, was on February 23rd at noon with 2.2 seconds. On the flipside, its worst day was February 12 with 6.7 seconds (which isn’t all that bad), but their worst hour proved to be on February 11th at 11pm with a sluggish 13.1 seconds. (Xbox 8.5/10)

Surprisingly, Playstation turned out to be just a little bit slower, with their best day average being 6 seconds on February 22nd. Their best time by the hour was on the same day at noon with 2.3 seconds, just a hair slower than Xbox’s best time. Their worst day was a full second longer on February 11th with 11.7 seconds, and their worst time by the hour was also 13.1 seconds, but on February 10th at 7am. (Playstation 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California seems to win out most of the time as the fastest location for load times and for Xbox.com, it was no different. California saw load speeds of 2.1 seconds on average, with Florida coming in second at 2.2 seconds. Georgia, however, saw an average worst time of 10.3 seconds with Missouri coming in second at 9.2 seconds. (Xbox 8.5/10)

Playstation.com actually turned in slightly more sluggish results geographically, too. Their best location was California, as well, but it was 2.5 seconds, and Florida was a close second at 2.7 seconds. Playstation’s slowest spots were also in Georgia and Missouri, at 12.6 seconds and 11.2 seconds, respectively. It’s not the worst we’ve seen, but Xbox clearly performed better. (Playstation 7.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like going through the motions of ordering movie tickets from a local theater or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to add a digital download of a popular video game to the shopping cart and start the checkout process.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.xbox.com into our Chrome browser and clicking around to find the Xbox One games, choosing the featured one (which, in this case was Dragonball FighterZ), clicking “Buy Now” and getting to the account login screen, it took 1 minute and 10 seconds. From the homepage, it took 7 clicks to get to the checkout process. It’s been a while since we’ve last visited their site, so our experience was fresh, but we encountered some significant slow loading times when getting to the product page. We actually added an additional click to the process because the “Buy Now” button didn’t load properly at first (and did nothing upon its first click). Overall, we got to do what we set out to do, but the process could have gone a lot smoother.

We were hoping for a better experience from Playstation, and we got one. From the point of typing www.playstation.com into our Chrome browser, it took 4 mouse clicks and 35 seconds to find a featured video game (in this case, Bravo Team), and get to the checkout stage (which was also an account login screen). There was some delay on first clicking on the game title, but it still loaded quickly and allowed us to get to the end of the process fast.

Both sites allowed us to get the job done in a rather speedy manner, but Playstation’s site gave us a much more positive experience.

With that said, here are the Usability scores:

(Xbox 8/10)          (Playstation 9.5/10)

 

Verdict

Both sites performed very well, but that positive user experience helped push one over the other, albeit only slightly. So while it was a tough call to make, we have come to a conclusion —

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Playstation.com"

]]>
Synthetic Monitoring for SaaS: Keeping a finger on the pulse of your cloud app https://www.alertbot.com/blog/index.php/2018/02/22/synthetic-monitoring-for-saas-keeping-a-finger-on-the-pulse-of-your-cloud-app/ Thu, 22 Feb 2018 12:06:12 +0000 https://alertbot.wordpress.com/?p=511 An illustration showing cartoonish hands on a computer keyboard. The computer monitor, as well as a cell phone screen and tablet show pie charts and graphs on them. Graphics of clouds and gears are also on the image.
Synthetic Monitoring for SaaS
Keeping a finger on the pulse of your cloud app
by Penny Hoelscher

Congratulations, you have just leveraged an awesome Software as a Solution (SaaS) service for your organization. Perhaps you have implemented a popular application – like Office 365, SalesForce or Dropbox – to support your staff and enhance collaboration between teams. Now you need to ensure that your employees and / or customers are happy too.

At this point, a common misconception often arises: the belief that a SaaS application relieves businesses of all responsibility for monitoring the application. It is just a matter of time before your business is rudely awakened to reality when customers start complaining about outages or poor performance on social media, and overloading the support desk with calls.

A negative customer experience when utilizing one of your SaaS applications can affect your bottom line. Unfortunately, you cannot totally rely on your provider to keep the system ticking; even the big guys experience outages and cyber attacks. Synthetic monitoring provides a solution, a way for you to keep your finger on the pulse of your cloud services.

Taking responsibility for SaaS applications

Effective SaaS monitoring is measured by how positive the end-user experience is. For instance, if a user cannot log in to an application to retrieve a file you sent them, they will not be happy. Can you leave it up to a SaaS provider to keep you up-to-date when they have a problem? No. In fact, it is not unusual for SaaS providers to delay making press statements when they experience problems or not announce them at all. Organizations are fast realizing the importance of taking the responsibility of proactively monitoring the performance of the SaaS applications they use themselves.

In addition, in 2016 Gartner predicted that by 2018 50 percent of enterprises with more than 1,000 users would use cloud products to monitor and manage their use of SaaS and other forms of public cloud. This reflects the growing recognition that, although clouds are usually stable, monitoring applications requires explicit effort on the part of the cloud customer.

Why do you need to monitor your SaaS applications yourself?

  • Whatever your Service Level Agreement (SLA) says, if customers cannot access your application, they are more likely to call you than your service provider, so you need to know exactly how your service is doing, rather than wait for irate customers to notify you.
  • Again, whatever your SLA says, if you are not monitoring your SaaS application, you cannot know the conditions of your SLA are actually being met.
  • A SaaS provider gathers generic data about all of its customers that may not be explicitly relevant to you, or sufficient to generate a meaningful performance analysis and take advantage of all the benefits a synthetic monitoring solution can, as we shall see, provide.

Monitoring the customer experience (CX)

Synthetic monitoring has immense benefits for monitoring SaaS applications. It can help you keep a finger on the pulse of your SaaS application by addressing the following core issues that affect the customer experience and can affect your bottom line:

  • Identified bottlenecks in the UX: Is your application functioning as predicted? How is it performing?
    When customer functions like logging in, using shopping carts or search fail, visitors may defect to one of your competitors. Synthetic transaction tests easily simulate these types of transactions, enabling you to be certain everything is working as it should and that recent software updates have not broken existing functionality. It is much easier to measure performance when you have a baseline. Synthetic monitoring applications, like AlertBot, have user-friendly portals that provide charts, graphs and reports to highlight deviations from the baseline. They make it that much easier to keep your finger on the pulse.
  • Error sources and reporting: If there is a problem, where is it?
    Synthetic monitoring is a powerful predictive tool. According to one leading industry professional, “Synthetic monitoring doesn’t rely on complex predictive algorithms, it doesn’t take a data scientist with a Ph.D to interpret the results, and it doesn’t require additional spending on IT infrastructure. What it does is predict, to a fair degree of accuracy, how your application will perform in which geographies and isolate the root cause of any detected bottlenecks.”
  • Downtime and outages: Is your application online and accessible?
    While synthetic monitoring is traditionally associated with customer-facing websites, it is platform-agnostic and works just as well monitoring SaaS applications. For instance, it can identify what is competing for resources on your network. Resource hogging applications can degrade the overall performance of your service.


5 top advantages of using synthetic monitoring for SaaS applications

  1. Quickly assess the overall impact and performance of new features on your system
  2. Monitor multiple cloud centers simultaneously as well as performance in scenarios where traffic is ramping up aggressively
  3. Validate base-line performance prior to signing off on an SLA with your provider and objectively monitor it after
  4. Utilize automatically generated diagnostics and reports that enable you to share information and collaborate with your SaaS provider
  5. Determine if a cloud service is down and exactly who is affected in what geographical locations


Synthetic monitoring for SaaS is growing in leaps and bounds

According to a MarketsandMarkets.com report, “Synthetic Monitoring Market by Monitoring Type (paywall),” the enterprise synthetic application monitoring market size is expected to grow $919.2 million in 2016 to $2,109.7 million by 2021, at a CAGR of 18.1 percent from 2016 to 2021. The report predicts that “SaaS application monitoring is expected to gain maximum traction during the forecast period.” Don’t get left behind.

Conclusion

A 451 Research study found that the rapid growth of public cloud services and network virtualization has often outstripped management and monitoring capabilities, creating “blind spots” in network operations’ ability to maintain internal uptime and performance benchmarks. If you only recently climbed on the SaaS bandwagon, it is likely that your existing system monitoring tools are not cloud-friendly.

You may need some help from the experts to help you keep your finger on the pulse of your new SaaS application. Mosey along to AlertBot for more information about a holistic synthetic monitoring solution.

]]>
AlertBot Showdown: Reebok vs Nike https://www.alertbot.com/blog/index.php/2018/01/09/alertbot-showdown-reebok-vs-nike/ Tue, 09 Jan 2018 20:00:53 +0000 https://alertbot.wordpress.com/?p=480 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are wearing athletic brand headwear. Text reads "AlertBot Showdown: Reebok vs Nike" with the word SHOWDOWN very large at the bottom.

Whether you’re hitting the gym or the trails, you’re likely to be lacing up with some active footwear that helps you burn calories and exercise in comfort and style. When it comes to activewear, there are many companies these days who contribute their accessories and gear to our daily workout regiments, however, two major players come to the front of our minds when it comes to popular footwear brands.

For our latest AlertBot Showdown, we picked frontrunners Nike and Reebok to evaluate the website performance for each athletic wear’s online persona.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from October 1, 2017 to October 22, 2017. While both sporty sites performed well, it became pretty clear after a significant trip-up that one site left the other in the dust.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

For the first time in our experience of tracking sites for a Showdown, one of the sites in the running went down while we were actually in the office. That gave us the ability to watch the event as it unfolded while AlertBot performed its tests against the failing site. Reebok.com hit a snag on October 13 around 3:30pm EST. It took nearly a full hour for their site to recover. We manually checked their site from our desks at 4pm, and the site was still down. We checked again at 4:15 and the site was back up, however, only text was loading – no images. By 4:30pm, when we checked one more time, the Reebok.com was back up in its entirety. It was the only failure event that Reebok.com encountered during the weeks it was tested for this Showdown, but it was definitely a doozy. During this time period, their average downtime was just 99.85%, but it’s proof that “99% uptime” can still contain an hour of critical downtime. And for a retail site, this could truly prove costly. (Reebok 7/10)

On the other hand, Nike.com experienced no significant failure events and only occasionally experienced minor issues like a slow page file or a “timed out” error. From the starting line, Nike is already on the fast track to success between the two brands. (Nike 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Speed is everything for the image of brands like these, which makes it a bit ironic that both sites seem to struggle a little in this area. Reebok’s fastest average speed was on October 4th with 6.4 seconds load time. Their worst average speed was October 23 at 7.9 seconds. They’re not drastically different, but that’s not an impressive load time.  (Reebok 7/10)

At this point, one might expect Nike to sprint past Reebok in the load time category, but Nike didn’t fair much better, with 6.3 seconds being their fastest average speed on October 23 (which is coincidentally the day of Reebok’s slowest average), and Nike’s slowest average speed was 7.5 seconds. Again, they’re not great speeds, but in this case, Nike edges out Reebok, even if it is only by a slight skip rather than a jump. (Nike 7/10)

 

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

 

 

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

Looking at site response time geographically tells a different story. First off, Reebok shows that they had the fastest load time in Texas with an average of 3.7 seconds. Their second fastest time was in New Jersey at 4.8 seconds. Virginia produced the slowest return, with an average of 6.9 seconds. (Reebok 7.5/10)

Yet again, Nike only performed slightly better, with California showing the fastest average speed of 3.2 seconds and Texas showing the second fastest at 4.5 seconds. However, Nike performed worse than Reebok when it came to slowest location, with Illinois taking the cake for worst average speed, at 9.7 seconds! (Nike 7/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater, or simply adding a similar item to both sites’ shopping carts. For this Showdown, we’ll see what the experience is like to use their respective websites to add their latest running shoe to the shopping cart and start the checkout process.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.reebok.com into our Chrome browser and clicking around to find a Men’s Running Shoe, choosing the first one, choosing a size, adding it to the cart and clicking “checkout,” it took 36 seconds. From the homepage, it took 5 clicks to get to the checkout process. At first glance at the homepage of the site, it seemed like it might be a challenge to actually find what we’re looking for, but it was a pretty easy shopping experience.

From the point of typing www.nike.com into our Chrome browser, it took 8 mouse clicks and 48 seconds to find a men’s running shoe and get to the checkout stage. Upon first visiting the site, the visitor is hit with an ultra closeup of a bunch of kids in gray Nike hoodies and it takes most of the page hostage. We scrolled down to the first running shoe advertised and clicked on it, only to find that it was only a women’s shoe (which is not mentioned on the image on the homepage). We then had to click around to the men’s department, for this task’s purpose, in order to find a shoe and continue the process. Both sites get the job done, but Reebok was a more pleasant shopping experience.

With that said, here are the Usability scores:

(Reebok 9/10)        (Nike 8/10)

 

Verdict

Both sites performed respectably, but we can’t ignore that failure that Reebok experienced on the 13th. Other than that, the sites performed quite similarly (and we actually preferred Reebok’s shopping experience a little more than Nike’s). Still, since we’re really weighing in here on web performance, the winner is rather clear —

Graphic rendering of a robot with a triangular head and circle eye holding up a sign that reads "Nike.com"

]]>
Black Friday / Cyber Monday Showdown: Amazon vs Walmart vs Target https://www.alertbot.com/blog/index.php/2017/11/29/black-friday-cyber-monday-showdown-amazon-vs-walmart-vs-target/ Wed, 29 Nov 2017 00:34:35 +0000 https://alertbot.wordpress.com/?p=465 A graphic with a yellow starburst in the center and two robots charging towards a third robot. The two on the left are carrying shopping bags. The one on the right is carrying a box. The text reads "Black Friday - AlertBot Showdown: Target vs Walmart vs Amazon" with the word SHOWDOWN very large at the bottom.

It’s that time of year again, where sales conscious bargain chasers brave the throngs of other sale hunters in the frigid November early morning air on that most dreaded of retail shopping days: BLACK FRIDAY. Just hours earlier, many of these same credit-card-wielding warriors were huddled around a table with family, giving thanks once again while stuffing themselves to their waistline’s discontent with mashed potatoes, roasted turkey and homemade pie. The juxtaposition of these two contradicting practices is staggering, but it’s no less the holiday tradition year after year.

As we approach another Christmas holiday, the world of ecommerce continues to ramp up the way they approach Black Friday–and its younger electronic sibling, Cyber Monday–with many now starting their sales right after Halloween. Accordingly, we decided to do something special for our next Website Showdown: a Black Friday / Cyber Monday edition that pits the ecommerce colossus Amazon against the websites for brick-and-mortar retail mega-stores Walmart and Target. It’s a truly epic battle royale to see how each site performs during the biggest shopping days of the year.

So, as usual, we used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor all three sites from Thanksgiving through Black Friday and Cyber Monday, spanning from November 23, 2017 to November 27, 2017. We expected strong, reliable performance during the entire run and we were not disappointed. The results were nothing short of impressive.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Usually for this section, we evaluate each site’s performance in detail, drilling down to specific errors each one faced, and the different types of errors we usually see (like Slow Page Files, Timeouts, etc). It’s unusual for the sites in a two-site Showdown to not return a single error, much less a three-site Showdown. In this special evaluation of three sites, not one single, solitary error was found between them. All three sites avoided any kind of failure event or significant error. With the stakes so high for three of the biggest retailers on the most significant sale days of the year, one would expect nothing less. So, with that said, each site earns a perfect score for Reliability.

(Amazon 10/10)
(Walmart 10/10)
(Target 10/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Sites like Amazon, Walmart and Target boast very graphics-driven designs, and especially with monstrous sale event days like these, the graphics are often big, bold, and frequently changing.

With that said, of Amazon.com’s 5-day run, they saw the fastest day, on average, to be Sunday, November 26th with 4.3 seconds. It’s not the slickest speed a site can have, but it’s certainly not bad. On their slowest day, on average, Amazon still clocked in at 5 seconds on Cyber Monday, which is still not too shabby. When looking at specific times of day for performance, the best hour was at 5AM on Sunday with an impressive 3.4 seconds, while Cyber Monday also saw the slowest hour at 7AM with 6.7 seconds.
(Amazon 9/10)

Walmart.com held their own surprisingly well during this time, too. Their best average day was Thanksgiving Day, November 23rd at 4.2 seconds, just barely edging ahead of Amazon. Their worst day on average was Saturday, November 25th, also at 5 seconds. Finally, their best hour on average was on Thanksgiving at a remarkable 2.7 seconds at 5PM. Their worst time on average was 6.4 seconds at 2AM on Sunday, November 26.
(Walmart 9.5/10)

Last, but certainly not least, Target.com didn’t perform quite as well as the other two, but they still performed respectably, especially considering the fact their site avoided any failure events. Their best day for speed, on average, was Thanksgiving Day at 5.2 seconds, which is worse than both Amazon and Walmart’s worst days. Target’s slowest day on average was Sunday, November 26 at 5.4 seconds, which at the very least, shows a great consistency for the performance of the retail chain’s online presence. Their fastest hour turned out to be on Black Friday at 9AM with 3.9 seconds, with their slowest being on Cyber Monday at 4AM with 7.6 seconds.
(Target 8.5/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

California tends to prove to see the fastest web transaction speeds in the country, and in this test scenario, they once again come out on top for each website. For Amazon.com, the titan of ecommerce saw average load times of 2 seconds in the The Golden State, with their next-fastest location being Texas at 3.2 seconds. When it came to their slowest locations, Illinois came in at the bottom with 6.6 seconds, with Georgia just above them with 6.3 seconds.
(Amazon 9/10)

Walmart.com was only a millisecond faster, seeing an average load time of 1.9 seconds in California, also coming in faster in Texas at 2.7 seconds. But Walmart saw a placement swap for which state was the slowest, with Georgia coming in at the bottom at 6.6 seconds and Illinois right above them at 6.5 seconds.
(Walmart 9.5/10)

Target loaded on average at 2.7 seconds in California, with Texas coming in next at 3.5 seconds. Again, Target’s fastest speeds proved to be slower than their competitors. The slowest average speed that Target saw in the U.S. was also Georgia, at 7.2 seconds, but Washington stepped in as their second slowest, at 7 seconds flat.
(Target 8.5/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we always select a common task a user might typically try to accomplish when visiting the sites we’re testing and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater. Like with the most recent Showdown for Lowes and Home Depot, we decided to see what the experience would be like to use these three different websites to add a common product to the shopping cart.

For each of these processes, let’s see about adding the PS4 version of new video game Star Wars: Battlefront II to our shopping cart. To begin each process, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.amazon.com into our Chrome browser, typing “Star Wars Battlefront 2” into the store’s search box and adding it to the cart, it took 30 seconds. From the front page, it took about 5 clicks (including selecting the autocomplete suggestion in the search bar) to get to the final “Place Order” window.

From the point of typing www.walmart.com into our Chrome browser, it took about 4 clicks and 35 seconds to get to the Cart Checkout window. The autocomplete was a little clumsy to deal with (it was tough to tell if the browser was really proceeding to load the site), but overall, it was a decent experience.

From the point of typing www.target.com into our Chrome browser, it took about 5 clicks and 27 seconds to get to the Cart Checkout window.

All three sites were good experiences, although each one has a very different feel. It’s a tough call to say which user experience we found to be better, so we decided to try a second test. This time, we chose something different, like Wonder Woman on Blu-Ray. We also decided to try Mozilla Firefox this time.

The process of finding the Blu-Ray disc and getting to the checkout process on Amazon took about 4 clicks and 25 seconds. The process on Walmart.com took 26 seconds and 5 clicks. On Target.com, it took roughly 24 seconds and 4 clicks. This time, we noticed that in the search results, there’s a convenient “Add to cart” option next to the items on Target’s site that Walmart and Amazon both DON’T have. This definitely gives Target a slight edge over their competitors. And with that being the only real significant difference, outside of its slightly faster completion time, we’ll have to say Target wins the Usability portion of this Showdown.

(Amazon 9.5/10)
(Walmart 9.5/10)
(Target 10/10)

 

Verdict

With stakes this high, you would only expect the best from the leaders in the retail industry, so it comes as no surprise that the results were so good and so close. This may be the toughest Showdown we’ve had to score yet, especially with three hats in the ring this time around.

But, with all things accounted for – reliability, speed, geographical performance, and the site’s usability – we’ve reached our verdict:

WINNER:

 Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Walmart.com"

 

 

]]>
10 Ways to Optimize Images to Improve Your Website Performance https://www.alertbot.com/blog/index.php/2017/11/07/10-ways-to-optimize-images-to-improve-your-website-performance/ Tue, 07 Nov 2017 19:07:09 +0000 https://alertbot.wordpress.com/?p=462 A graphic showing a desktop monitor, a laptop screen, a tablet screen and a mobile phone screen and all of them are displaying various kinds of icons - like a magnifying glass, wifi symbol, shopping cart, video game controller, etc.

10 Ways to Optimize Images to Improve Your Website Performance

by Louis Kingston

“Visuals express ideas in a snackable manner.” – Kim Garst, CEO of Boom Social

Visual imagery on websites is a powerful tool to grab the user’s attention keeping them curious, engaged and interacting on your webpage. Humans are a visual species. Our brains can process an image within 13 milliseconds with over half of the brain devoted to processing the visual information it receives. We show excellent memory capability for remembering pictures that is much higher than retaining text. Over 65% of the population are visual learners. What this means is that our websites must contain a healthy dose of visual images to keep a visitor engaged. Whether it’s on our homepage, service pages, in our blog articles, on our e-commerce sites –images are essential to driving sales, conversions and ultimately company growth.

Are Images Slowing Down Your Load Speed?

However, the images used must be optimized so that they don’t hamper your website’s performance. If they are too large, they are going to slow down your website’s loading speed. The Google algorithm doesn’t like that. More than seven seconds to load and Google’s going to ignore you, and you won’t make it to page one of SERP’s (search engine results page). The search engine’s focus is on organically profiling businesses that offer a great user experience; slow load speed will just have potential visitors clicking away.

Google loves text, and when it crawls your site, it can’t ‘read’ your images unless you have created file names, alt tags, and captions to describe the image. You are losing out on a perfect SEO opportunity if you don’t optimize your images.

Let’s investigate ten ways you can achieve image optimization for your website…

  1. Use keywords in the image file name. The file name affords a perfect opportunity to include your primary keywords as well as giving Google enough text, so it knows what it is “looking’ at on your webpage. But make sure you never keyword stuff these descriptions. You don’t have to use descriptions for decorative images (that would be overkill and Google might penalize you).

 

  1. Images must be scaled to fit the size it will be displayed on the site. The mistake many people make is that they think that once they take a large image and put it into a small size display area, it will then not take up so much ‘space.’ But the file size is still enormous and will continue to take a long time to load. The image should first be scaled to the size you want it to be displayed. You can also choose to remove any pictures that are no longer serving your website which will also improve the overall load speed.

 

  1. Always reduce the image file to the lowest possible size without compromising too much quality. Many online tools can assist you to reduce your file size, like JpegMini, io, ImageOptim etc. Aim to keep your image file size below 70kb (if possible).

 

  1. Use responsive Images for a better mobile experience. When you use responsive images plugins that apply the srcset attribute, it allows your pictures to display differently for each device screen width. If you are using WordPress, this function is automated.

 

  1. Add Customer-Centric Captions. According to KissMetric, the captions under images are read 300% more than the body content. Visitors to web pages are scanning information, and a well-captioned image can provide them with a wealth of info at a glance. Remember that the images should always be relevant to the content.

 

  1. Always be visible with alt tags. Proving alt tag text ensures that your images can always be ‘seen.’ If a user is unable to download images or if they are using a screen reader due to being visually impaired, the alt tag will describe the image.

 

  1. Make sure to add image tags to your XML image sitemaps. This helps Google with indexing the images on your site. If you are making use of JavaScript galleries or other flashy pop-ups, let Google know what they are and where they are located on your sitemap so they can crawl these images on the web pages’ source code.

 

  1. Remove metadata from raster images. If you are using raster images, there is often unnecessary info attached to it like geo-location and other information regarding the camera used which only takes up space. You will make the overall file size much smaller when you can get rid of this extra metadata.

 

  1. Where possible use vector images. This format is ideal for multi-device use with high-resolution. Raster should only be used when there are complex scenes with loads of detail and irregular shapes. Then using GIF, PNG, JPEG, JPEG-XR, and WebP will be the right choice. Experiment with the raster settings to reduce the quality to free up more bytes.

 

  1. Scalable Vector Graphics (SVG) should be minified and compressed. Minifying SVG files will reduce their size and GZIP can be used to compress them.

How is Your Website Performing at the Moment?

Of course, these are just ten basic image optimization pointers. You can drill down even further on image optimization to enhance your website performance. If you would like to find out more about your website’s performance, AlertBot can show you what elements are slowing down your site or what bottlenecks are causing user traffic to click away. We also offer a Free 14-day trial (without collecting any billing info). Give us a try!

Louis is a writer, author, and avid film fan. He has been writing professionally for tech blogs and local organizations for over a decade. Louis currently resides in Allentown, PA, with his wife and their German Shepherd Einstein, where he writes articles for InfoGenius, Inc, and overthinks the mythos of his favorite fandoms.

]]>
Website Monitoring Leader AlertBot Adds Mac Support for Web Recorder & Enhances SSL Testing Functionality https://www.alertbot.com/blog/index.php/2017/10/25/website-monitoring-leader-alertbot-adds-mac-support-for-web-recorder-enhances-ssl-testing-functionality/ Wed, 25 Oct 2017 12:04:52 +0000 https://alertbot.wordpress.com/?p=460

AlertBot logo with triangle logo image

 

Website Monitoring Leader AlertBot Adds Mac Support for Web Recorder & Enhances SSL Testing Functionality

AlertBot’s multi-step web recorder, which has been available to Windows users for several years and now supports Mac users, is a fast, easy and reliable way to verify that all interactions on a website are working properly.

 

ALLENTOWN, PA (October 25, 2017) – AlertBot announced today that per a new update it has added Mac support to its acclaimed multi-step web recorder, and has made several other security and usability improvements.

AlertBot’s multi-step web recorder is a fast, easy and reliable way to verify that all interactions on a website are working properly. Customers simply click record, interact with their website as desired (e.g. perform a search, put items in a cart, and so on), and upload their finished script to AlertBot, which then automatically performs these pre-set actions at regularly scheduled intervals. Any variations or concerns are immediately sent to customers for investigation and resolution.

Customers can also re-record their script at any time through AlertBot’s desktop dashboard, or through the re-designed viewer for smartphone and tablets, which per the update is now faster and easier to use.

“We are excited to bring our multi-step web recorder to our Mac customers, which allows them to change their multi-step testing scripts more easily,” commented Pedro Pequeno, President of InfoGenius.com, Inc. which owns and operates AlertBot. “Mac users are an important and valued part of our user base, and we want to make sure they continue to have the best tools available.”

Also featured in the update are new advanced SSL error ignoring and TLF features, which give customers greater control over site diagnostics, and helps them meet PCI compliance standards. For example, customers now can choose how to handle SSL certificate expiration dates, domain mismatches, and other common certificate issues, as well as specify which Transport Socket Layer (TLS) versions to allow.

Other key usability improvements include:

  • New “Set Active” and “Pause” buttons that enable customers to select and change the status of a batch of monitors in a single operation.
  • The ability for customers to run a summary report for any monitor from the main menu.
  • Alert scheduling is now more intuitive and easier to setup.

Added Mr. Pequeno: “With the surge in data breaches, PCI compliance standards are more important than ever. AlertBot’s enhanced monitoring capabilities help our customers ensure that the SSL aspects of this compliance commitment are always being met.”

About AlertBot

Founded in 2006, through its industry-leading TrueBrowser® solution AlertBot enables businesses to continuously monitor the availability and performance of their mission critical public Internet services from across the country and around the world. When AlertBot detects an issue with websites or servers, it analyzes the problem within seconds from multiple geographic locations, and delivers real-time alerts to business leaders and system administrators via devices such as smartphones and mobile devices. Thousands of companies trust AlertBot to help them deliver the uptime and performance they expect, and their customers demand. Learn more at http://www.AlertBot.com.

About InfoGenius.com, Inc.

Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Learn more at http://www.infogenius.com.

]]>
AlertBot Showdown: HomeDepot vs Lowes https://www.alertbot.com/blog/index.php/2017/10/11/alertbot-showdown-homedepot-vs-lowes/ Wed, 11 Oct 2017 09:57:06 +0000 https://alertbot.wordpress.com/?p=449 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying planks of wood. Text reads "AlertBot Showdown: The Home Depot vs Lowe's" with the word SHOWDOWN very large at the bottom. Tiny hardware nails are sprinkled around the image.

Living in an age where nearly every industry is driven by ecommerce, it should come as no surprise that this includes the home improvement world. Home Depot and Lowes are titans in their industry, and both have a strong online presence. But when it comes to who may have the better performing site, we set out to nail down one true winner.

For our fifth website Showdown, the AlertBot team got out their proverbial measuring tape and slipped on a stylish apron to dig in to the performance of HomeDepot.com vs Lowes.com.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for a couple weeks, spanning from August 11, 2017 to August 31, 2017. Not surprisingly, the performance for these heavy lifters proved to be rather resilient for both sites. Neither service’s site experienced significant downtime, but as usual, one did prove to perform a little better the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

HomeDepot.com performed quite well over the tested time period, experiencing no failure events. At most, it had a couple hiccups, like a short-lived Timed Out error or a Slow Page File notice, but none of these occurrences caused any amount of significant downtime. (HomeDepot 9/10)

On the other hand, Lowes’ site experienced one failure event on August 21st when the site was not responding for roughly three minutes around 12:21 in the afternoon. When errors like these occur, AlertBot tests them from a second location to confirm if the error is widespread or just a brief localized outage. In this instance, the error persisted after a few tests in different locations, qualifying it for actual site downtime, before the issue resolved.    (Lowes 8/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. These tests are performed from the perspective of a first-time visitor with no prior cache of the website’s content. AlertBot runs the tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

HomeDepot.com has a great deal of graphics on the front page, which typically slows sites down considerably. However, it didn’t seem to slow this site down much. HomeDepot.com’s best day, on average, was Tuesday, August 29th  with an impressive load time of 1.1 seconds. The “worst” day average was still an impressive 1.9 seconds.  When evaluating the site’s speed by hour, the site loaded in just 0.8 seconds at 1AM on Sunday August 20th. The worst hour was also on August 20th, at 2PM with 5.1 seconds. Overall, HomeDepot.com’s speed is quite good.  (HomeDepot 9.5/10)

Lowes.com has drastically less content on its front page, but it performed considerably slower than HomeDepot.com did. Sadly, Lowes best day was actually slower than HomeDepot’s worst, with an average of 6 seconds on Sunday, August 13th. Lowes.com’s worst day was Monday, August 26th with 7.1 seconds. That’s not horrendous, but with sites being expected to perform faster and faster these days, a respected retail giant like Lowes needs to up their speed game. On an hourly average basis, their best time was 11PM on Wednesday, August 23rd with 7.1 seconds (Again, their fastest time is slower than HomeDepot’s slowest). Their worst load time by hour was Sunday, August 27th at 1PM with a sluggish 10.1 seconds. (Lowes 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

Usually when we look at site speeds across the United States, sites tend to perform better in California than anywhere else. This isn’t the case for HomeDepot.com, however. For Home Depot, Florida appeared to experience the fastest web transaction (less than one second), while it experienced the slowest transaction test in California (But it’s still only 2.3 seconds). After Florida, however, it experienced the next fastest web transactions in New Jersey and North Carolina (both at 1 second). (HomeDepot 9/10)

Lowes.com had the fastest web transaction in California at 3 seconds. The next fastest was North Carolina, already up to 4.3 seconds. The slowest performance occurred in New York at a whopping 9.4 seconds (with the second-slowest being Georgia with 9.3 seconds). (Lowes 7.5/10)

 

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdowns, we tested things like visiting a site for nutritional information or going through the motions of ordering movie tickets from a local theater. For this Showdown, we’ll see what the experience is like to use their respective websites to add a common product to the shopping cart.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.homedepot.com into our Chrome browser and entering “leather gloves” into the search box, choosing one and adding it to the cart, it took 25 seconds. From the front page, it took 5 clicks to get to the “Checkout now” process. It wasn’t bad, but we found the Lowes process just a bit smoother.

From the point of typing www.lowes.com into our Chrome browser, it took 4 mouse clicks and 20 seconds to get the gloves into the shopping cart and view the cart. The “Add to cart” button is much more obvious and visible on Lowes’ site, where it took a moment to locate it on Home Depot’s site. And while both sites offer a “compare” option so you can look at product features side by side, it wasn’t very noticeable on HomeDepot’s site, while it was more prominent on Lowes.com.

The aesthetic of both websites isn’t bad, but Lowes has a crisper and more streamlined appearance and functionality. Both sites get the job done pretty quickly, but we had a slightly smoother experience with Lowes. With that said, here are the Usability scores:

(HomeDepot 9/10)       (Lowes 10/10)

 

Final Verdict

Both sites performed respectably, but HomeDepot.com clearly performed faster and was more reliable than Lowes.com. Despite the fact that we may have preferred the shopping experience on Lowes.com just a little bit more, one cannot ignore the slower site performance.

So, for the fifth AlertBot Showdown, the site that gets to join the ranks of previous winners Apple, FedEx, and Burger King is…

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "HomeDepot.com"

]]>
AlertBot Showdown: Burger King vs McDonalds https://www.alertbot.com/blog/index.php/2017/08/28/alertbot-showdown-burger-king-vs-mcdonalds/ Mon, 28 Aug 2017 18:25:35 +0000 https://alertbot.wordpress.com/?p=436 A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying hamburgers and wearing hats. Text reads "AlertBot Showdown: Burger King vs McDonald's" with the word SHOWDOWN very large at the bottom.

Whether you’re picking up a Kids meal for your littlest picky eater or satisfying a hankering for greasy and salty French fries, chances are you’ve found yourself in line at a drive-thru for McDonald’s or Burger King at some point in your life. But these two massive burger chains also have an online presence, and while you’re not exactly going to try to order a single or double patty to be shipped to your home, you might find yourself visiting the websites for either fast food giant to look up their menus or latest promotions.

So for this, our fourth website Showdown, the AlertBot team rolled up their sleeves, grabbed a handful of ketchup packets, and sat down to take the wax paper wrap off of these two websites to see just how the sites for BK and Mickey D’s performed in comparison to one another.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites for three weeks, spanning from June 5, 2017 to June 26, 2017. Not surprisingly, the performance proved to be reliable for both sites. Neither service’s site went down, but as usual, one did prove to perform a little faster than the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

Both sites performed quite well during the time period, but McDonald’s site experienced a hiccup on the first day of the test, June 5. It was a timed-out warning (meaning the site failed to load in the expected time period), but it didn’t last longer than a couple minutes, and didn’t seem to affect the site for very long. Otherwise, their site was pretty stable. (McDonald’s 9/10)

On the other hand, Burger King’s site didn’t experience any confirmed failure events at all and experiencing complete uptime during the test time. However, it did see two transient errors—one a slow page notice and one a brief timed-out notice—for less than a minute that affected the site’s overall performance from a single location. When errors like these occur, AlertBot tests them from a second location to confirm if the error is widespread or just a brief localized blip. In these instances, the error only occurred from just one test location and didn’t qualify as a downtime event.    (Burger King 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user.  We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Both sites are quite graphics-heavy, so it doesn’t surprise me that they may experience some slowness at times.

McDonalds’ loading speeds averaged around 9.5 seconds per day, with its best time being 10 AM on   Monday, June 12 at 5 seconds and its best day being Monday, June 26th with an average of 8.8 seconds. Its worst day was Monday, June 5th, when the load time crawled to an average of 12.7 seconds, while the worst time was on Wednesday June 7th at 11 PM with a pitiful 17.6 seconds. (McDonald’s 8.5/10)

Burger King performed significantly better by comparison. Overall, the site averaged 3.6 seconds for its load time, which is pretty good. Its best day was Wednesday, June 19th when it averaged 3.5 seconds, with its best load time being on Wednesday, June 14th with a speedy 1.8 seconds load time at 6 AM. Monday, June 5 was the worst day, seeing a 6.1 seconds load time (which was still better than McDonald’s BEST day), and their worst time being Saturday, June 17th at 10 AM with 8.5 seconds. (Burger King 9.5/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart

Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

It seems to be the norm for California to record the fastest speeds, and the same holds true for McDonald’s. However, surprisingly, New Jersey was the next fastest state on the list. Comparatively, the fast food chain legends saw the slowest load times in Georgia and Utah.  (McDonald’s 9/10)

Burger King, for the most part, saw stronger returns across the board, with California, Colorado, Virginia, Missouri, Washington and Texas all pinging approximately 1 msec. Their slowest locations were North Carolina and also Utah. (Burger King 10/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we tested out how the experience of tracking a real package might look when using two popular parcel services. For this Showdown, we’ll see what the experience is like to use their respective websites to look up the menu and nutritional information on each company’s signature burgers.

For each of these processes, we started by opening a new tab in Google Chrome and typing in the site’s URL.

From the point of typing www.mcdonalds.com into our Chrome browser and navigating until we could find the Big Mac nutritional info, it took 26 seconds. We were held up at first by a prompt on the front page that asked us to join their email list. The browser also wanted to access our location. From closing out the pop-up on down to finding the Big Mac info, it took five mouse clicks.

Now, from the point of typing www.burgerking.com into our Chrome browser, it took four mouse clicks and 18 seconds to get to the Whopper’s nutritional info. BK’s design is much simpler, so we see why their load times were faster.

We liked the aesthetic of both websites, but McDonalds has a slightly more modern feel in its design. However, their graphics are all-around larger and they have more going on on the page, which could be why their overall load times are slower than Burger King’s.

So, with all things considered, with the goal being able to find the nutritional info on each chain’s most popular burger, here are the Usability scores:

(McDonalds 9/10)       (Burger King 10/10)

 

Final Verdict

Neither site performed exceptionally well over the other, but it’s safe to say that Burger King edges out McDonalds in speed and overall performance. (Just for fun, we should follow this up with a who-has-the-better-French-Fries competition!)

So, for the fourth AlertBot Showdown, the site that gets to join the ranks of previous winners Apple, FedEx and Fandango is…

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "BurgerKing.com"

]]>
Tortoise, Dinosaur or Ostrich? Proactive vs Reactive Web Monitoring https://www.alertbot.com/blog/index.php/2017/07/19/tortoise-dinosaur-or-ostrich-proactive-vs-reactive-web-monitoring/ Wed, 19 Jul 2017 11:15:31 +0000 https://alertbot.wordpress.com/?p=425 A set of three photographs: The first shows a tortoise, the middle one shows the bones of a t-rex, and the third is an ostrich with its head buried in desert sand.

Tortoise, Dinosaur or Ostrich?
Proactive vs Reactive Web Monitoring – 3 Metaphors From the Animal Kingdom
by Penny Hoelscher

In February 2017, Amazon Web Services’ (AWS) S3 web-based storage service suffered an outage that led to half of the internet “melting down” and costing businesses millions. It was caused by an operator’s typing error when issuing a routine command to take a few S3 servers offline.

What has this got to do with you?

Despite the fact that the entire outage lasted 4 hours and 17 minutes, Amazon came under attack from experts and customers in toe-curling global headline news. AppleInsider reported that even Apple was affected, with a variety of cloud services experiencing outages and slowdowns. Apple relies on Amazon for portions of its cloud infrastructure. Albeit not as a result of the meltdown, rumor has it the company is thought to be gradually shifting away from its dependence on Amazon.

Perhaps you’re not an Amazon or an Apple, but you too may be vulnerable. It all boils down to reliability which has a direct affect on your revenue stream. If your web application or site delivers poor performance, your customers will go to your faster, more modern, more customer-centric competitors where they experience less downtime, fewer outages and faster page loading times, and better service. The result: you will lose sales, money and even your reputation.

How can you tell that it’s time to upgrade your website monitoring tool and get expert assistance? Well, you’ve already dropped the ball when you start noticing a decline in visitors; when once a waterfall, the stream of traffic to your website has slowed to a trickle. An external website monitoring tool like AlertBot can alert you to potential signs of trouble, like:

  • Degraded performance, e.g. due to page bloating, inefficient scripts, backend services.
  • Cyber attacks, e.g. website defacing and file changes.
  • Incompatible website and addons, e.g. by loading and testing site scripts.
  • Software and database issues, e.g. overloaded application servers and database bottlenecks.
  • Server failures, e.g. SSL, DNS, HTTP, and Ping.

In a nutshell, if your website persona resembles one of the following – tortoise, dinosaur or ostrich – you’re in trouble:


TORTOISE: Outages, high down-times and slow loading times

The internet is not like your local shopping mall which is a convenient one-stop shop for all your household needs. These days, “I want it and I want it now” customers have far more options and if you’re closed for business, they’re not going to go and have a cup of coffee and wait for your door to open again; they’re simply going to mosey over to your competitors. Only one thing hasn’t changed in the digital sphere: some old adages hold true. Thing is, customer loyalty is a fair weather friend in an online environment, and when it comes to affiliate loyalty, frankly, for them, time is money.

Website monitoring tools not only report on outages and high down-times, they help you to identify where (e.g. a particular geographic location), when (e.g. peak hours) and why (e.g. network issues) these are occurring. You may find it is your business model that is at fault, not slow servers or bloated software; for instance, perhaps you’re doing maintenance and performing upgrades at the wrong time in a different time zone to that of your head office.

In addition, page loading speed is one of the ways Google ranks your web pages. This matters because when searching for products and services, customers will click on the matching businesses Google serves first.


DINOSAUR – Being behind the times

Google lowers mobile page rankings for companies who do not have a mobile responsive web design. New website design trends have changed the face of online businesses and today’s tech-savvy generation can spot an old-fashioned, un-cool design in a heartbeat. But, keeping up with new design technologies can have an impact on your website’s performance. Page bloat is much like a beer belly; extraneous code, affiliate advertising and toxic data (storage of unnecessary and dated information) creeps up sneakily but has a huge impact.

One of the main benefits of a professional website monitoring service is that it provides you with an automated artificial intelligence that can manage big data and learn from the information it receives. You don’t have to wait for users to complain or continuously test the site yourself, and, because your business is constantly evolving, it is able to update its algorithm in tandem. These sophisticated technologies not only gather and analyze the data you need to make an informed decision about performance, they provide you with the solutions.

Cyber attacks are a 21st century bane to which all online businesses – big and small – are vulnerable. Of increasing concern is that at many companies, it can take months before a data breach is detected, giving cyber criminals plenty of time to ravage their victims’ systems. AlertBot can’t prevent a data breach but it can alert you when you’re attacked, e.g. by notifying you that files have been changed or your site has inexplicably gone down.


OSTRICH – Customer complaints

Negative social media posts can be harsh on a business’s reputation. Often, it may appear unfair, especially when the trolls join the battle to bring you down. Sure, you need a team to monitor social media channels and publicly appease customers (including the trolls) who have issues, but that’s not enough. An external website monitoring service can give you advance warning of problems with your system.

Customer Experience (CX) is not just about the latest trends – mobile first, conversational brands, emotional engagement, predictive analytics and personalization, etc.; CX is about serving customer needs and wants (read: demands) BEFORE they start complaining. Once your website starts exhibiting dinosaur or tortoise characteristics because you’ve been acting like an ostrich with its head in the sand, it is too late; all you will have is reminders of your ex-customers’ public vents still floating around on complaints forums and social media channels.


Conclusion

The Amazon debacle should be a wake-up call for businesses to be more proactive with regard to monitoring the uptime and infrastructure of their systems. Imagine how red your company’s face would be if you don’t notice a crisis before your users do and you have to be informed by irate calls and emails from them.

A monitoring tool like AlertBot simulates actual user behaviors and interactions, and runs tests using popular web browsers like Chrome and Firefox in real-time. It’s easy to set up (no installation necessary) and allows you to create scripts for different user experiences across multiple devices, using multiple features and functions, enabling you to be proactive at the best of times, and timeously reactive at the worst (after all, accidents do happen.)

]]>
AlertBot Showdown: FedEx vs UPS https://www.alertbot.com/blog/index.php/2017/05/23/alertbot-showdown-fedex-vs-ups/ Tue, 23 May 2017 18:51:27 +0000 https://alertbot.wordpress.com/?p=414


A graphic with a yellow starburst in the center and two robots charging towards each other. Both are carrying rectangular shipping boxes. Text reads "AlertBot Showdown: FedEx vs UPS" with the word SHOWDOWN very large at the bottom.

 

One of the most appealing things about ordering items online is receiving packages in the mail. Not only is it convenient for the fruits of your shopping toils to be brought directly to your door, but you can do your shopping from anywhere at any time of the day or night (and in your pajamas if you so desire). Two well-known, worldwide services that nearly everyone who has sent or received a parcel has used are UPS and FedEx. Both services are easily accessible for sending packages, and both are frequently used for receiving them. Both services also have websites that enable users to track their packages (if they’ve been given a tracking number), while also helping to provide resources for sending them out.

For our third Showdown, we set out to track the performance of these two services, trucking along until we could wrap up the results for delivery to you.

We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both parcel service sites for three weeks, spanning from March 27, 2017 to April 17, 2017. Not surprisingly, the performance proved to be reliable for both sites. Neither service’s site went down, but one did prove to perform a little faster than the other.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for what caused those failures.

FedEx’s website experienced not a single, solitary failure event. At the very worst, it may have experienced some slight slowness for a short period of time, but it didn’t affect their overall reliability results. (FedEx 10/10)

UPS’s website was a different story, but there were also no failure events or periods of actual downtime either. The most UPS’s site saw were a handful of warnings that the site was performing a little slower than usual, and a little slower than the average expected load time. These periods of minor slowness only lasted for about 3 to 5 minutes each.   (UPS 9.5/10)

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user.  We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

Both websites have pretty basic homepages, so the load times for customers should be fairly quick (even on a slow internet connection) if the sites aren’t experiencing any server issues.

FedEx’s site speed is fantastic, averaging less than 1 second on most occasions. Its response time was recorded on Wednesday, April 5, 2017 at 0.5 seconds, while its slowest response time was on Monday, April 17, 2016 at just over 2 seconds (which is still very good). (FedEx 10/10)

UPS was also pretty good, but their best response time was about the same as FedEx’s worst response time. UPS’s best response time was 2 seconds on Tuesday, April 11, while their worst was on Monday, April 10th with just a hair under 6 seconds. The standard used to be 7 seconds, but these days, users expect sites to load in roughly 2 seconds.  (UPS 8/10)

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart


Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others.

It’s also interesting to note that in most of these tests we’ve done for these Showdowns so far, California seems to frequently come out on top when it comes to website speed. With that said, FedEx seemed to perform best in California – at just under half a second, and performed the “worst” in Virginia, which still averaged around an impressive 1.1 seconds.  (FedEx 10/10)

UPS also saw its best results in California, but clocked in at around 1.4 seconds there. Texas returned the slowest results, however, averaging around 5.6 seconds. (UPS 8/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we went through the motions of ordering tickets for a recent movie on MovieTickets.com and Fandango.com, For this evaluation of FedEx and UPS, we’ll see how the experience of tracking a real package goes.

For each package tracking process, we started with having the tracking number copied onto our clipboard and then typed the URL of the test site into our browser.

From the point of typing www.FedEx.com into our Firefox browser, selecting the tracking tab at the top, pasting the tracking number into the search field on the left sidebar and clicking “Track,” it took only 15 seconds to get to the tracking results. That’s really fast! We then tried the same process again using the Google Chrome browser, for which the “region” needed to be selected first this time, it took only a second longer to complete!

Now, from the point of typing www.UPS.com into our Firefox browser, selecting the region, pasting the tracking number into the search field on the left sidebar and clicking “Track,” it took roughly 22 seconds to get to the tracking results. That’s not bad, but it’s clearly slower than our FedEx experience. We then tried the same process again using the Google Chrome browser and it took an impressive 12 seconds to complete!

So, with all things considered, with the goal being to track a package as quickly as possible, here are the Usability scores:

(FedEx 10/10)        (UPS 9/10)


Final Verdict

It’s a close match, to be honest, but we’d have to say that FedEx.com still outperformed UPS.com in the speed factor, delivering more than just highly anticipated parcels to its customers, but reliable website performance swiftly as well.

So, for the third AlertBot Showdown, the site that gets to join the ranks of previous winners Apple and Fandango is…

 

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "FedEx.com"

> ]]> The Most Important Pages and Processes to Monitor on a Website https://www.alertbot.com/blog/index.php/2017/04/25/the-most-important-pages-and-processes-to-monitor-on-a-website/ Tue, 25 Apr 2017 12:25:25 +0000 https://alertbot.wordpress.com/?p=410 An illustration of an arm and hand holding a magnifying glass up to a bunch of gears and a laptop displaying a bar graph and a gear connected to the other gears.

The Most Important Pages and Processes to Monitor on a Website

Most companies take advantage of third party website monitoring services to monitor their websites 24/7 for performance issues and downtime. These services alert them immediately when problems arise, equipping them with the necessary knowledge to pinpoint the problem so their team can resolve it.

Companies rely on their website for many things.  Whether their website is used to generate leads, drive business, or keep customers engaged, essential processes and pages on their website are often the lifeblood of their business and online presence.

In the same way that a routine doctor or dentist appointment evaluates your health and checks for any potential impairments or issues that need improvement or fixing, using website monitoring to routinely check your site’s performance is crucial to the success of your company’s online presence.

Here are some important processes and webpages to evaluate and monitor on your website:

The Landing Page

Your landing page is the page that is supposed to hook your visitor, draw them in and get them interested in your product or service.  Making sure these pages are always reachable by potential new customers is of utmost importance.  It may seem like a no-brainer to monitor this vital page, but a lot of people who own small businesses do not think to apply website monitoring to their landing pages.

Page Loading Speed

Once the user gets past your landing page, they become keenly aware of your website’s speed; particularly if it’s sluggish.  With the competition being fierce, one of the major website processes to monitor is each of your page’s loading speed. You cannot afford to have a home page that takes 10 seconds or more to load. The new generation of internet users is not patient enough to sit through a sluggish download or stare at a spinning “loading” icon. If you have a page that takes time to load, you may need to make some design alterations, incorporating minimalistic design that is both attractive and loads faster. A lot of web designers have taken this into account and have adopted new techniques to make the webpages load faster while retaining a fresh and respectable look.  Website monitoring can help you identify if your page load time is negatively affecting your bottom line.

Geographic Performance

Monitoring your website traffic and performance from different countries is extremely important. Knowing where most of your customers come from and enhancing the performance from that geographic area the most can make all the difference for your business. If you cater to a certain state or province, then monitoring the specific geographical location or district that fuels your business is recommended.

Your Shopping Carts

E-commerce driven websites must monitor their shopping carts very closely. For example, if a customer placed products in a cart but did not buy them, it could mean that there are issues with the checkout process. However, if you were not monitoring your cart, you would never know about it and might just assume they lost interest. Poor shopping cart performance will directly affect your company’s sales, which makes monitoring your shopping cart processes that much more important.

Your Signup Pages

Any page on your website that prompts a customer to sign up or register for a service needs to be up and running 24/7. Statistics show that in cases where the signup pages of a website are not working optimally, visitors often abandon the signup process due to a loss in confidence.  Since these pages are directly involved with registering new customers or providing new service to existing customers, they are some of the most crucial to monitor on your website.

Login Pages

Customer frustrations over not being able to access members-only areas of your website can cost you not only customers, but also support hours dealing with the problem. Getting ahead of the problem by monitoring these areas can save your company a lot of time and money.

These are just some of the top areas of your website to ensure are running smoothly 24/7. Start monitoring your most crucial pages today with a no-risk, 14-day FREE trial of AlertBot and start saving your company time, money and unnecessary headaches.

]]>
Website Monitoring Leader AlertBot Launches Redesigned Website to Improve Customer Experience https://www.alertbot.com/blog/index.php/2017/03/27/website-monitoring-leader-alertbot-launches-redesigned-website-to-improve-customer-experience/ Mon, 27 Mar 2017 19:09:34 +0000 https://alertbot.wordpress.com/?p=407 Website Monitoring Leader AlertBot Launches Redesigned Website to Improve Customer Experience

Key customer-features of AlertBot’s new website include responsive design, improved UX, intuitive navigation, new content and more.

March 27, 2017 – AlertBot, a leading provider of enterprise-class server and website monitoring solutions, announced today that it has launched a completely redesigned website at www.alertbot.com.

“As a leader in website performance monitoring, we know how important is to stay relevant and up-to-date with the latest technology and trends,” commented Pedro Pequeno, President of InfoGenius.com, Inc. which owns and operates AlertBot. “Our new website is the result of months of planning, development and testing. We are proud that it continues our tradition of quality and customer-focused updates that help make AlertBot so essential to our growing roster of customers worldwide.”

Key customer-focused features of AlertBot’s new and improved website include:

  • Responsive design, which enables visitors to enjoy and access the website from any device: desktop, laptop, tablet or smartphone.
  • Engaging graphics, which provide visitors with a crisp, clear and uncluttered experience.
  • An intuitive interface and ultra-fast loading speeds that help visitors easily and rapidly find the information they need.
  • New product pages that provide a clear picture of AlertBot’s advanced platform.
  • A new Blog that shares practical advice and insights.

Added Mr. Pequeno: “Since launching our new website, the feedback we have received from current and new customers has been incredibly positive. We look forward to enhancing and adding new features in the months ahead!”

A photo showing a top-down view of a laptop and its screen, a computer mouse, cell phone, and tablet. All of the screens show the AlertBot.com website. A cup of hot coffee is also on the desk with other office items.

About AlertBot

Founded in 2006, through its industry-leading TrueBrowser® solution AlertBot enables businesses to continuously monitor the availability and performance of their mission critical public Internet services from across the country and around the world. When AlertBot detects an issue with websites or servers, it analyzes the problem within seconds from multiple geographic locations, and delivers real-time alerts to business leaders and system administrators via devices such as smartphones and mobile devices. Thousands of companies trust AlertBot to help them deliver the uptime and performance they expect, and their customers demand. Learn more at www.AlertBot.com.

About InfoGenius.com, Inc.

Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Learn more at www.infogenius.com.

]]>
When Does Most Website Downtime Occur? https://www.alertbot.com/blog/index.php/2017/03/27/when-does-most-website-downtime-occur/ Mon, 27 Mar 2017 17:57:01 +0000 https://alertbot.wordpress.com/?p=364 Photograph of a man looking distressed with six arms coming off of him, each holding a different item. The items include a planner book, a calculator, a magnifying glass, a laptop, an abacus, and a marker.

When Does Most Website Downtime Occur?

To become competitive in the global market, it’s crucial for your business to have a strong online presence. One of the best ways to ensure this is to have a user-friendly business website that is accessible ’round the clock. And if your customers rely heavily on your website, you know that any amount of time your site is down could be rather costly.

Frankly, website downtime is inevitable. Even the big online giants like Microsoft, Google, Facebook, eBay, YouTube, Amazon and CNN have experienced website downtime at some point.  However, the good news is that you can mitigate the risk and lower the length of time your site remains inactive if you are familiar with some of the likely causes of website downtime.

Let’s dig a little deeper to find out the common causes of site downtime:

§  Server Overload

Server overloads occur when a big wave of online traffic overwhelms a server. Now, there are two situations when this happens. First, it happens if your site is being hosted on a shared server. Resources on shared servers are limited and they have to be stretched to support high volumes of traffic and site-processing needs, which can cause server overload. As a result, your site may be inaccessible to users for hours.

Second, server overloads may also happen on major online shopping days, like Black Friday and Cyber Monday, or any other occasion for that matter, when you have significant discount deals and special sales running on your website. Such deals draw in heavy traffic, thus increasing the chances of server overload and site downtime.

§  Hardware Failures

Server and network failures can bring a website to a screeching halt in no time flat. This could be caused by things like hard drive failures, power supply failures, circuit board failures, or cabling failures. It can also be caused by more troubling failures like data center infrastructure failures or network peering failures.

§  Webmaster Errors

Your business may experience downtime because of errors caused by the site’s webmaster. For example, your site may not be accessible to your audience if your webmaster forgets to renew the site’s hosting contract or domain name.

§  Coding Errors

Some common coding errors are incorrect syntax, infinite loops and typos. All of these errors can exhaust the resources of the server and yield 500 (Internal Server) error codes, resulting in website downtime.

§  Cyber Attack

With the surge in cyber crime, you need to make sure that your website is well-protected from cybercriminals, hackers and viral infections. Cybercriminals know how to hijack websites and redirect your site visitors to other websites or expose them to malicious content.

All of this can result in lengthy website downtime, which can be detrimental to your business sales, profits and reputation. And that is definitely something that no business owner wants! One way to help prevent cyber attacks is to keep your IT team, and those directly responsible for the health of your website and server, in the know about the latest cyber threats.

§  Distributed Denial of Service Attacks (DDoS)

Also known as DDoS, Distributed Denial of Service Attacks can also bring your online business to a standstill. DDoS are planned attacks. In these instances, heavy traffic is deliberately directed from different sources to cause servers to overload and, in some cases, crash entirely.

§  Natural Disasters

Website downtime may also occur when your data center is hit by a natural disaster like floods, hurricanes, earthquakes, fires, etc.

§  Planned Downtime or Server Maintenance

Lastly, if you have a dedicated server, you may need to go offline for server maintenance. This usually involves upgrading hardware components, drivers, operating systems, firmware, and even software applications. With these planned occurrences, you can alert customers ahead of time to the planned outage, which can help combat and minimize the effect it may have on your business.

Knowing the reasons for, and causes of, website downtime is crucial as it will help you devise and implement the right mix of strategies to overcome and avoid it.

AlertBot’s external website monitoring service exists to help businesses like yours to identify and fix website errors when they happen and hopefully prevent future downtime. Visit www.AlertBot.com for more information and to signup for a free, no-risk trial.

]]>
AlertBot Showdown: Fandango.com vs MovieTickets.com https://www.alertbot.com/blog/index.php/2017/03/06/alertbot-showdown-fandango-com-vs-movietickets-com/ Mon, 06 Mar 2017 11:37:00 +0000 https://alertbot.wordpress.com/?p=371 A graphic with a yellow starburst in the center and two robots charging towards each other. Text reads "AlertBot Showdown: Fandango vs MovieTickets.com" with the word SHOWDOWN very large at the bottom. Kernels of popcorn are scattered around the image.
Last fall, AlertBot debuted its first website “Showdown,” pitting cell phone giants Apple and Samsung against each other in an epic web performance death match (of sorts) to see whose website performed the best. (For those results – and to see who won – check out that blog here.)

For our second Showdown, we decided to grab an oversized bucket of popcorn, an unreasonably large cup of soda, and a pair of cheap, plastic 3D glasses and plopped down into the comfiest of chairs to evaluate two of the premiere movie ticket buying sites: Fandango and MovieTickets.com.

If you’re a movie buff who loves a night out reclining in front of ceiling-high silver screens to watch the latest Hollywood has to offer, chances are you’ve purchased tickets online before. And what would be more frustrating than website failure while you’re trying to combat the masses to secure your entry into an anticipated film’s opening night?
We used AlertBot’s external website monitoring system and its TrueBrowser™ technology to monitor both sites from December 26, 2016 to January 16, 2017. Not surprisingly, the performance proved to be pretty good overall, although one of the sites experienced some pretty significant issues on one of the days. Both sites saw some minor “Slow Page” warnings, but MovieTickets.com took a hit right after Christmas with a dreaded “Server Too Busy” error, meaning their website couldn’t withstand the weight of the traffic it was getting.

Reliability

For the reliability evaluation of a website, we look for failure events (like when a page doesn’t fully load or it is completely down), and we look for the causes of those failures.
Fandango’s website experienced not one single failure event. The worst things seemed to get for Fandango in this time period was a handful of minor “Slow Page” warnings on Dec. 29th and after the new year on the 2nd and 5th. (Fandango 10/10)

Meanwhile, MovieTickets.com experienced what AlertBot considered to be 13 failure events. While most of them were only 3 to 5-minute-long for slow page loads, on Dec. 27th (a Tuesday), the site saw some significant outages where the site was down and reporting a “Server Too Busy” error for nearly 6 hours! When visiting the site during this stretch of time, chances are users were met with the dreaded “Server Too Busy” error message in their web browser instead of the actual site. This would have been super frustrating for visitors (especially if you’re trying to order tickets in a jiffy). (MovieTickets.com 7/10)

 

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

Speed

When evaluating a website’s speed, we look at the time it takes the site’s homepage to render and load to the point of being fully interactive by the user. We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.
Both websites have pretty busy front pages, but both tend to change often and feature videos or Flash-driven ads and some graphic-heavy content – all of which can really compromise a website’s load time.

Fandango’s speed is solid, averaging less than 2 seconds. Its fastest day was Friday, Jan. 13, 2017 at 1.7 seconds and its slowest was Thursday, Dec. 29, 2016 at just over 2 seconds. (Fandango 9/10)

MovieTickets.com didn’t fare quite as well, unfortunately. On its best day, Tuesday Jan. 3, the front page took almost 4 seconds to load. On its worst day, Dec. 27 (also a Tuesday), it took almost 7 seconds to load – which is definitely below the current online industry’s website load time standards. (MovieTickets.com 7/10)

 

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart


Geographic

It’s always interesting to see how sites perform differently across the world. If we look exclusively at the United States, it’s intriguing to see which states regularly see faster or slower times than others. Fandango saw the fastest load times in California (I suppose that makes sense, given the movie industry being centered there), with the slowest happening in Texas. Still, the slowest times were typically still under 2 seconds of load time, with Washington, Virginia and Texas all seeing load times around 2 seconds or pushing 3 seconds. (Fandango 10/10)

For MovieTickets.com, it’s a different story. We already know they struggled with speed, but the question here is – where? California is also the best location for MovieTickets.com, with load times around 2.7 seconds. The worst, again, is Texas, with almost 6.5 seconds. Florida and North Carolina also performed well, while Washington joined Texas as one of the slower locations. (MovieTickets.com 8/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart


Usability

For usability, we select a common task a user might typically try to accomplish when visiting the sites and replicate it. For our previous Showdown, we chose the task of ordering the latest cellphone from the respective sites of Apple and Samsung. For these sites, we’ll see how the experience of ordering movie tickets compares to one another.

Starting with selecting a movie to buy tickets for, we approached each site with the goal of ordering two tickets for the recently released The LEGO Batman Movie.

The LEGO Batman Movie theatrical poster showing LEGO Batman characters running toward the screen.

By selecting a brand new film, it was easy to find the film on the homepage of Fandango.com and start clicking through to order tickets. From the point of typing www.Fandango.com into our Firefox browser, clicking on the LEGO Batman Movie poster, putting in our zip code, selecting the next available time and number of tickets, it took roughly 40 seconds to get to the Fandango checkout. That’s not bad. If this were for real, we would have probably spent extra time checking our show time options a bit more, choosing 2D over 3D, etc. But for this task, we figured it’s best to keep it simple. The whole process took about 4 clicks of the mouse with a little typing to put my zip code in.

For MovieTickets.com, we found the experience to be mostly the same. Except, when we put our zip code in, it seemed like MovieTickets.com gave more options right off the bat. Fandango suggests the closest theater for your zip code and the first batch of showings it finds (in this case, it’s a 3D showing), while MovieTickets gives you the full list of showings and format options. Our experience felt more thorough with MovieTickets, getting more choices right away. But we also feel like more options delayed our browsing experience because we had to read and think more. Still, the browsing time for MovieTickets.com – to complete the same process – was the same 4 clicks and around the same 40 seconds.
We tried both again from Google Chrome, and not even factoring in our newfound familiarity with the process for both sites, we found both processes to take 30 seconds each this time. They’re easy sites to navigate and their load times were swift.

So, with all things considered, with the goal being to get tickets for one of the most recent films released ordered, here are the Usability scores:
(Fandango 9/10)      (MovieTickets.com 10/10)


Final Verdict

We’d say Fandango won by quite a bit, given their better web performance over MovieTickets.com, but I think I enjoyed the usability of MovieTickets.com over that of Fandango’s. The fact that Fandango doesn’t present show time options upfront is a little unfortunate.

Still, one cannot ignore good web performance, and I have to hand it to Fandango for achieving impressive site speed and reliability. So, with that said, the result of the second AlertBot Showdown is…

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Fandango"

]]>
How Much Impact Does an Hour of Website Downtime Have on a Business? https://www.alertbot.com/blog/index.php/2017/02/27/how-much-impact-does-an-hour-of-website-downtime-have-on-a-business/ Mon, 27 Feb 2017 11:00:27 +0000 https://alertbot.wordpress.com/?p=353 An illustration of a business man with a briefcase running away from a shadowed monster with red eyes and red graph arrows coming from its head and mouth that are pointing downward. The background is a yellow grid with a couple money symbols.

How Much Impact Does an Hour of Website Downtime Have on a Business?

So, your business website is offline again and your IT team has sprung into action, trying to pinpoint the issue and fix it as soon as possible. Sure, it’s good that your IT experts are handling the problem responsibly, but do you know how much money your business may have lost during your website’s downtime? Well, if you are a major player in the ecommerce industry, chances are you could have lost millions of dollars by now. And that is not an overstatement.

Like it or not, even an hour of downtime can do a great deal of damage to your online business. Did you know that in 2014, Google experienced downtime which was caused by a virus and all Gmail, Google+ and Google Drive were affected by it? This downtime lasted for an hour, which decreased Google stocks by 2.4 percent.

But that’s not all! Amazon, the e-shopping giant, experienced 2 hours of downtime, presenting site visitors with cryptic HTTP messages. In just 2 hours, Amazon lost an estimated total of $3.48 million. That’s huge!

So, if you wish to estimate the true cost of an hour of website downtime has to your business, then you’ve come to the right place. Here are some of the more important variables you must consider when calculating this cost:

§  Impact on Business Sales

To figure out exactly how much an episode of website downtime costs in terms of sales lost, you’d need to determine what your average profits per minute are during the time period the downtime occurred. You can then multiply that average profit per minute times the number of downtime minutes to determine your total lost sales profits. If the downtime occurs at 2 in the afternoon, for example, it is most likely going to cost your business more sales than if the outage had happened at, say, 2 in the morning, when web traffic is typically much lighter.

§  Damage Done to Your Business Reputation

Downtime (especially if it’s frequent or at a crucial time) can scar your business’s reputation, losing the trust and loyalty of customers in your brand. Just like many businesses, you too have invested good money and a great deal of time in brand building. Your time and money can go to waste if you experience downtime—even if it is for just an hour. When considering the true cost of your site’s downtime, it is important that you keep in mind the resources you’ll need to spend to repair your tainted brand image going forward.

§  Money Wasted in Marketing Campaigns

Another factor to consider when determining the cost is the money you have invested in your marketing efforts, like PPC (pay-per-click) campaigns. You need to figure out the amount of money that was spent on marketing while your site was experiencing downtime. This is important to calculate, because let’s face it – you literally didn’t reap any benefits from the invested money, because your site was inaccessible when prospects clicked on the PPC link or advertisement.

Prevention is Always Best!

Calculating the cost you might have incurred due to an hour of website downtime is essential, but there are precautions you can take to avoid unplanned downtime and keep your business up and running ’round the clock (and be a hero!). AlertBot is an intuitive web-based website monitoring service that can alert your team about website errors and slowness within seconds, and also help you keep track of your site performance. All of this is much needed to mitigate downtime issues significantly. Start the AlertBot 14-day free trial today!

]]>
Why is Website Performance Monitoring Necessary? https://www.alertbot.com/blog/index.php/2017/01/17/why-is-website-performance-monitoring-necessary/ Tue, 17 Jan 2017 11:00:04 +0000 https://alertbot.wordpress.com/?p=343 Photo of two hands holding a tablet horizontally with illustrations of graphs and icons floating off the face of the tablet.

Given the fact that we live in a highly-digitized world today, websites, blogs and web-stores are now an essential component of any business and brand. While waiting for a site’s content to load can be annoying for a user, it can also be potentially disastrous for business.

That, however, is only one reason to monitor the performance of your website. Here are four more:

1.     Loss of Sales and Web-Traffic

First and foremost, businesses maintain websites and have web-stores to promote commercial growth. Now, imagine a situation where you’ve gone to a store and the service is impossibly slow. The salesmen and women are hardly making an effort to engage or help you and you just decide to take your business elsewhere. The same happens to a shopper when they visit a website that takes ages to load. Instead of making a sale, you lose web-traffic and potential customers. You can prevent this by monitoring how your website is performing.

2.     Potential Damage to Brand Image

Customers talk, and they are interested in what others like them have to say. While most brands depend on marketing ploys to promote sales, the importance of word-of-mouth advertisement cannot be discounted. If you leave a bad impression on one customer, chances are that word will spread about it, tainting- if not tarnishing- your hard earned reputation and brand image. Who wants that?

3.     Error Detection

Website performance monitoring is the best way to prevent errors. It’s all too common for ecommerce sites to hit a snag and run into trouble.  If your site is regularly maintained and monitored, you’ll not only be able to fix a problem sooner; you might even be able to detect it beforehand and prevent it completely.

4.     Quality Maintenance

Just as quality assurance is essential for a physical store, it’s equally important for a website and web store. By using a performance testing and maintenance tool, software or application, you will be able to standardize and retain the quality of your website. Not only will that help preserve the website’s ranking on Google, it will also contribute to drive online traffic.  As it is, Google ranking is affected by the minutest change in website speed and downtime. This is the whole reason why websites are search engine optimized in the first place.

So, if you’re even partially convinced that your website needs performance monitoring, why not start the AlertBot 14-day free trial, today?

]]>
New BOTS Act Passes: A Win For Ticket Buyers! https://www.alertbot.com/blog/index.php/2016/12/16/new-bots-act-passes-a-win-for-ticket-buyers/ Fri, 16 Dec 2016 22:12:15 +0000 https://alertbot.wordpress.com/?p=346 New BOTS Act Passes: A Win For Ticket Buyers!

If you’ve ever tried to buy tickets online for an event – whether for a popular Broadway play, a Las Vegas event, a concert in a local city or even a popular science fiction blockbuster movie months in advance – chances are you’ve struggled with being able to obtain tickets at regular price. Part of the problem is that many online scalpers have perfected the art of snatching tickets up as soon as they’re available to the public with the use of bots. Fans can breathe a sigh of relief, however, because just this week, President Obama signed The Better Online Ticket Sales (BOTS) Act to help combat this.

It’s unfortunate that this online ticket scalping problem has become so rampant that it’s taken governmental action to try to put an end to it. That blows our minds. But, in our opinion, the BOTS Act should be good for the consumer (and it’s about time!). It’ll allow for more tickets to be available at face value for consumers online than ever before. Event ticket vendor Ticketmaster commented yesterday, “On behalf of artists, venues, teams, and especially fans, we applaud the BOTS Act being signed into federal law.”

The BOTS Act won’t be good for everyone, though.   For consumers with fatter wallets who’ve grown accustomed to purchasing tickets last minute, and at inflated prices, the availability of those second-chance tickets may now be slim to none.

The BOTS Act doesn’t really impact AlertBot, but if someone had been considering the creation of an automated script on our system for use in buying highly sought after tickets as soon as they come on sale, that won’t be possible anymore.   That’s not a use case we’ve ever thought of before, but we suppose it was a possible scenario.  As of this week, however, the AlertBot team will not be allowing any scripts of this nature to be used on the AlertBot system (and as far as we know, there have never been).  However, if you work for a ticket sales website and have been thinking of monitoring your own ticket purchasing pages and shopping cart, you can still do that with AlertBot.  That’s exactly what AlertBot is meant for, and the BOTS Act provides an exception in the law to allow online ticketing operators to use bots to monitor and test their own systems for flaws.  That’s a good thing!  As we know, some of these ticket vending sites could use some improvement.

In mid-July of this year, the BOTS Act was first presented to the Senate and, on December 7th, the House passed/agreed to it without objection. It was then presented to the President on Monday this week, which he then signed. This Act, which applies to online sales for any public event exceeding 200 people, was the first of its kind pertaining to online bots and can be viewed in its entirety here.

]]>
Press Release: AlertBot Launches New Blog Series ‘Website Showdowns’ https://www.alertbot.com/blog/index.php/2016/12/13/alertbot-launches-new-blog-series-website-showdowns/ Tue, 13 Dec 2016 19:54:00 +0000 https://alertbot.wordpress.com/?p=339 AlertBot Launches New Blog Series: “Website Showdowns”

Allentown, PA / December 13, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading TrueBrowser®-based web application monitoring solution, AlertBot, is pleased to announce the launch of a new series of AlertBot blogs the team has dubbed ‘Website Showdowns.’ AlertBot’s Showdown blogs will feature monitoring results from competing websites, showcasing AlertBot’s TrueBrowser® technology at work, which combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions.

The AlertBot Showdown blogs will evaluate each website’s performance based on four categories, including reliability, speed, geographical performance and usability, complete with time-based trends and detailed analytics.

This month’s scrimmage pits rivals Apple.com against Samsung.com. With two titans of industry like these going head to head, the results were, for the most part, not unexpected.  Read the full report here.

AlertBot continues to remain on the cutting edge of website performance. With 85 Global Test Locations operating over 7 Internet Backbones developed during the past decade, AlertBot has established their reputation in real-world private industry applications. AlertBot serves over 10,000 users spanning 6 continents worldwide with 200 million website checks per month. Their Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine.

About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.


About InfoGenius.com, Inc.:

Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.

]]>
AlertBot Showdown: Apple vs Samsung https://www.alertbot.com/blog/index.php/2016/11/18/alertbot-showdown-apple-vs-samsung/ Fri, 18 Nov 2016 18:47:59 +0000 https://alertbot.wordpress.com/?p=258 A graphic with a yellow starburst in the center and two robots charging towards each other. Text reads "AlertBot Showdown: Apple vs Samsung" with cellphones above the brand names and the word SHOWDOWN very large at the bottom.

If website performance is important to you, then you’ll know just how vital it is to the success of your business’s website. To AlertBot, web performance is everything. This topic is of great interest to us, as we live and breathe web performance on a daily basis. It got us thinking – we all love a good head-to-head, mano-a-mano rivalry: Tyson vs Holyfield. The Hatfields vs The McCoys. The Jets vs The Sharks. Prego vs Ragu. Luke vs Vader. So we thought, what if we tracked the performance of two websites within a certain genre and pit them against each other. Who has the better website performance? Who will come out on top?

Every Fall, Apple releases a new iPhone like clockwork. But Apple isn’t the only game in town. With Apple celebrating the recent release of the iPhone 7, Samsung has their Galaxy S7 (which released in March). So we decided it was fitting to have Apple.com go toe-to-toe with Samsung.com. The results were not unexpected. (Well… most of the results.)

When you have companies as serious about their products and innovation as these two, you’d expect their websites to perform impeccably. And, honestly, they did.

We tracked the sites and examined three weeks in September – the 1st through the 22nd – to see how these sites performed.  During this timeframe, we tested the websites around the clock from 17 different locations across the United States using AlertBot’s TrueBrowser Monitoring.  The tests were performed by loading their homepages inside real Firefox browsers and giving them a maximum of 7 seconds to render and become fully interactive.  Anything beyond 7 seconds (which is well above the average expected page load time) was considered a failure.  After compiling all the data, this is what we found:


Reliability

When we examine the reliability of a website, we’re looking for failure events – like when pages don’t fully load or go down completely – and try to identify the cause of the failure. Some common causes are slow third-party code used on pages, incomplete page content, actual web server failures, etc.

For Samsung, their website experienced no failure events during our test period, and achieved 100% uptime. This is definitely above the norm for website performance, but not unexpected for a company like Samsung.  We would have loved to find some juicy failure-generated data to talk about, but Samsung’s website was as clean as a whistle on this front. (Samsung Score 10/10)

Similarly, Apple.com experienced no failure events and achieved 100% uptime. While I’d expect nothing less from a juggernaut like Apple, it’s still impressive when you consider other retailers that experience frequent website issues. (Apple Score 10/10)

 

Alertbot Uptime green circle performance chart Alertbot Uptime green circle performance chart

 

Speed

When we evaluate a website’s speed, we’re looking at the time it takes the site’s homepage to render and load to the point of being fully interactive.  We run these tests inside real Firefox web browsers using AlertBot’s TrueBrowser ™ monitoring.

While evaluating the speed of the websites specifically, Samsung.com’s fastest day was Friday, Sept. 2nd, with its slowest day being Saturday, Sept. 3rd. On average, the site’s homepage took  around 1.7 seconds to load. That’s not bad at all! Some recent studies have shown that the expected load time for sites in ecommerce to be 2 seconds or less, so Samsung definitely fits the bill here. Some online studies have determined that if an ecommerce site is making $100,000 per day in sales, just a 1-second page delay could potentially cost the company $2.5 million in lost sales per year.  (Ouch!) On its slowest day (Sept 3rd), Samsung.com saw some load times in the range of over 7 seconds at times during the day.     (Samsung Score 9/10)

While evaluating Apple.com’s speed, its fastest day was also a Friday, on Sept. 9th, with its slowest day being a Friday, Sept. 2 (coincidentally, the same day Samsung experienced its fastest load time), in which the site took 10 seconds to load at times (due to a slow page file error). However, on average, the site’s homepage took  around 1.3 seconds to load. It’s a hair faster than Samsung’s, but they’re close to each other.    (Apple Score 9/10)

One major mistake a lot of websites make is utilizing large graphic file sizes or third party code on their home page, and it’s things like that that can really bog down a website’s speed. It’s not surprising that both Apple and Samsung avoid this mistake. While both of them display large, beautiful images on their front page, they optimize their file sizes well.

 

Alertbot speed test green performance bar chart Alertbot speed test green performance bar chart


Geographic

When we looked at Samsung.com’s performance at various locations around the United States, we found that the site consistently took longer to load in Texas, with its slowest time occurring in Washington, DC, but was the fastest in Florida, North Carolina and Georgia. Samsung.com had just a handful of minor site hiccups during this three-week period, but only at specific locations. For example, AlertBot registered 5 instances of slower load times: once in New York, twice in Florida, once in Washington DC and once in Washington state. Still, it managed to perform more than adequately at these locations overall.  It wouldn’t be uncommon for websites to experience significant trouble in certain areas of the country on a regular basis, but we expect only the best from Samsung.   (Samsung Score 9/10)

When we looked at Apple.com’s website performance from various locations around the U.S, we found that the site consistently took the longest to load in Utah and Texas, but was the fastest in Florida and North Carolina. It’s intriguing to note that both Florida and North Carolina saw the best load times for both websites, while Texas was one of the slowest for both.  AlertBot did catch two instances of slower load times and a slow javascript file in Illinois, but neither problem caused the site to go completely down.   (Apple Score 9/10)

Alertbot performance by region green bar chart Alertbot performance by region green bar chart

 

Usability

For usability, we select a common task that a typical user might want to perform on sites like these. Then, using hands-on testing, we perform the same task on each website while timing how long it takes to complete and how many mouse clicks it takes to get the job done.  This time, we decided to approach each site with the intention of purchasing their latest phone.  We timed how long it would take from the point of entering the URL into the browser on through to getting the phone into the online shopping cart.

From the point of typing in “Apple.com” and clicking through their site from the phone product pages all the way to the shopping cart, it took 45 seconds (and 7 clicks of the mouse) for us to add a SIM-free 256GB “jet black” iPhone 7 to the online “shopping bag.” (There’s an additional click, however, to view the cart when you’re done adding the phone to it.)

From typing “Samsung.com” into our browser and clicking through to add a Samsung Galaxy S7 Edge 32GB “unlocked” phone into our shopping cart and viewing the virtual bag, it took a shocking 1 minute and 30 seconds (in 5 mouse clicks)! We used Google Chrome as our browser for both websites and the Samsung site froze up twice during the process (in fact, we accidentally added TWO of the same phone to our cart because we were trying to click through to the cart and it was unresponsive). Just to be fair, we tried it again, and it hung up yet again during the ordering process, but this time it was a little under a minute to get to the shopping bag. All of this happened on Chrome’s latest version, too. We know web browsers can be super fickle, though, so we decided to try it a third time, this time with Mozilla Firefox, and it took 20 seconds to get the same phone into the shopping cart. On Apple’s site, for the iPhone, there are a lot more choices – from storage space to phone color – to choose from, so it makes sense as to why that process might take longer. But it is rather alarming that Samsung’s site experienced THAT much trouble while just trying to add their phone to the shopping cart.

Just to compare via Firefox, then, we re-performed the timed test for Apple.com. One could argue that re-tests don’t account for newfound familiarity with either site, but it took 25 second to add the same iPhone 7 to the shopping cart. While that’s a few seconds slower than Samsung, we also didn’t experience any problems on either browser with Apple’s site.

All things considered, here are the Usability scores:

(Samsung Score 7/10)          (Apple Score 9/10)

 

Final Verdict

The performance of both sites were very, very good and quite close to one another. Apple’s site just barely edged out Samsung’s on speed and geographic performance, while both matched each other on reliability. Despite their slight differences, they both performed at the top of their game in online performance. However, after factoring in our usability testing, where Apple’s site performed much more consistently, the winner for the very first AlertBot Showdown is clear:

WINNER:

Graphic rendering of a robot with a triangular head and circle eye hovering above the ground and holding up a sign that reads "Apple"

]]>
AlertBot Now SAM Certified for Government Website Monitoring https://www.alertbot.com/blog/index.php/2016/09/22/alertbot-now-sam-certified-for-government-website-monitoring/ Thu, 22 Sep 2016 17:21:16 +0000 https://alertbot.wordpress.com/?p=255 AlertBot Now SAM Certified for Government Website Monitoring

Allentown, PA / September 21, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading TrueBrowser®-based web application monitoring solution, AlertBot, is pleased to announce that they are now SAM certified and are looking to grow their relationships with Federal and State Governments. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions. Downtime of any length can be costly for any website or online business; AlertBot’s Website Monitoring Service uses TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on government websites, login pages and other mission-critical pages. Learn More about Trusted Government Website Monitoring.

“With 85 Global Test Locations operating over 7 Internet Backbones developed during the past decade, AlertBot has established their reputation in real-world private industry applications; this level of website testing and monitoring is both proven and ready for Public Service Deployment as Federal and State Agencies rollout ‘Next Gen’ consumer style, interactive websites,” states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “We’re looking forward to showcasing AlertBot’s TrueBrowser® technology and capabilities to Governmental Agencies throughout the country and help them validate their client usage.”

AlertBot serves over 10,000 users spanning 6 continents worldwide with 200 million website checks per month. Their Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives government organizations, including the U.S. Department of Energy, Virginia state government, NOAA, U.S. Marine Corps, and Smithsonian Institution, the information they need to ensure their applications are always running error-free and providing a quality user experience. AlertBot has registered to do business with Federal and State Agencies using the following registrations: DUNS: 624818493; CAGE: 6QP16; NAICS: 518210, 454111, & 334290.

About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.

About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.

###

]]>
IRCE 2016 Conference Recap https://www.alertbot.com/blog/index.php/2016/06/17/irce-2016-conference-recap/ Fri, 17 Jun 2016 18:32:48 +0000 https://alertbot.wordpress.com/?p=250 A few of the AlertBot team recently returned from IRCE (Internet Retailer Conference & Exhibition) at McCormick Place in Chicago, IL. IRCE is the premiere conference for ecommerce players and resources and it’s really a great place to connect with familiar brands while learning about new ones.

We enjoyed visiting Chicago, too – for some of us it was our first time there – and took advantage of riding the “El” around town and visiting the 360 Chicago Observation Deck as well as some of the many fine eateries found there.

A photograph of a city from the window of a skyscraperPhoto by Adam Akarsoy

We had our booth set up in the exhibit hall among hundreds of other exhibitors, including recognizable names like eBay, FedEx, Shopify, UPS, USPS and many more. Many of the booths had great swag to give away. Our favorites included a gift box from Aramex, which included power banks shaped like freight trucks; Blue Acorn with their squirrel-shaped stress toys; Artifi Labs, who were giving away ice cream scoops and free ice cream sandwiches; and Classy Llama with their soft plush llamas and superhero mask and cape sets. Some of the bigger brands had some cool handouts too, like eBay with their USB reading lights and mini journals, PayPal with free sunglasses, or UPS’s cell phone rests and sanitizer spray markers.

Conference attendees who visited the AlertBot booth had the opportunity to meet some of our great staff and talk with us about what AlertBot could do to monitor their ecommerce platform. Each of our booth visitors got to spin our prize wheel for an opportunity to win brand new AlertBot water bottles, AlertBot playing cards, or grand prizes like Back To The Future Flux Capacitor USB car chargers and remote control helicopters. Last, but certainly not least, attendees could use their phones to scan a QR code (or visit win.alertbot.com) to enter an even bigger giveaway to win a DJI Phantom 3 Standard quadcopter drone!

 

Two smiling pretty young girls holding drones and posing next to a contest prize wheel.

Events like these are great because it allows us all to venture out from behind our computer screens to meet our customer s and prospective customers  face-to-face and connect on a more personal level.

If you weren’t able to attend IRCE this year, that doesn’t mean we can’t still talk! Shoot us an email. We’d love to chat with you and tell you why we think AlertBot is right for you – and why we know you’re going to love it!

]]>
AlertBot to Exhibit at Internet Retailer Conference & Exhibition (IRCE) https://www.alertbot.com/blog/index.php/2016/05/24/alertbot-to-exhibit-at-internet-retailer-conference-exhibition-irce/ Tue, 24 May 2016 18:02:37 +0000 https://alertbot.wordpress.com/?p=248 AlertBot to Exhibit at Internet Retailer Conference & Exhibition (IRCE)

Allentown, PA / May 24, 2016 / PR Newswire … InfoGenius.com, Inc., a software company and developer of the leading real-time web application monitoring solution, AlertBot, is pleased to announce that they will exhibit at the Internet Retailer Conference & Exhibition (IRCE) 2016 in Booth # 841. The conference will take place June 7-10, 2016 at McCormick Place West in Chicago, IL. At IRCE, AlertBot will be demonstrating its TrueBrowser® Web Application Monitoring solution. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions. Downtime of any length can be costly for any website or online retailer; AlertBot’s Website Monitoring Service uses TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on e-commerce-driven websites, login pages and other mission-critical pages.

“We’re looking forward to showcasing AlertBot’s TrueBrowser® technology and capabilities at the Retail Industry’s Leading E-Commerce Conference and Tradeshow (IRCE)”, states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “Over the past 10-years, AlertBot has been deployed and proven in countless real-world applications by some of the leading names in the e-commerce space and this gives us another opportunity to demonstrate our advanced technology.”

AlertBot serves over 10,000 users with 200 million website checks per month using its network of over 100 locations, spanning 6 continents worldwide. Their Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives businesses including Blue Cross/Blue Shield, Chrysler, Mutual of Omaha, Sony, Microsoft & Dell Computing the information they need to ensure their applications are always running error-free and providing a quality user experience.

About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.

About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.

###

]]>
AlertBot Celebrates 10th Year of Website Monitoring https://www.alertbot.com/blog/index.php/2016/04/11/alertbot-celebrates-10th-year-of-website-monitoring/ Mon, 11 Apr 2016 10:50:35 +0000 https://alertbot.wordpress.com/?p=185 AlertBot Logo

Allentown, PA / April 11, 2016 / PR Newswire
InfoGenius.com, Inc., a software company and developer of the leading real-time web application monitoring solution, AlertBot, celebrates a decade of website and server monitoring. Downtime of any length can be costly for any website or online retailer; AlertBot’s Website Monitoring Service provides best-in-class site monitoring using its TrueBrowser® technology to launch real web browsers and test websites inside those browsers, including mission-critical financial transactions conducted on e-commerce-driven websites, login pages and other mission-critical pages. AlertBot serves over 10,000 users with 200 million website checks per month using its network of over 100 locations, spanning 6 continents worldwide.

“AlertBot measures every facet of a website to help our clients improve the user experience; our testing helps clients make adjustments that result in measurable gains – for instance, a major e-commerce player measured gains of $1.4 million for every second of response time their platform improved – that small improvement netted them $18 million in revenue!” states Pedro Pequeno, President of InfoGenius.com, Inc. He continues: “Over the past 10-years, AlertBot has been deployed and proven in countless real-world applications by some of the leading names in the e-commerce space.”

AlertBot’s Synthetic Monitoring is designed to detect all possible application errors and collect important performance metrics as part of its monitoring routine. This data gives businesses including Blue Cross/Blue Shield, Chrysler, Mutual of Omaha, Sony, Microsoft & Dell Computing the information they need to ensure their applications are always running error-free and providing a quality user experience.

An illustration showing a robot with a party hat and holding a birthday cake. Text reads "AlertBot Celebrates 10 Years"

About AlertBot:
Since launching in 2006, AlertBot has provided industry-leading TrueBrowser® web application monitoring. Thousands of companies trust AlertBot to continuously monitor their mission critical websites for errors and performance issues that affect user experience. Visit www.AlertBot.com for more information.

About InfoGenius.com, Inc.:
Founded in 1999 by a group of engineers, InfoGenius prides itself in building and delivering quality enterprise-class services that help businesses, both small and large, realize their greatest potential online. InfoGenius conducts its business through its network of independently branded services including AlertBot, ELayer and UptimeSafe. Visit www.infogenius.com for more information.

]]>
Don’t Let Third-Party Code Wreck Your Website! https://www.alertbot.com/blog/index.php/2016/03/31/dont-let-third-party-code-wreck-your-website/ Thu, 31 Mar 2016 17:44:44 +0000 https://alertbot.wordpress.com/?p=182 When you’re evaluating a website’s performance, you may find that several culprits could come into play that can bog down your website’s load time. Today, we’re going to take a look at one of the biggest – if not the biggest – causes of web performance problems: third party code.

If you’re not quite sure what that is, third party code is usually any code provided by another company or website to plug in / embed a service on your website. For example, you may have a web stats tracking code, a banner ad rotator, or a couple lines of code that drops your Twitter or Instagram feed onto your website. These pieces of code are considered third party code since they’re provided by another source.

2015-Q1-Third-Party-code-cropped

Some of the problems that this kind of code can cause may be:

  • Slow page load times
  • SSL errors (there could be a non-secure component in the code)
  • Unexpected javascript errors of various kinds
  • Failure to load some of your website content
  • Inaccurate stats tracking

The case of causing inaccurate stats is a particularly interesting one that most people don’t consider. Problems with third party code could render your website’s stats unreliable if the stats code is not fully loading. When this happens, you may only be getting partial information about your visitors or no information at all.  If you make business decisions based on those stats, you may be making the wrong decisions based on misinformation.

In the case of third-party code causing slow page load times or loading errors, it affects your visitors’ experiences on your website.  Unhappy visitors may choose not to buy from you and often times won’t ever return to your website.

So what can you do in this situation? First off, you’ll want to diagnose the problem to make sure it is indeed the third party code causing the problems. AlertBot is an excellent service to use for finding out what is causing a bottleneck in your load time.

Once you know for sure that it is the third party code creating the issue, here are a few things you can do to resolve problems with third-party code:

  1. Ask the Third-party provider to resolve the problem – The solution may be as simple as contacting the third party, informing them of the issue(s) you’re having and asking them to fix it. It’s possible that they’re not even aware there’s a problem.
  2. Remove the third party code altogether – This may be the quickest and easiest solution, but obviously it doesn’t solve the problem if you really need the code on your site.
  3. Look for other third party code providers – This may be your best course of action. While it can be time consuming to search for viable solutions, if you need the code, trying something else out could be the most sensible option. And if you can find reviews on the solution from other users who have tried it out, that’s even better.
  4. Move to a purchasable / installable application – Free third-party code is great, and just dropping in a piece of third-party code is a nice time saver, but sometimes taking the high road and paying for an installable solution (with support) could be the best option for your business, especially when your own customers or clients are involved.
  5. Ask a web developer to look at it – This might not be possible for every site owner, but it’s an especially good option if your company has a programming department. There’s a good chance that just moving the code to a different location in your page’s HTML (or onto a different page altogether) could drastically improve the situation.

So, as you can see, third party code can greatly impact your website. And if you’re experiencing some web performance issues and you’re utilizing third party code, there’s a pretty good chance that code may be the catalyst for those problems.

Sign up for a risk-free trial of AlertBot today and start down the path to better performance for your website.  AlertBot can track the performance of all your third-party code and lets you know when it’s causing problems.

]]>
Lessons from the Dentist for Your Website’s Performance https://www.alertbot.com/blog/index.php/2016/02/25/lessons-from-the-dentist-for-your-websites-performance/ Thu, 25 Feb 2016 14:10:47 +0000 https://alertbot.wordpress.com/?p=173 Every successful website needs to undergo tests and retests (and re-retests) to ensure that the site performs well for all visitors. It’s one thing to test how your site looks in different web browsers, but what about how it actually performs?

Running an AlertBot waterfall chart is just one example of something IT and website managers can do to see how the site is performing with load times. Before you release your site’s new design, complete overhaul or its grand debut, it’s wise to give your site a thorough testing first. A simple test with AlertBot’s waterfall chart can reveal images or third party code that might be clogging up your load time — or many other possible hang-ups. Your site might look fine and dandy, but if the page is taking too long to load in this fickle web-browsing age we’re living in, it could be very costly for your business.

Waterfall-2016(Above is a real, abbreviated example of AlertBot’s waterfall charts)

In the end, it’s really like a visit to the dentist; they often give you tips and guides on how to prevent cavities and other oral problems, while helping you maintain good oral hygiene. Maybe using mouthwash or flossing daily will help keep your gums healthy and your teeth strong. Likewise, with the right web performance tools and tests, you can ensure quality conversions and hopefully prevent any possible decay in your site’s performance.

See for yourself with AlertBot’s completely free 14 day trial!

]]>
AlertBot To Exhibit At IRCE 2016 in Chicago, IL https://www.alertbot.com/blog/index.php/2016/01/18/alertbot-to-exhibit-at-irce-2016-in-chicago-il/ Mon, 18 Jan 2016 20:26:25 +0000 https://alertbot.wordpress.com/?p=165 2016-Q1-AB-Newsletter-Header---IRCE
We are proud to announce that AlertBot will exhibit at Internet Retailer Conference & Exhibition (IRCE) 2016 in Chicago, IL. The conference will take place June 7-10, 2016 at McCormick Place West, Chicago.

What: Internet Retailer Conference & Exhibition (IRCE) is the flagship event for the e-commerce industry. Learn the latest trends in the industry from experts who are implementing the latest technologies and solutions. IRCE 2016 will take place in the world-class city of Chicago, IL.

When: June 7-10, 2016

Where: McCormick Place West, Chicago, IL

At IRCE, AlertBot will be demonstrating its TrueBrowser® Web Application Monitoring solution. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions.

Planning to Attend?

If you’re planning on attending, make sure to stop by the AlertBot booth to meet our team, see a demo and get some cool AlertBot swag. If you would like to sit down for a one-on-one demo during the conference, please don’t hesitate to email us ahead of time. Oh, and mention this newsletter announcement when you meet us at our booth and we’ll give you an extra spin on our prize wheel for a chance to win a second prize!

]]>
2015 Holiday Ecommerce Website Performance Report Update https://www.alertbot.com/blog/index.php/2015/12/21/2015-holiday-ecommerce-website-performance-report-update/ Mon, 21 Dec 2015 20:47:33 +0000 https://alertbot.wordpress.com/?p=159 Illustration with a tree with snow on the ground and snow falling. Text reads "Holdiday Shopping 2015 Web Performance Report". Lettering is cutout showing a shopping mall in the background.AlertBot recently featured a 2015 Black Friday ecommerce web performance report, offering an inside look into online retail website performance over Black Friday weekend this year.

The report examined several sites that had experienced minor to substantial down times over the course of Friday thru Sunday, including NeimanMarcus.com and Staples.com. Our report, however, did not take into account Cyber Monday or the sales-heavy weeks that have followed.

On Cyber Monday, Neiman Marcus seemed to right the ship, while only a few sites appeared to experience no more than about 10-minute failures–including NewEgg.com and Staples–with Gap’s site registering 19 minutes of being unresponsive.

Throughout this month of December, Staples has continued to struggle with short outages; Walmart experienced some brief HTTP errors; while Footlocker, PC Mall, Crate & Barrel and Peapod all experienced just a couple few-minute hiccups (which in some cases could simply just be sluggish load times).

With online sales outperforming brick and mortar sales this Black Friday for the first time, it’s imperative now more than ever to make sure your site stays up and working, error-free. Sign up for a free 14-day trial with AlertBot.com today and start working towards a more successful ecommerce site!

]]>
Black Friday 2015 Web Performance Report https://www.alertbot.com/blog/index.php/2015/11/30/black-friday-2015-web-performance-report/ Mon, 30 Nov 2015 21:39:02 +0000 https://alertbot.wordpress.com/?p=150 Although today–Cyber Monday–is a glorified online extension of the annual brick and mortar post-Turkey Day national shopping binge, let’s take a look at how some of the top online retailers performed over the holiday weekend.

While websites like Walmart, Fanatics and QVC experienced a couple several-minute outages on Thanksgiving Day, one of the sites that seemed to struggle the most on Black Friday this year was the online destination for department store Neiman Marcus. The site even experienced a downtime of two hours in the morning.

Black and white graphic of a twisted, bent shopping cart in white on a black background. Text reads "Black Friday 2015"Second only to NeimanMarcus.com, however, is online tech retailer Newegg.com, who experienced some slow page load times no doubt due to the heightened traffic. Finally, Staples.com also experienced some short outages, but nothing more than a few minutes each.

Through Saturday and Sunday, it was much of the same with Neiman Marcus, Staples and Newegg, with Walmart seeing a few hiccups and Shutterfly.com experiencing a 45-minute outage due to the server being overloaded with traffic. Sony’s Playstation Network also experienced some significant downtime on Saturday, which also affected their online store.

Downtime of any length can be costly for any online retailer. According to this article by Evolven.com, “The average cost of data center downtime across industries was $5,600 per minute.” Clearly, that would add up real quick – especially on a major shopping day.

With AlertBot’s monitoring services, not only can you be alerted the moment your site experiences an outage or slow load times, but you’ll be able to use the AlertBot charts and reports to find potential hang-ups and future problems that will result in unnecessary downtime.

Give AlertBot a try with our totally free trial period and start seeing how AlertBot can look out for your business to help you prevent serious financial loss and online disasters.

]]>
Get Your Website Ready For Holiday Traffic https://www.alertbot.com/blog/index.php/2015/09/17/get-your-website-ready-for-holiday-traffic/ Thu, 17 Sep 2015 16:52:36 +0000 https://alertbot.wordpress.com/?p=140 A graphic showing a computer monitor with a cracked screen with fragments flying around. Text reads "Black Friday"

Get Your Website Ready For Holiday Traffic

It’s that time of year again. As we say farewell to summer and prepare for the coming of autumn next week, online retailers are faced with one harsh reality: Black Friday is a mere two months away. And while that may seem like a long time from now to some, now is really the time for preparation. And just like any brick and mortar retailer needs to have their store ready to go with employees on hand to wrangle the shopping masses, websites need to make sure their site is tuned up and ready for an influx of traffic.

If you’re feeling pretty confident that you’re ready and that this warning may seem premature or unnecessary altogether, let’s take a moment to spotlight last year’s Black Friday festivities and pitfalls.

The biggest name to have experienced major website failures last November was electronics retail chain Best Buy. Issues were recorded and reported on throughout the day on Black Friday and it sent social media abuzz with chatter and complaints about the site’s performance—or lack thereof.

Best Buy error page with an illustration of a wreath with a bow

Best Buy wasn’t the only one affected, however. Computer company HP’s webstore also experienced failure, while in the UK, online stores Currys (electronics), Argos (department store) and Tesco (groceries) all went down as well.

So what can we glean from this?

If you’re an online retailer, you’re probably already thinking about the holidays and getting prepared, but now is the most crucial time to not only make sure you have reliable website monitoring, but to evaluate your website’s performance so you can make improvements before the big online sale days. And you’re in luck – AlertBot can assist with your performance evaluation and help you rest assured that your site will perform better in time for the holidays. Try it out for free with our 14-day trial.

]]>
Use AlertBot To Monitor The Competition https://www.alertbot.com/blog/index.php/2015/08/25/use-alertbot-to-monitor-the-competition/ Tue, 25 Aug 2015 18:23:45 +0000 https://alertbot.wordpress.com/?p=137

Use AlertBot To Monitor The Competition

When most of us think of “website monitoring,” we usually think about how it applies to our own websites. However, website monitoring really has more uses than we may realize or consider.

Truth be told, while using AlertBot to keep an eye on our own websites and pinpoint problems that need fixing, we can actually set up monitors for any site—not just our own. This means we can actually monitor the competition as well.

The upside to monitoring the competition is that you can get an idea of how a competing website might be performing from around the world, and gauge whether your website is competing as well in those areas. Furthermore, you can see how long their page load times are and find out what features on their website may be slowing them down. It could help you figure out what to avoid in your own design or focus on what to do better in your market, for example.

Photograph of rooftop spyglass

You can test-drive this concept with our free, risk-free 14-day trial. Try it out today and start gathering actionable data on your website – and your competition’s!

]]>
Are You Testing Your Site’s Performance With Different Browsers? https://www.alertbot.com/blog/index.php/2015/07/24/are-you-testing-your-sites-performance-with-different-browsers/ Fri, 24 Jul 2015 17:45:20 +0000 https://alertbot.wordpress.com/?p=128 Are You Testing Your Site’s Performance With Different Browsers?

Web developers know browser compatibility can be a real headache, however, browser compatibility doesn’t just affect web developers. Recently, one AlertBot customer received an alert that their site had failed. When investigating the failure, they found that their site was actually not completely down, but that AlertBot had discovered that their site had stopped working in just one browser. Their website was working fine with Chrome, Internet Explorer (IE), Safari, etc, but had stopped loading with Firefox. Thanks to AlertBot’s TrueBrowser™ Monitoring options, which allowed them to test their website in multiple browsers, they were able to identify the problem with that one browser quickly and fix it.

For web developers, it’s easy to simply open your site in each of the popular web browsers to check it for compatibility, find that it’s working smoothly, and then never follow-up on it again. However, websites, servers and backend resources change often. AlertBot’s TrueBrowser™ Monitors can be set up to check your site regularly with each of the popular web browsers and make sure nothing has changed. So, for example, with AlertBot, you can set up a Test Scenario to check your website with Chrome, another one to check it with Firefox, then another with IE, etc. This way, you’ll know the very instant your site stops functioning within one of these popular browsers.

Browser logos circled around the AlertBot logo

It’s also just a super easy way to not have to worry about browser compatibility as often. Think about it; these days, web browsers are constantly auto-updating to new versions and web masters are constantly updating their websites. It’s a lot to keep up with–testing your site’s performance with each browser every time this happens–so having something as simple as an automatic browser monitor frequently testing your site’s reliability is one less worry for website owners.

Take the AlertBot TrueBrowser™ Monitor for a spin with a completely free trial and let us start watching your back for you!

]]>
Velocity Conference 2015 Recap https://www.alertbot.com/blog/index.php/2015/06/04/velocity-conference-2015-recap/ Thu, 04 Jun 2015 23:17:50 +0000 https://alertbot.wordpress.com/?p=120 Three guys standing in a booth at a convention

Velocity Conference 2015 Recap

The AlertBot team just returned from last week’s Velocity Conference event in Santa Clara, California. We had a great time meeting a lot of people who share our affinity for web performance. And, despite some air turbulence during the trek that rendered more than one of us uncomfortably queasy, we enjoyed the trip from the humid weather in Eastern Pennsylvania to the crisp breezy air of California.

As a VelocityCon sponsor, we had a booth set up in the exhibit hall, which allowed Velocity-goers to peruse various tables showcasing unique and recognizable products and brands (Even NetFlix and Amazon were there?!) and pick up some fun swag along the way. For example, Target had these awesome little plush versions of their canine mascot to give away (which a couple of us snatched up for our little Bots back home), HP had silicone cell phone speaker amplifiers, JFrog had foam frogs and “Batfrog” superhero spoof tees, Verizon offered a pair of ping pong balls, and our booth neighbors (x) Matters gave away old school handset phone receivers you can plug into your cell phone. So, yeah, there were quite a few fun things you could snag from any given booth.

Photograph of a populated showroom floor

If you visited us at the AlertBot booth, you had the opportunity to listen to us give a little talk on the AlertBot’s monitoring services and then take a spin of our prize wheel (carnival, style!). Many attendees walked away with a cool new remote control helicopter, while others got to grab AlertBot swag like travel mugs, highlighters or Frisbees. We even had a drawing to win a brand new Apple Watch, which we announced on the last day of the conference. The lucky winner even got to take it home that day too (congratulations to Craig T. from Constant Contact!).

Three guys standing in a booth at a convention

Events like these are great because it allows us all to step out from behind the comfort of a desk chair and computer screen to meet our customers in person and discuss our projects face-to-face. Velocity was a nice opportunity for this.

But hey, if you weren’t at Velocity, that doesn’t mean we can’t meet! Shoot us an email. We’d love to talk to you and tell you why we think AlertBot is right for you – and why we know you’re gonna love it!

 

 

]]>
Using Monitoring Reports To Improve Website Performance https://www.alertbot.com/blog/index.php/2015/04/07/using-monitoring-reports-to-improve-website-performance/ Tue, 07 Apr 2015 20:05:50 +0000 https://alertbot.wordpress.com/?p=116 When a web user or designer thinks of website monitoring, it’s easy to just think of the basic “up” and “down” aspects of your site’s performance. Something interrupts your server – be it a coding misstep, hosting downtime, server misconfiguration, etc – and website monitoring can let you or someone on your team know via text, email, or phone call so you can fix it.

But website monitoring can do so much more. AlertBot’s monitoring service collects all kinds of data about your website that is invaluable to any website owner. For example, each time AlertBot tests your website, it analyzes the load time and performance of every piece of your page and will generate a detailed assessment of how long each component takes to load. This helps you identify potential problem areas for your website’s loading time, including every component’s size, transfer speed, load time and more. With this kind of data, you can pinpoint exactly which areas need improvement.

Example AlertBot waterfall chartFor instance, some site owners don’t realize how their graphically-intensive websites might be causing serious load delays for their users – and maybe even only in a specific region or country in the world. Worse yet, if you’re using a lot of third party code or off-site image hosting on your page, you might not be aware of how it’s affecting your site’s visitors in different parts of the globe.

So website monitoring can do a lot more for you and your business than you might realize. Give AlertBot’s free 14-day, risk-free trial a chance and start learning how to increase your website’s potential right away.

]]>
Protection Against Website Monitoring False Alerts https://www.alertbot.com/blog/index.php/2015/03/20/protection-against-false-alerts/ Fri, 20 Mar 2015 17:48:13 +0000 https://alertbot.wordpress.com/?p=110 Protection Against Website Monitoring False Alerts

As website owners, uptime is about as crucial as making sure the front door on a shop owner’s local 24-hour business isn’t locked. We need visitors and customers to be able to reach us at all times. AlertBot’s service can ensure that uptime is consistent and reliable. Of all its features, AlertBot’s alerting process is what ultimately gives us website owners peace of mind.

A robot pointing to a palm pilot with icons and a bar graph coming out of it.

AlertBot’s alerting system differs from most in the way that it works hard to avoid false alarms. No one likes getting an alert that their site is down when it really isn’t, and AlertBot combats this by testing your site’s availability from more than one location before sending you that digital elbow nudge about your site’s downtime. For example, if a test server in New York responds that your site is down (or producing an error) at the moment, it’ll test it from another location—say, California—within 60 seconds. It’s only after the failure is verified from this second location that it will deem the error legitimate and begin alerting.  You won’t just be getting an alert based on a brief outage in one isolated location.

The alerting process is versatile as well. You can be alerted via email, text-message or automated phone call, or through any combination of these options. For example, with SMS text messaging, you might get a message sent to your phone from AlertBot specifying what went down. You then can take whatever necessary steps needed—depending on the cause of the error—to get things back up and running smoothly again. AlertBot will continue to test your site’s availability until it is, and you’ll be notified via text once more once it’s back up, with the amount of time your site (or the specified portion or page of your site) was inaccessible displayed as well (whether it be minutes or hours). It’s a great way to remain aware of your website’s performance day or night. It’s also a great way to pinpoint problem areas of our site to know what to fix or improve.

For more information about AlertBot’s alerting services and features, click here.

]]>
AlertBot to Sponsor O’Reilly Velocity Conference in Santa Clara, California https://www.alertbot.com/blog/index.php/2015/03/13/alertbot-to-sponsor-oreilly-velocity-conference-in-santa-clara-california/ Fri, 13 Mar 2015 22:07:40 +0000 https://alertbot.wordpress.com/?p=105 Photo from the outside of Santa Clara Convention Center in Santa Clara, California at night

We’re proud to announce that AlertBot is a Silver Sponsor and will exhibit at O’Reilly’s Velocity Conference in Santa Clara, CA. The conference will take place May 27-29th, 2015 at the Santa Clara Convention Center.

What:   O’Reilly Velocity Web Performance and Operations Conference. O’Reilly hosts four Velocity Conferences around the world but the Santa Clara conference is the largest with an expected attendance of 3,000.

When: May 27-29, 2015

Where: Santa Clara Convention Center, Santa Clara, California

At Velocity, AlertBot will be demonstrating its TrueBrowser® Web Application Monitoring solution. TrueBrowser® technology combines advanced performance tracking and error detection with real web browser testing to provide customers with best-in-class website monitoring solutions.

Planning to Attend?
If you’re planning on attending, get a discount code to use at registration to get a 25% discount for your conference passes. For those attending, make sure to stop by the AlertBot booth #815 to see a demo and get some cool AlertBot swag. If you would like to sit down for a one-on-one demo during the conference, fill out our form to reserve a time.

Photo from the outside of Santa Clara Convention Center in Santa Clara, California around a pool

]]>
What’s Google Up To With Recently Spotted “Slow” Icon? https://www.alertbot.com/blog/index.php/2015/02/27/whats-google-up-to-with-recently-spotted-slow-icon/ Fri, 27 Feb 2015 22:50:22 +0000 https://alertbot.wordpress.com/?p=99 Earlier this week, on Tuesday, Google+ user K Neeraj Kayastha discovered a new technique Google’s search engine may be getting ready to implement abroad that will warn mobile users of potentially sluggish links before clicking.

Neeraj posted screenshots from his personal Android browser that indicated a new red “SLOW” icon branded next to links for YouTube and even a Google search result (scholar.google.co.in, to be exact). Today, we tried to replicate the same result on an iPhone, but were unable to bring up any “Slow” icons on our search results. (And comments on Neeraj’s report page seemed to reflect similar experiences.)

Screenshot of a mobile screen with Google search results

So what does this mean? It’s possible that Neeraj happened to stumble on a brief Google testing of an upcoming new search result feature, and if this is indeed on the horizon for the near future, website owners may want to do all they can to avoid that little dreaded scarlet branding.

Should this feature come to light soon, now would really be the ideal time to find a Website monitoring solution for your business’s website to ensure visitors and new clients aren’t deterred by Google’s little warning.

Click here for a list of solutions and more info on how AlertBot can help.

]]>