Blog

8 Less-Than-Obvious Reasons To Test Your Backups

Transcript:

At Storagepipe, we’ve witnessed every kind of data disaster you could possibly imagine. And every day, we see first-hand evidence of what can happen if you don’t consistently maintain the right backup habits.

Of course, everyone knows they should test their data backups on a regular basis. But despite this, few companies actually do it. This is partly because people too busy with more important things, and partly because they don’t realize how common backup failure can be.

Sure, there is a small chance that your backup tapes or external drives could become damaged. But that’s not the only reason why testing is important. There’s actually much more to it than that.

So in this video, I want to outline 8 important reasons why companies should thoroughly test their backup and recovery processes at least once per year.

  1. New computers are constantly being added to networks. And often, we forget to back up these new systems. This is particularly true in the case of virtual servers.
  2. New applications may be added to your computers, which save data in unusual parts of the hard drive. If you’re only backing up your desktop and “My Documents” folders, you may be leaving some important files unprotected.
    In fact, it’s very common for PC users to forget to back up their Outlook PST files.
  3. There may be some new internal processes or external regulations which dictate how certain types of data should be handled. Failure to follow these policies could land you in trouble.
  4. Storage is now growing at an alarming rate. Many companies are reporting that their file storage is doubling in size every year.
    This rapid data growth means longer backup times, slower recovery, more difficulty finding specific files, and more chances for error.
  5. Now that we’re living in the Cloud Computing age, we’re seeing a rapid change in the network topology of most organizations.
    Employees are increasingly moving outside of the internal network, while the datacenter is being reshaped in order to leverage the benefits of public, private and hybrid clouds. If a backup administrator loses track of where these servers fit within the network architecture, they can be accidentally left out of the backup process.
  6. Some programs – like word processing or spreadsheets – can tolerate up to a day’s worth of data loss without causing too many problems. And other systems – such as e-commerce databases – have almost zero tolerance for data loss. It’s important to know what your backup priorities are in order to design an efficient and cost-effective backup process that still covers all of your data protection needs.
  7. The IT industry is known for its high turnover. Would your company be able to restore its servers if you – or another key person – were unavailable?
  8. Now that we live in an era of 24/7 business, there is much less tolerance for downtime. When IT systems become unavailable: employees become unproductive, the company stops generating revenue, and customers start seeking alternative providers.
    That’s why you need to test for speedy recovery in every possible scenario. This includes simple data file recovery, full bare-metal recovery, and even the transfer of your entire datacenter to an alternate emergency site.

When a real data disaster hits, failure and hesitation are not acceptable.

Technology changes at an incredibly rapid pace. And your backups need to be ready for tomorrow’s challenges.

If you have any other questions relating to the testing of your data backups, please visit Storagepipe.com for more information.

Leave a Reply