kindsey@kirkham.it, davin.chitwood@irontechsecurity.com, info@webpossible.net
#

Back to videos

The Importance of Data Backup

The main reason for data backup is to save important files if a system crash or hard drive failure occurs. Losing any amount of data can compromise your personal identity, erase your family pictures, or bankrupt your entire company.

Prefer to read? (Transcription)

TOM: Welcome to the webinar. This is – I don’t know how many Deeper Dives we’ve had. Do you know, Kindsey?

KINDSEY: A lot. I could go back and count, but a lot.

TOM: This is #1,392 in our Deeper Dive webinars. That’s where we take a deeper look at many of the topics that we talk about and things that we do and security and things like that. Today’s is all about backup – the why, what, how, where, and who that has to do with backups.

Everyone knows that I’m Tom Kirkham, Founder and CEO of IronTech, and we do these every Tuesday at 2 p.m. It’s not always me, so be sure to keep an eye on the schedule so you’ll know when not to tune in and find another webinar conductor, so it’ll be a much more pleasant and enjoyable experience.

KINDSEY: Yeah. [laughs]

TOM: Is that how it goes?

KINDSEY: Oh yeah, for sure.

TOM: Most of us have been on these before and you know we’re kind of laidback on these. I was really agonizing over this webinar because I didn’t start work on it until about 11:15 today. Backup is just one of those things – and today I got it, and I knew it before – but it’s always “I’m not going to” – everybody knows you’ve got to do backups. But when it gets into the nitty-gritty and the day to day of it, it just kills you. It’s such a boring thing to do. You’ve got to check the backups, then you’ve got to fix the backups. If you’re doing some old archaic way of doing backups that involves old, old media that we’ll talk about, it’s just horrible. But there’s easier and better ways to do it.

Anyway, building the webinar was just about that tedious. It really was. But hopefully we made some changes to it – because Kindsey will vouch it was pretty boring. So let’s take a look here.

There’s 2 types of people when it comes to backup: those that DO and those that WILL. The ones that will, because they needed a backup and they didn’t have one for whatever reason. When you look at the whole backup would and how you select and how you weigh the different things I’m going to talk about, the 5 Ws – incidentally, I know those were out of order, for journalism students. But they’re in the order of the slides.

At any rate, it’s always a balance. Even if you have unlimited resources and more money than Midas, like Apple, they can’t do everything perfectly and they even have downtime. Google has downtime. Microsoft. Facebook has a lot of downtime because the only thing that affects them on downtime is the ads not getting served up. Things just break on Facebook and they don’t really care as long as the ads get through. At any rate, it’s always a function of time and money, more or less.

When you’re looking at this chart, you look at the traditional backup and in most cases, I’d say up until about 15 years ago, give or take, we were doing traditional backups. If you were really cutting edge, you were doing it online and it was all automated. You may not have been able to test it, but you could’ve checked it just to make sure the file’s there, which is better than nothing. Most of that technology is based upon a program that’s been around forever, like 30 years or more, and a lot of backup technologies use it. If you’re using anything like Backblaze or – what’s the real common consumer? It’s all unlimited backup. But it all uses a program that’s opensource called rsync, and then they wrap it up with a nice little pretty interface and a timer and all of this.

It’s real low cost, but it’s also slow. For home users, that’s okay, and for a real, real small office, it might – might be okay. There’s other limitations besides just speed. Database architecture comes into play. Especially QuickBooks. But it is low cost.

Then you step up a notch, and this is where you get into your business grade stuff. This is where you’re looking at continuity. As of, give or take, 15 years ago, business continuity became affordable to where not only can I plan that I’ve got a backup to restore – remember, 30 and 40 and 50 years ago, backups were slow. You had to restore them off of tapes or floppies. So you just accepted the fact that if it took 40 floppies to restore a backup, or tapes are real slow and you just hope they work, that’s how long it took. There was nothing you could do about it.

But once it came online, you could restore just a file in a matter of seconds or minutes. But then virtualization came onboard, and that allowed you to actually launch a copy of the exact same machine in the cloud or locally, and you were back up and running within minutes. And this is affordable. It doesn’t get you 100% there, though.

Where the Googles and Apples and Microsofts of the world want to be is high availability, which is georedundant, everything’s virtualized, and everything is cloned all the time. The costs are coming down on that, and any of you that are on the call that use our phone services and any of the server in the cloud, any of our exchange servers – if you’ve got email through us, you’re getting an exchange server in the same data center. That data center actually has high availability because it’s two data centers under one roof, and they continuously duplicate everything, servers, the whole bit, to the other side of the data center.

That’s not a backup. That’s a duplicated environment. There’s a difference between a snapshot in time and a backup. We’ll get to that when we start talking about the hows. But anyway, it’s a balance of time and money. Business continuity – the shorter period you stay down, the more money it costs. It’s that simple.

So why would you need a backup as opposed to high availability? Even though they are a clone, it’s not necessarily a backup. And the key difference is you want to keep multiple backups. You might want to back up every hour of a certain file, and then you want a week-old backup or a month-old backup and a 6-month-old backup and a year-old backup, and maybe you want to delete it after that.

The reason for that is you may not discover that you’ve got a file corruption or database corruption until it’s a couple of days later. And we have seen this. You do a QuickBooks upgrade and you do a bunch of work – say you’ve got two bookkeepers in the office doing data entry for 2 days, and then all of a sudden you hit the bug and it’s fatal bug, and the only way to fix it is to roll back to a previous version of the database. If you don’t have a 2-day-old backup and all you’ve got is a clone of yesterday’s then you’ve just got a clone. You don’t have a backup. So you’ve got to be able to put your hands on multiple versions over a period of time just in case.

In the case of a ransomware attack, if you’ve got a large amount of data that’s being encrypted, like say a few hundred gigabytes – even as small as our company is, we have that kind of storage out there. It may be a day or two before it even lets you know that it’s busy encrypting files, so we may have to go back in time 2 days to get a file that’s not encrypted on the backup. Now, our backups are encryption-proof, but remember it keeps a different version every time it’s updated, so that doesn’t mean that it’s not necessarily backing up one of the encrypted files. We’ve just got to go back in time.

In ours, we can do that on a file by file basis, so you’ve got to be real careful about that. I don’t want to get too much into the nitty-gritty, but that describes the difference between the snapshot for availability versus a backup. Like I said, ransomware and viruses infect files. You’ve got to have backups of those files. But ultimately, it is for business continuity regardless of whether you’re doing an entire server backup or just your datasets. If you do just your QuickBooks data file and someone steals the PC, and it’s only on one PC, you’ve got to get a PC and then you’ve got to put QuickBooks on it before you can even use the data file.

So a data-only backup does have its drawbacks. You’re going to be down longer in that particular scenario than it would be if you had a full virtual machine online. We’ll get to that in just a minute.

I just told you why.

So, what? You’ve got to think about, if you go back to the core of the onion, the gold or the assets that you need to protect in your organization. You’ve done an assessment. You’ve determined what needs to be protected from cyberthreats and the bad guys, the hackers and all of that. And remember, your #1 threat is ransomware. I don’t think there’s anyone on here that is worried too much about intellectual property theft. There is somebody on here, at least one, that may have to worry about nation-states creating chaos. But you can do that with ransomware.

Any backup strategy you select, you’ve got to take that into account because we’ve had prospects that came to us and said, “Yeah, we had backups but those were encrypted too.” There’s a couple of things wrong with that statement besides just the fact that the backup was encrypted too.

So you’ve got to identify what’s important. Basically, it’s anything that’s created in the office – brand new Word documents, accounting files, whether you use Peachtree or QuickBooks or whatever, your billing system. If you’ve got a utility, you’ve got your whole meter system. All that’s got to be backed up. Whatever you need to run on a daily basis, for however long it may be. If you’ve got a document that you’ve spent 3 days working on, then that’s where those incremental things come in. If you only do a backup once a week, there’s a good window in there where you’re not going to have something that was created 3 days ago, so you’ve got to have timely backups as well.

Nowadays it’s getting to where it is practical to do workstations, entire workstation backups. We can actually duplicate that entire environment, both locally and in the cloud. That brings the time to get back up and running – your resilience is much, much higher with that type of backup situation.

You’ve got to keep backups of any portable device. Laptops, tablets, and iPhones or any mobile phone in case of loss, theft, damage, things like that. Catastrophic loss of the equipment, you’ve got to have a backup for that. And it really should be image-based. iPhones aren’t an image-based backup, but they’ve structured it where it appears that way. Very similar to Macs. But a Windows laptop, I would do an image-based backup, not a Windows backup, which means you’ll need a third party for that.

These are just examples. This is part of your assessment.

So how do you go about doing a backup? Well, like I’ve mentioned, you can do data-only – which is fine. There’s pros and cons to all of it. I actually do multiple layers. We do it for most of our clients. Multilayered approach, remember.

An image-based backup that’s not to a virtual machine, live. That gives us an entire snapshot of a server, is where we typically do that. Then we can move that to another machine. That just depends on the environment and the price. Sometimes it’s not easy to take a server that’s got 4 terabytes of storage and get any kind of cost-effective VM (virtual machine) in the cloud. I’m getting in the weeds a little bit there, but an image-based backup has everything that’s on the machine backed up incrementally, so it’s not sending the entire image every time. It’s only sending the changes up to the cloud or wherever, or on-prem.

And then, like I mentioned earlier, you’ve got the virtual machine backup that actually takes your desktop and backs it up to a duplicate that exists in the cloud that you can access from any computer in the world if you need to get to it. If you’ve got a catastrophic facility loss, all you’ve got to do is find a computer somewhere and you can access that computer.

You’ve got to think about those time intervals, like I’ve mentioned a few times. It’s also a balance. Sometimes – often, actually – it’s a balancing act on how fast your internet connection is. If you’re on a DSL connection, we can’t shove nearly as much data up that very, very small pipe to the server. And if it hasn’t finished backing up the first cycle, and I’ve got it set to do it every 15 minutes, well, guess what? They start erroring out. So we have to increase that time window to where we make sure that each backup is finished before another one is started. We have to measure that against the amount of data that we’re trying to back up.

Then finally, you’ve got to consider carefully, how long is it going to take to restore regardless of the method and strategy you use? Because you’ve still got that business continuity.

Finally, the “where” part of the scenario. I think for most of you, you’re going to know this, but primarily your two choices are on-premises – which is okay. In fact, we do on-premises. But that’s not all we do. If we do on-prem, it’s very, very rare that we don’t complement that with at least some cloud backup. So cloud backup is better than on-prem.

And what we do more often than not is both. We typically set a machine aside on the network, and that varies on the type of needs there are. That machine can do some of the storage, like a data backup or maybe an entire virtual machine backup, and it sends it to the cloud. We have the ability to only send it up to the cloud every 4 hours, but do a local backup every 15 minutes because the network is so much bigger and more efficient and faster locally than it is going across most everyone’s internet connection.

So an ideal situation is where you have both. You’ve got something local that your business continuity is really quick, you can get it back up and running really quick, and then if you have to go to the cloud, we’ve got that too in case the local stuff is damaged or lost or whatever, or there’s just simply a failure. Backups on top of backups on top of backups, done properly, is a great strategy because no matter how much you check, no matter how much you test, the real usefulness of a backup is only when you need it. And these are electronic devices. Just like there’s no guaranteed 100% sure way to prevent you from getting successfully attacked, there’s no 100% surefire way to know that that backup is always going to be there when you need it or it’s going to work when you need it, no matter how much testing we do.

So we prefer to have layers of backup. And we typically do different styles. We’ll do a cloud with a virtual thing and then maybe an on-prem data-only. Sometimes it’s reversed. Sometimes we go all cloud. You’ve got a real small office with 2 or 3 computers, 100% cloud. There’s really nothing wrong with that. I’d still probably put something on-prem just so you’ve got 2 copies. But at any rate, that’s what we recommend – doing both wherever possible.

And then who, finally. Who is going to check those backups? If you’ve got a break/fix guy and he’s not charging you every day or a monthly fee for monitoring the backups, well, you really don’t have backups. You’ve got a “set and forget.” It was all set up and nobody’s checking it. It may be running just fine, but who knows? Is it really going to be there? And if they’re not charging you and they say they’re monitoring it, I’d like to know the details on that. I just don’t know how that’s possible, cost effectively or smart, anyway.

You’ve got to check it at least daily. We’ve got redundant checks and automated checks and human eyes on it all the time, and then we respond immediately whenever we see there’s been a backup failure. First thing we do is check to make sure the status of the backups. How long has it been failing? It’s very unusual that it’s longer than just a few hours. Very unusual that it is a few hours, actually. The automated stuff will let us know right away. So you’ve got to lay eyes on it, you’ve got to have a human being looking at that, monitoring it, periodically testing it, including a file restore or booting up a virtual machine.

I’m going to answer your questions in a minute, Mary. Got to look at the owl blink here for a minute.

So I’ve got a quick little demo. I believe this is Datto’s direct-to-cloud virtual machine backup.

Video: Datto Cloud Continuity for PCs provides you a new level of protection to eliminate data loss and help get your employees back up and running quickly in the event of a lost or stolen PC, ransomware attack, or other disaster. Did you know 1 of 10 corporate laptops will be lost or stolen over their 3-year lifetime? 140,000 hard drives fail in the United States each week. A new organization will fall victim to ransomware every 14 seconds. Your employees are critical to daily operations. Loss or failure of their PCs can mean lost business, loss of critical data, and work stoppage. Additionally, ransomware attacks with the potential to lock up a number of PCs exponentially increases this risk.

It is impossible to take the guesswork out of when a disaster might occur. However, with Datto Cloud Continuity, you have the confidence that you can recover and get your employees back to business without skipping a beat, exactly as they were before. Cloud Continuity is a comprehensive BCDR solution for desktops and laptops. It provides automated backup of your employees’ PCs directly to the Datto cloud. So regardless of where your employees travel and work, their PC is being protected. Cloud Continuity uses image-based backup technology. This means it provides several layers of protection to ensure your employees can get back to business quickly and easily. Files or folders accidentally lost or deleted? Cloud Continuity provides for an easy restore directly to your employee’s PC. In the event of a PC loss or failure, getting a new device up and running for them is a snap.

Not only does Cloud Continuity restore files and folders, but also all their applications and original system configuration. Their new PC will be race-ready. No learning curve or frustrations with missing files or software. And most importantly, no lost productivity or data loss. With the threat of a ransomware attack always looming, rest assured Cloud Continuity provides an easy and speedy recovery. Protected PCs can be rolled back to their last backup prior to the attack. No ransom payments, no loss of functionality.

With Datto Cloud Continuity for PCs, you have the power to protect your business from becoming a victim of the statistics. Protect critical data and keep your employees up and running with this all-in-one BCDR solution for PCs, backed by a world-class, 24/7 365 technical support team. Find out more at Datto.com/CloudContinuity.

TOM: So, how not to do backups? If you’re doing any of this stuff, then I’m willing to wager that you haven’t seriously checked into and upgraded to technology changes over the years. That’s typically what we find out. They either don’t know or that’s just the way it’s always been done and that’s the way we’re going to do it. Every once in a blue moon we still run across somebody that’s using tapes. Now, I haven’t seen anybody burning CDs lately, and hopefully you’re not doing floppies. That would be – I have not seen that in a long time. You’d have to look really hard to even find a floppy in our office, even though we’ve been around 20 years.

Physical transportation. That brings an interesting deal. If you are still doing backups onto some sort of media – and I don’t care if you’re using a flash drive or external hard drive or whatever – and then driving it periodically and put it in a safety deposit box, or somebody’s just taking it home so it’s not in the office in case there’s a fire or tornado or theft, don’t do that. That is not a safe way to do a backup. It can be lost, stolen out of a car, a flash drive can fall out and get underneath the cantaloupes at Walmart. It’s just not efficient. It’s just too much effort, and the dollars don’t even make sense when you look at how cheap online storage is.

Flash drives might hold all your stuff, but there’s still better ways to do it. If they hold all your stuff, you’re going to be able to do online storage for hardly anything. So that’s not it.

File sync is not a backup. That’s important. If you’re duplicating things – like I do with iCloud. All of my stuff is duplicated on iCloud, but when I want multiple versions, I need multiple versions back, I’m going to go to my time machine, which has incremental backups of everything that’s on my Mac over a period of time. All the other online drive things – I’m not familiar with all of them, like Google Drive and their synchronization and backup capabilities.

But whatever – and this goes to your OneDrive as a backup, Mary – you’ve got to make sure it has a true backup component to it. Can you lay your hands on 3 revisions back? If you can’t, and all it has is the very last edit, then what you’ve got is a clone. And that’s not a true backup. You’ve got to be able to go back in time to the one that you did on August 5th or September 1st, whatever it may be. A true backup will give you access to multiple revisions of the same file.

Now, there are some exceptions to flash drives. Those are fine to use as local storage if you leave them on the machine. Don’t take them out. That’s not your way to get it off prem. You still want to use storage locally, maybe on a flash drive, and then cloud storage.

And machine-to-machine is really the same thing. That’s where you back up this PC data to this other PC or this server. Just make sure whatever you’re backing up to also leaves the building. It needs to go into the cloud.

Invariably, whenever I talk about this stuff, they go, “How do I know it’s safe in the cloud?” All of these backup services for cloud encrypt the data before it even leaves the building. Even they themselves cannot see the data, and criminals can’t break it either. It’s encrypted. Most of them are 256k or better. Some of the better ones are 448-Blowfish. Those are not crackable with the latest computers. It would take supercomputers years and years and years to crack that. And if you’ve got that kind of data, then you don’t need to be talking to us; you’ve got other worries about it. If somebody attacks you and can afford that kind of supercomputer time – because we’re talking hundreds of millions of dollars – then you’re like the NSA or Microsoft or Google and you’ve got other problems. It’s going to be something off the chart.

But at any rate, hey, I managed to do that for 30 minutes. I want to thank everybody for being on here today. It’s a topic – we’ve been doing this so long and we know so many different strategies, and when I sat down to do this I was like, “Jeez, I can’t think of a worse topic.” But everybody’s got to have it. You’ve got to plow your way through it, talk your way to the best strategies, and then review it every year. Just 10 years ago, it wasn’t practical to virtualize your workstation up to the cloud. Now it costs $15 bucks a month. The entire workstation virtualized up into the cloud.

So you do have to periodically review to make sure there’s not a better, faster way to not only do your backups, to make them safer, better protected, but also to increase your time back into business. You’ve got to cut that productivity loss down as much as possible.

If anyone has any other questions, the floor is open to you. Am I missing any, Kindsey?

KINDSEY: No, I don’t think so.

TOM: Okay, thanks, everybody. We’ll see you next Tuesday at 2:00. What’s the topic next week?

KINDSEY: It is “Business Continuity and Resilience Made Easy.”

TOM: Very directly related. It’d almost be Backup Part 2. See you next week.

KINDSEY: Bye, guys.