Personal computing discussed

Moderators: renee, morphine, Steel

 
RobrBaron
Gerbil In Training
Topic Author
Posts: 2
Joined: Thu Aug 25, 2011 3:07 pm

NAS Backup Best Practices with CryptoLockers in mind

Sun May 01, 2016 8:59 am

I have a small network for my business, and use a Synology 2-disk system as our file server. With the advent of more and more sophisticated Cryptolocker type viruses, I'm concerned that simply having a shared drive on a file server is going to end badly one day.

Synology sends user emails that talk about needing more than one backup, doing offsite backups, syncing, and cold storage (Amazon Glacier style) backups so that you can do a restore in the event of a Cryptolocker virus, but I'm kind of at a loss as to best practices.

Is there a good How To guide to set up something with Glacier or CrashPlan or rsync that would not only provide offsite redundancy, but also alleviate the specific problem of Cryptolockers?

My general concern is that I go through all the trouble of setting something up, to only discover that I've just allowed a Virus to infect all of my backups, and provided myself zero net benefit.

Does anyone have a link to worthwhile materials that are specific enough for a casual person to implement on a Saturday?

Thanks in advance!

Rob
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: NAS Backup Best Practices with CryptoLockers in mind

Sun May 01, 2016 9:25 am

Store whatever you would need for disaster recovery off-site, either on removable media or a cloud based backup service! If your office burns down tomorrow, will you be able to recover?

Regardless of the method of backup used, the key is to maintain some sort of history, in case you don't realize things have gotten corrupted before running a backup. Never overwrite the last backup that you *know* was good!

With removable media (tape or external drives), you can have a pool of several tapes or drives which you back up to in rotation. For extra protection, periodically (e.g. monthly) pull one from the rotation and keep it in off-site storage (replace it in the rotation with a new tape or drive). You can go back and thin the historical backups later to re-use the media -- e.g. retain only end-of-quarter copies once they are more than 6 months old, and only end-of-year copies after they are more than 2 years old.

Rsync's "--link-dest" option can be your friend. It allows you to do what amount to incremental backups of directory hierarchies. If you back up to a new directory each time (e.g. incorporate the date of backup into the target directory name), but specify the previous backup as the link destination, only files which have been changed are re-copied; unchanged files are hard-linked to the existing copy on the destination media. This gives you historical snapshots in time of the entire directory hierarchy, with only changed files taking up additional space on the backup media (other than the small amount of space for the additional directory entries). The whole historical archive can then periodically be rsynced to an external drive or remote server for off-site storage (make sure you use the "--hard-links" option to preserve the web of hard links).
Nostalgia isn't what it used to be.
 
Flatland_Spider
Graphmaster Gerbil
Posts: 1324
Joined: Mon Sep 13, 2004 8:33 pm

Re: NAS Backup Best Practices with CryptoLockers in mind

Sun May 01, 2016 1:12 pm

You're probably not going to be able to do all of this in a weekend. I mean you could just shoot from the hip, but generally stuff like this that is implemented in a weekend gets rebuilt in about six months because it doesn't work.

The best practice is to not give anyone live access to backups. It needs to be abstracted behind some automated program on a server that no one has direct access to, and it's not just because of cryptolockers, it's the users themselves. Users will do dumb things like delete directory trees because they aren't paying attention to what they are doing.

One of the first things I would do would be to keep people from having direct access to the share by implementing some file sharing service like Owncloud, Seafile, or Syncthing that supports file versioning. This is the first line of protection.

The next thing is to have some sort of backup software and server. I've used Yosemite Backup in the past, and it works well, even it if it is a little odd. It's also stupidly cheap for what it offers. There is also other stuff like BackupPC, Bacula, and BareOS.

Multiple offsite and offline backups are key. Rent a storage box at a bank and put the backups there.

Most sites have a tiered backup strategy that looks like this:
* Yearly full backup
* Monthly full backup
* Weekly full backup
* Daily incremental backup

The advantage of this that you can progressively roll back until you have a working copy.

Infrastructure wise this could be implemented several different ways:
# Disk-to-disk-to-tape
# Disk-to-disk-to-portable disk
# Disk-to-disk-to-cloud
# Disk-to-tape

There are hybrid versions of this like disk-to-disk-to-tape and cloud. Which ever version you pick depends on how much data you need to save. Last time I looked, cloud storage (Amazon, Rackspace, etc.) is cheap up to something like 100GB, and then it gets expensive.

Try to encrypt your backups. You make be thinking, "No one wants my backups, and that's just extra work", but encryption is also about maintaining the integrity of the files as well in addition to security. Since you specifically asked about cryptolockers, the integrity portion is relevant.

The last piece is to test restores often. The care and feeding of backups takes a lot of time and energy. There is nothing worse then finding out the backups don't work, and this is more of a concern with a full Windows OS restore then a full Linux OS restore. Windows is a finicky, fragile beast, and it's best if you can keep it running.
 
BIF
Minister of Gerbil Affairs
Posts: 2458
Joined: Tue May 25, 2004 7:41 pm

Re: NAS Backup Best Practices with CryptoLockers in mind

Sun May 01, 2016 2:40 pm

In my job I work for a corporation but don't have responsibilities for disk or partition backups.

In my personal life, I have a couple of computers on my home network that have data that, while I could re-build from installer CDs, DVDs, or HDDs, they would be a really HUGE hassle to rebuild. My return-to-service needs are not urgent or emergency in nature, but my free time is very limited, so I do whatever I can to avoid ever having to re-install Windows or rebuild a disk partition from original media. Due to the sheer quantity of data (4 TB on my laptop, 5+ TB on my desktop), the use of remote backup services is infeasible due to the length of time to transfer all that data over even the fastest consumer level network.

Local backups, even if done to external USB 3.0 drives, are MUCH faster, and (after initial setup and testing), have required very little of my time to manage.

I too worry about the cryptolocker/ransomware risks, so I try to keep multiple versions of my backups. I like the idea of having weekly, monthly, and yearly copies, but this can be expensive because I need to put those backups on big drives (8 TB). Since the laptop and desktop computers each have some local partitions that are COPIES of the other machine's partitions, I still back each one up. So that data actually lives locally on two machines, each with its own set of historical backups. This provides some extra measure of recoverability for some partitions, although that won't help me if one machine's Windows partition and all of its backups became corrupted.

About every six months, I go out and re-research the topic, optimistic that there's some new antivirus/antiransomware or backup technology available. Every six months I find that there's still no first-best-way to reduce all exposure to ransomware. So taking consistent backups is currently our best solution.

About once per month, I go into Macrium and check to be sure that the backups really are running correctly, and then I go to the backup folders and randomly mount one or two of the backup sets to be sure that they really contain readable data, and that the data is of the drive or partition that I intended for it to be.

Very critical: This step needs to be done anytime you change your partitions, install a new hard drive or SSD, or move any partitions from one SSD to another. After modifying your disk management strategy, ALWAYS go back and DOUBLE-CHECK your backups. I can't begin to tell you how many times I would have had this-or-that partition no longer getting backed up had I not double-checked my backup strategy after making a change.
 
Flatland_Spider
Graphmaster Gerbil
Posts: 1324
Joined: Mon Sep 13, 2004 8:33 pm

Re: NAS Backup Best Practices with CryptoLockers in mind

Sun May 01, 2016 4:41 pm

BIF wrote:
Due to the sheer quantity of data (4 TB on my laptop, 5+ TB on my desktop), the use of remote backup services is infeasible due to the length of time to transfer all that data over even the fastest consumer level network.


Some of the backup servers will have the ability to do data dedup. BackupPC does this, and if there is a lot of overlap in the data, it will decrease the size of the backup by quite a bit.
 
Aphasia
Grand Gerbil Poohbah
Posts: 3710
Joined: Tue Jan 01, 2002 7:00 pm
Location: Solna/Sweden
Contact:

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 7:31 am

Personally I would say the following should be sufficient, or at least I deemed it sufficient for my personal use.
I do daily incremental sync from my workstation to my fileserver.
I do weekly full bare metal disk image backup from workstation to fileserver.
I do daily incremental file setup from fileserver to nas.
I do weekly backup of my disk images from fileserver to nas.

All filesyncs does versioning as well, for up to 3 iterations if a file is replaced. All of the operations are fully automated, so only thing is that sometimes I need to delete and redo the windows image backup because of some file lock rights issue, like every 3rd month or so. And I do need to clean the imaging manually when growing too big. Now, while my standard account do have full access to the fileserver share for writing. I do not have anything but read access to the nas. And I also don't have it mapped as a drive at all, but always use an UNC path if I need to read something. The same practice goes for the fileserver, where I have a specific backup-user configured on the filesyncing software that has write access to the nas. And no mounted shares at all, everything is done through UNC paths.

Only lacking a remote site-backup, but have been thinking of getting another NAS to put at a friends place to cover that.

BIF - if you want offsite, how much new data are you actually producing at your workstation. You could always do sneakernet for the first 5 TB, then do incrementals if you have enough bandwidth.

And once you go into enterprise version, costs starts to become a real factor. Then you will have to weigh cost vs availability as well as types of storage. Then as others have mentioned, deduplication because a factor since it directly impacts size/cost, and file integrity depending on your type of data. And last, but certainly not least. Proper DR tests, including recovery tests.

Currently, various lockers are still client based infections, so you absolutely need backups that are not readable or writable from clients, or any servers where people might have external network access or use any external files. Except for being sure that you can recover files, user education about how not to get infected is important, because those are usually the weakest link. I mean, even after something like a locker or fishing campaign has been announced on a company, people will still click on links in poorly spelled email, get to a page, enter a captcha, download and run a file, or put in all their info like bank account, personal information and codes into a form because the email came from "support".
 
Aranarth
Graphmaster Gerbil
Posts: 1435
Joined: Tue Jan 17, 2006 6:56 am
Location: Big Rapids, Mich. (Est Time Zone)
Contact:

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 8:16 am

Flatland_Spider wrote:
Multiple offsite and offline backups are key. Rent a storage box at a bank and put the backups there.

The last piece is to test restores often. The care and feeding of backups takes a lot of time and energy. There is nothing worse then finding out the backups don't work, and this is more of a concern with a full Windows OS restore then a full Linux OS restore. Windows is a finicky, fragile beast, and it's best if you can keep it running.


We have a winner!!!

Backup drives should NOT be connected to your machine all the time! (lightning strikes, viruses, pebkac issues)

If you are using bare drives be sure they are put back in the anti-static bag before moving around.
If you are using drives in enclosures or bare drives be sure there is shock absorption. A 1/2" of stryrofoam or a padded strong box is key!
Your nas must be backed up!

Yes put drives in the bank (business) or a trusted friend's house (personal).

Your weekly, monthly, yearly backup drives must be disconnected from the network, power, and bump protected.

The one job I hate having is being responsible for business backups. Something always goes wrong and the business does not want to spend the money to do it right.

As I've mentioned in other places, I do not recommend optical media especially RW for long term storage. High quality CD-R's seem to last the longest followed by DVD+R. I don't trust Blueray-r's at all.
Main machine: Core I7 -2600K @ 4.0Ghz / 16 gig ram / Radeon RX 580 8gb / 500gb toshiba ssd / 5tb hd
Old machine: Core 2 quad Q6600 @ 3ghz / 8 gig ram / Radeon 7870 / 240 gb PNY ssd / 1tb HD
 
DrCR
Gerbil XP
Posts: 350
Joined: Tue May 10, 2005 7:18 am

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 8:31 am

Aphasia wrote:
And I also don't have it mapped as a drive at all, but always use an UNC path if I need to read something. The same practice goes for the fileserver, where I have a specific backup-user configured on the filesyncing software that has write access to the nas. And no mounted shares at all, everything is done through UNC paths.

Does that really matter? Just curious. I recall one poster recently stating the properly right way would be via SSH.
 
Duct Tape Dude
Gerbil Elite
Posts: 721
Joined: Thu May 02, 2013 12:37 pm

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 8:33 am

I use Crashplan for time-based, incremental, compressed, deduped, remote backups. For backup targets, got a buddy from here on TR along with a dedicated server of my own. Since Crashplan is over its own proprietary protocol instead of a basic file share, this effectively isolates me from him except by Crashplan, so most viruses won't hit both of us.

I know most people opt for a storage box at a bank or something, but for me Crashplan was far easier to maintain than that (and it's cheaper).
 
Thrashdog
Gerbil XP
Posts: 344
Joined: Wed Nov 03, 2004 1:16 pm
Location: Kansas City

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 9:13 am

We've been hit a few times. So far, nothing's been able to nuke remote VSS shadow copies, which makes recovering from a Crypto-whatever infection a fairly straightforward process that doesn't require going to tape:

1) Identify the infected machine by checking owner permissions on the affected files, and remove it from the network.
2) search "help_decrypt.*" on all network drives, and sort the results by folder path (add that column to Details view in Explorer)
3) For each affected folder, restore an unaffected version from VSS and delete the "help_decrypt" files.
4) Nuke and rebuild the culprit user's machine and deliver lecture on security best practices. Hope it sinks in.

For what it's worth, most people have said that these things ride in on phishing emails, but I think all the time's we've seen it here it's been delivered via browser exploits. If we didn't have line-of-business software that relied on older versions of IE (stupid accounting/timekeeping crap...) we'd probably have never had a problem. The other way to stop ransomware is to use application whitelisitng, but where I'm at that'd be a good way for us IT folks to be burnt at the stake -- with the blessing up upper management, too many people have become used to treating their work laptops as their own personal machines to use as they will, and telling them that they can only run IT approved programs would be apocalyptic.
 
Flying Fox
Gerbil God
Posts: 25690
Joined: Mon May 24, 2004 2:19 am
Contact:

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 9:38 am

Speaking of browser based exploits, how much does running as non-Admin user still protect against them?
The Model M is not for the faint of heart. You either like them or hate them.

Gerbils unite! Fold for UnitedGerbilNation, team 2630.
 
Thrashdog
Gerbil XP
Posts: 344
Joined: Wed Nov 03, 2004 1:16 pm
Location: Kansas City

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 10:47 am

Doesn't make any difference -- I think everything except the very first version of this crap can operate with admin access or privelege escalation. I have to give my users local admin because reasons (*grumble*), but the last guy who got infected didn't have it, because I don't trust him to operate a plastic spoon, let alone a PC connected to a network full of our clients' confidential information. Didn't make any difference -- still got infected, still hit all of the network shares.
 
SuperSpy
Minister of Gerbil Affairs
Posts: 2403
Joined: Thu Sep 12, 2002 9:34 pm
Location: TR Forums

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 12:26 pm

Either rsync like JBI noted, or ZFS with snapshots (especially on the NAS itself -- if you're lucky in this case you can roll back damages on the live server).

In my situation (small office of ~20 users), the backups are put on a machine with fairly open network shares, but they are replicated by an outside machine off-site via ssh, so any intrusion into the primary backup server cannot harm the off-site backup server, the worst it could do would be to corrupt an in-progress backup, but the historical backups would remain in tact. This machine also has completely different credentials and has no outward-facing network services on it apart from SSH.

I'm actually in the process of moving a lot of our network storage to FreeNAS/ZFS because of the added resistance to cryptolocker via ZFS snapshots. I can make snapshots fairly often (5 minutes in my case) with minimal load/wasted space so it allows me a fairly precise roll-back window in case something goes wrong.
Desktop: i7-4790K @4.8 GHz | 32 GB | EVGA Gefore 1060 | Windows 10 x64
Laptop: MacBook Pro 2017 2.9GHz | 16 GB | Radeon Pro 560
 
Thrashdog
Gerbil XP
Posts: 344
Joined: Wed Nov 03, 2004 1:16 pm
Location: Kansas City

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 2:10 pm

Ours is a much larger office, but our NAS is actually ZFS-based with its last storage tier as an Amazon S3 instance. It presents ZFS snapshots to Windows clients as VSS copies. Both those and traditional VSS shadow copies from our old Windows file server have shown resistance to CryptoLocker in our experience.
 
Aphasia
Grand Gerbil Poohbah
Posts: 3710
Joined: Tue Jan 01, 2002 7:00 pm
Location: Solna/Sweden
Contact:

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 8:23 pm

This is a side-trip regarding cryptolockers with regards to the backup part of the OP.

The different varieties of lockers we have seen where I do my daily work has been targeted locally to certain business including language, specific parts of the organisations, etc. Most of them, if not all, is pretty much a official looking email that leads to a landing page where you download a document, often hidden behind a captcha, so automatic linking and scanning/sandboxing can't download the actual malware before the user does it. Behind the captcha, or the landing page has been anything from what amounts to Russians version of google-drive, to hacked webservers, containing a page deep down under several folders where the original page never link. But yeah, there have also been infected ad-servers spreading popups that user browser vulnerabilities... very often flash, java, etc. One problem with many sandboxing technologies is that they take a few minutes to block the actual malware since they need to analyze it, so they often caught any attempts a few minutes after the first attempt. Now, depending on your security setup, scanning, sandboxing, proxies, etc, you can often block the callback itself, since they might be classified as malicious even without your sandbox having seen it. So while the download has been down, most lockers doesn't activate without having done a callback.

I work for a security company as a consultant, and we also offer a free DNS service that anybody can use if they want to.
https://www.mnemonic.no/news/2015/free- ... s-service/

There are also other free services from certain companies that offer some measure of protection. Running EMET is a thing I would not be without if you are using windows and especially IE.
Bluecoat have their free K9 Client you can use as well.



DrCR wrote:
Aphasia wrote:
And I also don't have it mapped as a drive at all, but always use an UNC path if I need to read something. The same practice goes for the fileserver, where I have a specific backup-user configured on the filesyncing software that has write access to the nas. And no mounted shares at all, everything is done through UNC paths.

Does that really matter? Just curious. I recall one poster recently stating the properly right way would be via SSH.

Depends what you are after. Many programs can use UNC, but not SSH, unless you get something specifically syncing over SSH. And if UNC or not helps, that depends on the malware. Most lockers currently starts encrypting all files you can access as a user, so any mapped drivers where you have write credentials are considered a risk. So yes, not having it mounted and only having read access from your running account and any admin accounts is a very easy step, since you shouldn't have any need to actually access your backups unless you are doing a restore or maintenance. Although it's the read only access that is the most important step. But why have it mounted at all if you don't need it.

Now there is malware that scan the network for hosts and looks for open shares as well, but currently I haven't heard of a specific cryptolocker type malware that actually extracts credentials from installed syncing software and use that to try to access the shares. It usually runs under your own account, and often with admin rights if you haven't limited them, as is way to common in a home setting.
As for UAC, etc, that is quite easy to bypass as long as you can throw up some form of user interaction.

What does the most difference is the user not clicking on any stupid sh*t that turns up in their browser or mailbox. But if they do, dns blackholing, automatic blacklisting of urls, etc to block callbacks is a nice thing to have with regards to cryptolockers, but it's only a matter of time until they get around such things as well, but those are not without inherent vulnerabilities to be exploited to decrypt the files as well.
 
LoneWolf15
Gerbil Elite
Posts: 963
Joined: Tue Feb 17, 2004 8:36 am
Location: SW Meecheegan

Re: NAS Backup Best Practices with CryptoLockers in mind

Mon May 02, 2016 8:55 pm

I manage multiple client backups. We use Storagecraft ShadowProtect (a very decent program) and this is our setup.

1) Servers are backed up to a local NAS. Backups start with a base image, and then incrementals happen after that. You can set incrementals however you want; if one a day is all you need, fine. If you want every fifteen minutes, fine. It's all dependent on your storage, and the I/O speed of your servers (faster drives in your servers = less I/O lost to backup). Backups are full image backups of volumes; in an emergency, your server could be booted from optical or USB media and bare-metal restored. Backups are also AES encrypted; only those with the password can open the backups. Individual files can be restored by mounting the backup as a read-only drive (the read-only default method of mounting prevents it from being infected, though it is possible to mount a backup in read-write if needed). Each server being backed up onsite requires a license.
2) Incremental backups are collapsed over time into daily, weekly, and monthly files through a component called Image Manager. You can retain the non-collapsed files for a period of time, but collapsing these files de-duplicates redundant data in the chain. These collapses happen automatically at a set time every day.
3) Through the Image Manager component, the backup files can be transferred offsite via FTP. We transfer only the daily collapses offsite (which in turn are collapsed at intervals when received into the offsite. We use an FTP server program to receive all of these to offsite storage. As the backups are encrypted and part of a chain, someone who managed to intercept a single file would be unable to open it both due to needing the encryption password, and due to the fact that you need the full backup chain to mount a restore. Each offsite backup requires a software license.

Theoretically, you could even do more and offsite those collapses to cold storage somewhere like an Amazon or Rackspace-hosted FTP server running the Image Manager component, though that would require additional offsite transfer licenses. The program is extremely reliable, and support is good. Also, you have the ability to spin a backup into a VirtualBox VM if necessary for emergency operations; you just need enough horsepower on whatever machine you're running to do the job.

The primary key to stopping ransomware isn't just good backups though. It's educating your staff to the ways they can get ransomware. The typical e-mail with a .zip or .js/.jse attachment, pretending to be a resume, or a FedEx/DHL/UPS shipping notice, or an invoice. Not clicking on ads in their web browsers when at work. And that if they see anything (like files they cannot access, or that desktop picture of their pet/family suddenly going blank for no reason) or odd text files showing up they didn't create, to TELL SOMEONE ASAP. The sooner you know, the better off you are. Make it clear you'd rather they went Chicken Little on you than decided they "didn't want to bother anyone" and three days later, you find your network drives are all encrypted.

https://www.backblaze.com/blog/cryptowa ... -recovery/
i9-9900K @4.7GHz, GIGABYTE Z390 Aorus Pro WiFi, 2 x 16GB G.Skill RipJaws V PC3000
Corsair 650D, Seasonic 1Kw Platinum PSU
2x HP EX920 1TB NVMe, Samsung 850 Pro 512GB 2.5", NEC 7200 DVDRW
Gigabyte RTX 2080 Super Gaming OC, Dell S2719DGF 27" LCD

Who is online

Users browsing this forum: Google [Bot] and 1 guest
GZIP: On