Backup, backup, backup. Most of us rely on our computing technology for our work, home, and play to keep our data easily accessible. We’ve all experienced it, we’re working and all of a sudden our computer crashes, the hard drive goes bad, or the files we need becomes corrupted and guess what? There’s no backup! What do to now besides cry? We at BVA believe keeping a backup, whether daily or weekly is very important because you never know when something will go wrong. There are all kinds of backup solutions out there for businesses, from tape, disk, or even in the cloud. But not only do businesses need to keep backing up their files, but we as individuals with our personal data should keep backups. I backup my hard drive daily so that I always have the ability to back and grab a file, maybe I changed something and realized I need the original, or the disk I had a file on goes bad and I need to grab a different copy. I just found this new hard drive from Western Digital called My Book Live – WD 2go. This is Western Digital personal cloud storage device that gives you your own cloud hard drive so you can access your data where ever you are. Usually when we talk about cloud storage we generally refer to a service you purchase from some company and they store your data somewhere at their datacenter. With this new Western Digital device you can have your own personal 3TB of storage. The device rest on your personal network and you can access it anywhere. The WD 2go has fee-free remote access to your My Book Live to your computer. Another cool feature are the apps you can install on your smartphone or tablet that will give you access to your files stored on your My Book Live as well. So wherever you go, your files are just a click away.
Backup’s are extremely critical for individuals and organizations to have in place. One solution I’ve found to be not only extremely cheap and affordable for any user, but also extremely safe and reliable is known as BackBlaze, a $4 dollar per month service that provides unlimited storage, end to end encryption, and integrates seamlessly with both mac and window’s based environments. Not only is this solution a great for anyone, it’s also extremely easy to use and setup, accessible anywhere through their easy to use website, and automatically keep’s itself in sync providing you reports of job success’s and failure’s.
- Unlimited Storage
- External Drive Support
- Military-Grade Encryption
- Continuous Backup
- Automatically Finds Files
- Automatic Throttle
- Locate Computer
- Free Web Restore
- Restore to USB Hard Drive
- Restore to Flash Drive
- File versioning
- 11 Languages
Starting with Windows 8, Microsoft is going to be implementing a reset and refresh feature that allows you essentially reset your PC back to factory defaults. It is going to be much like resetting a router back to factory defaults. I think this is a pretty neat feature as it will allow you to refresh you machine in a timely manner. This beats the Windows 7 solution which was to create a recovery disk upon first booting up the PC. In most cases people often ignored these messages.
There are going to be several options for this refresh and restore.
Quick: This mode just resets your computer to factory defaults and takes approximately 6 minutes.
Thorough: This mode rights random patterns to the sectors on your drives, which makes it much harder to recover personal data on that drive. Microsoft estimates that the process will take about 24 minutes without BitLocker encryption.
Refresh: This mode will actually back up your apps, personal data, and settings. It is claimed that this mode takes about 8 minutes to complete. The only bad part about this is that it only backs up and restores Metro-style apps that are approved in the Microsoft store. In turn, the good part about this is that it will actually create an HTML file and place it on the desktop that includes a list of the deleted applications.
Can’t wait to test…..
Acronis has some issues that need to be addressed; bva has been a serious fan of this product, but in older versions that were rock-solid. I would have to tell you that we are very disappointed with version 2012 and even worked with them directly and even their most tenured technician had problems getting it to work. Also to talk candidly, that technician never got it set up correctly. Bva got a full registered copy of Acronis 2012 True Image. We installed it on a brand new server with a lot of resources as well as a hefty storage target. This storage target has about 7 TB of usable disk space for back up images. This was the first time we installed this version and we had issues at every turn. A summation of the problems are as follows:
When installed, Acronis started making problems including:
- Incredible slowing the speed of the server. When installed, everything was running very slow, enormously time consuming program launch, everything freezes.
- Once Acronis was installed via testing, everything runs as fast as before.
- The server agent refused to load properly and when we got them to install they could not be seen by the back up server(main console).
- Acronis spent several minutes to show the interface. If the backup is lunched (often this freeze the server).
- Bva found the restoration process problematic and painful.
- When we started the recovery from backup, Acronis restarted the server which is ridiculous and would get errors such as “Impossible to read the source. Try the destination again”.
bva then reached out to support and chatted with an Acronis technician, according to a GM of the local office, a Guru where we sat on the phone for 2 hours, reconfigured it from scratch, and they still could not get the back up to work properly. The biggest issues that we see with this product is as follows:
- Difficult to install and configure, needs to be a lot easier
- The indexing of the data takes too long
- The verification process takes way to long
- The process associated on multi-level back up’s are very tough. One job needs to be completed before another starts. We tried to set up a single job where we backed up the data to drives and then to tape. This simple process took almost two days for 4TB.
Over the last five years I have seen a more passive approach to back up and disaster recovery. Organizations are letting their data reliability take a back seat to system up-time and performance which is starting to become scary. I typically ask CEO’s and owners what an acceptable amount of downtime for their business and they all reference about 2 to 4 hours. It always amazes me, these types of expectations people in power have about how quickly their systems can get back up. Never taken into account is how long it takes to build their new system as well as the time consuming process of moving data from one location to another. It is something that is always over-looked in normal system installations. Many businesses out there feel that their system can be up in 4 to 5 hours and typically when we review and assess a small to medium size business, we find that the average rebuild time for a single server that has a disaster is roughly 10 hours. Of course the 10 hours for a single server consists of:
- server build via operating system install and patching
- application set up and configuration
- shares/drive set up
- data migration
- testing and validation
It is very important to build and structure a network system that can facilitate an agreed level of downtime. In other words, if management decides that the network can only be down for 4 hours, no matter what time of the day it might be, that will drive a completely different back up system and methodology then if bva is told that 12 hours is satisfactory from 8am to 5pm on weekdays. Documenting the process and timeline for bring back up the system is critical and imperative.
Many businesses are looking to move their data into the cloud and normally referenced to bva that it is a cheaper alternative to onsite back up, but I can tell you that is not the case. Moving the data offsite in a reliable and consistent manner can be a bit tricky depending on the solution. For the solution to thrive, you need a reliable telco provider such as fiber as well as a stable power grid. Depending on the solution, data roughly can cost $4 to $12 per gigabit (GB) depending on the compliance standard set forth for data retention. (30 days, 12 months, 5 years, 7 years) There are several great softwares out there that can be loaded on any server and completely hardware agnostic. This software drives the back up job and can point it to any iSCSI target. This software can also move the data offsite to any destination you prefer and typically the software you select will provide that option via several data centers. Microsoft, Google, Amazon, and even Apple are a few that have gotten in this business and will continue to grow and large back up solution providers.
bva gets many inquires about how should organizations back up local desktops. Of course as technical professionals, bva recommends having nothing on the desktop with regards to important data. But never the less it is always a subject matter that gets voiced and requested from management. Having the ability to build and push an image to a user desktop is a great and easy way to ensure user satisfaction and lowers administration time. That being said it is starting to be a common development when moving email into the cloud such as Exchange Online with BPOS/Office365. There is currently no way of backing up the mail store which is painful. That being said the way around that is to back-up the local OST file that can be backed up to local disk from the desktop. Of course for restoration purposes you really need to move the OST to PST for restoring with typically takes another migration tool that is not free and needs to be purchased per mailbox. A great way around that is to simply image the desktop and keep that image locally on slower or cheap disk. That covers you on many levels from our perspectives and has many copies of data on different sets of hardware, something to consider.
The popular software that we are using as of right now are as followed:
- Acronis® Backup & Recovery™ 11 Advanced Workstation Website Link – Use console to manage machines locally or remotely, Includes Acronis Management Server for single point of centralized management, Group machines into static on dynamic groups, Automatically include multiple machines or group of them to backup task, Monitor backup and recovery activities on all machines from a single place, Build customizable reports.
- Symantec Backup Exec 2010 Desktop and Laptop Option Website Link – With the majority of business-critical information residing outside the data center or off corporate servers, protection for desktops and laptops is a must. The enhanced Desktop and Laptop Option delivers continuous data protection to desktops and laptops whether in the office or on the road. Not only improving data protection and efficiency, this option enables users to restore their own files and maintains synchronization between multiple desktops and laptops so the most up-to-date file versions are available on all of a user’s computers. Because the Desktop and Laptop Option does not require a dedicated stand-alone server as competing products do, it easily integrates into existing IT infrastructure and policies, helping lower the total cost of ownership. The new push-install functionality from within Backup Exec centralizes deployment. Backup Exec 2010 includes support for Windows 7, Windows Vista, Windows XP 64-bit OS, as well as delta file transfer, reducing the total amount of data being backed up. With this release, this option is integrated with Backup Exec Retrieve (available with the Backup Exec Continuous Protection Server) for even greater simplified file recovery.
BVA really likes vShere and have a lot of confidence in the product and offering. On July 12 VMware launched vSphere 5, a cloud computing infrastructure suite that essentially is a one-stop virtualization shop. An all in one really, which is very impressive. vSphere 5, whose predecessor vSphere 4 came out about a year ago, is the largest integrated software product ever launched by VMware, adding four completely new modules—(vCloud Director, vShield 5.0, vCenter SRM 5.0 and vSphere Storage Appliance (optional). VMware also announced that it has made available an iPad version of the management interface in the Apple App Store. Here are 10 of the most important points made at the July 12 VMware announcement event in San Francisco.
- Cloud Infrastructure Suite
- Virtualize Critical Business Applications
- Path to 100% Virtualization
- Simple Self Service
- Major Upgrade for the Infrastructure Suite
- Simplifying Adoption to vSphere
- IT Transformation Journey
- Hybrid Cloud Models
- vCloud Director
- vShere 5
BVA has been performing many virtual implementations in recent months and over the course of 10 years we have been involved with many different types of SANs on the market. Shared storage for the small to medium size businesses are starting to become a norm and with lower cost points for virtual server software it more advantageous for companies to go with that architecture. Our most common project these days are moving 6 to 15 production servers into a three-cluster virtual node architecture with shared storage leveraging VMware or HyperV. After testing and playing with a few different SAN’s, bva favorite is the The NetApp FAS 2040. It has up to 136 disks (136 TB) storage capacity, FC-SAN, IP-SAN (iSCSI), and NAS (CIFS/NFS) protocol support.
Full SAS capable which is more than capable to handle the typical i/o for small businesses today. Full Fiber Channel capable, full SATA or FC/SAS/SATA disk mix. Single and dual active-active controller models are capable for an aggressive price point. The SAN also has (2) two 4Gb FC ports as well as (4) four GbE ports and 1 SAS port per controller which is very versatile. (4) Four GB cache per controller which is the standard configuration when we are doing our typical installation.
The NetApp 2040
The FAS 2040 systems offers unified file and block storage. That means one solution for CIFS, NFS, iSCSI, and FC SAN storage protocols. The units Data ONTAP operating system provides storage efficiency through higher utilization of capacity via thin provisioning and SnapshotTM technology. The unit is also very scalable by way of having the option to add more drives to an existing original enclosure. It means being able to combine existing and expanded data-management resources in the fastest, most elegant solution.
Using MozyHome Service? Well you may not all be aware, but your unlimited storage is how can I say it, no longer unlimited. Mozy has decided that in its best interest, that it would no longer allow the unlimited plans for the MozyHome service and would go with tiered plans.
How it is setup now is that you can get 50 GB of storage for $5.99 for 1 computer, or you can get 125 GB for $9.99 and 3 computers. Mozy gives 3 reasons as to why the change:
- We are all Capturing More Photos and Videos with higher quality and resolution
- Multiple Machines are the Norm
- No Fine Print Backup
“In an attempt to remain viable in the “Unlimited backup” business, other providers have introduced measures to inhibit the growth of storage such as bandwidth throttling, excluding files over a certain size, or exclude certain file types such as photos and videos from the default backup sets. Rather than claiming to offer “unlimited” backup while imposing bandwidth or file limitations, we want you, our users, to decide what you want backed up and in return Mozy will provide you the best possible service level, with no hidden restrictions.” –From MozyHomes website.
According to Mozy, the vast majority of it’ customers will be happy with the 50 GB plan as their data will fit comfortably within that 50 GB that can be purchased for a dollar more than the extinct unlimited plan.
In my personal opinion, the unlimited data plans were excellent for the consumer but not for the company as a whole. It almost appears as if they are trying to get a better handle on the costs associated with adding additional storage by limiting the amount they need at any given time. With these tiers in place they will be easier able to manage and maintain the current and future storage needs, and although they most likely have plenty of storage available it helps to be able to better forecast future usage. Bummer for the customer, good for Mozy.