One of the hard drives on my work PC crashed a couple of days ago. My
work PC is (or rather, was) configured with an SSD for a boot drive, and
two regular SATA drives, in a RAID 0 configuration, for a secondary
data volume. It was one of those SATA drives that failed. Since RAID 0
doesn't have any redundancy built in, that killed the volume.
The only data I had on that volume were the files
for my VM. The way we have developer machines configured here, we have
general productivity stuff (Office, etc) on the boot volume, and all the
developer stuff on the VM. The setup for developing for Dynamics AX is
fairly complicated, so it makes sense to do it on a VM.
Unfortunately,
we don't have any facility set up for backing up our VMs anywhere.
Also, between the way AX stores source files, and the way we have TFS
set up, we don't always check in code daily, nor do we have a simple way
of backing up in-progress code changes that haven't been checked in.
So, the end result is that I lost about two days worth of work on my
current project.
I had, at one point, written a
backup script (using PowerShell and 7-Zip) to back up the My Docs
folder on the VM to the My Docs folder on the physical machine, but I
hadn't ever set it to run on a scheduled basis, so the backup file there
was about a week old, which meant that I also lost a few SQL files,
some test spreadsheets, and one quickie VS 2010 project that I'd written
to test a web service. Oh, and I was keeping the backup script itself
(plus some other scripts) in a 'util' folder on the root of the VM C:
drive, so those didn't get backed up either, and were lost.
So
the takeaway from all of this, of course, is that I need to do what I
can to get around the limitations of the environment I'm working in, and
set up some automated backup procedures.
In
terms of backing up the My Docs folder, I rewrote my lost PowerShell
script, and set it up in task scheduler to run at 6pm daily. It ran fine
last night, and I think it'll work fine on a continuing basis.
In
terms of backing up in-progress work in AX, I extended the 'startup
projects' class that I blogged about
recently to also allow me to
export all of my active projects. I have it exporting them to a folder
under the My Docs folder, so, if I run the export at the end of the day,
prior to the file system backup, I should always have a backup of my
current work, in a format that I can re-import into AX, if need be.
There
are still some big holes in this system, including the fact that I have
to remember to run that export daily. But it's a good start. I'd like
to add some extra stuff to this process, including daily SQL backups,
and maybe a push of certain backup files to the cloud. The SQL backups
are kind of hard, since the AX test database is 70 GB. And my employer,
for some reason, likes to block access to cloud-based backup &
storage providers, so I can't just copy stuff into a DropBox folder, so
that part's a little tricky too.
I've also
considered setting up a local Mercurial or Git repo, checking in the AX
export files every day, and pushing them up to a private Bitbucket repo.
This would give me offsite backup, with the added benefit of increased
granularity and visibility, but it would probably violate one or more
corporate policies.
As a follow-up to this post, I'm going to write a few more posts, about some of the scripts I'm using now.
Labels: hardware, software