Photographers, heck… everyone, learn from my failure… PLEASE!


For those who follow me on social networks, you may have noticed that I’ve been somewhat quiet regarding publishing new images over the past couple of months. There is a very good reason for that, and I’m here to share with you why that is, and hope that you will heed the message of lessons I learned… the hard way.

Part 1: Setting up the scenario

May 2017 was a busy and exciting month for me with two major photo shoots planned; the Planes of Fame Air Show in Chino, CA and the Massachusetts Military History Expo in Orange, MA. Both went off without a hitch and I returned home to begin the exhausting process of editing over 15,000 exposures.

18768591_1565052093539363_4896576397224112380_o

One of the published teaser images from the Planes of Fame Air Show in Chino, CA

Up until now, my post-processing workflow usually is as follows:

  1. Upon getting home, offload my RAW files from the SD or CF cards to my scratch drive on my Dell XPS 8900 workstation. I usually break out the scratch area as: “Photos > Year > Event > RAW Files/Adobe Bridge Files > Working files where I store the Adobe Photoshop Files (.PSD) as needed or the JPG rendered image files in a separate folder from the Working PS files.
  2. Open the new event folder in Adobe Bridge and go through all the image, labeling the ones I want to select for editing, converting to a panorama, or converting to High Dynamic Range (HDR) Tone Mapped images.
  3. Filter the view based on what I want to edit.
  4. Begin the edit process in Adobe Camera Raw (which is done via right clicking on the image in Adobe Bridge’s interface).
  5. Once all those images are edited, I change the filter to show those shots I want to convert to panoramas and edit those.
  6. Once all those images are edited, I change the filter to show those shots I want to convert to HDR Tone Mapped TIFFs and edit those. Then I’ll edit those TIFF files in Photoshop and save the .PSD files in a “Working” sub-folder for later rendering to JPG.
  7. Render all edited image to JPG with a watermark and reduced resolution for publishing to social media outlets into a sub-folder saying “JPG for FB” etc.
  8. Render all edited images to JPG without watermarks and at full resolution into a “JPG for Print” folder. These will be uploaded to my SmugMug site and leveraged for print orders.
  9. Upload those full resolution images to SmugMug and add Metadata tags for exposing those new images to search results and add Titles and Descriptions.
  10. Make a copy of the Event folder with all edited and final files to my separate internal Archive drive.
  11. Rinse and repeat for the next event.

It’s a fairly straight forward process and has served me well for the last 12 years (And people think we photographers just offload directly from the camera to Facebook *insert maniacal laugh here*).

18595318_1559530090758230_8087556847216827361_o

An action shot teaser from the 2017 Massachusetts Military History Expo

Part 2: That fateful day of realization

Fast forward a couple of months to the end of June. I have been editing the daylights out of both events, periodically posting highlights to my Photography Facebook Page as teasers. You should see me as a man on a mission to deliver to reenactors, vendors and staff of the events. If I had to guess at that point, I am probably over 150 hours into editing, with over 400 images edited and rendered, and many more to go before I am ready to engage in step 10 of my process (copying the files to my archive disc as a basic backup). My Photoshop files are ranging from 100MB to 1GB each. So at this point, there are well over 1.5TB (That’s 1,500+ GB, or 1,500,000+MB) worth of files on the scratch drive. That’s no chump change in the average person’s storage world.

It is another evening after a long day at my day job, when I sit down to get back to editing air show images. I open up Adobe Bridge and notice that the thumbnails are not showing up as quickly as they normally would. This doesn’t concern me at first as I remember that I periodically should clear the thumbnail cache per many online articles. No matter. I clear the cache and get back to editing. This should have been yellow flag number one to me.

A few nights later, I attend an Independence Day party at my friend’s house. I bring my camera gear and have fun doing an impromptu fireworks capture course with a person who is just getting into photography. Nothing makes me happier than to talk shop and exchange knowledge about photography. It is a fun evening and I go home with many shots that I am happy with. Once home, I begin the prescribed workflow and offload my RAW files from my SD card to my scratch disc. The files copy at the usual rate and I open that folder in Adobe Bridge so I can start applying labels in my review pass.

Once again, the thumbnails are not coming up. After five minutes of waiting for Bridge to build the thumbnails, I decide to go to the other folders with the MA Military History Expo shots so I can continue editing those. But those files aren’t showing up! Wait… WHAT??? That’s not right! Lumps start forming in my throat as panic starts to set in.

18623397_1558647787513127_4959237833003270747_o

An action shot teaser from the 2017 Massachusetts Military History Expo

Part 3: Technical Triage

My 20+ years of IT experience starts to kick in as I immediately go into the DEFCON 3 of troubleshooting mode. I start to inventory the symptoms and investigate causes. In the past, slow disc performance usually are indicative of hard drive fragmentation. Realizing I hadn’t defragged that scratch drive since I got the computer a year and a half earlier (bad Dan… BAD), I install Smart Defrag and go to town analyzing the disc (which comes back 1.8% fragmented with no bad sectors… not too bad all things considered) and begin an overnight defrag process. Editing for the night would be a washout. Tired, I call it a night and retire to bed.

The next morning, I stumble downstairs, unlock my computer and surprisingly see that the defrag process had only managed to get to 32% complete. In my experience, a 2TB drive like this should be done defragging and optimizing in less than 4 hours at that percentage of fragmentation. OK. Ummmmm… I’ll let it continue to run, while I go to my day job. It should be done by the time I get back. Off to work I drive with a ball of concern souring my stomach.

That evening I unlock my computer with trepidation. Sure enough, the defrag process hadn’t made much progress at all. Red flags are frantically waving and sounds of submarine dive klaxons are ringing in my head. It’s at this time that I start realizing that I hadn’t manually backed up any of my files from this year to my archive drive. CRAAAAAAAAP!!!

For the next two days, I scramble to copy whatever files I can from my D: drive to my archive drive. Processes waver from transfer rates of 700mbps to 10kbps and essentially hang for hours on end. While all this is going on, I leverage Google to see what other things I could try to rectify the situation. Come to find out, this particular model hard drive has an estimated 30% failure rate in the wild. Not good at all compared to other brands and models, and I’m hoping that Dell no longer carries this brand of hard drive. All the while, the thought of needing to engage a data recovery service is growing exponentially. Costs mounting as a worst case scenario unfolds.

An urgent call for help in my Facebook network goes out and many helpful friends step up to the plate to offer assistance. With each suggestion, I am stubbornly met with resistance from my dying Toshiba hard drive. An old high school friend even emerges from seclusion to offer help to retrieve the files at a low level using data recovery tool called Spinrite. After three hours of struggling to get a bootable thumb-drive configured to work with my computer’s BIOS, Spinrite confirms my fears in that the drive had a catastrophic failure. “This partition exceeds the size of this drive as defined by the system’s bios or bios extension.” Spinrite then strongly urges the discontinuation of the product as it could result in irreversible damage to the drive.

Literally tens of thousands of files; images, working files, tax files, business documentation, and hundreds of hours invested in edits and creation effectively lost… gone… dejection sets in, in a major way.

18588825_1558672864177286_6019065171894574311_o

An evening bivouac teaser shot from the 2017 Massachusetts Military History Expo

Part 4: Resurrection services

As anyone who has actually looked into data recovery services has found out, these are not cheap by any stretch of the imagination. These services start at $150 for the best case scenario (a Tier 1 recovery) and have been known to soar into the tens of thousands of dollars in the worst case scenarios (a Tier 3 recovery, where usually in those situations damaged hard drives need to be rebuilt in a clean room and special software needs to be written to recover the data). Up to this point, I have given it my all to do what I can to save my files. I manage to save all my tax files and business documentation. But I am unable to copy my 2016 and 2017 image folders (although luckily I previously copied all of the 2016 folder), as well as my audio folder where I have songs I had written, loop files for Acid Pro and mp3 files I had acquired over the years.

Again, more due diligence is performed to find a reputable local service. My research leads me to Proven Data Recovery which has a branch in Hartford, CT. When you engage a data recovery service, you’ll need to create a service ticket on their site. This requires being as detailed as possible so they can give you a more accurate estimate. I provide a detailed account of what happened, the make and model of the hard drive (which was only a year and a half old), and all the troubleshooting steps I have taken.

A few days later, I receive a rough estimate; $150 – $700. *gulp* Once they have the hard drive in house, and they figure out what the real problem is, then I get a more accurate quote. That upcoming Monday, I drop off the hard drive at their Hartford drop-off point after I fill out a form that detailed the service request and my credit card information that will be charged when the work is complete. They won’t actually charge my card until after I consent to the work, after I receive the final quote, and they manage to recover data. Proven Data Recovery has a policy of not charging if they cannot retrieve the files.

A week later, I have the prognosis and final quote. The hard drive had a head failure which caused bad sectors and the inability to properly index and find the files. The drive will need to be rebuilt with spare parts in a clean room, and software custom coded to recover the drives. In essence, a Tier 3 recovery. Once the files are recovered, they will provide a listing of the files for me to review to makes sure everything was there. After the review, they will copy the files to a new hard drive and ship that to me. All for the princely sum of $1,470… more if I want the job to be done faster than 6-10 business days. And knowing the labor involved in the process, that’s more than a fair fee to ask. By this point, I have already resigned myself to this expense with the measly assurance that this is a tax deductible expense for my business. And so I sign the consent form for them to begin.

18518015_1550389838338922_1135220389417240791_o

One of the published teaser images from the Planes of Fame Air Show in Chino, CA

Part 5: Biting the bullet with a better backup plan

During the wait for the data recovery service technicians to work their magic, I am forced to look at my “backup plan” and its shortcomings. It was a rudimentary plan at best that required discipline to manually copy files to a hard drive that was in the same computer chassis as my working drive I was copying from. Not a safe idea at all. What if the computer gets hit with a power surge despite my UPS with battery backup? What if my living space is burned to the ground because of a fire? I’d be in an even worse position. So clearly my current backup plan isn’t going to cut it.

The first thing I need to address is automation. With everything I have going on, I cannot afford to let the copying process slide for any reason. I need something that’ll handle the automatic syncing/copying. In addition, I need to copy the files to a location that is external to my computer and on a separate power source to help mitigate power related damage. Lastly, I need to have something in place to account for natural disaster related losses… a cloud storage service.

For the local backup, I decide on a Network Attached Storage (NAS) box. This is a stand-alone system that can be used for multiple purposes, including backup storage, media server and email server among other functions. There are many brands (some of the more popular being Synology, QNAP and Buffalo), and many configurations. Each of these boxes have slots that house from at least one hard drive and may contain many more. They also have an on board CPU and memory (RAM) just like your computer. Many solutions come with a user interface and apps that facility various functions. So your choice of NAS needs to be driven by your needs. How do you want to use the box? How much current storage do you need and how much will you need in another year or two (think expansion)? Go to various websites and read the reviews… good and bad. Figure out what has the best reliability, usability/feature set and cost based on your needs.

In my case, I opt for a Synology DS916+ (8GB RAM version) with four Western Digital Red SATA 6TB 5400RPM NAS drives. With running Synology’s hybrid RAID with a 1 disc fault tolerance configuration, this leaves me with 15.7TB of available storage for backups, while allowing one of the four hard drives to fail without any data loss. This theoretically gives me an opportunity to get a replacement hard drive shipped and installed while still having the backed up data available. In addition, Synology seems to have the better apps that will allow me to do what I need. Primarily this is to actively sync any folders on my computer when there are changes. So as I work, my files are simultaneously getting backed up to the NAS. There are also apps that will sync specified files/folders to a cloud storage service.

18518036_1553613548016551_2966591410994322110_o

One of the published teaser images from the Planes of Fame Air Show in Chino, CA

Speaking of cloud storage services, this is the final security net in my new backup process which (hopefully) will mitigate the natural disaster scenario. You may have seen commercials for these kinds of services, like iDrive and Carbonite. There are other services as well from major players, like Google, Amazon and Microsoft. Like the various NAS solutions, the cloud storage solutions have various features and costs. If you’re a home user, then chances are you won’t need a lot of space or bandwidth for transferring files. So a service like Carbonite will do the trick for you. But as a seasoned photographer, I have terabytes worth of files that need to be stored. So a typical consumer level service will not suffice. Down sides to many consumer services are the time it takes to upload files to the service initially (in many cases it literally takes months for this to happen), disaster recovery retrieval times and cost per TB. To be fair, your Internet Service Provider also is a factor in transfer speeds. So if you don’t have a plan that offers fast upload and download speeds, expect your transfer times to be much longer.

Business plans are better suited for people with high storage requirements and high availability needs for mission critical files/data. Upon recommendations and research, I am leaning toward a business plan and Backblaze B2 is in the lead for my business at $5/TB per month upload, and $20/TB per month download. They have a very good industry reputation and are well established. Another selling point is they have an app for my particular NAS box that facilitates targeted syncing on a schedule that I prescribe.

Now given that cloud storage is at a premium cost, both as far as storage and bandwidth/time for uploading and downloading, you are not going to want to store all of your files on the cloud. Only essential files should go up there. So in the case of a photographer, just the original RAW files and the edit files (Photoshop or Lightroom, or Bridge .xmp files). The rendered JPG files do not need to be uploaded as you can always re-render them locally. You may want your tax files on the cloud (or not if you’re concerned about security), but not your grocery list. Take the time to choose wisely. Only files that you would lose money on, or would put you in a legal bind if they are permanently lost should go up to the cloud. Everything else should be stored on your local NAS solution.

Last but most certainly not least, TEST your recovery from all devices! Backups aren’t any good if you cannot retrieve the files.

Part 6: The take away

Putting off a consistent backup plan is a costly choice, folks. The investment of a multi-pronged backup approach is far cheaper in the long run. TRUST ME! It is not just more cost effective for the price of recovery, but for the delay of getting the files back and the anxiety that comes with not knowing if you’re going to get everything back. And if you are doing photography as a business, either as a primary or secondary source of income, you cannot afford a month or more without your files.

Please note that I provide examples of brands and solutions as a starting reference. My end decision is based on my specific needs as a hobbyist photographer that uses photography as a second income. I am a registered business owner and therefore, I can leverage tax deductions. This may not be an option for you if you do not gain income from print sales or rendered services. If this is the case for you, then the investment of a more robust NAS and/or cloud service may not be feasible for your budget. If you haven’t done so, identify your backup needs and put plenty of time into doing solution research!

The main thing I want you to leave this page with is the understanding that complacency has no place in our world. No one is immune to hardware or software failure. Don’t slack off in your data safety practices as I have, else be prepared for needing to shell out literally thousands of dollars to recover the files that otherwise would be one copy and paste away.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s