Lightroom Previews – and getting Lightroom to fly


(Pipeline, Hawaii, 2015).



In this article I will talk about Lightroom Previews and their effects on freeing disk space, restoring missing files and enhancing performance.  This might seem an obscure and esoteric topic but there are items in here of interest to everyone using Lightroom, even including people on old and slow PCs.  We will discuss:

  • What are previews?
  • Saving disk space by trimming your Lightroom catalogue
  • Missing and excluded images
  • Regenerating missing images
  • Speeding processing with smart previews
  • Optimising performance in Lightroom and Photoshop


What are Previews?

Lightroom creates a variety of previews to speed up display and processing of images.

Standard previews are by default the number of pixels your screen is wide so you can quickly see an image filling up the screen.  This only applies to the Library module (except briefly in the Develop Module while the RAW file renders).

1:1 previews allow you to zoom in to 1:1 in images displayed in the Library Module (i.e. not in Develop).

There are also smart previews but we will discuss them separately later.


You can see the settings defined for your previews in the Edit/Catalog Settings/File Handling dialogue.

  • Standard previews default to your monitor resolution.
  • The setting “Preview Quality:  High” relates to thumbnails.
  • You can set to discard your 1:1 previews after a day, a week, a month or never.


You can define previews when you import files, by specifying a value for “Build Previews” under File Handling in the top right corner of the Import dialogue.


Alternatively, you can select files in Lightroom and use the command Library/Previews/Build Standard Previews or Library/Previews/Build 1:1 Previews.

If you don’t have a problem with disk space you might as well retain 1:1 previews indefinitely and create them while importing files.  This slows down import but makes Lightroom run faster.  If you don’t define 1:1 previews, Lightroom will create them on the fly with a significant impact on performance in the Library module.


Saving disk space by trimming your Lightroom Catalogue

One of the ways you can improve the performance of Lightroom is to use an SSD as the hard disk for your catalogue.  Since the storage capacity may be small, the size of your Lightroom Catalogue may become a problem.   For example, when I put my catalogue on my new 500GB hard drive (an M.2 SSD), I found I was left with less than 10% free space, too little for reliable performance.  So I had to find a way to reduce it.

There are four elements associated with the catalogue

  • The Catalogue itself. Mine was 3.6GB.  I have seen people advocate clearing Develop versions in Lightroom but that seems to me a waste of time.  You lose functionality and only save a couple of gigabytes, which does not solve the problem.
  • Catalogue backups.  I had 22.7GB here and you can save some space by deleting old backups but this wasn’t enough to solve this problem.
  • The Cache. Many versions ago, Lightroom benefited from a huge cache but this is no longer required.  The default is 1GB and you can have more than that but 10GB will be plenty for most people.  Stored cache files may build up to a few GB and you can clear them with Edit/ Preferences/ File Handling/ [Purge Cache].
  • The Previews. This is where all the files were, 391.3GB in my case.  Reducing this is not as easy as one might think.


Catalogue backups offer some scope for space saving by changing the drive where you store the backups.  You can’t set this from inside Lightroom but you can change it in the dialogue that appears when Lightroom is about to make a backup.  It also makes sense from a security point of view to have the backups on a separate drive to the catalogue.

Previews is where all the action is though for saving space.  We saw above that you can discard 1:1 previews after a day, a week, a month or never.   I had mine set to Never but you’re supposed to be able to remove them with the command Library/Previews/Discard 1:1 Previews.  I decided to remove all 1:1 previews for images rated at less than 3 stars (85% of my images) but it only reduced the stored previews by 0.4%.  I then read on an Adobe performance guide that this only works where the standard preview size (2560px in my case) is less than half the resolution of images from your camera (4193px for the Nikon D3s).  So I reduced the standard preview size to a low number, reopened Lightroom and tried again.  That was better but not much.  I only reduced the stored previews by 5%.

So that left only one option – to delete the Previews file and start again.  In other words, I deleted the folder Lightroom Catalog Previews.lrdata under the folder for the Lightroom catalogue.

Whoompa!  Previews to zero!  Lightroom still works!

So what do I need previews for again?

  • Standard previews let you quickly see the image full screen in the Library module.
  • 1:1 previews let you zoom into 100% in the Library module, for example to compare the sharpness of one image against another to determine what to delete or what to retain.  This doesn’t apply to viewing images in the Develop module or zooming in one-to-one because there you are accessing the actual RAW file.

My next step was to create new 1:1 previews using the Lightroom command Library/Previews/Build 1:1 Previews (see previous image).  I decided to create them for all my images with 3 stars or more, plus current working folders.  That took a long time.  More than a day and a half for 23,000 images.  After that I had a previews folder 141.4GB in size, 64% smaller than it was.  That solved my disk space problem.  Lightroom also seemed to create standard previews for all images automatically, which I wasn’t expecting.


Missing and excluded images

Do you have missing images or images unintentionally excluded from your catalogue?

On the left-hand side on the library module, right click on a folder and select “Synchronise folder…” .


The Synchronise Folder dialogue appears.  It may show you have photos in that directory that have not been imported into Lightroom, or photos in the catalogue that are missing on your hard drive.

If [Import New Photos] shows images in that folder that are not in Lightroom, you can click [Synchronize] to import them and see what they are.  After the import, you can select them all under “Previous Import” at top left of the Library Module and then click on their folder (left pane in Lightroom) to compare the new selected images with the ones already there.  They may be images you want to have in your catalogue or they may be images you meant to delete from the disk but instead just removed from Lightroom.

Missing photos may simply have been moved in Windows Explorer (so Lightroom doesn’t know where they are).  If you have some, you can click [Show Missing Photos] to see what they are.  You can then click on the little exclamation mark that appears at the top right corner of an image to locate them.  Alternatively, the missing images may have been deleted or lost.

However, before you remove any previews, especially 1:1 previews, you should check that you don’t really need them.  As we shall soon see, you may soon be able to use the previews to recreate missing images.



Regenerating missing images

A few years ago, I had three hard drives fail within a week, two in my main data drive (a RAID array) and one in my Drobo backup (essentially another RAID array). When the smoke cleared (metaphorically), I realised I had holes in my backups.  Whole directories of files now showed in my Lightroom catalogue as missing.  Fortunately, I left them there and did not delete the previews.  Quite recently, I realised I could regenerate those missing images which were still in my catalogue, using Jeffery Friedel’s Preview Extraction Tool.  Because they were 1:1 previews I was able to recover them as full-sized jpegs, good enough to print from.


You can have to copy the regenerated images to a different location, but you can copy them all into a single folder or preserve a whole folder structure.  You can also retain image metadata and Lightroom’s metadata including star ratings and colour labels.



Speeding processing with smart previews

Normal previews and  1:1 previews speed operations in the Library Module; smart previews speed processing in the Develop module.  They are actually miniature RAW files and save in a folder under your catalogue taking about 2% to 5% of the space of the photos themselves. They were introduced some years ago as a means to let you keep processing while disconnected from your data when travelling.  However, they are just as useful to greatly improve response time in normal editing.

First you need to generate the smart previews, which you can do on import or by the command Library/ Previews/ Build Smart Previews (see third screen shot above).


In current versions of Lightroom (from CC2015.7 and 6.7), you can select a checkbox to use smart previews instead of originals for image editing.  (Edit/ Preferences/Performance).

Changes transfer instantly to the original RAW files, and stay there even if you discard the smart previews.  The only limitation is for sharpening and noise reduction; custom sharpening done on a smart preview will not transfer well to the original file.  However, if you zoom to 1:1, Lightroom is showing you the original file so it is safe to use smart previews with custom sharpening, as long as you do that at 1:1.

If you are using an earlier version of Lightroom, you need to trick it that your folder is offline.  If you close Lightroom, rename the folder say by adding ” xxx” at the end, then reopen Lightroom, you are working with your smart previews.  To transfer the changes to your RAW files, you will need to close Lightroom and rename the folder back.

You might retain smart previews indefinitely if you’re not short of disk space.  Still, you only use smart previews when you’re processing in Develop module so you may not need to keep them for long.  Unlike the 1:1 previews, discarding smart previews works OK.



Optimising performance in Lightroom and Photoshop

The specification of your PC is one of the main factors affecting the performance of Lightroom and Photoshop.  I wrote an article about that a while ago.  A new generation of chipsets and motherboards has come out but everything else should be pretty much unchanged:

Adobe also provides useful guides to optimising performance in Lightroom and Photoshop:



  • To increase Lightroom performance by running it on a small fast SSD, you may need to reduce the size of your Previews file
  • You may be able to recover deleted or lost images by regenerating files from the 1:1 previews
  • Standard and 1:1 previews speed Library module operations; smart previews speed Develop module operations

Cloud Backup for Photographers


(Cloud over Iceland farmhouse)

Why Backup?

All hard disks die, it’s only a matter of time.  And then, when you need it, you may find your backup is corrupted.  So you should have at least two backups and one of these should be offsite in case your house is hit by a meteor or falls into a sinkhole.

When you travel with a laptop, you should also have at least two backups and should carry one on you.  However, here I’m primarily considering your home computer.  Cloud backup doesn’t usually apply to travelling.

You should also have a system image backup as well as a file backup.  A system image backup backs up your C Drive including your applications and hidden system files.  You restore your files from a file backup; you restore your C Drive from a system image backup so that if it dies or becomes disabled by a virus, you don’t have to to rebuild it from scratch.

Windows allows you to make free file and system image backups.  There are also third party programs that you may have to pay for or may be free.  For example, I use Acronis, which you pay for (It’s fast, highly customisable and allows disk cloning and restoring an image to a different PC).  I still use Windows as my most reliable method of system image backup though I also have incremental system image backups in Acronis.


Why cloud backup?

The advantage of cloud backup is it provides current offsite backup.  If your offsite backup method is taking hard drives to store in someone else’s house, then those backups are not likely to be very up to date.  Most Cloud backup services do not include system image backup, though, so you probably need to do that locally.

I had assumed cloud backup was not suitable for large data requirements and reassessed this presumption after reading Martin Bailey’s recent blog post on his workflow.

So, the questions we will need to address are:  How much does it cost?; How reliable is it?; how flexible is it?; how fast is it?; and how easy is it to use?


Why you still need local backup

It’s very slow to transfer large amounts of data to the cloud.  Getting back a few files should not be a problem but retrieval of large amounts of files is also slow, though not as slow as the upload.

Also, as with any media choice, local or Cloud, there is some risk your file may be missing or corrupt when you come to recover it.  Cloud companies have also been known to fold in the past (though hopefully not the ones we will mention).


Cloud storage vs cloud backup

Cloud backup is not to be confused with cloud storage.  Companies like Dropbox offer cloud storage and you can copy files in and extract them or share them, but there is no interface for file backup.  There are many options for free cloud storage but most of them don’t offer much space.  The largest I have found is Mega (founded by the infamous Kim DotCom) which offers 50GB free.

The cheapest cloud storage for larger amounts appears to be Amazon Glacier or Backblaze B2 which cost 0.5 US cents per Gigabyte per month.  That corresponds to $A8 per annum for 100GB, $40pa for 1TB and $200pa for 5TB.  Glacier costs from 0.3 US cents per GB for a slow download (wait 5 to 12 hours before anything happens) to 3.6 cents/GB for a download with little delay.   Backblaze B2 costs 2 cents/GB for a download with little delay.


Cloud backup options

For backup to the cloud you need backup software as well as a cloud storage repository.  These are usually integrated but can come separately.  There are many alternatives out there.  Here are a few that I see as the most likely options.

If you have small amounts of data to back up – up to around 1TB, the cheapest option is Cloudberry.  This is an interface for backing up to the cloud but does not include storage.  There is a free version of Cloudberry, or one with encryption for a one-time cost of $A40.  Then you need to add storage from Amazon Glacier or BackBlaze B2 at the costs outlined just above, or any number of more expensive alternatives.  This may be a cheap option but is not an easy one.  You need to negotiate the complexities of setting up both Cloudberry and the storage choice.

If you are already using Acronis for your local backups, extending to Cloud backups may be a viable option.  Acronis is fast, easy to use and extremely configurable.  Additional cost per annum of backup to the Cloud is $A3 (50GB), $A14 (250GB), $A30 (500GB) and $A60 (1TB).  Beyond that, other options are more cost-effective.

This leaves the last two candidates and the most likely to be of use to photographers at a reasonable price.  I have eliminated IDrive, SpiderOakOne and SugarSynch (too expensive and too little storage), Carbonite (limited space per computer and no external drives) and SOS (too expensive).  That leaves BackBlaze and CrashPlan.  Both of these offer unlimited storage at $A67pa and $A80pa respectively for a single user.  In effect, the vast majority of users who have little data are subsidising us photographers, who may have vast amounts of data.


Why Unlimited Storage

Photographers are likely to have large amounts of files and unlimited storage is likely to work out cheaper, especially when including future requirements.

Also, you could be backing up specific directories using multiple backups with different criteria.  This creates the possibility you may forget to define some backups for new projects so you end up with holes in your safety net.  Having an unlimited Cloud backup means you can back up everything and solves that problem.



Backblaze is cheap and simple.  Basically you just set it going and it backs everything up.  On the main screen you simply have options for [Backup Now], [Restore Options] and [Settings].  There is also a Help button at top right.

All you really have to do is click [Settings…] to select which drives to backup and let it do its thing.  However, you might want to specify a drive other than C:\ as temporary data drive because Backblaze temporarily stores copies of large files there while uploading.

On the Performance tab you can let Backblaze automatically adjust the upload overhead, or adjust it manually in various ways.  Here, I have unchecked [Automatic Throttle].  You can use the slider to increase backup times but if you take it too far, you may slow down your whole home network.  You can also increase the number of backup threads and you should do it slowly, one day per increment, and observe results.

On the Schedule tab, scheduled upload is usually continuous but you can make it daily or on demand.
Backblaze does not back up system files and has a number of folders and file types it excludes by default.  You can specify additional folders to exclude but you can’t do it the other way and define folders or files to include.
Restoring files is more complex than backing them up.  You login to the web site to request the restored files. Then you can download your files as a zip file.  Next you have to work out what to do with them.  They come inside the zip archive in a folder structure corresponding to the directory structure.  You have to manually work out where to copy them to from there – and you need enough space to have two sets of those files until you’re finished.

Alternatively, you can ask BackBlaze to send you a hard disk (up to 3.5TB; US178).  You can get a full refund on the hard drive if you return it within 30 days and pay return postage, making it almost free.  After you make the request, it takes them 2 to 4 days per terabyte for them to post it plus 3 to 4 days in the post.  So probably: 4 to 6 days for 500GB; 5 to 8 days for 1TB and 14 to 24 days for 5TB.

I read a review that said that if you tweak the Settings/ performance values, BackBlaze should run about the same speed as CrashPlan.  If this is the case, and based on my download test for Crashplan (though my PC and network speed may be quite different from yours), direct download may be quicker than a disk for up to somewhere between 1TB and 2TB download.  However, without tweaking those settings,  my current download test is running 25 times slower than my last CrashPlan download test(!).  In any case, a disk may be more practical if you’re short of disk space.  If you do go for a disk, you can still be recovering your most important directories while waiting for it.  Bear in mind that your whole download choice has to complete before you can access and copy any files.


Backblaze is cheap and simple but there are a few drawbacks that Crashplan does not have.

  • That restore process is a handicap for me though the hard disk option could be useful for large data restores.
    • You have to copy the files manually from the zip file to your final destination
    • You can’t access any files until the zip file has finished downloading
    • Restore seems extremely slow with default settings.
  • BackBlaze does not support backing up files from a NAS (i.e. an external array of disks working together, such as a Drobo).
  • It allows backing up from external drives but deletes the files if the external disk is not connected for a month.
  • Though it claims continuous backups, it may take two or more hours to notice a new or changed file.
  • It lacks a History screen to allow you to accurately determine backup durations and speeds.
  • It allows only 6 versions of files and removes versions after 30 days.



So that brings us to CrashPlan, which is my preferred option and I’m currently on the 30-day free trial.  You can see on the main screen that I have a backup running.  It shows details of how that is progressing.  You can also see the folders or directories involved in that backup.

How did I do that?  Very simple.  First time I opened the screen, there were no files defined.  I clicked on [Change…] to define some files and away it went, backing up to the default location of CrashPlan Australia (i.e. the Cloud).

This is the Settings Tab for the backup.

  • Default is for backup to run always, but you can make it on specific days and between specific times
  • Verify selection is set by default to 3am every morning.  This assumes your PC is left on and set never to sleep (Control Panel/ Power Options).  Otherwise, you should change the time to when your computer will be on.
    • Increase the number of days before verifying to 30  during your initial upload will help it run faster.
  • [Frequency and versions]: See the next screen and comments….
  • [Filename exclusions] allows you to exclude file types as in BackBlaze.  There is nothing specified by default here but your system files are excluded from backup anyway.
  • Leave [Advanced settings] alone.  They’ll only decrease functionality.
  • [Enable] backup sets lets you define backup sets which are for different destinations.  Apart from the Cloud, this can be other locations on your computer’s drives, other computers in your home network or friends’ computers.


On the Backup Frequency and Versioning Settings subscreen, the first slider sets the frequency of backups, varying from every minute, to every 15 minutes (the default), to every week.

The next four sliders determine how many versions of files to keep and how much to whittle them down as they age, ranging from keeping all versions to jettisoning them after a week.

The last slider is how often you remove deleted files.  The default setting is never and alternatives range from every day to every year.

CrashPlan determines whether the computer is in use according to keyboard/ mouse activity.  On the General Tab of the settings screen, you can define how long before CrashPlan thinks the computer is inactive, and what percent of CPU to use if you’re away and if you’re using it.   You can see it defaults to 80%/ 20%.  You could conceivably change that to 90%/ 10% if performance were an issue. Alternatively, you might try 100%/ 90% and then wind the in use setting back if the PC slows.  Setting CPU% to zero though is not what you would think; it actually tells CrashPlan to use whatever it wants.  Its default upload speeds are good, so there may not be any need to modify anything here.

You can define other computers to backup files to you (as we will see later).  The [Configure…] button allows you to change their destination.  (The default destination is a subdirectory of C:\ProgramData\).


Setup tips

What to back up?  Not system files because they keep changing endlessly, so even though CrashPlan excludes many file types automatically, not your whole C Drive.  On your C Drive, perhaps just your Users directory plus any directories you have created for your files.  Most user settings are stored under C:\Users but one I can think of that isn’t is:

  • Printer profiles:  C:\Windows\System32\spool\drivers\color

Also exclude Lightroom previews and cache files from the backup.  Lightroom rebuilds them anyway and it slows the backup.  You may have the Lightroom catalogue stored somewhere else but by default on Windows 10 and for Lightroom CC these are located at:

  • Lightroom previews cache: C:\Users\username\Pictures\Lightroom\Lightroom Catalog Previews.lrdata
  • Smart previews cache: C:\Users\username\Pictures\Lightroom\Lightroom Catalog Smart Previews.lrdata
  • Adobe Camera Raw cache: C:\Users\username\AppData\Roaming\Adobe\CameraRaw\Database

When setting a huge initial backup, you can define it in sections with your most important directories and folders first.


Restoring files

Restoring files is easy.  You can select drives, folders or files and then restore them to their original location or another place such as the default of Desktop.  It’s just going to take a long while if you have a huge mass of files, but you can restore your most important files first.

Here are some locations of important settings in C:\Users:

  • Lightroom User Print Presets:  C:\Users\[user name]\AppData\Roaming\Adobe\Lightroom\Print Templates\User Templates
  • Lightroom User Export Presets:  C:\Users\[user name]\AppData\Roaming\Adobe\Lightroom\Export Presets\User Presets
  • Lightroom User Develop Presets:  C:\Users\[user name]\AppData\Roaming\Adobe\Lightroom\Develop Presets\User Presets
  • Photoshop Actions (if you’ve saved them): C:\Users\[user name]\AppData\Roaming\Adobe\Adobe Photoshop 2017\Presets\Actions
  • Photoshop Actions (otherwise): C:\Users\[user name]\AppData\Roaming\Adobe\Adobe Photoshop 2017\Adobe Photoshop 2017 Settings\Actions Palette.psp



The history screen is very useful.  It tells you when your tasks started and finished, and the speed that they run at.


Help and support

There is no Help button to click in the interface but there are help screens available online.  Just search for the screen or the issue.  Alternatively, here is the support site.  You can send emails and there is a Chat Line.  Both are quick and informative;  only catch is that office hours are 12 midnight to 8am Canberra time.


Additional free capabilities

A very interesting feature of CrashPlan is that you can download it and use it for free to make backups to your computer drives, to and from other computers in your home network, and to and from the computers of friends.  You only have to pay for it if you use it to back up to the cloud, although with the free version you can only back up automatically once a day.


You might remember there was a [Backup Sets] Button at the bottom of the Settings/ Backup tab as it initially appeared.

  • If you click on that you can create additional backups to locations other than the Cloud.
  • Next, go to the Destinations screen and set up destinations as a folder on your computer, as another computer on your home network, or as the computer of a friend
  • Then, go to The Settings/ Backup tab again to define new backups.



Here I have two backup sets defined.  “Computer to Cloud” is unsurprisingly my backup to the Cloud.  “Computer to External Drive” is a local backup to a directory (Travel 16) that I have defined as a destination.

Here, the main backup screen shows that as well as the backup to the Cloud at the top, I have defined the backup to a local drive we saw above and also there are two inbound backups from other computers in my home network.  So you can also use CrashPlan free for local backup and if you have a NAS with spare capacity, other members of the household can use CrashPlan to back up to it.

You can also have two destinations to one backup set, say one locally and one to the cloud.  In this case, the local, fastest destination has priority in the backup.

Currently, when I come back from travelling, I create a Lightroom catalogue of the new files on an external hard drive and then import it into the Lightroom catalogue on my PC.  Instead I could first highlight the images in Lightroom, save all changes to sidecar files ([Ctrl][S] or Metadata/ Save Metadata to File) and then use CrashPlan to send them overnight to a folder on my PC via my home network.  When all the files are in, I would need to use Crashplan to restore the incoming backup, and then I can specify which drive the files go to.

You can even use CrashPlan free for offsite (non-Cloud) storage.  Create a backup on an external drive (fast transfer).  Have a friend do the same on his PC.  Configure your computers as CrashPlan friends. Swap hard drives.  Now you can continue the backup you have set up on your friend’s computer and they can do the same from theirs.  Cost then depends on whether you need to buy disk drives or disk caddies.


Options while travelling

From your laptop, you can log into your CrashPlan account, view the status of downloads, and download files you need from your Cloud backup (up to 250MB per selection and 500MB per session).  You receive a zip file of the files you selected.

You can also download an iOS, Android or Windows app to your phone to see and download files you have backed up to the cloud from your home computer.

Most places I travel to I’m lucky to have an internet connection at all. However, Cloud backup while travelling might be possible if you travel in Europe or North America and especially if you stay in one place for a while.  Even if you don’t get everything backed up, whatever you do is better than nothing. In that case, unless you already have a CrashPlan Family subscription, you could purchase a monthly one for your laptop ($A8.16 per month) and terminate it when you return and have all your files on your PC and backed up.


Backup and Restore Times

Upload to the Cloud depends mainly on your network speed (which can vary) and also on the specification of your PC and what else you are running or doing on it. So the speed I get on my PC may or may not be relevant to what you get on yours and a test I make at one time may give different results at another. Here is a link to test your upload and download speed.

I made a test to see how long it would take to backup a 54.1GB upload with speedup setting implemented (as below). That took 12 hours 37 minutes, corresponding to 23 hours for 100GB, 10 days for 1TB and 14 weeks for 5TB. Download is usually faster so I set a restore going. That took 5 hours 20 minutes, corresponding to 10 hours for 100GB, 4 days for 1TB and 6 weeks for 5TB.


Speeding up the backup

There are a number of things you can do to speed up the backup, especially for a large initial backup.

  • I have already mentioned leaving the computer on and setting sleep to never (Control Panel/ Power options)
  • In Settings/ General, increase CPU% to 90% both for when user is away and for when user is present.
    • Then reduce CPU% for when user is present if it noticeably slows computer down.
  • In Settings/ Backup/ Frequency and versions: [Configure…], reduce backup frequency from default every 15 minutes to say every 8 hours
    • This reduces the time the CrashPlan spends checking for new versions.
    • You could also verify selection of your backup less often.
  • You can return CPU % and Frequency settings to default values (or whatever you prefer) when your initial backup is finished.
    • However, CPU % when user is present should be at least 10%.  If you reduce it to 0%, CrashPlan actually takes that to mean “Do whatever you want!”.
  • If you have two different backup programs operating simultaneously and you notice performance issues, it may help to schedule them to run at different times.


Managing Memory

CrashPlan recommends allocating 1GB of RAM for each 1TB of files stored in the Cloud.  Default is 1GB.  (Actually, it’s really 600MB per TB storage but they allow for expansion).

So far I haven’t encountered any performance problems, though I have only uploaded 380GB of a potential 5.7TB and my computer has 32GB of RAM.  CrashPlan is in any case designed to run quietly in the background and not compete for resources.  Presumably though, people with old slow PCs and lots of data to backup are more likely to encounter issues.

If you do encounter problems, here are a couple of further things that could help:

  • The most resource-intensive activity is file verification scans. Normal backup gets file change information from the operating system and it doesn’t need to scan.  So make sure the verification scan only operates when you are not using the computer (Settings/ Backup/ Verify Selection)
  • You could set your backups to only run overnight (Settings/ Backup/ Backup will run…)
  • You could reduce the amount of data you store on the cloud.  For example, you could create “3+” subfolders and move images with 3 or more stars to them in Lightroom, then back up only images in those folders.
  • You could have different backup sets for older files and current files with different settings.  Different backup sets are usually for different destinations but you could have two for backing up to the cloud.
    • On the Settings/ Backup tab, you can set different [Frequency and Versions] settings for each backup set, but if you have two sets backing up to the cloud, only the settings from the highest priority set will apply.  So that’s of no use here.
    • However, you can set Backup times and Verify times for two sets backing up to the cloud and they will apply.
      • So one set with your old files that change very infrequently would seldom backup while the set with your current files would backup often.
      • At the extreme, you could set your old backup set to back up for one minute on Sundays (i.e. not at all) and verify it very infrequently.
      • When you finish a project and want to transfer files from the current to the old backup set, define that folder in the new backup set, click Verify [Now] for that set, and delete them from the current backup set.
  • If necessary, you can also pause all backups for a specified period by right-clicking on the CrashPlan Tray at the right of you Menu Bar and choosing [Sleep…]


Recovering from a Ransomware Attack

Cloud backup can be a valuable way to protect against ransomware attacks. Typically you introduce the ransomware to your PC through clicking on an email attachment or a link. All your files may become encrypted and a message appear on your computer demanding a ransom. We have recently seen the explosion of the WannaCry ransomware. According to Wikipedia, over 230,000 computers in over 150 countries were infected within the first day.

Norton now protects against WannaCry and other known ransomware. Acronis’s expensive Premium subscription option also does this by preventing malicious changes to your Master Boot Record and to your backup files. Probably nothing can be 100% successful against new ransomware algorithms. Received wisdom says it is not a good idea to have two competing antivirus solutions so running Acronis Premium as well as Norton (or Kaspersky or whatever) may not be a good idea. It is also important to have your operating system up to date.

If you get hit by something like WannaCry, you’re probably going to need to reimage your PC. If it has spread to your home network, you may need to reimage all the PCs on your home network that were turned on since the attack began. Then you need to restore files from backup.
If you have an online local backup, your backups are probably encrypted too (not merely the files they contain). Offline backups to hard disks may be OK but are are more likely to be out of date than your Cloud backup.

The standard approach to Cloud backup with CrashPlan is that you back up data files (images, video, music, Word files, Excel files etc.) and not system files. So when you restore files from the Cloud after reimaging, the danger is not reintroducing the ransomware Trojan, it is reinstalling encrypted files. If you find this happens, you just need to restore files from a date earlier than when the attack started. CrashPlan will show you the files you have available to restore at a certain date in a directory structure, including the file names. So if you see encrypted file names, you need to go to an earlier backup and if you don’t, they should be all right.

If the ransomware attack means you have lost your CrashPlan password and you have the standard security level, then CrashPlan support can help you reset it and you can access your backups. However, if you are using the higher security archive key or custom key settings, you will need to know that password or you won’t get your files back. In any case, it may be as well to have your passwords on a USB stick, a disconnected hard drive or a piece of paper.


ISP Issues?

Backing up or restoring large amounts of data to the Cloud could have a hidden cost if you have a fixed-data plan with your ISP.  I have just switched to one of the new unlimited iiNet plans so that is not an issue for me.  For some, Cloud backup might require changing your plan or changing your ISP.


Options for families

CrashPlan is $A80pa for a single PC to backup to the Cloud, or $A199 for a Family plan for from 2 to 10 computers.

BackBlaze doesn’t have a Family Plan; you just have to buy more single licenses.

Another alternative is IDrive, a well-featured program including file synch, file sharing and also free offline backup with unlimited computers for $A93pa, but for only up to 1TB of data, which is probably not very much, especially including future expansion.

Cloudberry plus storage on Amazon Glacier or BackBlaze B2 would be cheaper up to about 4TB but the hassle factor would be much greater and the functionality more limited.  If you happen to have a subscription to Microsoft Office 365 Home ($A120pa), that comes with 1TB Cloud storage in OneDrive per computer for up to 5 computers.  Cloudberry supports OneDrive so I presume you could connect from Cloudberry to your Office 365 OneDrive online storage.

There is also the option of off-site backup to a friend’s computer using CrashPlan, free if you have the hard drives, but for many, the CrashPlan Family Plan might be the most appealing option.


(Sunset near Flinders Ranges, South Australia)




Photographers inevitably end up with lots of images and backups are essential. You should have at least two and one should be off-site.

These days it makes a lot of sense for your off-site backup to be the cloud.  Your initial backup will be very slow but this is not a problem because it will run in background and you should have two local backups before you start this process anyway.

I have covered some other options above but for most people, Cloud backup will be a choice between BackBlaze and CrashPlan, both of which offer unlimited storage space.  Backblaze will suit people who want a very simple choice, don’t mind slow performance if they don’t tweak the settings, and don’t mind a complex restore process.  CrashPlan will be a better choice for most people, though somewhat more expensive.  It is much better specified and more customisable with several unique features, including free backup to a friend’s computer.


Links to more information

Free free to comment, and to offer or to request information….

Computers for Photography

Unless you are shooting film and printing in a darkroom, you’re likely to need a computer to deal with and process your images.  If you’re looking at purchasing one for photography, there are several things to consider:

  • Monitor
    • Colorimeter for profiling
    • Graphics cards
  • Computer
    • RAM
    • Storage
      • SSD or conventional
    • Chip
    • Software
  • Laptops
    • External drives
  • Backup
    • External hard drive or NAS
    • Software

So let’s consider each of these in turn, both from the point of view of a cheaper alternative and what’s the best you can have.





It’s better to have at least a reasonable quality monitor.  If your monitor is too cheap or too old, it may not be capable of showing accurate colour.  The main monitor types are TN and IPS.  It is better to go for an IPS rather than a TN monitor because the appearance of TN monitors changes according to viewing angle and therefore may not show you an accurate picture of your image.

The best monitors are NEC and Eizo.  Having an excellent monitor makes life easier especially for printing.  My impression is that NEC is pretty much as good as Eizo at a much lower price though some Eizo users may disagree.  One example of such a monitor is the NEC PA272W.  Such monitors are not cheap.

Here is an article from ImageScience on buying monitors.

Another option at an even higher level of expense is a 4K monitor with a resolution of something like 3840×2160 instead of 1900×1200.  There are not many available and photographic quality ones can be very expensive, as with the NEC PA322.  You will also need an expensive graphics card to drive it.  You will get amazing resolution and excellent colour but you may have problems with some software.  For example, the Nik software suite is unlikely to work well on 4K because Google do not appear to be updating it.

If your budget is more limited, the choices are more complex because it is a question of how much you are willing to pay and how far you are willing to compromise and there is a multitude of choices out there.   Here are two reviews though of the BenQ2700PT by Joshua Holko and Martin Bailey, a much cheaper monitor than the NECs and Eizos.



A good colorimeter is almost essential, especially for printing.  Your eyes can adjust to see both daylight and tungsten light as normal so they are not good tools to adjust monitors, so you should use a good colorimeter to calibrate and profile your monitor.  The best colorimeter is the X-Rite i1 Display Pro.  (Online prices start from just under $300).



If you have an old computer and it works for you then it works for you.  You might get more life out of it with more RAM but then new computers are cheaper than they used to be.  If you are considering a new one:

  • Generally you would want at least 16GB RAM though you may get away with less.
  • The CPU is not so critical as long as it’s not too old and slow. You don’t really need a state of the art gaming chip.
  • These days, it’s better (and faster) to boot up off an SSD rather than a spinning drive. (An SSD or Solid State Drive is like a larger version of a flash drive or an SD card).  SSDs are getting cheaper and you might even choose to go for a second SSD for your Photoshop scratch file and Lightroom catalogue.
  • Your graphics card can also be relevant as with many graphics cards you can enable GPU processing to speed up the display and transformation of an image on the screen.
  • Your motherboard is relevant as it will determine what generation of chip your system can support and whet you can plug in. For example, the newer M.2 generation of SSDs is much faster provided you have a board that supports them.  The current generation of architecture is based on the Skylake chip.

For ultimate performance, you may want a custom PC.  You could either build this yourself or get someone to build it for you.  In Canberra, this might be MSY (don’t expect salesmanship and demonstration from them; you need to know what you want first).

Here are some guides to a custom PC:

And here is a couple of guides if you are in the market for a Mac:

Another thing to consider is storage.  It depends partly on how many images you delete and how large the image files are from your camera, but it is common in the digital age to need lots of space for image storage.  SSDs may take over in due course because they are faster and probably more reliable but that’s still some way off so for most storage we still rely on spinning disks.

Larger spinning disks are now available.  You would want a 7200rpm drive rather than the 5400rpm ones which are more suited to backup and Western Digital Black drives now go up to 6TB.  If that is not enough storage you could combine several drives in a RAID array. This can both speed up operations and give some protection against disk failure.  Your motherboard and operating system would need to support the size of drive or type of RAID you might want.

One last thing that may be worth considering is a UPS or Universal Power Supply that will protect your PC against power spikes and enable you to save your work in the event of a loss of power.

(I will consider printers in a separate article).



There is a huge variety of laptops available in all sorts of different configurations.  For most photographers the main purpose of laptops is for travelling.  For some, the sole purpose is storing images in which case RAM and screen resolution are not so important.  Others want a machine they can process images on in available time while they travel.  RAM and screen resolution then become much more important.  In either case, USB 3 inputs will make a big difference in speed of importing images.  So will SSD hard drives.  It is possible these days to purchase a laptop with a 4K screen, 32GB RAM and a 1TB SSD hard drive though such machines are not yet readily available in Australian retail outlets and will not be cheap.

The alternative to a travel laptop is lots of SD or CF cards but this may not be practical on a longer trip.

This page from Puget Systems shows what might be possible with a very highly specified custom laptop though at 3.4kg this is a desktop replacement unit rather than one for travel.  (Click the [Customise] button for specification options).



It is common for people to be sanguine about backup until the first time their computer goes down and they lose lots of files.  Ideally you should have two or three backups and one should be stored offsite in case of fire or other disaster.  These could be single external drives or you could use a NAS, which combines multiple hard drives in a RAID array and which you may specify for access from a home network.

If you rely on SD cards or CF cards while travelling, you may not have your images backed up and would therefore be at risk of losing them.  If using a laptop while travelling, you should also be backing up to external disks.  External SSDs are a much lighter option than conventional drives and can readily fit in a pocket.  They are still more expensive but becoming more affordable.

To backup files you need backup software which can be Windows (which I admit I haven’t tried for this purpose), a third party product such as Acronis, or possibly software that comes with your hard drive.

It can also happen that your C Drive crashes or gets a virus.  To cover for such an event you should make a system rescue disk so you can still boot up your PC from it, and save a system image so you can quickly get back your C Drive in a functional state.