HHD Software DupeCare can be fully controlled from the command-line. A single executable
dedup.exe is used to control the product.
To start an
OPTIMIZE operation, execute the following command:
.\dedup.exe -o|--optimize [--purge-placeholders] [--purge-blocks] [--compression-level N]<full-path-to-folder>
The server for the specified deduplicated folder must be running. The command starts and dumps the progress to the console window. It also dumps a brief operation and storage statistics when completed.
When operation completes, it removes local (full) cached copies of deduplicated files. The operating system will automatically create a cached copy of a file next time it is requested. If the file is modified, it will be re-processed next time an optimization operation runs. If the file is not modified, its local cached copy will be deleted next time an optimization operation runs.
Operating system may also store “placeholders” - small cached copies of deduplicated files (without the actual file contents). Specifying the optional
--purge-placeholders parameter will also delete these cached copies.
Another optional parameter
--purge-blocks can be used to remove compressed data from the deduplicated storage. This parameter should be used after deleting a large amount of files in deduplicated folder. It is also used in “garbage collection” scheduled task which is usually configured to run less often than “normal” optimization task.
--compression-level parameter can optionally specify the desired compression strength, a number from 1 to 12. The default is 7, which produces a good time/compression ratio balance.
Note that accessing compressed data will be equally fast no matter what compression level was used to compress it.
The estimate command may be used on a normal folder to estimate the benefits of converting it to deduplicated folder.
.\dedup.exe -e|--estimate <full path to folder>
It basically executes the full optimize operation without actually storing compressed data and dumps the statistics when the operation completes. This allows you to decide whether it is worth to deduplicate a given folder or not.
If you have accidentally removed a number of files or folders from the deduplicated folder and have not yet run the optimize operation, you can restore the deleted files by running the
.\dedup.exe -r|--recover <full-path-to-folder>
The server for the specified deduplicated folder must be running. The command “un-deletes” all recently deleted files and folders in a deduplicated folder.
Please note that if the
OPTIMIZE command has been executed for the deduplicated folder, the
RECOVER operation will not be able to recover deleted files.
This command allows you to get a short statistical information about a given deduplicated folder:
.\dedup.exe -i|--info <full-path-to-folder>
The server for the specified deduplicated folder must be running.
DELETE command to delete the deduplicated folder.
.\dedup.exe --delete [--force] <full-path-to-folder>
The server for the specified deduplicated folder must be running. The command deletes the deduplicated folder. It finishes with an error if the target folder is not empty, unless the
--force parameter is specified.
DupeCare can also be used in portable mode. A single executable (without any external dependencies)
dedup.exe can be copied to the target computer and used to create and support a deduplicated folder.
In order to create new deduplicated folder, convert an existing folder or to start a server for an existing deduplicated folder, use the following command:
.\dedup.exe [--adopt] [--optimize-period PERIOD] [--gc-period PERIOD] --server <full-path-to-folder>
This command creates a new deduplicated folder or opens an existing one. If the target folder exists and it is not a deduplicated folder, the command fails, unless the
--adopt parameter is specified.
--adopt parameter may be used to “convert” an existing folder to a deduplicated folder.
If the command is successful, a server is started. A running server instance controls the deduplicated folder until it is stopped. If you stop the running server, your ability to access deduplicated files will be limited (only locally cached files and folders will be available).
Running a server does not automatically starts deduplication of any existing files or folders. You will need to execute the
When the server starts, it installs two periodic tasks. Optimize task runs a default optimization operation and GC task runs another optimization operation with
You can customize the period of these tasks using
PERIOD can be one of the following:
|A task is disabled
|A task runs every N seconds
|A task runs every N minutes
|A task runs every N hours
|A task runs every N days
N is an integer value.
The default period for an optimization task is 1 hour and the default period for a GC task is 1 day. A minimum allowed period is 15 minutes for both tasks.