clone#
Copy a Digest::SHA object, producing an independent duplicate.
Returns a new object with the same algorithm and the same internal state
— including any data already absorbed via add or addfile. The copy
and the original are independent from that moment on: hashing more data
into one does not affect the other.
Useful when you want several digests that share a common prefix, for example hashing a large buffer followed by a per-record suffix.
Synopsis#
my $base = Digest::SHA->new(256);
$base->add($prefix);
my $copy = $base->clone;
What you get back#
A fresh blessed Digest::SHA object with state copied from the receiver.
Returns undef if called on something that is not a valid
Digest::SHA object.
Examples#
my $base = Digest::SHA->new(256);
$base->add("common-prefix:");
for my $id (1 .. 3) {
my $h = $base->clone;
$h->add($id);
print $h->hexdigest, "\n"; # three distinct digests
}
my $sha = Digest::SHA->new(512);
$sha->add("abc");
my $copy = $sha->clone;
print $copy->hexdigest eq $sha->hexdigest ? "same\n" : "diff\n"; # same
Edge cases#
The copy is a snapshot: subsequent
addon the receiver does not propagate into it.cloneon a non-Digest::SHAscalar returnsundefrather than croaking.
Differences from upstream#
Fully compatible with upstream Digest::SHA 6.04.
See also#
new— start a fresh context instead of branching from an existing one.reset— reuse the same object for the next digest.getstate/putstate— snapshot to a string you can send across processes.