You can't do it because it doesn't make any sense. CryptGetHashParam with the HP_HASHVAL option finalizes the hash, so there is no way to add data to it. If you want to "fork" the hash so that you can finalize it at some point as well as add data to it, you must duplicate the hash object prior to finalizing. Then you add the data to one of the hash objects and finalize the other. For example, you might do this if you wanted record a cumulative hash after every 1024 bytes of a data stream. You should not call CryptSetHashParam on the hash object that you are continuing to add data to.
CryptSetHashParam with the HP_HASHVAL option is a brutal hack to overcome a limitation in the CryptoAPI. The CryptoAPI will only sign a hash object, so if you want to sign some data that might have been hashed or generated outside of CAPI, you have to "jam" it into a hash object.
EDIT:
Based on your comment, I think you are looking for a way to serialize the hash object. I cannot find any evidence that CryptoAPI supports this. There are alternatives, however, that are basically variants of my "1024 bytes" example above. If you are hashing a sequence of files, you could simply compute and save the hash of each file. If you really need to boil it down to one value, then you can compute a modified hash where the first piece of data you hash for file i is the finalized hash for files 0, 1, 2, ..., i-1. So:
H-1 = empty,
Hi = MD5 (Hi-1 || filei)
As you go along, you can save the last successfully computed Hi
value. In case of interruption, you can restart at file i+1
. Note that, like any message digest, the above is completely sensitive to both order and content. This is something to consider on a dynamically changing file system. If files can be added or changed during the hashing operation, the meaning of the hash value will be affected. It might be rendered meaningless. You might want to be certain that both the sequence and content of the files you are hashing is frozen during the entire duration of the hash.