Sixty second explanation: What is Hashing?

Hashing is the term used for a specific sort of encryption; one-way, or asymmetric encryption. In other words, you can make something secret but never be able to reveal the secret again, ever. What use is something like that, immediately springs to mind.

As it happens, it’s actually very useful. Hashing is the key (no pun intended) to inviolable digital signatures. When you think about it, who actually needs to read a signature, you just need to able to recognize it. The reason hashing works as a signature mechanism, is because it’s always a repeatable exercise. If you know what the signature should be, when you rehash what you expect the signature to be, you should have a match. If your assumption is wrong, because something has changed or been tampered with, then the hashed value will never match (at least that’s the idea). Thus the signature has served its purpose. It doesn’t tell you what has changed or who is responsible, just that something has changed.

That’s not the only use for hashed data. A hash algorithm also reduces the size of what you are encrypting to a fixed manageable size. Thus the signature for anything, even the complete works of Shakespeare will be a hash (or message digest to the pedantic) of a fixed size, depending on the hash algorithm used. Popular algorithms are MD-5, SHA-1, SHA-256 (& other flavours of SHA-2) and most recently a new SHA-3 algorithm has emerged. There are other variants but those are the most common. They are related to one another and, as you might imagine, are successively more secure than prior algorithms (incidentally resulting in progressively larger hash sizes).

So, the more astute of you are probably wondering how one can compress a document the size of the complete works of Shakespeare into a hash the size of 256 bits say (with SHA-256). Sort of TARDIS like, if you enjoy colourful metaphors like I do. Well the answer is, there are obviously multiple collections of data out there that will eventually give you the same hash. This is called a collision in true opaque style. Indeed, most attempts to break hash algorithms involve trying to deduce arrangements of data that yield collisions. Consequently, the larger the message digest size in bits, the less likely it is to possibly obtain a viable collision, at least that’s the theory.


  1. #1 by mohoni on December 22, 2016 - 9:58 pm

    Reblogged this on Welcome to NaijaTechFeed.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: