There’s no reason that this chapter shouldn’t or couldn’t be included together with the previous chapter, but here we are. Chapter 27 is about cryptographic algorithms, in slightly more depth than the previous chapter.
This is a continuation of my blog post series on the CompTIA Security+ exam.
Cryptographic systems are composed of an algorithm and a key. The algorithm is the method of converting plaintext to ciphertext and back.
Symmetric Algorithms
Yep, we’re going through these again. Symmetric algorithms use the same key for both encryption and decryption. Common algorithms include DES, 3DES, AES< Blowfish, Twofish, RC2, RC4, RC5, and RC6.
DES
Data Encryption Standard (DES) was developed by the National Bureau of Standards in 1973, which is now known as NIST. DES has since been replaced with AES.
3DES
Triple DES, or 3DES is like DES< but instead of one round of encryption with one key, there are 3 rounds of encryption, with 2 or three keys. This makes it more secure.
AES
This is the successor to DES. NIST wanted a block cipher using symmetric key cryptography, and had a handful of options. They went with the Rijndael option, which has three different standard key sizes (AES-128, AES-192, and AES256). It’s secure and computationally efficient.
RC4
“RC” stands for Rivest Cipher, as Ron Rivest designed them. RC4 is a stream cipher (DES, 3DES and AES are block ciphers). Popular in TLS and WPA. RC4 can use keys between 8 to 2048 bits, so it’s important to make sure you aren’t using weak (small) keys.
Blowfish/Twofish
Blowfish was designed by Bruce Schneier in 1994. it’s a block-mode cipher using 64-bit blocks and a variable key length. Twofish is a block cipher using 128-bit blocks and a variable length key as well.
Cipher Modes
These are ways of dealing with the problem of multiple blocks of identical plaintext. If these duplicate blocks get encrypted, we want to make sure that the resulting ciphertext doesn’t give away the pattern by also having duplicate blocks.
CBC
Cipher Block Chaining (CBC) is a block where each block is XORed with the previous ciphertext block before being encrypted. The first block uses an initialization vector XOR’d with the text. CBC can’t be parallelized, so it’s not as fast. It’s also subject to POODLE attacks (Padding Oracle On Downgraded Legacy Encryption) which allows someone to deduce the plaintext from two adjacent blocks of ciphertext. Oops.
GCM
Galois Counter Mode (GCM) is recognized by NIST and used in the IEEE 802.1AE standard. It can be parallelized.
ECB
Electronic Code Book (ECB) is the simplest cipher mode. The plaintext gets split into blocks, and each block is encrypted separately. This does not protect against the identical blocks problem, and thus, ECB is not recommended.
CTM/CTR
Counter Mode (CTM) uses a “counter” function to generate a nonce for each block’s encryption. You take the nonce, encrypt it, and then XOR it with the plaintext. CTM can be multithreaded, and is considered to be secure.
Stream vs. Block
Encryption can happen on blocks. Block operations are done on blocks of data, so transposition and substitution can be used. Alternatively, encryption can be used for streams, which limits the amount of data that can be encrypted at a given time, limiting it to substitution only.
Asymmetric Algorithms
Also known as public key cryptography. Asymmetric functions are built around hard-to-reverse math problems. They’re slower than symmetric encryption so not as well-suited to bulk encryption.
RSA
RSA is named after its inventors, Ron Rivest, Adi Shamir, and Leonard Adleman, and was invented in 1977. The algorithm is based on the product of two very large prime numbers, which is very difficult to factor. It’s almost 100x slower than DES, though.
You can use public key encryption (which is slower) to securely exchange symmetric keys, and then switch over to symmetric key encryption. This is known as electronic key exchange.
DSA
A digital signature is “a cryptographic implementation designed to demonstrate authenticity and identity associated with a message.” By means of public key cryptography, the digital signature algorithm allows traceability to the person signing the message because it uses their private key. Presumably, the person with the use of a key is the owner of that key. If you add hashing to it, you can assure the message integrity, too.
Diffie-Hellman
Also discussed in chapter 26, and in this blog post. It’s used for electronic key exchange of SSL and TLS protocols, as well as SSH and IPSec.
There are some variants and related terms. Diffie-Hellman groups determine the strength of the key used in the key exchange. The higher the group, the more bits in the key. Diffie-Hellman Ephemeral (DHE) is a variant where a temporary key is used, instead of the same key each time. There’s also ECDH, or Elliptic Curve Diffie-Hellman, where ECC is used to generate the keys.
Elliptic Curve
See? We’re going through each topic again. ECC is well-suited for platforms with limited computing power, like cell phones. ECC isn’t as well-tested as other methods, but still is considered secure. It also relies on the difficulty of certain mathematical relationships, so it would likely take a breakthrough in mathematics to weaken ECC.
PGP/GPG
Pretty Good Privacy (PGP) was created by Philip Zimmerman in 1991. It’s now a commercial product offered by Symantec, and makes clever use of both symmetric and asymmetric encryption methods to get the best of both worlds.
Hashing Algorithms
These are cryptographic methods that are commonly used to store computer passwords and to ensure message integrity.
MD5
MD stands for Message Digest. It’s a series of algorithms designed to create a message digest or hash from a given input. It uses a secure method to compress the file and generate a computed output in a specified number of bits. This was also developed by Ron Rivest. MD5 has had some collision issues.
SHA
Secure Hash Algorithm is another algorithm developed by NIST, and the NSA. There are several variants, including SHA-1, which was developed in 1993 and modeled on MD4. However, SHA-1 is vulnerable to collision attacks.
SHA-2 includes SHA-224, SHA-256, SHA_384, SHA-512. The hash output is equal to the number after “SHA.” There’s also SHA-3 but this is new and not widely adopted yet.
HMAC
Also mentioned in the previous chapter. hashed Message Authentication Code is a special type of hashing that includes a message authentication code to show message integrity. HMAC also has functionality to prove authenticity, as well.
RIPEMD and RIPEMD-160
RACE Integrity Primitives Evaluation Digest is also mentioned, but has collision issues. RIPEMD-160 is based on MD4 but isn’t necessarily that much stronger.
Key Stretching Algorithms
Key stretching is a way of taking a weak key and “stretching” it to make the system more secure. It does so by adding iterative rounds of computations that make brute forcing infeasible.
BCRYPT
This is a key-stretching mechanism that uses the Blowfish cipher and salting, and increases the number of iterations.
PBKDF2
Password-Based Key Derivation Function 2 is a pretty unfortunately named method, and acronym. It is meant to produce a key from a password, and utilizes salting and HMAC.
Obfuscation
Again, security by obscurity is not a solid strategy by itself. Still, obfuscating, or hiding the meaning or presence of a communication, can be useful.
XOR
This is exclusive OR. It’s a bitwise arithmetic operation. I’m not really sure why the book is pointing this out in this section. Many algorithms are built on XORing data with a key. A potential issue is where the key is shorter than the plaintext and has to be reused.
ROT13
Not an encryption method, really. This is a type of Caesar substitution cipher where each letter is replaced by a letter 13 places later in the alphabet. Definitely not secure.
Substitution vs Transposition
Not sure why they didn’t put this earlier in the chapter. Substitution ciphers substitute characters on a character-by-character basis, so the order of the characters doesn’t change. Transposition ciphers change the order of the characters.