Practice Exams:

CompTIA CASP+ CAS-004 – Chapter 03 – Implementing Advanced Authentication and Cryptographic Techniques Part 3

  1. Hashing

Hashing is another cryptographic concept that is probably familiar to some. It involves running data through a cryptographic function to produce a one-way message digest, aka a hash value. The size of the message digest is going to be determined by the algorithm that’s used, but it represents the data. It’s also referred to as non reversible encryption because it’s a representation of the data, but it can’t be reversed. You can’t use it to determine the original data, but it is unique. It’s unique based on the bits that were in the original value. And so the message digest is used instead not to support confidentiality, but to support and check data integrity.

So essentially, a one way hash function is just going to reduce a message to a particular hash value. And the recipient can compare the sender’s hash value with its own hash value to determine whether or not the message was compromised. If both the sender and the receiver have the same hash function, then they should get the same values based on the same input message. If they get different values, then the message has been altered. In fact, one bit that is off will produce a completely different hash value. All right, so it’s a way to check that integrity. Now, they do have some limitations. If an attacker intercepts a message that contains a hash value, the attacker might be able to alter the original message to create a second invalid message with a new hash value, and then they could send that second invalid message to the intended recipient.

At that point, the recipient would have no way of knowing if that they had received an incorrect message, because at that point, the recipient would perform hash value calculation and the invalid message would look valid because the invalid message was appended with the attacker’s new hash value, not the original one. So in order to prevent this from occurring, we use Mac, or message authentication code. It protects the hash value using a symmetric key. Encrypting the hash function with a symmetric key algorithm is going to generate that Mac. It doesn’t encrypt the message. It’s only used to protect the hash value.

Two major vulnerabilities that can occur are collisions and rainbow table attacks. A collision is when a hash function produces the same hash value on different messages. A rainbow table attack occurs when rainbow tables are used to reverse the hash through the computation of all the possible hashes to look up a matching value. So they’re not trying to reverse it. They’re just basically feeding multiple sources into rainbow table to try to produce the hash that matches the hash that was sent, which would, in a way, give me the ability to reverse it. So those are a couple of vulnerabilities, but as we said, hash functions are going to be utilized for data integrity.

  1. Message Digest (MD) Algorithms

The algorithms that are used for this are called Message Digest or just MD algorithms, and there are a number that are out there. The MD Two algorithm produces 128 bit hash value. MD Two is still in use today, but it’s much slower than the other options. MD Four also produces 128 bit hash value. It’s significantly faster because it only does three rounds of computations in comparison with 18 for MD Two, but it’s significantly declined because attacks against it have been very successful.

Like the other MD algorithms you have, MD 5128 bit hash values produced. It was originally created because of the issues with MD Four, and it’s a lot more complex, but it’s not collision free, and for that reason it shouldn’t be used for SSL certificates or digital signatures.

The US government requires the use of Sha Two instead of MD Five, but MD Five hash values are used in a lot of cases for software vendors. The MD Six algorithm produces what’s called a variable hash value, so it does a variable number of computation. And it was originally introduced as a candidate for Sha Three, but it was withdrawn just because of some early issues the algorithm had with differential attacks. Now it’s been rereleased, the issue has been fixed, but the release was a little bit too late, and so it’s not included in the Shah Three standard.

  1. Secure Hash Algorithm (SHA)

The secure hash algorithm is Sha. That’s what we’re referring to technically a family of four algorithms that are published by the NIST. The original version produces a 160 bit hash value after performing 80 rounds of computations. It was never very popular just because collisions were discovered and it was quickly replaced by Sha One.

Sh j One also produces 160 bit hash value. In the same way, it corrected the flaw in Sha Zero that made it susceptible to attacks. But we’ve kind of moved on from there. Sha Two is actually a family of hash functions. Each one of them provides different functional limits.

224 is a 224 bit hash value with 64 rounds of computations. 256 is 256 hash value. 384 512. Then you have 512 224 and 512 256. So those are actually producing a 224 bit and a 256 bit hash value. The 512 designation is actually the internal state size and that’s the same with with 256 sha Three. Like Sha Two is a family of hash functions. It does 120 rounds of computations by default. It was adopted formally in May of 2014.

  1. Digital Signatures

A digital signature is a hash value that’s encrypted with the user’s private key. So it’s using a hashing algorithm, but it’s also using asymmetric encryption. The user’s private key is used to secure the digital signature. So digital signatures provide authentication nonrepudiation, which is you can’t claim to have not sent something, as well as the validation of integrity.

A blind signature is a form of digital signature where the contents of the message are masked before it’s signed. So here’s the process of creating a digital signature. Step one, the signer is going to obtain a hash value for the message, for the data. Then they’re going to encrypt the hash value using the private key of that user. They’ll attach the encrypted hash and a copy of the public key, which is the corresponding key in the key pair to the data and send the message to the recipient.

The recipient then verifies the digital signature by separating the data, the hash and the certificate, obtaining the hash value of the data, and then using the public key to decrypt the encrypted hash value. At that point, the recipient would just compare the two hash values and see that the message had not changed. But in this way, a digital signature is used to protect those hash values.

There are a number of standards for this. The digital signature standard, DSS, is the federal digital security standard that governs the digital security algorithm. It generates a message digest of 160 bits. It’s slower than RSA. It only provides digital signatures, whereas RSA provides digital signatures encryption and secure symmetric key distribution. You also have elliptic curve DSA, which is a newer type of DSA standard.

  1. Message Authentication

A message authentication code Mac that we talked about earlier plays a role similar to code signing because it can provide message integrity and authenticity. You really should be familiar with three different types of Macs. The first is the hash Mac or HMAC. It involves a hash function with a symmetric key and provides us with data integrity as well as authentication. Any of the previously mentioned hash functions can be used within HMAC, with HMAC just essentially being propensity to the hash function name. So if it was Sha One, we would just call it HMAC? Sha One?

The strength of HMAC is going to depend on the strength of the hash function that includes the size as well as the value size. Excuse me. As well as the key size, you also have Cipher blockchaining Mac CBC Mac that involves a block cipher that operates in CBC mode that also provides data integrity and authentication. And then cipher based Mac or CMAC, which operates with a better set of mathematical functions than the CBC version. It addresses some security issues as well, and it’s approved to work with the Advanced Encryption Standard and Triple Des.

  1. Code Signing

Code signing is the process of developers and creators of code digitally signing, executables and scripts, so that when a user is installing that code, they can be sure that it’s actually coming from a verified author. And that hasn’t been altered in transit. So we know it comes from a trusted installer and we know it hasn’t been altered or corrupted. Java applets, ActiveX Controls, active Web and browser scripts, they often use code signing for their security, and in most cases, the signature is going to be verified by a trusted third party like Dogcart, Verisign and others.

  1. Additional Concepts

Some additional concepts that we need to be familiar with. Pseudorandom number generation, or PRNG, generates a sequence of numbers that approximates the properties of random numbers using an algorithm. In actuality, the sequence itself is not random because it’s derived from a pretty small set of initial values. As security professional, we should be able to recognize that there are some issues that can arise that could be resolved by using pseudorandom number generation. So if an enterprise needs a system that produces a series of numbers that don’t have any discernible mathematical progression, say, for a Java based website or something, pseudo random numbers should be generated at Invocation by Java. Then you have perfect forward secrecy, or PFS, which ensures that a session key is derived from a set of long term keys that can’t be compromised if one of the keys is compromised. In the future, the key can’t be used to derive any additional keys if it’s compromised.

If the key is derived from some other keying material, then the keying material must not be used to derive any more keys. So compromise in this case of a single key would permit access only to data protected by that key. So there’s two conditions that PFS would require. One is that keys are never reused. And then two is that new keys are not derived from any previously used keys. It’s incredibly important to know when to implement this. All right, if a security audit has uncovered some encryption keys used to secure, say, financial transactions with a partner and has identified that they may be too weak, well, then the administrator should implement PFS on all VPN tunnels because that’s going to make sure that financial transactions won’t be compromised if a weak encryption key is found. PFS is primarily used in VPNs, but it can also be used by web browsers as well as other services and applications.

  1. Protecting Data in Transit

That brings us to Data in transit encryption protecting information. Data is important when the data is in transit, when it’s at rest, as well as when it’s in memory. Data in transit is going to require transport encryption, so we’re making sure the data is protected when it’s transmitted over the network or over the Internet. So this type of encryption can protect against network sniffing attacks. We have secure Sockets Layer TLS http with https secure electronic Transaction 3d secure and IPsec We’ve already mentioned a number of those and so we don’t need to necessarily go into more detail. Let’s focus on a couple that we haven’t mentioned secure Electronic Transaction or SCT.

This was proposed by Visa and Mastercard and intended to secure the credit card transaction information over the Internet. It was based on X 509 certificates and Asymmetric keys and used an electronic wallet on the user’s computer to send encrypted credit card information. But in order to be fully implemented, this would have required the full cooperation of your financial institutions, the credit card users, your retail establishments, your payment gateways.

Because of that, it was really never fully adopted. Visa now promotes the 3d secure protocol instead of set. That’s an XML based protocol just designed to give us an additional level of security for online transactions for credit cards and debit cards. You may not recognize 3D Secure, but you probably recognize is the name verified by Visa. That is, the Visa implementation of this and the implementation of 3D Secure by Mastercard is called Secure Code. And then of course, we did talk in detail about IPsec and the various protocols that are a part of that suite that help us to provide transport layer security.

  1. Protecting Data in Memory

It’s also important to encrypt data in memory. A data in memory processing is just an approach in which all data in a set is processed from memory rather than from the hard drive. It essentially assumes that all the data will be available in memory rather than just what’s been most recently used. It’s usually done using Ram or cache memory that’s going to result in faster reporting and decision making in business, but securing this requires encrypting data in Ram. Windows offers the Data Protection API, or DP API, which lets you encrypt data using the user’s login credentials. One of the key questions is just where to store the key, as it’s typically not a good idea to store it in the same location as the data. Intel’s version of this is software guard extension that ships with the Sky, Lake and newer CPUs. It lets you load a program into your processor, verify that its state is correct, and then protect its execution. So the CPU is automatically encrypting everything, leaving the processor that is, everything that is then offloaded to Ram. And it ensures in that way, it ensures security.

  1. Protecting Data at Rest

Data at rest refers to data that’s stored physically in any digital form that’s not active, so it’s on the hard drive. This data can be stored in databases, technically, data warehouses, archives, tapes, offsite backups, flash drives, technically any storage medium. And data at rest is most often going to be protected by using data encryption algorithms. The algorithms that are used in computer systems are going to implement very complex mathematical formulas and they’re used to convert plain text into cipher text. The two primary components of any encryption system are going to be the algorithm and the key. Now, in some cases, the communicating parties use the same key. This is known as symmetric encryption. In other cases, they use different keys, but the keys are related to one another. That’s known as asymmetric encryption.

So we’re going to look at these two as well as the hybrid cipher. Symmetric algorithms use a private or secret key that’s shared between the two parties. This key has to be kept secret because the same key that’s used to encrypt the data is used to decrypt the data. But each party pair has to have a separate private key. And so sometimes this is called a session key. It does provide confidentiality, but it does not provide authentication or nonrepudiation. It goes by a few different names and those are single key, share key, or secret key cryptography. We’ll go through the different versions of this, but you’ve got Des, AES, Blowfish, two Fish. Those are some of the symmetric key algorithms.

Asymmetric key algorithms use both a public and a private key. In the pair, the keys are related to one another, but it is not possible to derive the private key from the public key. That would completely eliminate the usefulness of the encryption model. The public key is known to both parties. The public key is made accessible.

The private key, on the other hand, is secure. So if I want to send you a secure message and have you be able to decrypt it, I can encrypt that message with my own private key. And then as long as you have access to my public key, you would be able to decrypt that message. Now, that’s not typically how it’s done because you should see the flaw in that because technically, anybody who can access my public key would be able to decrypt that message.

Well, we just said my public key is public. Now, it’s not public in the sense that anybody can get access to it. They’d have to actually be connecting and locating it. But it’s usually done in the reverse. If I want to send you a protected document, I would encrypt the document with your public key because I have access to that. And then it ensures that only you with the corresponding private key can actually decrypt that information. So that’s typically how we’re going to see asymmetric key encryption used.

And then, as we said, the third one is the hybrid cipher. Because both symmetric and asymmetric algorithms have weaknesses, there have been solutions that use both types in a hybrid cipher. And so that way it provides confidentiality authentication and nonrepudiation. In many cases. What happens here is a symmetric key is generated, but the symmetric key is then passed in a secure fashion using asymmetric algorithms.