exam
exam-1
examvideo
Best seller!
AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) Training Course
Best seller!
star star star star star
examvideo-1
$27.49
$24.99

AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) Certification Video Training Course

The complete solution to prepare for for your exam with AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) certification video training course. The AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) certification video training course contains a complete set of videos that will provide you with thorough knowledge to understand the key concepts. Top notch prep including Amazon AWS Certified Security - Specialty exam dumps, study guide & practice test questions and answers.

86 Students Enrolled
165 Lectures
21:41:00 Hours

AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) Certification Video Training Course Exam Curriculum

fb
1

Getting started with the course

1 Lectures
Time 00:05:00
fb
2

Domain 1 - Incident Response

11 Lectures
Time 01:28:00
fb
3

Domain 2 - Logging & Monitoring

29 Lectures
Time 03:58:00
fb
4

Domain 3 - Infrastructure Security

41 Lectures
Time 05:38:00
fb
5

Domain 4 - Identity & Access Management

47 Lectures
Time 06:35:00
fb
6

Domain 5 - Data Protection

31 Lectures
Time 03:18:00
fb
7

Important points for Exams

5 Lectures
Time 00:39:00

Getting started with the course

  • 04:32

Domain 1 - Incident Response

  • 03:18
  • 07:24
  • 07:28
  • 08:20
  • 04:33
  • 12:19
  • 02:38
  • 15:59
  • 05:01
  • 11:18
  • 06:27

Domain 2 - Logging & Monitoring

  • 05:17
  • 08:02
  • 08:23
  • 08:38
  • 06:05
  • 06:42
  • 09:04
  • 07:53
  • 06:20
  • 12:47
  • 06:06
  • 04:22
  • 10:38
  • 03:38
  • 08:00
  • 04:15
  • 07:53
  • 08:54
  • 11:54
  • 09:21
  • 13:20
  • 09:17
  • 09:17
  • 08:40
  • 02:18
  • 08:39
  • 07:52
  • 07:15
  • 13:41

Domain 3 - Infrastructure Security

  • 11:01
  • 05:45
  • 05:12
  • 03:19
  • 06:15
  • 09:26
  • 13:49
  • 04:43
  • 07:40
  • 12:56
  • 09:17
  • 13:39
  • 10:50
  • 04:51
  • 06:30
  • 12:38
  • 11:57
  • 05:15
  • 13:48
  • 07:54
  • 05:31
  • 07:47
  • 07:22
  • 05:23
  • 09:07
  • 06:53
  • 09:55
  • 08:20
  • 04:05
  • 09:50
  • 15:12
  • 09:32
  • 04:18
  • 05:46
  • 04:24
  • 05:42
  • 10:48
  • 05:36
  • 02:45
  • 10:48
  • 06:15

Domain 4 - Identity & Access Management

  • 06:17
  • 08:34
  • 05:15
  • 11:33
  • 15:06
  • 11:56
  • 07:02
  • 10:02
  • 11:54
  • 10:56
  • 07:56
  • 07:52
  • 09:56
  • 04:15
  • 07:24
  • 03:55
  • 07:08
  • 07:24
  • 04:21
  • 16:55
  • 04:18
  • 06:08
  • 08:02
  • 14:09
  • 07:19
  • 20:33
  • 06:15
  • 11:23
  • 10:28
  • 04:13
  • 04:16
  • 09:17
  • 07:25
  • 10:34
  • 07:31
  • 10:59
  • 09:05
  • 13:47
  • 06:02
  • 07:37
  • 09:38
  • 01:47
  • 04:46
  • 05:28
  • 04:24
  • 01:24
  • 03:32

Domain 5 - Data Protection

  • 12:15
  • 07:07
  • 06:36
  • 09:14
  • 08:20
  • 07:26
  • 09:26
  • 04:00
  • 04:51
  • 03:21
  • 05:35
  • 03:01
  • 04:25
  • 01:56
  • 03:08
  • 00:44
  • 01:41
  • 12:08
  • 07:45
  • 08:20
  • 07:32
  • 11:55
  • 08:42
  • 08:59
  • 05:23
  • 03:08
  • 04:54
  • 07:40
  • 02:57
  • 05:59
  • 10:07

Important points for Exams

  • 05:49
  • 08:16
  • 09:21
  • 05:45
  • 08:50
examvideo-11

About AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) Certification Video Training Course

AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) certification video training course by prepaway along with practice test questions and answers, study guide and exam dumps provides the ultimate training package to help you pass.

Domain 2 - Logging & Monitoring

23. Trusted Advisor

Hey everyone, and welcome to Part 2 of AWS Config. In the previous lecture, we discussed the fundamentals of AWS configuration and how AWS conflict can assist us in tracking infrastructure changes. So today we look into more features of AWS Config, and there is one very amazing feature called "compliance check" that Config provides. So let's understand what that means. So again, just infrastructure-related changes. Monitoring is not enough.

As a security specialist, we should be monitoring the security aspect as well. So there are various use cases related to the best security practices. Root MFA, like all S buckets, should be enabled. Security groups should no longer have access to port 22 (or possibly another port such as 3306, etc.). The cloud trail must be enabled. Now, one more rule: no unused EIP should be present. So this can be part of the cost factor as well. So these are the five points related to security, as well as cost optimization, which are important. Now, how do you actually monitor all of these things? This is only a sample of five.

There can be hundreds of different points. So there should be some kind of centralised dashboard that can say that your account is compliant with all of these rules. And this is what AWS Configure allows us to do. So again, based on the use case that you configure, AWS Config can show you the compliance status. This is the compliance status that is currently restricting SSH. You see, it is compliant. So SSH is restricted; it's not open to all the EIP that are attached. So that means there is no unused EIP. You have. Root MFA is enabled. So usually, it is compliant. However, there are certain resources that are noncompliant here. So, simply by inspecting the controls, you can determine whether your infrastructure is compliant or not. And generally, if the auditor comes, you can directly show the auditor this page, provided you have all the documentation over here.

So this is what AWS Config allows us to do. So let's look at how we can configure these rules. So, going back, let's go to the AWS configuration. Now these are the resources in inventory. Look at the first tab over here, which says rules. And by default, Amazon gives us a lot of rules that we can use within our infrastructure. So, for timing, there are 32 rules that are included by default in your configuration data. So these rules basically check IAM, EC2's two instances, root MFA, S3's three buckets, and so on. So let's do one thing: let's enable certain rules out here. Let me enable the EC's detailed monitoring. Okay, let me enable this particular rule. Okay, so it is being evaluated. Let's add a few more rules over here. Let's see; let's go to the next part. OK, three bucket logging is enabled, and three bucket versioning is enabled.

As a result, we want versioning to be enabled in all S3 buckets. So I'll click on "Save." I'll add this particular rule as well. I'll click on "add rule." Let's add a few more rules. Let's see. Cloud Trail is enabled. This is again a very important rule that should be there. So I'll add this rule. Let me add a few more rules so that our dashboard looks pretty nice. Okay, let me go to the next step, the EIP attached. Again, this is very important because specifically for free tires, if you don't have an EIP that is attached to the EC2, you will be charged for that EIP. So, keep in mind that this should be present in at least three tyre applications. A lot of people get charged because they don't have EIP attached to any EC2 instances. So just remember that you should have an EIP attached. I'll click on "Save." So we have around four rules here, and you see that it is showing me the compliance as well as the noncompliance status. So for EC-2 instance, detailed monitoring, it is saying that it is a non-compliant error, and there are three resources that are not compliant. Three-bucket versioning is enabled. Again, there are two noncompliant resources.

Cloud trailing is enabled. Yes, we have a cloud trail enabled. So it shows me as compliant, and it will also tell me whether or not the EIP is attached. So this is one of the ways in which you can configure the rules of AWS configuration. Now, again, as we discussed, there are around 32 default rules that comes. Now, what happens if you want to add more rules? You can, of course, add more rules. You can put those rules in lambda, and you can connect those rules with a configuration service. So here you see that there is one EIP that is not attached. Okay, this is dangerous because I will be charged for this particular unused EIP.

So I should be removing the EIP, and you should be doing the same if you have an EIP that is not attached. So there is one noncompliant resource that you see. I have four EIPS, among which there is one that is noncompliant. So let me go to this particular EIP. Okay, so this is the EIP.

Let me actually go to the EC2 and ElasticIPS and paste the EIP, and you will see that this EIP is not attached to any of the instances. So why keep it? Just release it; you'll save the cost and I'll release this particular EIP. So this is the basic information about the AWS configuration. Now, there is one more important thing that you should remember. We already discussed the CIS benchmark, and there is a very nice GitHub repository that contains a lot of AWS configuration rules that you should have within your AWS account. Specifically, if you're running the production service, security is something that's important to you.

So if you go to the rules.md file over here, this file basically tells you what the rules are that are present within this particular GitHub repository. So, as you can see, there are a lot of rules related to the IAM password, policy, key rotation, whether the im user has MFA enabled or not, whether the VPCflow log is enabled, and so on.

So there are around 34 rules that are present over here, and there are around 32 rules that are present by default within the AWS config repository. So, as long as AWS keeps updating this rule set, you can add the rules or, for the time being, it does not update. You can write your own rules within the lambda function. So that's the lowdown on the conflict resolution service. I hope this has been useful for you, and I will really encourage you to practise this once. And if you are managing the AWS infrastructure of an organization, I really recommend that you have some kind of dashboard where it shows you, hopefully, compliance for all the resources. So this is it. I hope this has been useful for you, and I'd like to thank you for viewing.

24. CloudTrail - Log File Integrity Validation

Hey everyone, and welcome back. In today's video, we will be discussing the Cloud Trail log file validation. The CloudTrail logs are now a critical asset within your organization, particularly if there is a breach or something unexpected that occurred within your AWS account. Typically, if you look at organisations that use AWS, particularly enterprises, you will find multiple teams with admin access to your AWS account. They may include the SOBS team, the SRE team, the security team, and others. In such cases, it is critical to be able to validate whether or not your log files, which Cloud Trail delivers to S bucket, have been tampered with.

And the Cloud Trail log file integrity validation actually allows us to determine whether a log file was modified, deleted, or left unchanged after a cloud trail delivered it to an S3 bucket. This specificity can now be achieved with the hashing feature of Shard 256 as well as the RSA for digital signature. So with this, I do remember that we do have a great course on PKN and cryptography on the list. At the very least, it should be completed this year. So let's go ahead and understand this in a practical way. This is quite an important feature, and in your AWS account you should be enabling it. So let's get started.

So I'm in my Cloud Trail console; let's go to the trails, and there are three trails available. Each trail is specific to a region. Now that we are in the Oregon region, we'll take the Oregon Trail over here. Let me open up this trail, and if you go a bit down, you should see that there is an option to enable log file validation, and it is selected as yes. Now, typically, when you go ahead and create a trail, let me quickly show you how. So when you create a trail within the advanced section, there is an option to enable log file validation. Ensure that this is always set to yes. Now let's do one thing. Let's take a look at how this specific trail stores data in a bucket called KP Labs Cloud Trail, Oregon. So this is the bucket. Now if you go into the US West Zone here, you should see that there are a lot of log files that are being delivered. Great. So let's go back to the bucket level, and within the AWS logs, within the account ID, you should see that there are two directories that are available. One is the Cloud Trail hyphen digest.

And second is Cloud Trail. So the Cloud Trail Hyphen Digest actually contains the digest file, which is used for validating whether the log files within the trail bucket have been tampered with or not. So you can see that there are two digest files present over here. Now, before we go ahead and understand more, let me go ahead and download these digest files. I'll go ahead and download it. Great. So let's quickly open the JSON file, and this is what the JSON really looks like and what we'll do. Let's copy it to your JSON validator. So this is a pretty good website that validates the JSON as well as formats it. So let's paste it here and I'll walk you through the steps. And here you will be able to see things in much better detail. So if you see over here, it actually tells you that this is an object file. So this is a file that has a cloud trail, and this is the hash value associated with the file. And here you have the short 256.

Again, there is one more file over here. There is a hash value, and you have a short of 56. This is now directly related. Let me go a little higher. So here I am, heading to Cloud Trail. Great. So you see, there are a lot of files that are available, and within the digest, you will see the digest, which is the hash value associated with each of these files over here. So if any of these files have been tampered with, you will be able to detect it pretty quickly. So let's try it out and see how you can detect it. So I'm in my Cloud Trail CLI, and there are a few commands that we are more interested in. The first one would be the description of trails, and the last one is the validation of logs.

So the "describe trail" basically gives you information about your cloud trail, including the ARN. So let's put it within the CLI. So let's do the AWS Cloud Trail. Describe the hiking trails in our area. West two. Now this will give you the trail. So the trail name associated with the USWest Two is KP Lapse Cloud Trail, Oregon. This is the trail ARN, which is useful while we are validating. It also says that log file validation is enabled. Great. So once we have this, let's go ahead and look into the Cloud Trail validation logs. Now within the validate logs menu, there are three options. So you've validated the logs. So this is a CLI command. It has two options, in fact. One is the trailing ARN, and the second is the start time. So let's try it out. So, say, AWS.Cloud trail. Validate logs. I'll say trail ARN.

So we can just copy the trail ARN from here, and the next thing is the start time. So let's get started. If you go into the example, this is how the format is; let me just copy it up, and what we'll do is just modify it. I'll change the year to 2019, and the month and date to zero one. Now let's press Enter. Oops, we made a little typo. It should be zero one.So now it is going ahead and validating the log files that are part of this trait. And it says that two out of two digest files are valid, and seven out of seven log files are valid. Great. Now, in order to verify if things are working as expected or not, what we'll do is intentionally delete a specific log file within the XD bucket. So I'll take the first file, give permission, and delete it from the history bucket.

Great. So once the file is deleted, let's try and rerun the same thing. And now, if you look into this, it says that six out of seven log files are valid and one out of seven log files is invalid. And it is also saying that this specific log file is not found. Now, the reason why it is able to find this specific entry is because of the data that is present within the digest file. So these are the details that are associated with this. Now, one smart guy might ask, "Hey, I can in fact modify this specific digest, and the log file validation will still succeed." and the answer is no. Now, let me quickly show you what Cloud Trail does, let me quickly show you here. So, what Cloud Trail does here is sign each of the digest files with the associated public and private key pairs.

The digital signature of the previous digest files is also included in the next digest file. Now, if we are talking about the public and private key, let's go to the Cloud Trail CLM, where there is an option to list the public key. In fact, you can validate the digest with the help of a list of public keys if you have returned a specific function. So let's try it out. I can also say that AWS CloudTrail lists public keys and regions as US West too. So you see here, it gives the value associated with the public key, through which we'll be able to validate the digits as well. So this is a high-level overview of the Cloud Trail log file integrity validation. I hope this video has been informative for you, and I look forward to seeing your next video.

25. Digest Delivery Times

Hey everyone, and welcome back. Now, in the earlier video, we were discussing the cloud trail file integrity monitoring for the logs. Now, let me quickly demonstrate something that I intended but missed in the previous video. So this is not required for the exam, but if you implement this in the real world, I still remember that one of the colleagues came back to me. He had enabled the log file validation, and he tried and deleted a log file, and he said that the validation was still succeeding even though the logfile was modified from the SD bucket. And the question is: how? Let me quickly show you what exactly he meant by this.

So if you quickly do a cloud trail log file validation here, it says that six out of seven log files are valid and only one out of seven log files is invalid. This is something that we already saw because we had deleted one log file. Now, if you look into the cloud trail console, there are more than seven files. That is something that we already see. I'm guessing there are around 20 files. Now the question is: why? It is only looking for seven files. Now, the answer to that is that the digest in the cloud trail bucket gets delivered on an hourly basis. So let's assume that you have a log file over here. So this is at 12:31. If you look at the timing, it's 12:41.

So this log file is currently present. However, there is no digest that has yet been generated for this log file. And even though you can tamper with this right away, let me go ahead and delete this, and let's try to run this command again here. You see, it is still showing. Six out of seven log files are valid, and one out of seven log files is invalid. The reason is because, although we have tampered with the digest associated with this, new files have not reached the cloud trail bucket yet. Once the digest has been pushed to the cloud bucket and then you do a log file validation, then all of these changes that we have made will be discovered. So this is an important point to keep in mind. So that's about it for this video. I hope this has been informative for you, and I look forward to seeing you in the next video.

26. Overview of AWS Macie (New)

Hey, everyone, and welcome back. In today's video, we will be discussing the AWS Macy. So AWS Macy is a new security service that makes use of machine learning to identify and protect the sensitive data that is stored in AWS from things like breaches, data leaks, and unauthorised access.

Now, Mesi can automatically discover and classify the data posts, assigning them a business value, and monitor them to detect any suspicious activity. So this is something similar to—you can say if you work in user behavioural analytics that it does something similar to—that. Now, the great thing about Macy is that it can automatically detect sensitive data from the data store that it supports. So let's say that S-3 can contain a lot of sensitive information. For example, it may contain PII data in an unencrypted form. It can contain the database backups, the SSL private keys, the access secret keys, and various others.

So, nowadays, when you talk about Enterprise, it stores, or I should say dumps, massive amounts of data to S 3. Now, that data can contain a lot of critical information, and you don't really want critical information to be stored in S3 in a nonencrypted manner. And now the question is, let's say you have 100 accounts, and 100 accounts have a lot of "three buckets." How will you go ahead and go into each and every object within each and every S-3 bucket in 100 accounts and validate the sensitivity of the data there? It is really a tedious task, and Aidan Macy does a great job there. So let's understand this with a simple example. Let's say that the DevOps team within your organisation has added a static website to your S Threebucket that can be accessed by the public. However, the application developer who has created that static website has hard coded the AWS credentials, which are the access and secret key within the static website.

So Macy will be able to detect that quite easily. So let's do one thing. This is a theoretical perspective. Let me quickly show you the practical demo so it becomes much more clear. So this is my AWS Mac dashboard, and this is what it really looks like. Now, if you look into the main console over here, it says "Critical Asset." And within you, you have one critical asset. You can see that there is a lot of information in this document. like there's a double access key, AWS credentials, context, and an AWS secret key. Now, the event that is detected by Macy generally has a risk score associated with it. So you can basically specify the risk score that you are intending to have. So now, within your dashboard, you see there are a lot of options available. This is the "Select Time Range" object for selected time range.

Then you have S three objects, and then you have S three objects by PIL. You have various other options, which are available over here. Let's quickly explore all of them. So within the S-three objects, you see, there are three S-three monitored objects. One is the access key, one is the credential context, and the second is the secret key. We will understand this more in the upcoming screenplay. So this specific option is the "S Three" object by PII. So PII here is showing you a few objects that contain credit card information. So let me click on "search" over here, and if you go a bit down, you will see that it is basically giving you the object name that contains the PII data. And each object also has a risk score that is associated with it. So let's click on one of them. And now it basically tells you the bucket name where the object is present.

Demo public PII bucket is written on it. It also shows you the name of the object, which is CC, and the user, TXT. Then it gives you the details, saying that there are three distinct credit card numbers found in the 66 characters of text. So basically, AWS Macy has the capability to open and read through the files that are stored in S Three.Again, if it is encrypted, then it will not be able to do so, but otherwise, there are certain extensions that it will be able to support. And if you go back down, you see that this data is subject to Amazon S3 three-block public access settings.

So it can even read the ACLs related to the S Three. So let's go back to the dashboard. I'll quickly show you the demo so that it's easier for us to understand. So this time, let's click on "three public objects" and "bucket." So here it basically shows you the list of "three buckets" where objects are open to the public. Now for a few more interesting things. We'll keep the demo short; otherwise, it will go quite long. So, next is the activity location. So basically, it shows you the location from which the activity is happening. So, for example, if someone logs in or performs an activity from us, you can see that certain activities are detected. It basically shows you the exact city name from where the activity is currently taking place in India.

So I'm currently debating whether to travel to Mumbai for a few meetings. So this is the reason why it detected the location as Mumbai. Great. So now let's go to the alerts. Now, within alerts, there is one critical-level alert. It appears that AWS credentials have been uploaded to Amazon SA. Now. You should never have AWS credentials uploaded to Amazon s three. That is also true in the case of the plaintiff. So if you click on this specific alert, it tells you that this is a specific bucket, the risk is ten, the theme is access key and secret key, and it also gives you the ARN. So now, if you go a bit down, you see this is the risk score, and the file name is Access neww.TXT.So if I click on that file, it basically gives you all the information related to that. So let's do one thing. Let's quickly open up the file and look at what exactly it might look like.

So I'm in my S-3 bucket, and this file is called Access New TXE. When I open the file, you'll notice that it essentially contains the AWS access key and the secret key. And this is the reason why Macy was able to detect it. And it also sent us an alert for that. Now, the question is, what are the things that Macy will be able to detect? We saw that it was able to detect the credit card information. It was able to detect the access and secret key. So basically, if you want to look into the settings, let's look into the regex. So, yes, there is a lot of reggae, and each genre has an associated risk score. So you have the Bank of America routing numbers, you have the CVE numbers, and if you go a bit down, it contains the MySQL database. Dumb. Then let's do one thing. You see, this is the interesting one. It can detect the RSA private key.

It can contain SSL certificates. We already discussed the AWS access key, credentials, contacts, and secret key. Now, the secret key here has a risk score of ten, and this is the reason why it was able to alert at this level of criticality. It can even detect the GitHub key, the Facebook key, and various others. So I hope you understand at a high level what Macy is trying to achieve over here. Now, Macy has the limitation that, because this is a new service, it has a limitation on the data store that it can scan. So it cannot really scan the entire data store. Currently, it can scan the data that is present inside the S3 bucket and even analyse the cloud trail logs.

27. Creating our First Alert with AWS Macie (New)

Hey, everyone, and welcome back. Now, in the earlier video, we had a very high-level overview of AWS Macy. We also had a practical demo, which allowed us to get a better understanding of what the service is all about. So in today's video, we'll go to the interesting part of performing the practical. We'll look into how we can enable AWS Macy, as well as various other configurations associated with it. So I'm in my AWS management console. Now let's go to the services, and we'll open up the Amazon Marketplace. Now, this is the dashboard that we saw in the earlier demo. So this dashboard is basically for the North Virginia region. Now, Macy currently supports only two regions. More should be coming. However, for our practical purposes, we'll switch to the Oregon region so that we can start from scratch.

So, whenever you launch the Amazon app, this is what you'll see. So you need to click on "Get Started." Now, Macy will need certain service role permissions. So these are the permissions that it can automatically create. We'll leave it as is, and then click Enable Macy. So it went ahead, and the enable process began. It might take a minute or two for it to be a minute or twSo the Mac has been enabled. Now, the next thing that we already discussed is that Macy collects the data from various sources, which includes cloud trait. It also has the capability to scan the data within the S3 data store of S Three.So currently, if you look into the integrations, this is the account. So this is my current account, which is present over here. Let me click on "select."

And you can basically specify which S Threebucket Macy should be able to scan here. So before we do that, let's go to the S-3 service. And here I'll create a new bucket; let's call it the KP Labs PII demo. For simplicity's sake, I'll create it in the Oregon region. All right, so currently, this is the bucket. However, the bucket does not really have any data over here. So we'll go ahead and add a simple text file. Now, there are two text files that we'll be adding. Now, the first text file basically contains the access and secret key over here. And the second text file also contains the access and secret keys. Now, there is a reason why we are uploading two TXT files. We will understand this once Macy goes ahead and scans the text documents. So the first one is called the "Access Key." Then you have New York access keys.

So let's go ahead and upload both of them. All right, so both of these files have been added to the S3 bucket. So now let's go to the Amazon store. We'll quickly refresh the page in case it does not detect the latest changes. Now, within the integrated S Three resource, with Macy, we'll click on "Add," and here I'll select the Kpops PIIdemo S Three bucket that we have currently. So let's go ahead and add that specific bucket. I'll do a review and a preliminary classification. Great. So it says that the setting is updated. Now, if we go to the dashboard over here, within the dashboard you have the critical assets listed, and basically you don't really have any data. As a result, Macyto requires some time. Go ahead and initiate the scan and go through the objects that are present in the S-3 bucket. So I'll pause the video for a few minutes, and we'll come back once the data has been scanned. All right, so it has been close to 10 to 15 minutes.

And, as you can see, our MacBook dashboard is now illuminated. Now, within the dashboard, you have an access-secret key that has been detected. So we already discussed that there are a lot of sub-tabs that are currently available currently. Let's go ahead and select S three objects by PII, and here you do not really have any S three objects by PII. So PII data can include things like email address usernames, your driving license, and so on. So we are more interested in alerts here. Again, you can go ahead and explore all of the sub-tabs that are available over here. We are more interested in alerts. So here you have the alerts related to AWS credentials uploaded to Amazon s three. So if I go ahead and click here, it basically gives you the entire information, and it also gives you the file.

Now, within the file, it is just showing you the access key TXT; it is not showing you the other file. So, if you remember, we had uploaded two files. Now, the second file also contains the access and secret keys. However, it was not identified as a critical level risk in this location. Now the question is: why? And the answer to that lies in the settings tab. Now, within the Settings tab, you have a lot of options that are available over here. Currently, we'll explore the role to understand the use case. Now, within REGX, there are a lot of reggas that are being written by Amazon. As of now, you do not have the option to create your own regex. So basically, we are more interested in accessing the secret key because the secret key has a risk score of 10.

So if we look into the query over here, you will be able to detect all the objects that pertain to the specific reggae. As of now, it can only detect the access hyphen KeyWe TXT. So let's go back to the settings. We'll go to the reggae. Let's say AWS secret key, and I'll click on Edit this time. Now, whenever you click on Edit, it basically gives you a lot of information. The regex over here is one of the useful pieces of information here. Now, this specific regex looks for certain text within the document. So it is basically looking for "let me show you." It is basically looking for the AWS underscore secret, underscore access, and underscore key. If this portion is not present, it will not be considered a risk of 10. So this is something that is important to understand, even for credit card information. If you do not have a keyword like "credit" or "debit," it will not by default detect it as a higher risk score.

So this is a little caveat that you should understand. Anyways, once Macy allows us to customise the regression and various other factors, we will be able to implement the searchability of the data, which follows according to our use case. And last but not least, you have the integration. We already discussed that. As of now, the Mac can look into the data store, which is in S 3 buckets. So it can go ahead and scan the files, which are part of the S-3 bucket. There can be support for additional data stores in the future, but as of now, users have S three. So with this, we'll conclude the video. I hope you have a general understanding of what Amazon.com does and its associated capabilities. With this, we'll conclude this video, and I look forward to seeing you in the next video.

Prepaway's AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) video training course for passing certification exams is the only solution which you need.

examvideo-13
Free AWS Certified Security - Specialty Exam Questions & Amazon AWS Certified Security - Specialty Dumps
Amazon.selftesttraining.aws certified security - specialty.v2023-09-18.by.khalid.195q.ete
Views: 126
Downloads: 396
Size: 2.18 MB
 
Amazon.passcertification.aws certified security - specialty.v2021-12-14.by.thea.191q.ete
Views: 102
Downloads: 903
Size: 1.5 MB
 
Amazon.passguide.aws certified security - specialty.v2021-04-30.by.callum.149q.ete
Views: 678
Downloads: 1252
Size: 1.13 MB
 
Amazon.testkings.aws certified security - specialty.v2021-02-12.by.gabriel.145q.ete
Views: 340
Downloads: 1224
Size: 1.25 MB
 
Amazon.braindumps.aws certified security - specialty.v2020-09-12.by.lily.99q.ete
Views: 482
Downloads: 1429
Size: 621.68 KB
 
Amazon.test-inside.aws certified security - specialty.v2020-04-09.by.max.84q.ete
Views: 599
Downloads: 1661
Size: 509.5 KB
 
Amazon.examlabs.aws certified security - specialty.v2019-12-05.by.jenson.83q.ete
Views: 792
Downloads: 1845
Size: 514.51 KB
 
Amazon.test-king.aws certified security - specialty.v2019-02-28.by.lala.33q.ete
Views: 1102
Downloads: 2142
Size: 95.24 KB
 

Student Feedback

star star star star star
48%
star star star star star
52%
star star star star star
0%
star star star star star
0%
star star star star star
0%

Add Comments

Post your comments about AWS Certified Security - Specialty: AWS Certified Security - Specialty (SCS-C01) certification video training course, exam dumps, practice test questions and answers.

Comment will be moderated and published within 1-4 hours

insert code
Type the characters from the picture.
examvideo-17