Practice Exams:

SPLK-1003 Splunk Enterprise Certified Admin – Splunk Post Installation Activities : Knowledge Objects Part 4

  1. Tags Creation

The next knowledge object in our discussion is Tags. Tags are also one of the knowledge objects which are used to enrich the data in Splunk. The tags can be used only with the field value combination. We will see, when we are creating tags, it always requires a field name and the value that that is expected for this tag to be applied. It is always created for a field value pair and you can always assign any number of tags to a fields or Event values. In this lab we will be seeing how to create tags, share this and search using our newly created tags.

And also you’ll be seeing how tags will be stored in the configuration files. So let us go to our lab. This is our search ad. In order to see all the tags which are present, go to Settings, click on Tags and you’ll be able to see these tags. You can list them by three different methods, but we’ll list them by tag names just to see if there is any tags that are by default present in our Plains Flunk installation. As you can see, by default there are no tags. We’ll start creating them one by one. As I said, for creating tags, we always, almost always we require a Field value combination. Even though if you write a search and if you want to tag this, you will not be able to tag this from this method. But since from our previous lecture we created a new field that is our Event types, let us access our Event types. This is nothing but a new field that we added to our Splunk, which contains a field name which is your Event type and Field value access log.

We can see we have three event types. Now let’s start tagging this field. This is our Event types where we have created previously. And you can click on one of the field type if you want to edit those and I’ll add it. Let us choose the access combined. The first event type we can add tag values, any number of them separated by comma. I’ll say tag is equal to these are Apache logs. These are complete logs. These include errors. Any information that can add value to your event type, you can add it here. I’ll just save it here. These are the three tags that have been added to our access log. Event Type I’ll go to Access log underscore 200, where here we have a tags with OK response, that is response was or the request was successfully processed. So I’ll say the tag as OK.

And this is our Apache logs. Again, this doesn’t contain error. Without error I have saved this information and also I have none 200. These are again Apache logs and these might contain some errors. That is our 504 series of Http response. These are specific to your environment. You can also add a couple of tags like Production prod machine, prod Apache logs, or staging or even QA. Any value that you gain from as part of System Admin or as part of Splunk Admin during the integration that can add value or add more information regarding this logs or device or what this event type does. We will be heading it here.

  1. Manual Creation of Tags

We have created couple of tags here. Let’s see how they will be populating additional fields in our search index is equal to main, source type is equal to access combined with Cookie. This was our base search without event types or tags. As we can see newly added event type types. I’ve shown up here. I’ll just quickly catch hold of our tags. Even tags will be displayed as you can see. Now we have our newly added tags which are included. So let me select this too. These you can refer it quickly so that if I want any specific tags I’ll click on them and I’ll get only success logs from my Apache. So you can just write your search by just mentioning tag is equal to OK similarly, these tags can be of technology based that is your tag is equal to web which indicates all the web logs.

Tag is equal to Windows which indicates all your windows log. Let’s say I can add multiple tags, tag is equal to Windows and tag is going to error which should give me Windows errors. Similarly, here we can have multiple tag conditions that are including errors which is our next perspective and without errors. These are some of the tags which you can use in order to narrow down your information you are looking for or enrich your data so that the information is added to your splunk in order to enhance the value out of it. And the last thing about tags is the file which has been created as part of tags I’m on indexer. Let me go back to my searcher. As you can see we have a newly created file that is tags conf. So in this we have seen that the field is name is called event type and access logs. Similarly the value should be of access log and we have named a tag Apache which is enabled and we have also called it as a complete logs including errors.

These two are also enabled. These are not limited only for event types. If you go to our tags, you can add your new tag by mentioning your tag name. Manual creation of we can keep it. Manual creation of tags and we can mention host is equal to some value which we are familiar or source type is equal to access star I’ll mention a wild character so it will add a new value called manual creation of blocks. Similarly, you can add multiple host values or field values which matches this criteria. I’ll add one more sorry, this is not source, this will be source type and you can also add if source that is the location containing if it contains Apache, add it as manual creation of tags and I’ll save this you can add any number of fields. So these are some of the tag names that we have created and we’ll search for our newly created, that is manual creation. Star.

I don’t remember the complete name, what we have given, but we have got our results, what we are expecting. For as you can see, our tags also matches some of our additional tagging that we have enabled as part of here. Because we have manual creation which says source is equal to Apache, add a tag called manual creation. Similarly, source type is equal to access. We have a matching criteria here. We don’t have Apache matching in the source. But source type matches access star that we have mentioned here. So it has created a tag. This is all about our tags. We should be able to understand these tags are created under file tags. com. This will be adding more value in order to enhance the information that is present in our logs.

  1. Lookups Creation in Splunk

The next knowledge object is one of the most widely used and most important knowledge object and Splunk data Lookups lookups are another set of knowledge object that provides data enrichment by mapping a selected fields or values in an event to add other fields or information to a specific set of events. For example sample we can see in our lab how we can map each status code in our Splunk access log to its Http response and also we have prizes CSV or a lookup file which has been downloaded as part of our tutorial data. So these sources for lookups can be CSV file KB store or scripted lookups from an external database. In our lab, we will be seeing how to create this lookup and share this lookup and use this lookup in our search to enhance the data. First, let us go to our search engine. So this is our searcher in order to see what? All the lookups are available. Go to Settings lookups and you will be able to see all the lookups that are present as part of our Splunk installation. So here you can list them based on specific order. I’ll just for now list them based on lookup table files. As you can see, it has couple of lookups that is Geostatistics which are inbuilt as part of our Splunk installation. If you want to create new lookup, I have downloaded Prices CSV.

This is the prices CSV, which is as part of our tutorial data. And here is the actual file. This is our prices CSV. As you can see, it has a product ID, product name, price sales. Price and code so we can see which fields of these exist in the logs and which fields we need to add. First, we need to create a lookup file. So I’ll upload a lookup file. That is our prices. CSV choose the file and I’ll give the same file name to have this blank has to show that is prices CSV. Our prices CSV has been successfully added. Let’s say if this file has to be used by some other users or Splunk click on permissions next to your newly added prices. CSV as we have shared other permissions this is quite similar all apps that is global privilege everyone which are using Splunk they will be having read access. Admin will have right. And power user will have right. These are some of the best practices, so I’ll be sticking to those admin and power users of Splunk will have the edit permissions for our prices. CSV now. Let’s go back to our search app and we will see out of what fields the CSV, the information that are present in our locks.

That is, index is equal to main source type is equal to our access logs. This is the tutorial data. So probably you can also run the same queries and see how the data has been passed and how you can add more information. We’ll see what fields are present. I’ll select or I’ll deselect all other events. I’ll keep only the default selected fields, that is, host source and source type. So we’ll see. Do we have product ID? Yes, via product ID. Do we have product name? We don’t have product name. Do we have price information? We do not have the price information to sale price, no. And the code for the product. So we don’t have any information except our product ID. Let’s see how we can add all these fields as part of our Splunk fields.

So, what I’ll do, I have a lookup file uploaded, that is Prices CSV, which consists all this information. I’ll use a command called lookup. Using this lookup, I need to mention which file I need to look up for. That is, we know while uploading, we have given it a name called Prices CSV. So this is the file. The second option or the argument is which field in our CSV file should be looking for. That is product. ID. It should map to our field in Splunk that is already existing. This is our field in Splunk. All small product ID match my product ID. That is from CSV to product ID. In my logs output the new fields which are as part of a CSV files. What this command makes is wherever the product ID matches let me go to CSV file. So wherever the Product ID matches this value, it will add all these values for that event. Similarly, if the Product ID matches this one, it will add this information to our CSV.

  1. Searching Using Lookups in Splunk

So while writing this while writing this lookup, we have made a small assumption that product ID in this field was all small. But actually it isn’t the i. Here is the caps. Let us change this to I capital I. Now we should be able to see all the product information that we have added. See we have got a product name which was not as part of our log information. Similarly we will have a price sale price that was also as part of a lookup that is price and sale price. The final one is code. Once we have this information, probably we can guess how much purchase was done considering the price and sale price based on specific IPS or user. Similarly, what code was used for applying these prices and what products were purchased. Let us see what all the top product name that was searched or purchased using these logs which are available as part of Splunk. See this information was never part of our Splunk but still we got this by adding a lookup file that was Prices CSV which contains these information.

See by this we will come to know this was the top purchase product and this was the bottom and the top ten product list. Once we have added these products, you would also like to understand where these files are getting stored. Especially the Prices CSV that was actually uploaded by us using the Splunk web. If you go to our search ed so this is our search. Let us go to a Splunk directory opt Splunk and we have uploaded this as part of our search app. So under the search app we know there is a default location. This is default location and there will be a local location. So these are the two locations. Along with this there is another directory. Let me come out of local.

Yes. So along with this directory you also have a directory known as Lookups. If you go to lookups directory there you will be able to see the newly uploaded file prices CSV. So this lookup file is the one which contains these information of product name, product price and product sale price and code which is being used in the search to get the information so quickly. We can also see what is the information that Price CSV is holding. So as you can see, this is the same information that is present in our CSV file that we uploaded to fetch the information in our Splunk. In this tutorial we have learned what are lookups, how to upload the lookups and where the lookup files are stored and how to get the information which is not present as part of log but by adding additional information using lookup files.

  1. Lookups Use Case Example

As part of our lab exercise we’ll also be doing additional lookup file that is to add description to all our Http status in our web server logs. So probably you can repeat the same exercise when you will be provided with a lab access with the complete purchase of this program. So we’ll see how we can add this. I’ll go to settings. Lookups. For this exercise I have created a file that is Status CSV.

If we see inside these files we have a status field and description. We have just two simple fields that is status referring to the Http status code present in the logs and the description with their respective information. So let’s see how we can add this information into our splunk went into settings Lookups. Now I’ll add a new file http status. I’ll give the same file name you can give whatever the name you choose status CSV and once you have created you can it will be completely under your ownership where the user has uploaded and if you want this information to be stated click on Permission.

All apps read, everybody can read and admin and power can add information or edit this files. Let us quickly go back to our search app where we can validate our newly created lookup adds new information or not. Once we are in our search app I will again go to index is equal to main source, type is equal to access, that is my access log where all the status are present and I’ll check for last 30 days because I need to see what is my status. The field name of Http status is status. We know by now we have uploaded a file called Lookup. The name is http underscore status CSV and the field we are looking for is status in our CSV file as status itself in our logs and we are outputting new field that is description which is not as part of any web server log.

As you can see, we have a new field that is called description. These information were never part of our internal logs but we have added them based on our knowledge and we have taught Splunk anywhere. If you look at this information you can add these additional fields that are description. Similarly for prices we have added sale price code and price and other information. This information can be any number of columns. You can add any number of information to your logs consisting of same field. Let’s say the host field. I have a universal forwarder that is installed on my PC referring to as Arun Kumar PC. The same information you can add latitude, longitude, owner and the applications running on this PC and when was the PC last patched all this information as part of your CSV itself? The options here are unlimited. You can add any number of fields you.