cert
cert-1
cert-2

Pass Microsoft Azure Data DP-900 Exam in First Attempt Guaranteed!

Get 100% Latest Exam Questions, Accurate & Verified Answers to Pass the Actual Exam!
30 Days Free Updates, Instant Download!

cert-5
cert-6
DP-900 Exam - Verified By Experts
DP-900 Premium Bundle
$39.99

DP-900 Premium Bundle

$69.98
$109.97
  • Premium File 283 Questions & Answers. Last update: Jul 17, 2024
  • Training Course 32 Video Lectures
  • Study Guide 672 Pages
 
$109.97
$69.98
accept 160 downloads in last 7 days
block-screenshots
DP-900 Exam Screenshot #1
DP-900 Exam Screenshot #2
DP-900 Exam Screenshot #3
DP-900 Exam Screenshot #4
PrepAway DP-900 Training Course Screenshot #1
PrepAway DP-900 Training Course Screenshot #2
PrepAway DP-900 Training Course Screenshot #3
PrepAway DP-900 Training Course Screenshot #4
PrepAway DP-900 Study Guide Screenshot #1
PrepAway DP-900 Study Guide Screenshot #2
PrepAway DP-900 Study Guide Screenshot #31
PrepAway DP-900 Study Guide Screenshot #4

Last Week Results!

students 87.4% students found the test questions almost same
160 Customers Passed Microsoft DP-900 Exam
Average Score In Actual Exam At Testing Centre
Questions came word for word from this dump
Premium Bundle
Free VCE Files
Exam Info
DP-900 Premium File
DP-900 Premium File 283 Questions & Annswers

Includes question types found on the actual exam such as drag and drop, simulation, type-in and fill-in-the-blank.

DP-900 Video Training Course
DP-900 Training Course 32 Lectures Duration: 2h 27m

Based on real-life scenarios similar to those encountered in the exam, allowing you to learn by working with real equipment.

DP-900 PDF Study Guide
DP-900 Study Guide 672 Pages

Developed by IT experts who have passed the exam in the past. Covers in-depth knowledge required for exam preparation.

Total Cost:
$109.97
Bundle Price:
$69.98
accept 160 downloads in last 7 days
Microsoft DP-900 Practice Test Questions, Microsoft DP-900 Exam dumps

All Microsoft Azure Data DP-900 certification exam dumps, study guide, training courses are Prepared by industry experts. PrepAway's ETE files povide the DP-900 Microsoft Azure Data Fundamentals practice test questions and answers & exam dumps, study guide and training courses help you study and pass hassle-free!

Non-Relational Database Concepts

3. Choose a NoSQL Database

So as you can see, there are a lot of options. Looking at the relational side or looking at the non-relational side Basically, you're given a buffet of database types to choose from. And if you're sitting there thinking, "Well, I have to store data in Azure, which one should I choose?" Well, the answer, unfortunately, is not going to be one single answer. There is no single database that is going to cure any particular problem you throw at it. Relational databases are good. I've used them for years for enterprise applications. Above a certain amount of traffic, if you will, but below a certain amount of traffic. Non-relational databases are good in one instance, and they're good in other instances too. In reality, you're just going to have to base it on the particular scenario. Now it is possible that your solution can contain more than one type of database.

Here's an example of an Internet of Things device. So I think this little card in the top left represents an IoT device. The data gets stored in an IoT hub, and you were using data bricks to do some real-time processing and streaming analytics here, and that gets into a Cosmos DB. Cosmos DB is suitable for extremely high latency and global scale data duplication. You can store that data there, and basically, some type of application is querying that data. Then, after some time, 90 days are recorded here. That data gets pulled off of this application. For one thing, it gets stored in an Azure storage account as a backup. So there we go. Our second data option is there, and then it gets stored in a data warehouse. And this is going to allow us to run reports. The data warehouse has indexing. It's a massive relational database storage, and you can use Power BI to run queries and go deeper and have some fancy dashboards and things like that.

So in this case, we're using three data storage technologies for three different purposes, right? for high efficiency global usage, subten millisecond response times, backup recovery, and data analytics So let's look at the different relational sites then. Relationships do have their benefits, right? So, as I previously stated, never use relational. There's a concept called normalization, which I believe we talked about, that basically makes the data smaller because it reduces duplicate entries. It uses the ID field and then points you to another table to go look up the single entry for that. It also helps you reduce mistakes. And so if you had a state field, like a province or a state, in your data, people could misspell their state in so many different ways that they could abbreviate it. Two-letter and three-letter abbreviations are available. If you were to just allow it to be a free text field, then you're going to end up with 16 different ways of saying the same thing. But if you used a relational database, You had the state as a lookup table. You forced people to use a drop-down box to pull from. There are only a few possible states. Then that's going to basically reduce those errors. A related topic to that is how those data relationships allow you to drive insights. So imagine that you do have the state field.

Sometimes it's an abbreviation, sometimes it's the full state name. But when you're running a query on it, you don't want your data to be littered with three or four variations of the same thing. You want to be able to group by that thing and then be able to see the state in one output row. Data integrity. The database enforces integrity. So you can't insert a row if the things it relies on, the foreign keys, don't exist. And so that's 1 second. As you're redefining the schema, you're saying some things are required and some things are not required. So you can't insert a row if the name is missing. You can't similarly delete a parent row if the child row still exists. Relational databases have been around for decades.

You know, SQL Server has been around for years, and although the code has been refreshed and updated, it's basically proven there aren't going to be massive bugs in the core products. I'm not saying that nonrelational databases have obvious bugs, but relational databases are established, massive products with tonnes of features. You're going to find examples, documentation, and things like that. We recently witnessed power bi. The diagram basically lets you point to your relational database and makes the data accessible to non-data users. And there are other types of constraints you can enforce. So by having a fixed schema, again, you're not allowing people to just really nearly add new fields without going through the proper process. You have defaults for fields. You have limits. If it's a number, it can only contain numbers. Sometimes there are range limits, etc. For now. So you see the other side when we're talking about NoSQL or nonrelational data. Again, we saw a graph database; we saw time series.

These are optimised for certain data types, and they can outperform relational data for those data types. It is also designed for performance at Internet scales. So CosmosDB's sub-ten millisecond response time was designed specifically in response to the experiences of large businesses like that. Now this one I hesitated to put there, but you could run some of these open-source databases on your own hardware, and it's not even really a big investment. Now, I know you can probably get a light version of my sequel or PostgreSQL, but basically, it's probably cheaper to run no sequel at the lower end. And finally, they're often open source. They're not all open source, but you're going to find the source code for a lot of these databases. So there's some faith in that as well. Some companies prefer open-source technologies to closed ones.

4. Azure Non-Relational DB Options

Alright, so let's turn our attention to Microsoft Azure specifically and take a look at the non-relational databases that Microsoft Azure supports. Now, the big daddy of them all, the one that I've mentioned a couple of times, is Cosmos DB. So Cosmos DB, the best way to describe it, is an enterprise-grade, non-relational database. And so if you are a big company and you want to run your data in a nonrelational database, you should be thinking about Cosmos DB. It supports many data models, including GraphAPI, document DB, table, column, family, etc.

It is also 100% compatible with some established APIs. So if you're migrating your existing database from one of the existing non-relational databases into Cosmos DB, then those APIs are called wire-compatible. So MongoDB has a very specific version number, and Cosmos DB is 100% compatible with that API for that version number. It's got some other features that are really cool, and we'll show you those in the video that comes up. But you can even select the data consistency you need. So if you have multiple Cosmos DBs around the world and you want to enforce a very strict consistency so that every single database always gives the same results at any specific microsecond, you can set that.

But if you're thinking, well, it's just Twitter. It doesn't matter if one user in Russia doesn't see a tweet for a few seconds after a user in America does; then you can have a different level of consistency, a looser consistency. This also makes scaling very easy. I didn't show you the scaling within the AzureSQL database, which I probably should have, but I'll show you the scaling in Cosmos DB. It's really easy to do. Also, the guaranteed single-digit millisecond latencies and the service level agreements for Cosmos DB are quite extensive. So Microsoft is really promising the availability, the throughput, the latency, and all that stuff. Now next up we can talk about Azure Table storage. Azure Table Storage is just one component of an Azure Storage account.

Azure storage accounts can handle up to five petabytes at their maximum, which is a huge amount. You're not going to be able to get a Cosmos DB or an Azure SQL database to support up to five petabytes. So the quantity of data table storage takes the cake there. It's also the cheapest way to store data. So at 4.5 cents per gigabyte per month, let's say you store—actually, when I was working for a company a few years ago, their main marketing database was 800gb. And so they could have stored that in a table storage account for $35 a month. Now they probably wouldn't have, because that was an important database to them. That would be more like an Azure SQL database. But if you do have massive amounts of data and you need to query it and use applications against it and things like that, this is the cheapest option. Keep in mind, though, that storage accounts do charge for operations.

So listing, reading, writing, and deleting are all transactions, and you're going to pay three-tenths of a cent per 100 transactions. So it's extremely cheap. But I mean, over the course of months, maybe this adds up to some extra money there. And if you look at the SLA for a table storage account, it's quite confusing. However, their promise to return data from a query is 10 seconds, or 2 seconds per MB for data transfers. I mean, that's not something an enterprise-grade application could use if it's going to take 10 seconds for a query to reply. Blob Storage is similar to Table Storage, except that it is a Blob account. Still, with five petabytes, it is cheaper. So it's half the price of accessible storage, two cents per gigabyte. You get options for access, which are Premium, Hot, Cool, and Archive, with different pricing levels for each. So Premium is much more expensive for storage, and Cooland Archive is much cheaper than the listed price. It also supports a reservation system. This is relatively new, but you can get that two-cent price down to one and a half cents if you're willing to sign up for a three-year agreement. Supports blob indexing, a recently released public preview feature.

This wouldn't be on the test because it's a preview feature, but they are looking at more traditional data access ways such as indexing against storage accounts. You have the same issue with table storage. You will be charged for transactions, and the SLA guarantees that they will not lose your file. But the SLA is 2 seconds/mb. So if you have a 100-megabyte file, they will guarantee to get it to you within three and a half minutes. And that is, can you wait that long? File storage in Azure is a little bit more expensive. There are standard and premium options. And again with the poor thing, a file storage account has the hierarchical structure of folders and subfolders and can be mounted using the SMB protocol. So you could use this as a file server. Anyway, those are the storage options that we'll cover in this course for non-relational data.

Manage Non-Relational Databases

1. Create Cosmos DB

So now it's time to switch over to the Azure Portal, and we're going to take a look at non-relational databases in Azure. We're going to go through the process of creating a Cosmos DB and the things you need to know as you're creating it, like the configuration options you have. And then, once it's created, we're going to go into it. We'll see how it differs from the relational database, the Azure SQL Database, that we covered a few sections ago.

So this is the Azure Portal. It's at portal.azure.com. Remember, we saw this for the SQL database. This is the dashboard, and I'm going to switch over to the home screen. We do have all the resources we have and all these things. I'm going to go right into creative resources and databases. And we saw that these are all the relational options at the top, but Cosmos DB is in there as well. Now remember, a part of this is also the Azure storage account. contains blob storage, file storage, and table storage. So we can even create a storage account in the storage section. Go back to the databases. Cosmos DB Click Create now; it's a very similar creation option compared to SQL databases. So Wizard: Cosmos DB just introduced a free tier. So there is a free option. It's 400 request units per month. The Azure Cosmos DB free tier has 5 GB of storage and 400 requests per month. We have to choose a subscription. same one, a resource group. Remember, we created one for the relational database.

So I'll call this the Azure Cosmos. Give it whatever name you wish it do have. Now, the SQL database had a server and database model. Cosmos DB has an account model that would require you to create a database separately. So I'm calling this AZ SGD Cosmos. If not available, I'm going to put the word "DB." So we do have to give it an account name. Cosmos DB has an account name and a database model, whereas the SQL database had a server database model. So I'm giving this sort of a unique account name. It has to be unique across all of Azure. So if you tried to take this one and I had it, you wouldn't be allowed to. Now, this is the API, which is sort of the first big decision that we have to make here. Remember that we said that Cosmos DB supports many different data models? Well, this is where you have to choose it, so you can't have multiple database models in the same account. So if I choose the core that says bracketsql, that's the Document DB, and that's the JSON data. We've got MongoDB Cassandra, which is Polymer Azuretable, and Gremlin, which is graph data. So we do have our database choices here. I'm going to leave it as Document DB or Core? This is a preview feature for Azure Notebook support.

We are going to leave that off. We physically have to store that somewhere in the world. We'll begin in the western United States. Like I said, there is a free tier. So if you keep this under 405 GB of storage, you could get it for free. You do have to apply that and only use it once, right? So you do have to apply that. So I'm going to apply that here. Azure will now ask you which types of data are in non-production and which are in production. By selecting this, you're going to get You see, it automatically switches to different defaults for some of the other questions. You can change this after, but it does change the UI experience and some of the defaults. And Geodeundacy will activate it later. Multiregional rights will turn that on later. Now, we're going to talk about security separately, but I'm going to leave this open to the public for now. Again, you do need access keys. And so you can't just access the Cosmos DB server without security. But the endpoint will be available. It is encrypted by default. Azure will manage the key for you. You may be working in an environment where your security team wants you to manage the key.

And so you can switch from Azure-managed to personally customer-managed. And then there are a whole bunch of other questions that would be related to that. Skip over tags, and then we'll get to the review screen. So this is one of the few places in Azure where it'll tell you it's going to take longer than you thought. So Azure SQL Database took under two minutes, while Cosmos DB is expected to take ten minutes. All of my choices are here, and I can just hit the create button to create my first Cosmos DB.

2. Query Cosmos DB

Alright, so after some time, our deployment was created by going to deployment details and operation details. We can see that the duration was ten minutes and 18 seconds. So, if you recall, it estimated ten minutes at the time of creation. And so it's pretty much on track for what it promised to create. Let's go into the database that we created. Now, there's some similarities with a SQL database. I created this as a public endpoint, and so this unique name documents Azure.com is a publicly accessible URL. Access keys, just like in an SQL database, are required to access my database again. So if you don't have them, it's not going to help you.

Now we can go under. Quick start. We can see here that there's some sample code based on the language that you want to work with. We haven't created any data, but once we create the data, we can go and download a sample code to work with it. I'm going to scroll down and go into containers. So, as we said, we have an account. What we don't have is a database, which is called, in this instance, a container or a collection. So I can say "add collection." And I do have to give the collection a name so I can call it DB One, similar to a SQL database. Remember, to stay on the free plan, we do need to have the 400 throughput container have a unique identifier. So I'm not sure why DB1 can't be used as the unique identifier while indexing. We won't go over partitions, leaving out the partition key and other fundamentals.

So we can partition this off of, let's call it, the type field. And so now what we're doing is creating our first container or collection inside of our Cosmos DB account. Each one of these that we create is going to have its own throughput, which is a limit based on how much data it can go through, et cetera. So I have my collection, but it doesn't have anything in it. There are zero items. Now, remember, we created this ascii SQL, which is a document. As a result, it requires a JSON to be created. Now I can see the new item, and you'll see that it creates a JavaScript object for me. I give it an ID. I call that one We said that we were going to use type as the partition. I can create a JSON document and store that within the Cosmos DB database by clicking the save button here. Now I have one item in my collection.

Now, you'll notice that Azure has added its own IDs, references, tags, et cetera. There's a timestamp element. So this is stuff Azure has added to our data for its own purposes. All right, so we've created a document here. In fact, we can make another one where the ID is two and the name is Bob. And we have to put the type because that's the partition key, which is customer, and maybe orders equals true, right? So now we've sort of added a new property to the data, and that's okay, right? So, Cosmos, DB? No. SQL? Nonrelational databases. This supports a dynamic schema. So I can start running queries on this, and I won't throw a nasty error message because the first customer lacks an Orders property, but the second customer does. In fact, we can see ourselves discussing queries. Poking out from this table, here is the SQL query. Choose a star from C. If I say, "Create a SQL query, select star from C is there, execute the query, and it returns my two data rows," it will return my two data rows.

Now, you might say, first of all, what is C? This is one of the things that confused me when I first started. C is just an alias, right? So it doesn't actually represent the collection called "DB 1" in this case. And it's not a select star from DBone; it's a select star from C. That works. But then I can say where C ID equals one. And so now I've got a where clause on my query, and I'm able to execute it and return just that single item. Okay? One, two, and if we try one that doesn't exist, it says zero results. We can also search by name. So we know that Bob has a record in there, and that successfully returns that successfully. But what about orders?

Okay, so let's try the value we know doesn't exist; that returns no records. This results in one record. So we're able to dynamically add properties and then run queries based on those dynamically added properties. So this is something that's not typical in a relational database. So that's pretty much it. Now, this is obviously not the typical way. Data Explorer is a tool for developers and administrators to do some diagnostics here. It's not the typical way that you're expecting your end users to interact with this database. So the Cosmos DB does have a public API. And so, because I created it, I enabled it for public consumption.

When we set this up as a public API, it does require a security key. So we have a key metaphor to access this data. You cannot access my database just by accessing this API. And we can even enable web-based interactions. And so cross-origin resource sharing, or "cores," is a browser restriction that basically stops websites from accessing URLs that are outside of their core domain, unless you whitelist those domains. And so let's say I wanted to put "softwarearchitect.ca" as a whitelisted domain; let's put it as "S." Then any applications that are verified as running in a browser from this domain can now access my DBN. That is something that's set up and available to us. So you expect API access to be the most common method rather than using the Data Explorer.

Microsoft Azure Data DP-900 practice test questions and answers, training course, study guide are uploaded in ETE Files format by real users. Study and Pass DP-900 Microsoft Azure Data Fundamentals certification exam dumps & practice test questions and answers are to help students.

Exam Comments * The most recent comment are on top

Allen
Brazil
Apr 29, 2024
This test sounds easy, but there are several questions that can generate several different interpretations.
Get Unlimited Access to All Premium Files Details
Purchase DP-900 Exam Training Products Individually
 DP-900 Premium File
Premium File 283 Q&A
$65.99$59.99
 DP-900 Video Training Course
Training Course 32 Lectures
$27.49 $24.99
 DP-900 PDF Study Guide
Study Guide 672 Pages
$27.49 $24.99
Why customers love us?
93% Career Advancement Reports
92% experienced career promotions, with an average salary increase of 53%
93% mentioned that the mock exams were as beneficial as the real tests
97% would recommend PrepAway to their colleagues
What do our customers say?

The resources provided for the Microsoft certification exam were exceptional. The exam dumps and video courses offered clear and concise explanations of each topic. I felt thoroughly prepared for the DP-900 test and passed with ease.

Studying for the Microsoft certification exam was a breeze with the comprehensive materials from this site. The detailed study guides and accurate exam dumps helped me understand every concept. I aced the DP-900 exam on my first try!

I was impressed with the quality of the DP-900 preparation materials for the Microsoft certification exam. The video courses were engaging, and the study guides covered all the essential topics. These resources made a significant difference in my study routine and overall performance. I went into the exam feeling confident and well-prepared.

The DP-900 materials for the Microsoft certification exam were invaluable. They provided detailed, concise explanations for each topic, helping me grasp the entire syllabus. After studying with these resources, I was able to tackle the final test questions confidently and successfully.

Thanks to the comprehensive study guides and video courses, I aced the DP-900 exam. The exam dumps were spot on and helped me understand the types of questions to expect. The certification exam was much less intimidating thanks to their excellent prep materials. So, I highly recommend their services for anyone preparing for this certification exam.

Achieving my Microsoft certification was a seamless experience. The detailed study guide and practice questions ensured I was fully prepared for DP-900. The customer support was responsive and helpful throughout my journey. Highly recommend their services for anyone preparing for their certification test.

I couldn't be happier with my certification results! The study materials were comprehensive and easy to understand, making my preparation for the DP-900 stress-free. Using these resources, I was able to pass my exam on the first attempt. They are a must-have for anyone serious about advancing their career.

The practice exams were incredibly helpful in familiarizing me with the actual test format. I felt confident and well-prepared going into my DP-900 certification exam. The support and guidance provided were top-notch. I couldn't have obtained my Microsoft certification without these amazing tools!

The materials provided for the DP-900 were comprehensive and very well-structured. The practice tests were particularly useful in building my confidence and understanding the exam format. After using these materials, I felt well-prepared and was able to solve all the questions on the final test with ease. Passing the certification exam was a huge relief! I feel much more competent in my role. Thank you!

The certification prep was excellent. The content was up-to-date and aligned perfectly with the exam requirements. I appreciated the clear explanations and real-world examples that made complex topics easier to grasp. I passed DP-900 successfully. It was a game-changer for my career in IT!