Two peer responses at least 150 words each spa.


Save your time - order a paper!

Get your paper written from scratch within the tight deadline. Our service is a reliable solution to all your troubles. Place an order on any task and we will take care of it. You won’t have to worry about the quality and deadlines

Order Paper Now

Normalization is the process of reorganizing data in a database so that it meets two basic requirements. The first is that there is no redundancy of data and all data is stored in only one place. The second requirement is that data dependencies are logical, and all related data items are stored together. Normalization is important for many reasons, but mainly because it allows databases to take up as little disk space as possible, resulting in increased performance. Databases can hold a significant amount of information, perhaps millions or billions of pieces of data. It is not good to have similar tables and columns with the same data. It just takes up too much space and slows down the performance of the database. Here are the benefits of normalization of the database.

Normalizing a database reduces its size and prevents data duplication. It ensures that each piece of data is stored only once.

Application developers who create applications to “talk” to a database find it easier to deal with a normalized database. The data they access is organized more logically in a normalized database, often similar to the way in which the real-world objects that the data represent are organized. That makes the developers’ applications easier to design, write and change.

Referential integrity is the enforcement of relationships between data in joined tables. Without referential integrity, data in a table can lose its link to other tables where related data is held. This leads to orphaned and inconsistent data in tables. A normalized database, with joins between tables, can prevent this from happening.




Mark Rodriguez | 2 days ago | 278 words

Normalization in regards to database management is a term describing basic housecleaning functions. Just like any other projects or files that hoard information they eventually need to be scrubbed to increase efficiency by deleting unnecessary data and reorganizing processes. There are two goals of the normalization processes: eliminating redundant data (storing the same data in more than one table) and ensuring data dependencies makes sense (only capturing necessary information) (Chapple, 2016). In the end objects should be able to be modified with in one table and disseminated throughout the database. The benefits of normalizing include improved concurrency resolution, faster update performance, faster index creation and sorting, optimized queries, and data integrity (Microsoft, 2017). However the opposite may benefit the database as well called denormalizing. The intent of this procedure is to add as much of the same information or type of tables to the database as necessary. In theory this may decrease processing time because the effort to search for information will decrease.

According to Mr. Chapple, Database Expert, there are multiple forms such as 1NF and 5 NF with several iterations in between (Chapple, 2016). However for this discussion let’s just go over 1 through 3. These forms are not actual step by step procedures but guidelines. The first norm is to set rules to help organize. The second is remove duplicate data and create relationships. The third norm is to fulfill the second norm and to also get rid of columns not dependent on primary keys (Chapple, 2016).

Thanks for reading.


Do you need a similar assignment done for you from scratch? We have qualified writers to help you. We assure you an A+ quality paper that is free from plagiarism. Order now for an Amazing Discount!
Use Discount Code "Newclient" for a 15% Discount!

NB: We do not resell papers. Upon ordering, we do an original paper exclusively for you.