Skip to content
  • 1 Votes
    6 Posts
    2k Views
    zareenZ
    @zareen said in CS403 GDB1 Solution and discussion: Now would you normalize the database or keep your database in de-normalized form. Although, denormalized schema can greatly improve performance under extreme read-loads but the updates and inserts become complex as the data is duplicate and hence has to be updated/inserted in more than one places. One clean way to go about solving this problem is through the use of triggers. For example in our case where the %(red)[orders] table has the %(red)[product_name] column as well, when the value of %(red)[product_name] has to be updated, then it can simply be done in the following way: Have a trigger setup on the %(red)[products] table that updates the %(red)[product_name] on any update to the %(red)[products] table. Execute the update query on the %(red)[products] table. The data would automatically be updated in the %(red)[orders] table because of the trigger. However, when de-normalizing the schema, do take into consideration, the number of times you would be updating records compared to the number of times you would be executing SELECTs. When mixing normalization and de-normalization, focus on de-normalizing tables that are read intensive, while tables that are write intensive keep them normalized. link text
Reputation Earning
How to Build a $1,000/Month World CUP LIVE Matches Live Cricket Streaming
Ads
File Sharing
Stats

1

Online

3.0k

Users

2.8k

Topics

8.5k

Posts
Popular Tags
Online User
| |