Aha ! again database

 I was trying to solve yet another hypothetical situation . I love investigations, its like an addiction . 

So, here I am busy with my own mystry! 


What to do when the data is just too much for Database to handle ? [Mysql] 


1) In the application we must when designing, design for Bulk updates. 
    insert into table  (a,b,c)  values (1,2,3) , (4,5,5), (9,9,0) on duplicate key update 

2) never write what is already written .  [check before writing] 

3) Queue the writes in the queue and batch process them 

4) write them on different tables , based on the data. for example . all urls starting with a-n in one table. 
n-z in another. so that locks are not held. writes are faster. 
    queues should be in place each queue should have a reader configured to decide which table to write to . 
    

write ops of myslq/postgres


2,000 - 5000 small 2GB RAM dual core. 

5000-20,000 

Comments

Popular Posts