Project

General

Profile

action #34531

Updated by szarate about 6 years ago

Currently the table `job_module_needles` has 59M rows and the table is about 3GB (Plus the indexes that can be also quite scary but a Vacumm FULL might help there). 

 The indexes itself are not the problem, the need is to actually figure a better way to calculate the last time a needle was used without keeping that much information. 

 Also a point here: Use this task to evaluate the feasibility to start mixing Mojo::Pg with DBIx (Since we don't really rely on SQLite anymore) 

      

     openqa=# SELECT nspname || '.' || relname AS "relation", 
     openqa-#       pg_size_pretty(pg_relation_size(C.oid)) AS "size" 
     openqa-#     FROM pg_class C 
     openqa-#     LEFT JOIN pg_namespace N ON (N.oid = C.relnamespace) 
     openqa-#     WHERE nspname NOT IN ('pg_catalog', 'information_schema') 
     openqa-#     ORDER BY pg_relation_size(C.oid) DESC 
     openqa-#     LIMIT 20; 
     relation                        |    size    
     ---------------------------------------------------+--------- 
     public.job_module_needles_needle_id_job_module_id | 4716 MB 
     public.job_module_needles_idx_needle_id             | 3878 MB 
     public.job_module_needles                           | 3021 MB 
     public.job_module_needles_idx_job_module_id         | 2501 MB 

Back