need some help. search was fine while we were testing standard pages. then we turned on pdf/word as file types to search by and suddenly results were taking a very long time and the postmaster process on the server was spiking the cpu (at times over 100%).
so yes i disabled, removed filetypes and re-indexed just standard pages.
but the table is still huge: sq_sch_idx (457688) and search is still too long.
if i empty this table (toggling off index first) and then re-index are there any issues?
cheers
Did you do a full vacuum on the table after making large changes to it?
yup.
managed to reduce a 700mb db back to its normal 300mb. but still getting slow search - i really need to remove any leftovers from the 'enable pdf, antiword' event. looking over the data in that table it appears to be search indexes.
anyone tried it? i only need the search module externally now for our course listings (standard pages delivering content via metadata).
Enabling pdf/word indexing tools does not change the index size unless you do a full reindex of your files. So unless you did that, your table is probably big just because you have a lot of content.
You can truncate it and do a full reindex without any problems if you want to give it a go.
i think my problem was a known bug for 3.20.2 (so i'm told with squiz support
- upgraded this morning to 3.22.0.
we have 1000 pages, 140 pdfs (some pdfs are 5+mb) and i was keeping the db hovering around 300mb for some time.
then it jumped up over 700mb and started maxing out the cpu/s while doing a simple keyword search.
after truncating that table and pointing the search manager at two sections (standard pages) rather than whole site i ended up with a 50mb db file.
after re-indexing i'm back up to 300mb, but couldn't full vacuum without max_fsm_pages errors.
i put the default up to 40,000 - vacuumed - put it back to default.
of course we now have a support call in for your sysadmins to auto-tune postgres for us :ph34r: