I’ve recently migrated two Rails projects from Sphinx search to Postgres Full Text Search. Mainly because the applications were small and I didn’t see the benefit of running another service, hence another point of failure.

In both cases the number of documents and the level of search activity were not very high, so it was not a question of load on the Postgres server.

In one of the applications searchable content has to be filtered for security and just treating Sphinx as a dumb search content repository was annoying. By moving it all into Postgres I can have the actual fuzzy text searching and the necessary security checks (by checking other tables) all in one place.

My general architecture / flow is:

* For each searchable table: add a search_content column of type tsvector

* Create a GIN index on the search_content column

* If the columns can be indexed as-is and we dont need any other searchable columns then we can use the native tsvector_update_trigger trigger to update the search index:

However, if the search content requires other tables then we need to write a manual function trigger. Don’t forget to use COALLESCE if any of your searchable columns can be NULL. If you forget this and allow NULLs to be creep in then it will make the whole tsvector ¬†NULL and you’ll wonder why you have no searchable content.

Now we need to seed the initial search index. In this case I just leveraged the touch method in ActiveRecord.

Contact.all.each { |c| c.touch }

Finally, performing searches is pretty straight-forward. I wish there was a way that we can save from having Postgres parse and construct a query structure twice, but for these projects its a negligible cost.