I have a large table with a jsonb column. one of the fields in the jsonb is an array of objects. I need to get a unique list of values from within the array. Here is a simplified equivalent example.
create table objects(
id serial4 not null,
object_data jsonb not null
);
say it has entries like below
insert into objects (object_data) values
( '{"id": 1, "color": "green", "alias" : [{"name":"abc", "order":"1"}, {"name":"abcd", "order":"2"}, {"name":"bbc", "order":"3"}]}'::jsonb),
( '{"id": 2, "color": "blue", "alias" : [{"name":"bbc", "order":"1"}, {"name":"abcd", "order":"2"}, {"name":"bbcnn", "order":"3"}]}'::jsonb)
;
I want to get a unique list of values for name in the alias array. This query below works but its slow.
SELECT DISTINCT jsonb_array_elements(object_data -> 'alias')->>'name'
FROM objects ;
There are only about 20 unique values for name and the table has around 350k entries.
we are using postgres 12. Is there any indexing option that can speed up the query. I tried btree and gin index on object_data -> 'alias' but the query always does a seq scan.