Dealing with large JSON arrays

So I am trying to pivot some large Json arrays into a row by row format (postgres db). So imagine

{“searchResults”:[{“id”:“89705”,“score”:42,“title”:“Foo.”,“properties”:{…

However this array may contain hundreds of nested arrays (largest so far is 350) and then I need to repeat this for each row, I have looked at using a ‘for loop’ however that does not seem to be supported by MB. Are there any other suggestions on how to go around this ?

Ideal Output

Array, ID , Score, Title

1 89705, 42, foo
1 89706, 34, bar
2 90003, 54, thing
2 98594, 53, that
(so 1,2 represent different rows in the initial table that both contain ??? amounts of objects the JSON data array)

Cheers

1 Like

I think the only way is using native jsonb postgresql extension in a custom sql, then you can create virtual fields, and then Metabase can cook it for you.
https://www.postgresql.org/docs/9.5/static/functions-json.html

It could be a great idea somehow if Metabase is doing the job from a json field, as he does for mongodb…

Thanks for the suggestion, solution curtesy of Stakoverflow

select x.idx, 
       x.val ->> 'id' as id,
       x.val ->> 'score' as score, 
       x.val ->> 'title' as title
from( Select Data 

From artifact
 
)as Data, json_array_elements(Data::Json->'searchResults') with ordinality as x (val, idx);
2 Likes