You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
We have several fields named field_name_X where X is an int. This limits us to a finite number and forces us to manually create an array to iterate over. We also would need some strange logic to keep them full at the smaller index (e.g., field_1 will never be null if field_2 is non-null).
Describe the solution you'd like
We should use a second table and join it to the users table. This is preferrable to making each field its own array since they are strictly relatated.
Describe alternatives you've considered
We could have one field called extra_fields in the current table that is a JSONB[] type which contains objects that have three fields: label, value, and verified. This looser schema could lead to the introduction of errors, and it would be preferable to use PG to enforce a strict schema.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
We have several fields named
field_name_X
where X is an int. This limits us to a finite number and forces us to manually create an array to iterate over. We also would need some strange logic to keep them full at the smaller index (e.g.,field_1
will never be null iffield_2
is non-null).Describe the solution you'd like
We should use a second table and join it to the users table. This is preferrable to making each field its own array since they are strictly relatated.
Describe alternatives you've considered
We could have one field called
extra_fields
in the current table that is aJSONB[]
type which contains objects that have three fields:label
,value
, andverified
. This looser schema could lead to the introduction of errors, and it would be preferable to use PG to enforce a strict schema.The text was updated successfully, but these errors were encountered: