How do you delete an entry in an array when using `createStore`

In the following documentation, this case is not presented: https://docs.solidjs.com/guides/complex-state-management It could be trivial but I'm unsure of the right way for doing that 🙏
33 Replies
edygar
edygar7mo ago
An easy way of doing so is using produce
setStore(produce(array => {
array.splice(indexToRemove)
}))
setStore(produce(array => {
array.splice(indexToRemove)
}))
bigmistqke
bigmistqke7mo ago
Afaik there is no fine-grained way of removing an index. With keys there is setting undefined, but i don't think there is an equivalent with arrays. produce will look like a mutation, but it will generate a new array under the hood.
setStore((array) => {
array.splice(indexToRemove);
return [...array]
})
setStore((array) => {
array.splice(indexToRemove);
return [...array]
})
or
setStore((array) => filter((value) => value !== valueToRemove))
setStore((array) => filter((value) => value !== valueToRemove))
are also possible relevant issue https://github.com/solidjs/solid/issues/1748
Alex Lohr
Alex Lohr7mo ago
If you want to be even more performant, consider nulling entries and guarding the rendering with <Show /> and add an effect that filters the empty entries in a batch if the size of the array becomes too large. Alternatively, if you have unique entries, you could also try our ReactiveSet community primitive instead of an Array.
edygar
edygar7mo ago
Also, you could switch to createMutable ⚠️ Note that splice mutates the original array, it might cause bugs throughout the codebase
bigmistqke
bigmistqke7mo ago
could u give example?
edygar
edygar7mo ago
Imagine you implemented an Edit mode, which reads in an initial state for the array (the original ref), and it's not intended to react to any outside changes, but then you spliced the SAME array elsewhere. Your edit mode will not react immediately to the splice and any future updates within the edit mode will be affected by that splice (will not have the element in question).
bigmistqke
bigmistqke7mo ago
But like anything else in the store is mutated when using the default api. It's only arrays that you can not mutate/update finegrained
edygar
edygar7mo ago
I'm recommending avoiding mutability just to keep the unidirectional data flow, as it might introduce unexpected behavior
bigmistqke
bigmistqke7mo ago
but createStore is inherently mutable.
edygar
edygar7mo ago
It's not...
bigmistqke
bigmistqke7mo ago
setStore(0, 1) is mutating the first index of the array. setStore('key', undefined) is mutating the object by deleting the key key
edygar
edygar7mo ago
@bigmistqke WTF?! 🤣 TIL okokok https://playground.solidjs.com/anonymous/71ee47f5-8f18-48cc-be3a-e27fded91e1b I keep being tricked by my React premisses
Solid Playground
Quickly discover what the solid compiler will generate from your JSX template
bigmistqke
bigmistqke7mo ago
Ye man, been there! It's a bit of mental shift. In react they need the immutability because otherwise they only have the diff to figure out what is changed, so they need a path to follow. In solid's store that's generally not needed bc each node in the store is a signal under the hood. And mutation is a lot more gc friendly then re-creating all these objects and arrays on each update.
edygar
edygar7mo ago
But it's confusing then, why do we have to return a different ref on the updateStore callback?
bigmistqke
bigmistqke7mo ago
Because that's how a createSignal([]) would work too On an individual signal you can do { equals: false }, but all signals in a store work w referential equality.
edygar
edygar7mo ago
This is very inconsistent :/ The fact we can't define this on the createStore
bigmistqke
bigmistqke7mo ago
Ye, I agree. It is consistent in the sense that undefined actually does remove the index, it just does not re-index the array. The indices become empty instead.
edygar
edygar7mo ago
Does it actually apply delete ref ? If not, it's just an assignment xD
bigmistqke
bigmistqke7mo ago
No idea tbh
binajmen
binajmenOP7mo ago
I was not expecting this level of "complexity" 😅 This is my current situation:
export type Fields = {
title: string;
fields: Array<Field>;
};

type Field = {
id: string;
label: string;
inputs: Array<Input>;
};

type Input = {
id: string;
label: string;
unit: string;
};

...

const [store, setStore] = createStore<Fields>(
props.fields ?? { title: "", fields: [] },
);

...

setStore(
"fields",
fieldIndex,
"inputs",
store.fields[fieldIndex].inputs.filter((input) => input.id !== inputId),
);
export type Fields = {
title: string;
fields: Array<Field>;
};

type Field = {
id: string;
label: string;
inputs: Array<Input>;
};

type Input = {
id: string;
label: string;
unit: string;
};

...

const [store, setStore] = createStore<Fields>(
props.fields ?? { title: "", fields: [] },
);

...

setStore(
"fields",
fieldIndex,
"inputs",
store.fields[fieldIndex].inputs.filter((input) => input.id !== inputId),
);
I'm wondering if the filter() trich is acceptable or is there a side effect I'n unaware of?
bigmistqke
bigmistqke7mo ago
I would probably do setStore('fields', fieldIndex, 'inputs', inputs => inputs.filter(...)). It's a bit more compact and u don't get endlessly repeating createEffects.
peerreynders
peerreynders7mo ago
I'd be inclined to go with:
setStore('fields', fieldIndex, 'inputs', (inputs) =>
reconcile(
inputs.filter((input) => input.id !== inputId),
{ merge: true }
)
);
setStore('fields', fieldIndex, 'inputs', (inputs) =>
reconcile(
inputs.filter((input) => input.id !== inputId),
{ merge: true }
)
);
reconcile
edygar
edygar7mo ago
Honest question though: if you need indexed access to the array elements, why not make it an index? Like, inputs as an object? You can always iterate over all elements of this index by Object.entries or Object.values, but you wouldn't have to filter to remove an element, all it would take it would be to refer to its index:
export type Fields = {
title: string;
fields: Record<string, Field>;
};

type Field = {
id: string;
label: string;
inputs: Record<string, Input>;
};

type Input = {
id: string;
label: string;
unit: string;
};

//…

const [store, setStore] = createStore<Fields>(
props.fields ?? { title: "", fields: {} },
);

//…

setStore(
"fields",
fieldId,
"inputs",
inputId,
undefined
);
export type Fields = {
title: string;
fields: Record<string, Field>;
};

type Field = {
id: string;
label: string;
inputs: Record<string, Input>;
};

type Input = {
id: string;
label: string;
unit: string;
};

//…

const [store, setStore] = createStore<Fields>(
props.fields ?? { title: "", fields: {} },
);

//…

setStore(
"fields",
fieldId,
"inputs",
inputId,
undefined
);
peerreynders
peerreynders7mo ago
I could be wrong but I suspect that Object.entries wouldn't give <For> the referential stability that it needs to operate correctly; Object.values should work however and shouldn't be a show stopper as long as key is also stored in the value.
peerreynders
peerreynders7mo ago
Hypothetically the deterministic traversal order could also be leveraged
The traversal order, as of modern ECMAScript specification, is well-defined and consistent across implementations. Within each component of the prototype chain, all non-negative integer keys (those that can be array indices) will be traversed first in ascending order by value, then other string keys in ascending chronological order of property creation.
ref
MDN Web Docs
for...in - JavaScript | MDN
The for...in statement iterates over all enumerable string properties of an object (ignoring properties keyed by symbols), including inherited enumerable properties.
peerreynders
peerreynders7mo ago
My learning progression: - signals are great! - stores are better! - wait a minute, signals just work if you follow “data that is observed together, stays together”. So stores are fine at the entry point of the reactive graph but when it comes to derived (observed) values (on the way to the effects on the other end of the graph) the signal (memo) mindset should prevail. To some degree I see parallels to the relational DB domain where initially data normalization (store shape) is king but then the real life access patterns drive the de-normalization efforts.
binajmen
binajmenOP7mo ago
I could lose the order, no?
edygar
edygar7mo ago
True, but if you are not intending to reorder and you are just appending, it works fine
binajmen
binajmenOP7mo ago
not sure how it will behave when saving in postgres then retrieving it back
The actual operation should be a filter, slice etc...
I will trust Ryan on this one I guess I breathed fresh air for 5 minutes before re-reading all your responses. Thank you for your inputs! I will stick with setStore('fields', fieldIndex, 'inputs', inputs => inputs.filter(...)) for now. I have to read a little bit more about reconcile as I'm not sure what is the added value — I'm sure there is one, I just want to understand before applying it 🙂
edygar
edygar7mo ago
Yeah, this is a very safe option You're right, without the referential stability, any changes would cause For to iterate all items. But with the Index or with a createMemo it works fine
peerreynders
peerreynders7mo ago
as I'm not sure what is the added value
You may be right in the filter case but if you are merging Postgres results back in it maintains referential stability for the items which didn't change in the interim; it essentially diff's the state, changing only what needs to be changed. It was my impression that Index was only an optimization for primitive values because rather than swapping entire DOM blocks around (<For> behaviour) , Index just swaps the associated text nodes around while still using strict equality on the (primitive value) items to track their position. None of that would help with the volatile entry tuple wrappers that Object.entries produces. And I'm not sure where createMemo could mitigate that volatility.
peerreynders
peerreynders7mo ago
There are differences. When using reconcile subscriptions to the array itself will not be triggered; presumably because it uses splice to stabilize the identity of the array. Subscriptions to the length of the array are triggered in both scenarios as the length will change regardless. What I wouldn't have predicted is the impact on the <For> behaviour. The plain filter case results in the deletion of the affected list item (in the middle of the list). This much I would have expected. In the reconcile case however the last list item is deleted—only the text node values were updated. This surprised me.
https://playground.solidjs.com/anonymous/894295bd-9ec5-4af0-bbc1-9fb134945e14
MDN Web Docs
Array.prototype.splice() - JavaScript | MDN
The splice() method of Array instances changes the contents of an array by removing or replacing existing elements and/or adding new elements in place.
Solid Playground
Quickly discover what the solid compiler will generate from your JSX template
bigmistqke
bigmistqke7mo ago
Ye reconcile isn't too smart... It's a pretty simple diff it performs.
Want results from more Discord servers?
Add your server