-
Notifications
You must be signed in to change notification settings - Fork 67
GetNextIdValue should return next id based on the largest id in the collection, not based on the last item's id #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I would like to take this item. Can I ? |
@loneshark99 sure! If you have some questions, just ask. I will add in progress label, but there is no hurry to complete this task. |
@loneshark99 Hi. Do you have any questions regarding this task or if you are not anymore interested in implementing this, just tell me and I will remove "In Progress" label. |
Hi. I must admit that it is probably not the most eficient solution, but i couldn't come up with better one |
Thanks! Just make the PR and we can then think together if there is some more efficient way. |
Hi there, I noticed that a PR was created for this issue, and I see that the issue has been around for a while. I was wondering if we could explore potential solutions to the efficiency concern. Would it make sense to add a property to the Datastore class that keeps track of the largest ID? This property could be updated whenever an item is inserted or updated, based on a comparison with the current largest value. This approach might eliminate the need to search the entire document, especially for datastores that start out empty. What do you think? I'd love to hear your thoughts! 🚀 |
GetNextIdValue checks only the last item from the collection and gives a new id based on that item's id. Method should find the item with the largest id value from the collection and give a new id based on that item's id.
Function is in JsonFlatFileDataStore/DocumentCollection.cs
Tests are in JsonFlatFileDataStore.Test/CollectionTests.cs
The text was updated successfully, but these errors were encountered: