I am working on a project where I need to manage complex data binding scenarios & i want few advice on best practices; I am dealing with a dynamic data grid where users can add, remove, and edit rows. Each row contains various input fields & the data structure is somewhat nested.
What are the best practices for optimizing performance when dealing with a large amount of dynamic data in DotVVM? Are there any strategies for minimizing re rendering or managing large datasets efficiently??
How can I implement validation rules that apply to individual rows as well as to the overall dataset? What approaches work best for ensuring that validation errors are clearly communicated to the user?
What are the suggestion techniques for managing the state of such complex forms to ensure data integrity and user experience?
that is very general question and so it’s hard to answer specifically. It depends on what you consider to be “large” dataset and what kind of performance is acceptable to you (is it internal site on gigabit local network, or should it also work on crappy mobile connection?, etc.). Generally, there are a few common performance bootlenecks in DotVVM applications
network roundtrips with large view model, which can be partially mitigated with Server-side viewmodel cache. Good practise is to keep the viewmodels at most at single digit megabytes, DotVVM will start printing warnings when you get over 5MB. This obviously matters less on fast network, but serialization and deserialization still isn’t free.
too many client-side components - if you display 1000 table rows at once, even plain HTML table isn’t that fast; if you put 5 BP comboboxes in each row, then it’s going to be real slow.
unnecessary server-side processing - DotVVM commands will run the Init, Load, PreRender methods on your viewmodel, so if you run tens of expensive DB queries in them, all commands will be slow.
You can load some data only in the initial load by checking Context.IsPostBack
Server-side timings can be easily monited using MiniProfiler
I’d primarily suggest making a quick prototype with synthetic data to get feel for it. I don’t know what exactly do you mean by dynamic grids, in general this is bit hard in DotVVM, as it likes statically typed viewmodels, UI defined in dothtml markup, … The default grid doesn’t support interactive column reordering, although you can quite easily go and change it’s Columns collection during page loads.
Client-side DotVVM is based on knockout.js, which has the annoying property that all values in the viewmodel have to wrapped in ko.observable, which is a bit expensive in large quantities. However, DotVVM initializes these ko.observable objects lazily, only when needed in the UI. If you only parts of the viewmodel from custom JS code through the dotvvm.state API, or from a JsComponent, you can save quite a bit of client-side performance. Given that React is also IMHO better suited to very dynamic layouts, you can write one part of the application as a React component, and only use DotVVM for client-side communication and all the boring CRUDs, where it shines.
Feel free to ask further questions, and let us know if you encounter unexpected problems in your experiments. It’s quite possible that you hit some performance issues which would be easy to fix in the framework