Performance impact of lots of elements

Our department makes heavy use of a central URI repository, so I’ve recreated it using UD with added functionality. It consists of a search field, plus all the URIs listed on the page under their various sub/categories.

The main loop iterates through the links, populating UDCollection/items and thus creates a lot of nodes that are evaluated with by the Javascript.

I understand that this wasn’t how UD was intended to be used, but is there any way I can reduce the overhead? Either by a config change, or different way of calling UD elements or even restructuring the code in some way that reduces the number of DIVs created.

Cheers!

Hi @fromtheinternet not sure I am fully understanding the question as I cannot see how you are doing this…I am sure if you post a snippet of some of your code that would help myself and others on what you are trying to acomplish then I am sure someone else will chip in on a solution

1 Like

What is the performance impact are you subjected to ?
Long time to load ?
Slow when loaded ?
Other / Mixed bag ?

You’d need to produce a snippet so we can look at it, maybe generating a similar quantity of elements using dummy data so we can observe.

Otherwise, without more context, poor performance could be due to:

  • Generic inefficiencies present in your code (e.g. using += to build arrays and stuff like that)
  • Inefficiencies into some way you are creating UD components (e.g. : Not using cache variables and reloading everything every time)
  • other…

Ultimately, if nothing work with the current components, you would always have the possibility to use custom components that are more suitable to what you want to do.

2 Likes

Probably a bit of a stretch trying to explain things without a code example!

The following code extracts ‘work’ in that they output the correct data in the correct format. Briefly, it loops through each category, and then their respective subcategories before putting a link on the page.

$Cache:Links = Import-Csv -Path $PSScriptRoot\links.csv    

New-UDCollapsible -Items {
        foreach ($category in ($Cache:Links | Select-Object Category,Icon -Unique )) {
            New-UDCollapsibleItem -Title $category.Category -Icon $category.icon -Content {
                New-UDLayout -Columns 3 -Content {
                    foreach ($subcategory in ($Cache:Links | Where-Object Category -EQ $category.Category | Select-Object -ExpandProperty subcategory -Unique) ) {
                        New-UDCollection -Content {
                            New-UDCollectionItem -Content {New-UDMuTypography -Text $subcategory}
                            foreach ($link in ($Cache:Links | Where-Object subcategory -EQ $subcategory)) {
                                New-UDCollectionItem -Url $link.URL -Content {New-UDMuTypography -Text $link.Name}
                            }
                        }
                    }
                }
            }
        }  
    }

As for the CSV, it consists of Name,Icon,Category,Subcategory,Keywords and URL fields. Things would be a lot neater if I could use a relational database, but never mind.

All the piping in the script above isn’t going to be very fast, but that’s server-side and not where the delay lies. As an aside, I’ve tried normalising the CSV into an array of objects with nested properties so that I can just reference them directly, but for some reason ReactJS throws error #31 if I try to use it. That’s a different issue entirely, though.

In the screenshot below is some of the output from the Chrome performance recorder for the page load, and each of those columns is a script evaluation and load event. Removing the code above clears these events, so the delay is obviously caused by JS evaluations of each element/node.