Well, high-performance graphing libraries draw on canvas instead of using DOM. Please see e.g. what netdata is capable of. Canvas has the drawback that there's much more code to be written though, so I understand that avoiding it is desirable in a general-purpose tool.
Sure, I understand that some limits make sense. I am trying to learn where exactly is the bottleneck at the 2k level, because I doubt it's the database, network speed or back-end. For these components, 2k records should be a breeze.
Right now I'm using custom ggplot-based diagrams for some performance analyses. My goal here is to put them in a form where a person who is not familiar with ggplot can use and modify them for their purposes, as opposed to loading data into a spreadsheet.
I've reproduced a stacked area chart displaying ~30 series of daily measurements over a year, one that was particularly useful in the past. Except at that scale I had to make it weekly to go below 2k data points.