it is a very strange world when my terminal emulator program is taking up 1.2GB of memory.
@lgw4 pretty much yeah.
Electron everything was such a mistake.
2010s: the browser is the new thin client
2020s: what if we bundled the browser separately into each application
š¢
@aurynn I still remember being concerned when my terminal emulator was taking up nearly 16kiB of RAM. Admittedly that was 1/4 of the address map on the system, and the screen RAM took another 16kiB. But even so, a terminal emulator taking over a GiB of RAM is very ādo you have a memory leak? Sure seems like a memory leakā.
@ewenmcneill well, I have, uh, 40 open? And each one has over 10k lines of back buffer. And a full GPU display buffer.
@aurynn @ewenmcneill Even if every line in those buffers took up a whole kilobyte, I think that's still rather a lot
@aurynn 40 * 10k lines * 255 bytes/line is still under 100MiB. Plus say a generous 25 MiB for code.
Your terminal program is taking at least an order of magnitude more RAM than Iād expect :-/
@buherator @aurynn Allegedly for faster rendering. And to be fair, alacritty does feel much faster than urxvt, or xterm. (I didnāt measure anything, though.)
@buherator @aurynn Tools that produce a lot of terminal output run faster. Still not as fast as 2>&1 mylog, though. I think we can agree on ā1.2GB RAM for a terminal emulator is badā.
@buherator @aurynn while Alacritty specifically currently doesn't support it, I could imagine that offloading Sixel rendering to the GPU might be a good idea.
More generally, one might argue that on many desktop systems the GPU sits around IDLing anyway, and that offloading graphics-related computations to the GPU is "the right thing to do". But I do see that this also adds complexity, possible bugs/vulns etc.
I'm personally kinda undecided on what's the right take here.