This is only remotely true if you have a box dedicated to doing one single thing and nothing else. That is almost certainly not the case for the vast majority of Photoshop users
Consumer software running on a consumer OS should not be grabbing all available RAM just because. Doing so will cause other applications to be moved to swap and have to be loaded back into RAM when the user goes to use them. In a server environment doing something like running a SQL server it would make more sense to grab all available RAM and start aggressively caching frequently accessed data in RAM to present it sooner with the assumption that the server’s primary role is to perform SQL operations as quickly as possible.
Specifically with Photoshop what would be the benefit of it be aggressively reserving RAM beyond what is needed to function?
there is no OS-level, standardized, dynamic allocation of RAM (definitely not on windows, i assume it’s the same for OSX).
this is because most programming languages handle RAM allocation within the individual program, so the OS can’t allocate RAM however it wants.
the OS could put processes to “sleep”, but that’s basically just the previously mentioned swap memory and leads to HD degradation and poor performance/hiccups, which is why it’s not used much…
so, no.
RAM is usually NOT dynamically allocated by the OS.
it CAN be dynamically allocated by individual programs, IF they are written in a way that supports dynamic allocation of RAM, which some languages do well, others not so much…
it’s certainly not universally true.
also, what you describe when saying:
Any modern OS will allocate RAM as necessary. If another application needs, it will allocate some to it.
…is literally swap. that’s exactly what the previous user said.
and swap is not the same as “allocating RAM when a program needs it”, instead it’s the OS going “oh shit! I’m out of RAM and need more NOW, or I’m going to crash! better be safe and steal some memory from disk!”
what happens is:
the OS runs out of RAM and needs more, so it marks a portion of the next best HD as swap-RAM and starts using that instead.
HDs are not built for this use case, so whichever processes use the swap space become slooooooow and responsiveness suffers greatly.
on top of that, memory of any kind is built for a certain amount of read/write operations. this is also considered the “lifespan” of a memory component.
RAM is built for a LOT of (very fast) R/W operations.
hard drives are NOT built for that.
RAM has at least an order of magnitude more R/W ops going on than a hard drive, so when a computer uses swap excessively, instead of as very last resort as intended, it leads to a vastly shortened lifespan of the disk.
for an example of a VERY stupid, VERY poor implementation of this behavior, look up the apple M1’s rapid SSD degradation.
short summary:
apple only put 8GB of RAM into the first gen M1’s, which made the OS use swap memory almost continuously, which wore out the hard drive MUCH faster than expected.
…and since the HD is soldered onto the Mainboard, that completely bricks the device in about half a year/year, depending on usage.
TL;DR: you’re categorically and objectively wrong about this. sorry :/
for whom? as a power user, I’d keep affinity photo or photoshop, maya, max, blender and godot/unity open at the same time. I DO NOT WANT PS EATING UP ALL THE RESOURCES. Affinity so far (only 4 months into it) has been a delight.
Not using available ram only is true when doing so could offer performance benefits. Many applications can’t be sped up by using more ram. Using more ram for no obvious reason is stupid, especially on a machine that has to do other things at the same time.
Bad memory management can actually slow down applications significantly. Allocating memory is actually a fairly expensive operation. So much that high performance software actually uses a bunch of tricks to avoid extra allocations where possible. Additionally, accessing memory is actually kinda slow for a CPU, and the CPU often has to sit around for many clock cycles waiting for memory to be retrieved if it’s not in the CPU’s cache. If your main data can be stored more compactly, more of that data can fit in your CPU’s cache, reducing that idle time.
Not using all of the available RAM is not a good thing…
E: my PC doesn’t have anywhere near 40GB of RAM and yet still runs Photoshop just fine. Why do you think that is? 🤔
This is only remotely true if you have a box dedicated to doing one single thing and nothing else. That is almost certainly not the case for the vast majority of Photoshop users
This is not remotely true.
Consumer software running on a consumer OS should not be grabbing all available RAM just because. Doing so will cause other applications to be moved to swap and have to be loaded back into RAM when the user goes to use them. In a server environment doing something like running a SQL server it would make more sense to grab all available RAM and start aggressively caching frequently accessed data in RAM to present it sooner with the assumption that the server’s primary role is to perform SQL operations as quickly as possible.
Specifically with Photoshop what would be the benefit of it be aggressively reserving RAM beyond what is needed to function?
This is non-sense. Any modern OS will allocate RAM as necessary. If another application needs, it will allocate some to it.
this is not true.
it entirely depends on the specific application.
there is no OS-level, standardized, dynamic allocation of RAM (definitely not on windows, i assume it’s the same for OSX).
this is because most programming languages handle RAM allocation within the individual program, so the OS can’t allocate RAM however it wants.
the OS could put processes to “sleep”, but that’s basically just the previously mentioned swap memory and leads to HD degradation and poor performance/hiccups, which is why it’s not used much…
so, no.
RAM is usually NOT dynamically allocated by the OS.
it CAN be dynamically allocated by individual programs, IF they are written in a way that supports dynamic allocation of RAM, which some languages do well, others not so much…
it’s certainly not universally true.
also, what you describe when saying:
…is literally swap. that’s exactly what the previous user said.
and swap is not the same as “allocating RAM when a program needs it”, instead it’s the OS going “oh shit! I’m out of RAM and need more NOW, or I’m going to crash! better be safe and steal some memory from disk!”
what happens is:
the OS runs out of RAM and needs more, so it marks a portion of the next best HD as swap-RAM and starts using that instead.
HDs are not built for this use case, so whichever processes use the swap space become slooooooow and responsiveness suffers greatly.
on top of that, memory of any kind is built for a certain amount of read/write operations. this is also considered the “lifespan” of a memory component.
RAM is built for a LOT of (very fast) R/W operations.
hard drives are NOT built for that.
RAM has at least an order of magnitude more R/W ops going on than a hard drive, so when a computer uses swap excessively, instead of as very last resort as intended, it leads to a vastly shortened lifespan of the disk.
for an example of a VERY stupid, VERY poor implementation of this behavior, look up the apple M1’s rapid SSD degradation.
short summary:
apple only put 8GB of RAM into the first gen M1’s, which made the OS use swap memory almost continuously, which wore out the hard drive MUCH faster than expected.
…and since the HD is soldered onto the Mainboard, that completely bricks the device in about half a year/year, depending on usage.
TL;DR: you’re categorically and objectively wrong about this. sorry :/
hope you found this explanation helpful tho!
for whom? as a power user, I’d keep affinity photo or photoshop, maya, max, blender and godot/unity open at the same time. I DO NOT WANT PS EATING UP ALL THE RESOURCES. Affinity so far (only 4 months into it) has been a delight.
For everyone?
And any modern OS will allocate the necessary amount of memory to each task.
You speak from the perspective of someone who’s either always had enough RAM, or not enough work to do.
I speak as someone who has used a computer before and paid attention to dynamic memory allocation.
Not using available ram only is true when doing so could offer performance benefits. Many applications can’t be sped up by using more ram. Using more ram for no obvious reason is stupid, especially on a machine that has to do other things at the same time.
I mean what differences does it make if it’s needed or not if it’s not in use?
Bad memory management can actually slow down applications significantly. Allocating memory is actually a fairly expensive operation. So much that high performance software actually uses a bunch of tricks to avoid extra allocations where possible. Additionally, accessing memory is actually kinda slow for a CPU, and the CPU often has to sit around for many clock cycles waiting for memory to be retrieved if it’s not in the CPU’s cache. If your main data can be stored more compactly, more of that data can fit in your CPU’s cache, reducing that idle time.
Who said anything about “bad” memory management?
Bad memory management includes allocating memory you aren’t actually making use of.
How is that bad?
Try reading two posts of mine up where I explained it
we all need a little swap here and there, right
Adobe can’t bother to fix it, they ended up adding a “Scratch Disk” aka virtual memory instead of fixing the problem.