Remember the very first Mac back in 1984? I suspect many of you weren’t around then, or were too young to notice or care, but it had a singular flaw that received its share of complaints. You couldn’t upgrade the RAM. It was meant to be a computing appliance, and you would no more open the Mac’s case to upgrade anything than you would open your refrigerator to swap out the compressor for a more powerful one.
Later Macs could be upgraded, but that clearly wasn’t the vision of Steve Jobs. But it didn’t mean that RAM was easy or quick to replace. There were several Mac minitowers in the 1990s that forced you to remove the logic board and disconnect some flimsy cable assemblies to get at those RAM slots. When the first iMac arrived, the computer that signaled the resurrection of Apple, you had to pull out the entire electronic assembly to reach the RAM slots. Why did they design those things this way?
You could clearly get the impression that Apple was hostile to people who wanted to upgrade RAM. Rather than have you go to an outside supplier to save money, they simply made the process impossible. Even when you could replace RAM, the process remained difficult, with the original Mac mini being the worst offender. Some companies even produced special style putty knives to ease the process of opening the delicate case, and forget about replacing the hard drive in a convenient fashion.