In my first post about Ubuntu 7.10 (Gutsy Gibbon), I described my efforts to install a test installation inside VirtualBox, hosted on my old-but-stable Ubuntu 6.06 system. By accident, that was posted under Here and There, instead of Computers ( http://www.frihost.com/users/SonLight/blog/vp-82999.html ) .
Unfortunately, I ran into some snags. I only have 512 MB of ram, which isn't a lot to run virtual machines in. Nevertheless, I expected to be able to install successfully, because I used the alternate install CD -- actually a virtual image, not a real CD -- which claims it only requires 32 MB of memory to do an install. I tried it first giving the VM 192 MB, and it bombed. I increased memory to 320 MB. That seemed like it might work, but it became apparent that memory was overflowing, and data was being swapped back and forth between disk and ram, without any real work being done.
I might try running it again by itself, but the idea that I could do an install while working on the computer was one of the features that encouraged me to try it. More memory would probably do the trick, but how much is needed? Can today's operating systems really handle scheduling programs well when memory is limited? I remember operating systems that would only try to run one program at a time if it decided the program's "working set" was too large to share, and would only "thrash" uselessly if there was no program able to run in available memory. Perhaps the processes in a modern OS are too interconnected to take that approach today.
0 blog comments below