PC Survival and Maintenance Part 3
More horror stories (cont.)..T.O.M.S.
Of all the pessimists' many horror stories about PCs which we considered in Parts 1 and 2, we didn't find one which - when put into perspective - turned out to be anything like as bad as it's painted, so let's see if we can do the same for more of similar ilk.
'Spurious' hard disc activity
Some PC users get very nervous when, seemingly without reason, the hard disc drive starts thrashing of its own accord, and naturally they wonder if some malign being has taken control of the machine and all sorts of nasties are being downloaded from the net.
A far more likely reason is that Windows is simply responding to some sort of 'legal' command and so is perfectly OK. In many cases, if desired, a simple reconfiguration can reduce this activity in future. So let's have a look at some of the possible reasons for the drive activity (there may well be more):
To summarise this topic, there's invariably no need to be concerned if the hard disc seems to be running for no apparent reason. Windows and/or its applications can often be reconfigured to reduce the activity but, if not (e.g. System Restore), in fact it's actually beavering away for your overall benefit. And don't forget that a routine update to one of the anti-nasty applications can restore its automatic scan routine, so you may again need to disable the feature.
'Spurious' hard disc content
It follows from the previous topic that all the disc activity - whether controlled or uncommanded - can result in additional data files being stored on your hard disc drive. So the disc contents will gradually expand, perhaps to a surprising degree (gigabytes even!) and may cause you some concern.
Are they batches of virus or spyware files? Highly improbable; they're far more likely to be newly-downloaded 'signature' files, needed specifically to counteract the said viruses and spyware.
After some weeks and months use, with the System Restore function self-generating restore points on a near-daily basis, the relevant folder starts to become quite sizeable and shows up in the disc's total contents. By default, the folder size will not exceed 12% of the drive's total capacity, and the limit can be reduced by simple reconfiguration if desired.
With current hard disc drive capacities far exceeding whatever is realistically needed in practice, you're unlikely to hit problems. But if, say, you have an earlier laptop, typically with a 20-40MB hard disc drive, it is possible to start nudging the limits, so some reconfiguration and/or judicious file pruning may be needed.
Hard disc defragmentation
Defragmentation of individual files over a hard disc surface does not happen under RISC OS, and this is undoubtedly a Good Thing.
So naturally, the topic of file defragmentation under Windows is a favourite source of horror stories. Yet again, this must be put into proper perspective and, when looked at in detail, we find most of the reported problems are even self-inflicted.
Firstly, how often do you need to defragment your hard disc drive? This depends very much on its contents and on how much it's used. In general, we find the main drives in our most heavily-used desktop machines need to be defragmented every 12-15 months or so.
The back-up drives have never needed to be done in the 3+ years since installation and, interestingly, the 30GB drive in our daily-use laptop didn't need defragmenting until well past its second birthday.
Even with a drive which is getting to the point of requiring defragmentation, this doesn't normally impact heavily on operation. You may notice a little more drive activity in day-to-day use (e.g. loading or saving a large file which may be fragmented over different parts of the disc) but performance is barely affected, especially if the drive is of the modern SATA variant.
On the basis of all these experiences, we do a brief check only every six months or thereabouts to see if any drive needs to be defragmented. If one does, we set aside an hour or so to do it when it's next convenient. At a push, the exercise is multitasking, but it's probably best to leave it to get on with the job.
To put all this into context, the last defragmentation we did - a year after the previous one on the most heavily-used disc drive - took 38 mins to complete. In other words, in practice, defragmentation is simply nothing like the very regular and onerous task which the horror story writers would have us believe.
That said, there are a couple of points to watch out for. Firstly, the algorithms which determine whether or not a defragmentation is required are very capable. So if they say it's not required then don't do it!
If you decline to accept their good advice and do it anyway, you will achieve very little in the way of savings but quite unnecessarily thrash the drive, probably for longer than a routine defragmentation.
Secondly, unless the defragmentation is to take an inordinately long time, there needs to be a minimum of 15% free space on the drive for file 'shuffling'. So a relatively small drive in, say, a laptop may first require some files to be moved onto temporary storage such as a backup drive or data DVD.
Fact: The RISC OS font manager has the best outline font antialiasing system in the business. Fig 1 is a screenshot of a typical example (intentionally expanded slightly so that the antialiasing pixels are just beginning to show).
Windows does not have integral font antialiasing as such and takes a lot of stick because of it. But that isn't to say that it doesn't have any font smoothing and, with a trivial, one-off reconfiguration, very respectable working results can be achieved.
Let's go through the options. To start with, Fig 2 shows just how grotty Windows screen fonts will appear without any smoothing. Bleghh...
Note the serious 'jaggies' round the headline font and, in particular, just how difficult the Times italic face is to read. (As an aside, it also shows another trap for players: note in Figs 1 and 2 the considerable differences between the headline font character shapes, both of which are (supposedly) Gill Sans ExtraBold, and which will show up on screen and the printed page.)
With Normal smoothing selected (Fig 3), which is the default condition for Windows, the headline font edges are tidied up quite noticeably. But frankly, the body text is little improved and the italicised text is still quite difficult to read. So at this level, although unjustified, arguably some of the criticism may be understandable.
However, if we select the other option, ClearType, although (oddly) the headline font seemingly loses some of its smoothing, the body text including the italics is greatly improved (Fig 4).
Please accept that what you see in the screenshots in Figs 1-4 is only representative of what you will see on your screen, using your choice of outline fonts, and whether you are looking at a Windows or RISC OS display. Yet in practice, with the Windows ClearType font smoothing selected, if you were to look at the same block of identical text, side-by-side in Windows and RISC OS windows (courtesy of VirtualRPC) subjectively there isn't a great deal of visible difference (compare Fig 1 with Fig 4).
But how to select ClearType font smoothing? This is a one-off configuration change, achieved by selecting Start-Control Panel-ClearType Tuning and ticking the Turn On ClearType box. As you do so, you should immediately see the clarity of the text improve significantly (try toggling between ticked and unticked to preview the effect).
To further 'fine tune' ClearType, start the wizard and follow through the self-explanatory instructions to set your personal preferences. But note that, if you're using an LCD monitor, it must be in its native resolution to do this tuning.
Reportedly, text in some monitors may appear slightly fuzzy and/or display slight edge RGB 'colouring' of monochrome text, which is due to the way the ClearType algorithms work, but this is not intrusive.
Some Windows applications such as Photoshop have true antialiasing built in, improving both text and graphics, while the more powerful graphics cards may offer universal antialiasing, affecting the whole display.
As with antialiasing under RISC OS, all these font smoothing features for Windows affect only what you see on screen (including the 'colouring' phenomenon) and do not affect what you will see on the printed page.
A favourite way to knock a Windows-based machine is to be critical of the amount of memory (RAM) it needs to operate effectively. But when we look into the facts of RAM requirements, they reveal considerable misconceptions which undoubtedly - and very unfairly - lead to misrepresentations when they appear in print or in forums.
Taking Microsoft's own published data, we find that Windows XP for example - as a complete operating system (not just the equivalent 4MB core of RISC OS) - requires a minimum of 64MB RAM, with 128MB recommended. But if that's the case, where on earth do the extraordinary (quote) "minimum" figures of 1GB or more come from?!
In fact, these nonsensical (mis-)quotes have got nothing whatever to do with the Windows operating system's needs, but have everything to do with running the superb but unavoidably memory-hungry application software titles under Windows. They refer to recommended amounts of RAM, not minimum requirements.
To put this into proper perspective, we ought to point out that RISC OS application software simply does not exist which can provide anything like the same productivity of many Windows applications. If it did, RISC OS would of course require and benefit equally from these very large doses of RAM.
A typical and very popular example is the wide choice of video editing suites available for Windows, including the built-in Movie Maker. It has to be appreciated that video footage - in whatever format - can result in very large data files; 0.5-1.0GB is by no means unusual.
But to edit video footage, we first need to load each file into memory and, if we are not to rely heavily on virtual memory, the more RAM we have installed, the easier it will be for the machine to cope.
That said, it is perfectly feasible to edit very large video or other files on a PC with 'limited' memory - we've done it, on our laptop (512MB RAM) - but you necessarily have to accept lots of disc-thrashing while virtual memory is accessed and, therefore, an element of reduced performance.
Another example - and of direct relevance to VirtualRPC - is that, if you wish to configure it to provide the maximum 256MB RAM (described in Part 1), the underlying PC will require a minimum of 512MB RAM.
When it comes to the latest iteration of Windows - Vista - and again using Microsoft's quoted data, the minimum RAM for the entry-level Basic version, which is often installed in a typical, off-the-shelf PC, is 128MB. The recommended amount is 512MB and even a modest laptop is very likely to have at least that.
It's only when you come to the other Vista variants - Home Premium, Business and Ultimate - that 1GB RAM is recommended (no minimum requirement is specified) and, as described earlier, this is aimed specifically at supporting the often pre-installed application software's requirements, not the underlying Vista operating system.
In practice, any PC purchased today will most likely be labelled 'Vista Capable' or 'Vista Premium Ready', depending on the version of Windows installed, and will contain at least the amount of RAM which is recommended for that variant of Vista, i.e. 512MB or 1GB as appropriate.
As proof of this particular pudding, we recently upgraded our 512MB RAM laptop from Windows XP (Home) to Vista Home Premium and recorded some before-and-after performance data on typical, day to-day tasks such as accounting, desktop publishing, graphic work and colour/photo printing, primarily under RISC OS (set to use 256MB RAM) but also under Windows.
The slow-down due to the reportedly 'memory-hungry' Vista was less than 10% and most Windows and RISC OS applications stormed along, as usual. With a 2.66GHz Pentium 4 processor under the bonnet (which, by today's standards, is quite a modest-sized engine), RISC OS applications were running at many times the speed any current ARM-powered machine can achieve.
It was only when we came to editing a 1+GB video file under Windows that - not surprisingly - the hard disc started thrashing noticeably and performance fell by some 25-30%. It still does the job perfectly well though; it just takes rather longer than under Windows XP.
However, further RAM can be bought relatively cheaply and easily installed to suit the user's purposes. But it's important to ensure that the replacement or additional modules are the same or better standard as the existing chip(s) as, otherwise, it is possible to reduce performance, rather than enhance it.
*Erratum: Please note that, in Part 2, the screenshots for Figs 1 and 2 were inadvertently transposed.
Coming in Part 4..
We aim to use the fourth and (probably) final part to finish off with a few more areas where Windows is quite unfairly criticised and where simple configuration changes can make a big difference. We'll deal with your feedback and, based on the various topics we've discussed, put together a 'checklist' of suggested actions for setting up a Windows-based PC as a reliable and - above-all - safe foundation for running VirtualRPC.