The eye (and brain which is part of the picture) are analog, you may be able to perceive something you can't see under close examination. CRT flicker is more noticable in your peripheral vision and not your direct vision. Your eye does not "capture" at 25 Hz, it has a response time, some persistance in light detection, and the brain also applies it's own interpretation. You can detect a flash of 1/4000 of a second, but would not be able to see a light that is oscillating on and off at 1/60th of second.
LED lights demonstrate this. You look at the light directly and it is 100% solid and unwavering, you scan your eyes to the left or right, and you will see a collection of discrete images of the lights as you "capture" the light in being on and off in different parts of your field of view.
I agree that there may be *some* aliasing between the local power source (as applied to lighting) and CRTs, but almost all domestic lighting uses some method of persistance (either phosphorus in flourescent tubes and CFLs), or in the case of tungsten lighting, the filament doesn't cool down enough before it heats up for the next AC cycle.
Also, bear in mind though that the VESA "flicker-free" standards applied just as correctly moving from 60 to 72 Hz in Australia as it did in North America.
The aliasing you are talking about when filming is a related, but more or less discrete effect. The frames capture a finite amount of light, and the CRT monitors on the set will be emitting (or scanning out) a fixed number scanlines in that period. As a result the captured image will never be full frames, this coupled with phosphor decay will results in the historically familiar dark horizontal lines on TV and movies. With LCDs, this simplifies down to tearing on screens. Framelock and Genlock provide a solution to this, by ensuring that all displays on a set will be synchronized to a common clocking signal that is in sync with the camera.