It depends on the codec - VC-1 is much less demanding than H.264, for example. MPEG-2 is less demanding than VC-1.
It depends on the bitrate - more bitrate requires more CPU power. There's a world of difference between 1080p rips found on the net (5-10 Mbps) and BD content (20-30 Mbps).
If H.264 is used, it depends on the encode settings, i.e. profile and precise features used. H.264 is very scalable and "baseline" profile content optimized for fast decoding vs. "high" profile content with every H.264 feature enabled can easily mean you need multiple times the CPU power.
Of course, it also depends on the efficiency of the player/decoder, if postprocessing (for example deinterlacing) is needed and how the video output device/driver performs.
So, if you ever see a claim like "my $SLOW_CPU plays 1080p just fine!", take it with a grain of salt.