A console will probably stream video to a TV that, i asume, will also be ON. A TV consumes in the range of 50W-300W (depending on screen size, age and technology). If you are in the low end of the range, power used to stream video becomes significant, if you are in the high end, it becomes almost irrelevant.
That doesn't make sense. The Watt already defines a rate of time,
ie. joules per second. Perhaps what you meant to do was refer to
power consumption per fixed time such as the kWh (or kW.h for
the fussy), in which case a PS4 drawing 89W for one hour would
have used 0.89 kWh.
If you're describing a device's continuous usage where the time period
is irrelevant, then there's no need to mention 'per hour', as it's the same
no matter how long the device is connected, eg. just say a PS4 in
standby draws 8.4W.
If you read the article, a PS4 will use 181 kW hour per year. For me, at 11 cents a kW hour, that is less than $2 a month, and not really a big part of my electric bill. If one really wants to save energy, maybe we could focus on something significant