Traditionally, marketing of gadgets used a lot of quasi-technical terms, like for example Gigaherz and Terabytes, to provide easily comparable indicators of product quality. Often, the technical specialists who originally conceived these parameters and have used them in a very specific context, were unsatisfied with this tendency. While the chip frequency in GHz simply means a frequency used to synchonize specific units of some limited area of the CPU, and therefore is non-trivially correlated with the overall performance, marketing speak has often used it as one and only indicator of the device performance. Normal consumers were satisfied, because it was an easy concept for them to grasp. Engineers were unsatisfied and have pointed out that different CPU commands require different number of cycles to complete, and that different parts of CPU are clocked with different frequencies, and that there are a lot of use-cases where CPU performance is irrelevant to the overall performance… But nobody cared.
Apple has changed that a little, by using qualitative and emotional parameters for the marketing speak. Well, may be, it is the part of an answer to the question, why are there so many good software developers (Google, Facebook, Yandex, etc) who choose Apple devices. May be, it is because Apple marketing doesn’t piss them off? That simple.
Unfortunately, other manufacturers are not ready to switch, and are still using quasi-technical terms to describe their devices. They have a problem though. Intel and AMD have stopped to increase the CPU clock frequency. The new CPUs have the new advantage though, it is the number of cores they have. Again, technical specialists know exactly that the number of cores is non-trivially correlated with the overall performance. If anything, having more cores is better for the system responsiveness rather than system performance. And again, nobody cares how the engineers feel, and the number of cores is beginning to be used as a substitute of the performance.
Especially grotesque this marketing speak looks in the TV set area. In the last couple of months I’ve observed at least two large TV manufacturers to announce a dual-core Smart TV sets, implying they would be quicker than the usual TV sets.
It is grotesque, because a usual modern TV set already has seven to eight cores. They are:
- All-purpose MIPS or ARM core to execute the application performing the traditional TV tasks (tuning to channel, switching of different inputs, controlling the display, changing volume, reacting on remote control, displaying the user interface and so on)
- Another all-purpose MIPS or ARM core to implement Smart-TV functions, for example the TCP/IP stack, the UPnP/DLNA stack, a WebKit browser, a media pipeline for Internet formats, the PVR function, and so on.
- Dedicated MPEG-1, 2, 4, and AVC decoder.
- Sometimes, an additional dedicated MPEG-2 only decoder. It might not be strictly needed, but can be a remainder of some previous hardware platforms supporting only MPEG-1 and 2.
- Dedicated AAC/AC3/DD decoder.
- Dedicated transport stream engine with real-time demuxing, filtering and PSI processing of several TS streams in parallel.
- A slave MIPS or ARM core implementing graphics acceleration, typically some kind of OpenGL ES. This is needed to efficiently blend the TV user interface, the web-page shown by WebKit, and the running full HD video together.
- A slave MIPS or ARM core implementing real-time full HD video enhancements, including complicated de-interlacing algorithms, high-quality (bicubic?) down- and up-scaling, noise reduction, adaptive dynamic contrast, color and sharpness correction, and, very important on large screens, motion estimation and compensation (including detecting and fixing the 3:2 pull-down).
There is also a stand-by controller, which is always-on even when the other cores are powered off, and is responsible for IR remote control interfacing and the booting sequence. If you count it as a core, there are nine different cores. And I’m not including FPGA arrays or other interfacing needed to handle CI+ modules.
Now, when a TV manufacturer announces a dual-core TV set, what exactly does it mean? If it means that their TV user interface is displayed not by the same core that is rendering HTML pages, this is hardly an innovation and is more or less a virtual marketing bubble. If it means that their WebKit or Opera has two cores available for rendering, this is a different story. On the other hand, it is hard to believe that they dedicate those two cores for the WebKit only; other tasks like the Internet media pipeline, or a PVR function, would typically also reuse those two cores. Therefore it is hard to compare performance and responsiveness of a “single-core TV set” (whatever it means) and a “dual-core TV set” (whatever it means).
I believe, TV manufacturers should follow the example of Apple and talk about qualitative parameters instead. How many users, in percent, perceive a significant lag between action and reaction when using Smart TV functions? A simple, beautiful scalar quality parameter.