Well, I don't think anyone actually makes a good TV set anymore to be honest. The colours on most (all?) modern TV sets are deliberately (to attract the masses) over-saturated to varying degrees, the brightness is essentially fixed (moving the control labelled "brightness" either artificially darkens the picture, or blends it into white) on many, and it doesn't look like they offer any functionality you can't get with a TV tuner for your PC. (I don't have one of my own — yet — but I have considered getting one in the future, provided Linux drivers will be available for it.) Let's not forget that they either omit headphone outputs or if they do have them, use low-quality circuitry that results in high noise — any half-decent PC audio codec will put it to shame. They do have some advantage in convenience, but to me that's minor compared to their current downfalls.
Of course, we can't forget that much of the programming on TV has also gone downhill — although some sensible stuff remains. But that's not the main topic here.
(Protip: Most new desktop PC monitors come with the contrast control set to 70 by default — again this is about marketability and nothing else. But at least you can still reduce it to 50 to get a realistic image. And while you're at it, adjust the brightness to a comfortable level — the mid-point is generally uncomfortably bright. Of course, you do have to choose wisely to get a good one — but aside from one dead green subpixel, I'm pretty satisfied with the image of my current Dell monitor, of course after initial adjustments. What I would really like from a monitor is higher pixel density — say a pixel pitch of 0.1mm — but the industry is still being held back by the yet-to-end stream of software that was never written with graphical scalability in mind. )