Luminaires (or “lamps” to the rest of us) are looking a lot like pixels lately, as they offer larger and tunable patches of the CIE color space. This leads to the question, will display pixels ever be suitable to use for general illumination? Already the discussion of excess blue light from these sleep slayers is widespread, and the upcoming use of Micro LEDs and their potential to knock LCDs and OLEDs off the throne(s) they presently occupy points to an argument to combine forces and make a wall display the ultimate tunable white lamp. At least when no one is watching “My 600 Pound Life!
But not so fast. The natural light-advocating Human Centric Lighting groups (do plants and pets also have an advocacy group?) are inclined to use CRI as one measure of suitability, in addition to a circadian shift in color and brightness to match that of local sunlight.
In reviewing the drive for efficient green LEDs to enable 100+ lumen per Watt from RGB sources, the drawbacks of phosphors (8-20um particles on a 3um micro-LED junction seem to direct us to narrow band alternatives such as quantum dots and wells. Following that path we see a different set of challenges such as heavy metals, low efficiency and huge current density problems. David Wyatt of Pixel Display describes these issues well here. The primary task of a display is to send three separate images to our red, green and blue cone cells minimal cross talk, not to make the pasta on our TV tray look more appetizing.
Those of us attempting to recreate a natural hole in the ceiling with LEDs dream of tunable white but with true blue skies and moving clouds that actually follow the present wind direction overhead, and show twinkling local stars and planets at night. Sure, a HDTV can do this, but until the brightness exceed the 6500K of a real sky, meet a 120 lumen per Watt efficiency, and the light is Planckian over the entire visible spectrum we’ll be making our sky/star pixels and probably find watching clouds and stars more interesting than what’s on TV.