24/09/2013

Back in IBC - Cutting Edge Session 1&2

It was almost on all lips during IBC this year, you have to improve the immersive experience for the viewers. Not only the story or TV program have to be original and addictive and if possible an endless source of money generator, no, they also have to be more immersive.

But what is an immersive experience? Different people, different interpretations. The idea is to offer to a spectator - while he is at the cinema or in front of its tv - an experience the more realistic it can be, the viewer should forget he is watching a display (e.g. watching a football game on your sofa and feeling you are in the stadium). The question becomes how to improve the display to make it more immersive? That's what the conference speakers (#ibcconf hosted during the show #ibcshow early September in Amsterdam) were trying to answer in their presentation. And especially in the sessions "Cutting Edge 1 & 2" new technologies were discussed. I took part in session 2 and we are all presenting what are the things comings, which technology will be predominant in a near future and hopefully the one we propose will be.

Starting from a rectangle display with a fixed resolution (e.g. HD), a fixed frame per second rate (e.g. 25fps), a fixed color dynamic (e.g. each color channel pixel coded on 8bits) what do we do? How do we introduce new "things" without destroying completely the existing workflow (from image acquisition, encoding and compression, diffusion and display)?

Increasing the resolution to 4K was written everywhere this year, increasing the resolution almost 4 times the one corresponding to HD (which is the one for Blue-Ray, DVD is way smaller), but so far there is not real content or affordable customer product for home. Professional movie cameras (or the latest GoPro) can offer this resolution, but only movie theater could follow on that (also I don't remember what is the offcial/recommended resolution for film distribution). Comparing to some years ago, the switch to full digital is a reality in many movie theaters and even so the quality in full digital is said to be lower than with film, two things are interesting: there is limit for an average viewer to see/perceive a difference in quality above 2K (so do we need to go so high?) and depending where you sit in a theater you will perceive different resolutions. In case you don't know this already, the human eye is performing filtering on the signal he receives, details will be perceived in an image depending of the viewing distance (this is use in many image compression algorithm where you remove what is not seen anyway).

Researchers at BBC where showing their latest work on fps, which fps is achievable in order to increase the viewing comfort. They said that above 100fps it's better, for you to appreciate the difference they were showing the same sequence (athletes doing high jump) with "normal" fps and higher fps side by side.

About hdr, the display market is almost reduces to one company (Dolby). The technique for taking hdr images (high dynamic range images) is known, basically you have to combine several pictures taken at different exposures. One can see the issues when you want to record movie file and not a single image. Researchers at Fraunhofer ISS proposed a very neat solution where sensors with different sensitivities pseudo-randomly distributed where used. And taking into account sparsity they could in one single shot obtain an hdr image. A brilliant friend did suggest me the following "but why don't they read continuously the captors to get the values at different exposures and create the hdr data on the fly", hmm why? And hdr must not be misunderstood with RAW file format despite data in that format possibly available in higher that 8bits per channel.

And there is our approach which consists in surrounding physically the viewers: the screen is curved (e.g. cylinder or digital dome to spherical display). To do so we combine several projectors to project images on the whole curved surface. This year we proposed a study case where 3D stereoscopic content was streamed to a digital dome. The installation we described use two fish-eye cameras and a 2 diameters dome tilted at 90degres such that while standing you are looking straight in the middle of the curved screen (the field of view proposed is bigger than your natural field of view, you should feel immersed). 3D and immersive experience go together and research is actively done to make the 3D experience more seamless to the user, meaning no glasses to wear, auto-stereoscopy and multi-views. Here as well the colleagues at Fraunhofer HHI and ISS have a strong presence. The matter of 3D and immersive display (or surround cinema) is tricky, the whole 3D thing is tricky (which makes it very interesting), because looking with one eye in real life doesn't make you see a flat image, you still perceive depth. And coming back to immersive display/surround cinema, the way you generate 3D images has to take into account where the viewer may look at, the freedom you are offering (possibility to look in every direction) is a challenge for the generation of proper stereoscopic effect. And I probably lost 98% of my friends after these lines.

All these large, curved, multi-blended displays need to be accurately color calibrated on the top of their geometrical calibration (we also do that at Fraunhofer FOKUS in our department VISCOM). The viewers need to have the feeling to look at one single light source only. The more you forget about the technique during the show you are watching/attending, the more immersive is the experience.

Last but not least, sound is of course in the game. You can't talk about surround cinema without surround sound. That wasn't too much mentioned in the sessions, but both image and sound are working together, beautiful isn't it... That is not my specialty, mine is color and image processing and more, I let the sound to my colleague specialists.

Aucun commentaire: