Browsing through the television or mobile devices sections of a big box store today might give you the impression that 4k has landed. While it has in some respects, there are many more pieces to the 4k puzzle that are years away from being assembled into place. First of all, what is 4k? Simply put, 4k, formally known as Ultra High Definition (UHD), is a screen resolution comprised of roughly 4000 horizontals pixels (4096x2160 for digital cinema and 3840x2160 for television). It's essentially 4 times the resolution of 1920x1080 HD, which represents the high end of HD resolutions. Oddly enough, all of the frame rates that came before it were known by their vertical resolutions (480, 720, 1080). The UHD generation is known by their horizontal resolutions, which can make it more challenging to keep straight. In addition to the frame size, there are also improvements in color depth, gamut, and other areas that improve the viewing experience. In an age where more is better, 4k seems like it should explode onto the scene. However, there are quite a few reasons we won't see it take hold overnight. First of all, it's big, really big. It's essentially four times the size of content broadcasters are making and delivering today. This means that most of the existing infrastructure in broadcast facilities will have to be upgraded or replaced. Infrastructure that can be retained will have a reduced capacity by a factor of four. Given that many broadcasters feel they just completed the transition from Standard Definition to High Definition, most will be unable to jump quickly into UHD. Some will adopt it in stages, starting with cameras and edit systems, because there are many advantages to beginning to create a 4k asset library - but that content will be scaled down to HD resolutions for broadcast. So traditional broadcasters may not jump into the fray right away, but what about Over the Top (OTT) providers like Netflix. After all, Netflix is already delivering their hit original series, House of Cards, in 4k resolutions, right? While this is true, it's likely the number of people capable of viewing the series in 4k numbers only in the hundreds as of this writing. There are a few reasons for this. First of all, as previously mentioned, the 4k footprint is big. Using standard compression formats in use today (typically h.264 for video), the 4k resolutions of a video stream could be upwards of 25mbps or more, which can be 2 to 3 times the bitrate that is delivered at the high end today. This would choke most Internet Service Providers (ISPs) and ensure that very few viewers could tune into that resolution. This doesn't even take into account the delivery costs inherent in pushing all that data around. For this reason, Netflix has wisely chosen to encode and deliver content using h.265, a more efficient compression spec that can decrease the required bandwidth by 30 to 50% compared to h.264. Therefore, to watch House of Cards in 4k, you need a television with 4k resolution capable of decoding h.265 and running the Netflix application natively. The problem is, there are few televisions on the market that are able to do this. I should note, there are others trying to hack at this issue from different angles. For example, Beamr is a technology that claims to filter media in a way that allows 4k content to be encoded in h.264 at bandwidths equivalent to content encoded in h.265 while maintaining the same perceived level of quality. They claim to do this by filtering out information that is not able to be perceived by human vision during the encoding process. It has promise, but this type of approach is still on the fringe and it remains to be seen if it will be implemented broadly or if the industry will skip stop gap measures such as this in favor of pushing forward h.265. If solutions of this nature get adopted in the near term, it may help speed up 4k's arrival. One might also ask, what about my Xbox, PS4 or Roku device, can't I view 4k on these devices? Again, we run into an obstacle. Currently, most shipping consoles and set top boxes have HDMI 1.4 ports - which are technically capable of delivering 4k resolutions, but don't have the bandwidth to support high frame rates. To achieve the true promise of 4k, the console and the connected television and/or receiver will all need to support HDMI 2.0. To my knowledge, it has not yet been announced if the existing game consoles and set top boxes will be firmware upgradable to HDMI 2.0. On the other hand, televisions are starting to ship with HDMI 2.0 today. There is a great CNET article by Geoffrey Morrison that captures a snapshot of where the major manufacturers sit with HDMI 2.0 support. Regardless, Netflix is limiting their 4k content to UHD televisions with the built-in Netflix app for now. So let's assume we start adopting UHD televisions and the consoles and set-top boxes get upgraded to support HDMI 2.0 and h.265 decoding, or that the industry chooses to embrace a technology like Beamr's to make encoding more efficient… now can we watch our 4k OTT content? We'll be much closer, but the fact remains that even at the bandwidths of more advanced h.265 encoding, 4k is a big data hog. When adoption starts to reach a tipping point, Content Delivery Networks (CDNs) and ISPs will feel the congestion on their networks. CDNs are already exploring ways to be more efficient in delivering content to homes. According to Tom Leighton, CEO of Akamai (one of the world's largest CDNs), while speaking in CNET's For the Record Podcast, the company has been exploring many techniques for tackling this problem including broader use of Multicast (where applicable), peer to peer and client assisted delivery and more. In fact, the company already has a client product called NetSession that aims to create download efficiencies. Netflix has sought to improve the situation on their own behalf as well. Today, Netflix offers ISPs a cache server product called "Open Connect" that essentially allows the ISPs to put a Netflix cache server inside their own network infrastructure. This makes it so end users don't have to source media content all the way back to Netflix, rather, they can get it from within the walls of the ISP, where bandwidth is less constrained. Netflix even rates how well different ISPs do delivering media, likely in an effort to publicly shame ISPs into incorporating these cache servers to improve overall performance and lower the cost of Netflix content delivery. While this approach is good for Netflix and for Netflix customers, it's really not scalable to build unique caching methodologies for every content provider. I believe CDNs, like Akamai, will start to build similar solutions into ISPs and potentially even incorporate technology into televisions, devices, and set top boxes to aid in end-to-end delivery. By going this route, most or all content providers will be served, rather than just a few top players like Netflix. Just as we see Dolby, DTS and other monikers on your electronics today, perhaps we will someday see references to delivery brands to provide consumers with confidence that they will spend less time in buffering states. Is it time for Akamai Inside? In addition to all of the above mentioned obstacles that will hamper rapid 4k adoption, you also can't overlook the industry's temptation to take shortcuts in the interim. One such shortcut will come in the form of Dolby Vision. With Dolby Vision, Dolby intends to enhance the richness of video in our media experiences in the same way it did with audio. Dolby vision increases brightness levels by 40 times over conventional television, it expands the color depth, and enhances contrast in ways that create a dramatic perceptual difference in the image quality. They are strong advocates of the notion that better pixels are better than more pixels. In the short term, content creators and broadcasters may choose to make their content richer by leveraging Dolby Vision rather than trying to jump straight into higher resolutions. Research in human visual acuity (above) suggests that this is a smart strategy because improved color volume and dynamic range is shown to have a higher impact on how we perceive the quality of the image relative to resolution improvements. In fact, depending on the size of the display and the distance at which the viewer is sitting, the argument could be made that a jump in pixel count may make no perceptible difference for many viewers (Click here for a great resource on resolution, display size, and viewing distance. Also the source of the graphic to the left). All of this suggests that we are a few years away from 4k media consumption being the norm for a significant portion of media audiences, but it will eventually emerge. Ironically, one of the areas we may see 4k most prevalent in the near term will be in the user generated space. Tablets and phones are quickly beginning to support 4k photography and video recording. That type of content is often created and viewed locally, bypassing many of the issues presented above. When it's not, it's usually short in form and relevant only to small audiences, thereby making it a fairly light load on network resources. This means many people may find themselves viewing their home videos and YouTube channels in 4k while they wait for the broadcasters and delivery systems to catch up. How long do you think it will take for mainstream adoption of 4k media consumption? Leave a comment and let me know your thoughts.