Tag: HTPC

Gabriel Dusil • Information Technology • 3 • Oak Case Mod™

16.Jul.3 - Prague · Oak Case Mod™ (office desk, 2, smaller)

https://youtu.be/hBY39kyss14

 20 minutes 5 seconds

• Creativity Inspiring Technology

  • This corporate server is the main workstation for Euro Tech Startups s.r.o. It’s mainly used to render client projects for 3D Motion Graphics. All data is completely mirrored, to ensure high -availability, and uses the motherboard’s RAID controller for improved data throughput performance. This computer is also used in photo restoration services. Restored photos are saved in uncompressed TIFF, and the process is recorded with video capture software.

• Build

  • The first day was focused mainly on preparing the workspace. Then I cut the side panels and fill the router grooves from a previous HiFi project. Next was routing grooves (6mm) needed for the rails that would hold the 20x 1TB hard drives.
  • The second day focused on drilling dowel holes, sanding the case, beveling the edges, and oiling the case. A Dremel was used to carve my signature into the front panel.
  • The third day was spent mounting the components, tie wrapping the cables, and general fixes. For example, I had mistakenly mounted the power supply upside down. So that needed to be fixed. Then the PC was booted for the first time.

• Snags

  • The first attempt at booting the case ran into a macro panic attack. As I turned on the power there was a spark and a zap from the motherboard! A seminal WTF moment for an IT geek! Dread set-in as a burning plastic filled the air, and I watched as a hint of smoke left my motherboard. Sweat ran down, “WTF did I do wrong?!” I wasn’t sure what had burnt-out, but I had seen the general location of the spark around the PCI-e slots. In any case, the PC still booted successfully. But over a series of future boots the PC would POST, but it wouldn’t make it past the Windows 10 spinning dots in the OS boot process. I tried countless restore points as well as switching my operating system SSD with my backup. Nothing worked. After six hours of no success I was exhausted and decided to call it a night.
  • As many PC geeks know too well – we hate going to bed when our PC’s aren’t working. But I thought maybe some rest may clear my mind and give me some new ideas. That night I actually dreamt of a few solutions; strip the whole PC to it’s bare components; buy a new video card; or start all over again and buy a new motherboard, CPU and RAM. That last option I was hoping to avoid since it meant scrapping a perfectly good Intel 4770 processor and 32GB of Corsair RAM, unless I could source a similar Z87 motherboard.
  • As I woke the next day I had the sinking realization of unfinished business with my PC broken. Rather than going through the arduous effort of striping the whole computer, I decided to start with the easiest step: the video card. The Asus Z87 Deluxe has built in video, so I removed the nVidia card and booted the machine using it’s onboard video. Success -Wow! I then closely examined the PCI-e 90 degree 16x 2U Riser and finally saw the culprit – a burnt capacitor!  In the end a $3 component was my problem. The boot process was getting stuck because it couldn’t communicate with the video card successfully, when it was trying to draw more data as it entered a higher graphics state.

16.May.11 - Prague · PC Case Oak Mod (90° Graphics riser card, 5, smaller)

 

• Inventory

  • 1x Motherboard • Asus Z87 Deluxe
  • 2x RAM 8GB Corsair Vengeance
  • 2x RAM 8GB Corsair Dominator
  • 1x CPU Intel i7-4770K processor
  • 1x CPU closed-loop water cooler Corsair Cooling Hydro Series H100i GTX
  • 1x Graphics car•  Gigabyte Windforce nVidia GTX770
  • 1x PCI-x 16x Riser 90? 2U (to turn the graphics card 90° degrees, so that it is oriented flat against the case)
  • 1x Power Supply • Corsair HX1000i
  • 4x SATA controllers HighPoint Rocket 620
  • 20x Hard Drive fans Scythe Ita Kaze
  • 1x Hard Drive Samsung 1TB 850 Evo SSD (for the operating system)
  • 1x Hard Drive Western Digital Green WD60EZRX, 6TB, 64MB Cache SATA 6.0Gb/s, 3.5″, 7200 RPM (for 3D motion graphics)
  • 19x Hard Drive Seagate 1TB, 3.5″ Internal SATA Hard Drive, 6.0Gb/s, 64MB Cache, 7200 RPM. These are refurbished drives to keep the price low and to test the overall concept of the multi-drive design goal (used for client projects, mirrored backups, and personal files)
  • 2x Wood Panels 50cm x 50cm x 4cm Oak
  • 5x Wood Brackets 42mm diameter x 18cm Indonesian Tamarind
  • 160x brass spacers 6mm diameter x 10mm (for spacing the hard drives between the wooden panels)
  • 80x brass washers 15mm diameter (between the brass spacers)

Oak Case Mod™

If you missed the first part of this project, you can find the link here:

If you like this post, please leave a comment below.

Gabriel Dusil • Information Technology • 2 • Oak Case Mod™

16.May.7 - Prague · Oak Case Mod™ (2, smaller)

 

• Design • Form

• For years I struggled with integrating traditional PC cases. I never had enough room at the back to plug my devices. Under my desk the tower was cramped, and every time I needed to service a component my heart sank, contemplating going back there. I usually couldn’t move the PC because of the abundance of cables prevented any rotating or tilting. I hated plugging in SATA drives or USB devices when I couldn’t see the port and I needed to guess it’s orientation.

• Design • Function

• A compact design was important. In consumer and industrial PC cases it’s rare to find a case that holds more than 8 hard drives. This case needed to hold 20 hard drives within the confines of a standard PC tower. Also, the case would only accommodate modern components. So no need for a DVD or BluRay drive.

• No screws were used in the core construction of the Oak Case Mod™. The main structural dowels are hidden from view, and the only hole necessary is for routing cables. I was inspired by the design principles of wood craftsmen that built their art such that the structure holds together by the design itself. In this way it could easily be taken apart without an tools.

• All ports needed to be easily accessible, including USB, SATA, Ethernet, audio cables, power, and internal cabling, hard drives, and motherboard. For this reason, the entire motherboard orientation is rotated 180? from traditional case designs, so that all ports face the front.

• Design • Quiet

• The case design needed to be silent and cool.  With a measured room temperature of 22°C all component temperatures are continually measured using Corsair Link:

  1. Hard drive •  This open concept allows all components to ‘breath’. Each hard drive has their own fan – a Scythe Ita Kaze, spinning at 1,000rpm (rated at 14.5 dBA). Temperatures range between 30°C – 44°C. 
  2. Graphics Processor Unit (GPU) • a Gigabyte Windforce GTX 770 video card is used, with fans spinning at 700 rpm to 800 rpm and maintaining a GPU temperature of 28°C – 40°C. During Cinema 4D rendering, the GPU fan spins at 1,200rpm to 1,400rpm with temperatures between 50°C – 52°C.
  3. Power supply • a Corsair HX1000i is the PSU, and it’s completely silent. The fan does not spin under standard working conditions,. Nor does it spin when the CPU is running at 100% utilization, when encoding 3D motion graphics or videos.
  4. Motherboard • Temperatures range between 26°C – 40°C.  Under 100% load the motherboard ranges between 28°C – 60°C
  5. CPU cooling • This design accommodates a double length 280mm long CPU closed-loop water cooling radiator, ensuring that CPU cooling is maximized (without venturing into custom water-cooling territory).  The Corsair Cooling Hydro Series H100i GTX is a CPU water cooler with its fans spinning between 1,000rpm and 1,300rpm. They maintain a CPU temperature between 36°C – 40°C (between all 4 cores) under normal working conditions. Under 100% load, CPU load temperatures range between  58°C and 64°C with the Hydro fan speeds from 1,100rpm to 2,000rpm.

• Design • Beautiful

• The reason for choosing Oak as a base material was inspired by my new company Euro Tech Startups where our motto is Creatively Inspiring Technology. I wanted to combine beauty and technological excellence in this Oak Case Mod™. The case also aesthetically matches my office desk, built out of 4cm thick solid walnut, and follows the same design principles. It’s all about merging art and technology.

Oak Case Mod™

If you missed the first part of this project, you can find the link here:

If you like this post, please leave a comment below.

Gabriel Dusil • Information Technology • 1 • Oak Case Mod™

16.May.7 - Prague · Oak Case Mod™ (1, smaller)

 

https://youtu.be/GH7oK7JU7Rc

 1 minute 3 seconds

• Introduction

  • I’ve been building custom PC’s for over twenty years. This iteration began back in 2003 as a Home Theatre PC (HTPC) project, and has evolved significantly. It’s now my work computer, home PC, and entertainment center. Over the years all components have been upgraded again and again. Last year I decided to do away with the computer case completely, and layout all components in an open concept. It was mainly to help with accessibility to all components, and to keep them as cool as possible. I found that the constraints of a standard PC tower tended to act as a convection oven, regardless of how well I tweaked the fans. In an open concept the hard drives and motherboard had easy access to the natural flow of ambient air.
  • But it quickly turned into a mess. Cables were everywhere and it was embarrassing to look at. Friends and family would visit and basically ask, “What is that monstrosity?” So I began to look around the Internet for a suitable PC case. One approach was an industrial or rack mount solution. But that added new complications in space, noise and aesthetics. There was nothing practical that accommodated a design that supported at least 20 hard drives – that was my gating factor. So I decided to build one instead. I wanted to demonstrate how far a standard PC and motherboard could go – The ability to accommodate an enterprise level storage system in the size of a desktop tower. Even though I use 1TB hard drives, it was important in this experiment to prove that storage scalability can be built at a consumer level, and still look pretty.

• Production

  • This is the first part in a pair of blog posts that outline my design goals, and the entire process leading to the final product. I hope it inspires other builders with new and innovative designs. Enjoy!
  • In this motion graphic we animated the end-to-end production of the Oak Case Mod™. It demonstrates how motion graphics can not only be used for precision photo realistic 3D modelling, but all elements of the creative process can also be animated in a wicked video.

• Contact Us

  • Do you want a 3D model of your new idea to test your design and validate your specifications?
  • Are you interested in a sensational 3D video to showcase your products?
  • Would you like to use 3D motion graphics to animate the unique design of your product?

If you are interested in creating your own 3D product animation, contact us today for a quote: g@eurostartups.tech.

If you like this post, please leave a comment below.

Oak Case Mod™

If you missed the first part of this project, you can find the link here:

If you like this post, please leave a comment below.

• Additional Projects

Here are quick links to some of our other videos and motion graphics projects:

OTT & Multiscreen • Digital Video Series • 3 • Benchmarking the H.265 Video Experience

Graphic - Benchmarking the H.265 Video Experience (title)
Creating a compelling and engaging video experience has been an ongoing mission for content owners and distributors; Whether it was the introduction of CinemaScope[1] in 1953 to stifle the onslaught of color TV[2], or the introduction of 3D films[3] in the 50’s, the 80’s, and its subsequent re-introduction in 2009 with the launch of Avatar[4], to 4K Ultra high definition (UHD[5]) TV, and retina[6] quality video. In every way, gauging video quality has been a subjective exercise for consumers and experts alike.

 Graphic - Benchmarking the Video Experience (i. Calculating Qf)

Figure i - Visual Representation of calculating Qf

Figure i – Visual Representation of calculating Qf

Beyond the signal to noise ratio (SNR[7]) measurement used to compare different compression ratios or codecs, in many cases only a trained eye would notice errors such as compression artifacts[8], screen tearing[9], or telecine judder[10] – unless they were persistent.

A modest metric to assess a video file’s compression density is the Quality factor (Qf[11]). In fact, the name is misleading since it is not actually a measure of quality, but an indication of video compression using three parameters: bitrate, the number of pixels in the frame, and the overall frame-rate of the video. Qf is essentially a measure of, “the amount of data allocated to each pixel in the video” [12]. This metric doesn’t take into account the type of compression profile used, the number of passes originally utilized in the encoding process[13], or any tweaks implemented by the encoding engineer to optimize the video quality. So Qf, or compression density, is just a baseline guide for an administrator that is responsible for transcoding or managing large video libraries.

The accompanying table shows a comparison of Qf using nominal figures for DVD, Blu-Ray and the recently ratified H.265 codec (aka. High Efficiency Video Coding, HEVC[14]). As the compression standard used for encoding the video improves, this corresponds to a reduced Qf.

Although Qf may be considered an inaccurate measure of video compression quality, where it becomes valuable is during the video encoding[15] or transcoding[16] stage – especially when multiple videos are required for processing, and an administrator has the option to choose consistency in the profile used and all related sub-parameters. Choosing a single Qf in this case will ensure global uniformity of compression density across the entire library. There are several internet forum discussions on the optimum quality that should be used for encoding (or an ideal Qf). Realistically, every video has its own unique and optimum settings. Finding this balance for each individual video would be impractical. For this reason, grouping video libraries by genre, or content type, then using a Qf for each group is a more reasonable compromise. For instance, corporate presentations, news casts, medical procedures – basically any type of recording with a lot of static images – could be compressed with the same Qf. The corresponding file for these videos could be as small as 1/20th the size of a typical Blu-Ray movie, with no perceivable loss in video quality.

Table I - Comparing Qf for MPEG2, H.264 & H.265[17]

Table I – Comparing Qf for MPEG2, H.264 & H.265[17]

As shown in the table, the Qf metric is useful in showing that a 1080p movie using the MPEG2 codec (aka. H.262 under the ITU definition) at 16.7GB (Gigabytes[18]) of storage (with a Qf = 0.33), compares equally to 10GB using H.264 (Qf = 0.20). Or in the case of H.265 a file size of 6GB (Qf = .6) again maintains the same quality. This is because each of these codecs significantly improves the efficiency on the previous one, while maintaining the same level of perceived video quality.

Figure ii - Visual representation of Video Compression standards & relative bandwidth requirements[19]

Figure ii – Visual representation of Video Compression standards & relative bandwidth requirements[19]

Ascertaining a video’s compression density can be achieved using MediaInfo[20], an open-source software package. This utility is an excellent resource in determining the formatting and structure of a given video file. MediaInfo displays a plethora of metadata and related details of the media content in a well laid-out overview. This includes granular structure of the audio, video, and subtitles of a movie. The layout of the data can even be customized using HTML and entire directories can be exported as part of a media library workflow. It’s an indispensable resource for content owners and subscribers that are managing large multimedia databases.

Figure iii - Snapshot of MediaInfo showing a video's Structural Metadata

Figure iii – Snapshot of MediaInfo showing a video’s Structural Metadata

The H.264 codec (MPEG 4 AVC[21], or Microsoft’s own VC1[22]) improved on the efficiency of MPEG2[23] codec, developed in 1995, by around 40% to 50%. Although H.264 was created in 1998 it didn’t reach mainstream until Blu-Ray was officially launched in 2006. The H.265 standard, currently promises a similar 35% to 50% improvement in efficiency[24]. So when MPEG2 needs 10Mbps to transmit a video, an H.264 codec could send the same file, in the same quality at 6Mbps. H.265 can achieve the same at 3.6Mbps. The trade-off in using H.265 is two to ten times higher computational power over H.264 for encoding. So expect video encoding to take up to ten times longer to encode when using today’s processor. Thankfully devices will need only a two to three times increase in CPU strength to decode the video.

The new H.265 standard ushers in multiple levels of cost savings. At a storage level, costs saving of 40% would be significant for video libraries hosted in any cloud. Content hosting facilities or CDNs (content delivery networks[25]) are costly endeavor at the moment, for many clients. It may be argued that storage costs are a commodity, but when media libraries are measured in Petabytes[26] then these capital cost savings help the bottom line by using newer and more efficient codecs. Also, bandwidth costs will play an important role in further savings. Many online video platforms charge subscribers for the number of gigabytes leaving their facilities. Halving those costs by using H.265 would have a significant impact on monthly operational costs. On the flip side, video processing costs will increase in the short term, due to stronger and more expensive CPU power needed at both the encoding and decoding stages. Existing hardware will likely be used to encode H.265 in the short term, at the expense of time. But dedicated hardware will be needed for any extensive transcoding exercises, or real-time transcoding services.

Subscription-based internet services significantly compress their video content compared to their Blu-Ray counterparts. It’s a practical trade-off between video quality and bandwidth savings. But the quality of video only becomes a factor on certain consumer devices which can show the deficiencies of a highly compressed video. For example, a 60” (inches diagonal) plasma screen has the resolution to reveal a codec’s compression artifacts, but for a TV less than 40”, these artifacts would be hardly noticeable to the average consumer. For the most part, a 1080p title is barely distinguishable in quality to 720p on even a medium-sized television. Likewise, for many views watching on a majority of mobile device, high resolution content is both overkill and costly.

For those with bandwidth caps, subscribers are charged for all streaming data reaching their smartphone, whether they experience the highest quality video or not. Any video data sent exceeding the capability of a consumer device is a waste of money

Graphic - Benchmarking the Video Experience (iii.a. If H.265 lives up to its hype, then it is destined to be the de facto encoding standard for digital video)

At the moment video playback on mobile devices still poses a challenge for high definition. Thanks to multi-core processing on smartphones consumers are on the brink of having enough power to play full HD video, and can even run other processor intensive tasks in the background. Although quad-core[28] processors such as the Cortex A15 from ARM[29] and nVidia’s Tegra 4[30] (also based on the ARM architecture) have the ability to play some 1080p video, they will still struggle to play a wide library of full HD content without requiring some level of transcoding to lower profiles. 2013 is ushering in a wide range of handsets claiming 1080p support from HTC, Huawei, Sony, Samsung, and ZTE[31]. Multicore GPU and CPU running at ultra-low power requirements are asserting mobile devices as a viable platform for 1080p.

In the meantime, the resilience of H.264 and H.265 is in their use of encoding profiles (eg. baseline, main, or high and all associated sub-levels). The use of different profiles ensures that the best quality video experience is delivered within the limitations of the device playing the video. Low profile’s such as baseline require minimal processing power but do not efficiently compress the video. High profile modes are highly efficient and squeeze video file size as small as possible. Thus bandwidth is used efficiently, but requires higher processing power of the end-device to decode the video. Although the latest Apple iOS[32] devices support high profile, most smartphones still use lower profiles to ensure wider device compatibility. In the interim, internet video providers continue to encode titles into multiple profiles to suit a wide range of subscriber devices, accommodate their limitations in decoding capabilities, and maximize each individual viewing experience.

Higher profiles in H.265 will also have an effect on consumer electronics (CE[33]) equipment. Current iterations of these appliances are not equipped to handle the required processing demands of H.265. The next generation Home Theater PC (HTPC[34]), Set Top Box (STB[35]), or Media Player[36], will require upgrades their processing engines to accommodate these next generation codecs. Lab testing is still required to showcase that next generation computer processors will have the ability to decode H.265 at higher bit depth (eg. 10 bit), and resolutions as high as 4K resolutions. Some estimates state that 4K using H.265 will require 80 time more horsepower compared to HD using H.264[45].

To further compensate for the vast differences in mobile coverage, and best-effort internet communications, Over the Top (OTT)[37] providers, and Online Video Providers (OVP)[38] are offering advanced video optimization features such as Adaptive Bitrate Streaming (ABS)[39]. This is a solution to optimize video quality sent in real-time. Protocols such as Apple’s HLS[40], and more recently MPEG-DASH[41] have been developed to provide a universal approach to implementing adaptive bitrates.

The need for Adaptive Bitrate Streaming and related techniques is just a stop-gap requirement. As quality of service improves and bandwidth speeds increase, the need for optimization techniques will diminish. In some regions these techniques may completely disappear. Certainly, during the days of the analog modem, bandwidth was at a premium, so compression techniques and sophisticated error correction methods were used to maximize data throughput while also saving costs for the last-mile[42]. As bandwidth increased, these line adaption features were no longer deemed necessary. Similarly, the need for bandwidth optimization techniques will be diluted in regions where mobile 4G LTE[43] (Long-Term Evolution) will become ubiquitous. Speeds will become so reliable that even the internet’s best-effort[44] will be sufficient to deliver multiple 4K videos, in real time, to any device.

Read Additional Articles in this Series

  • I. Consumption is Personal

    In the days of linear television, broadcasters had a difficult task in understanding their audience. Without a direct broadcasting and feedback mechanism like the Internet, gauging subscriber behavior was slow. Today, online video providers have the ability to conduct a one-to-one conversation with their audience. Viewing habits of consumers will continue to rapidly change in the next ten years. This will require changes in advertising expenditure and tactics.

    II. Granularity of Choice

    The evolution from traditional TV viewing to online video has been swift. This has significantly disrupted disc sales such as DVD and Blu-Ray, as well as cable and satellite TV subscriptions. With the newfound ability to consume content anytime, anywhere, and on any device, consumers are re-evaluating their spending habits. In this paper we will discuss these changes in buying behavior, and identify the turning point of these changes.

    III. Benchmarking the H.265 Video Experience

    Transcoding large video libraries is a time consuming and expensive process. Maintaining consistency in video quality helps to ensure that storage costs and bandwidth are used efficiently. It is also important for video administrators to understand the types of devices receiving the video so that subscribers can enjoy an optimal viewing experience. This paper discusses the differences in quality in popular video codecs, including the recently ratified H.265 specification.

    IV. Search & Discovery Is a Journey, not a Destination

    Television subscribers have come a long way from the days of channel hopping. The arduous days of struggling to find something entertaining to watch are now behind us. As consumers look to the future, the ability to search for related interests and discover new interests is now established as common practice. This paper discusses the challenges that search and discovery engines face in refining their services in order to serve a truly global audience.

    V. Multiscreen Solutions for the Digital Generation

    Broadcasting, as a whole, is becoming less about big powerful hardware and more about software and services. As these players move to online video services, subscribers will benefit from the breadth of content they will provide to subscribers. As the world’s video content moves online, solution providers will contribute to the success of Internet video deployments. Support for future technologies such as 4K video, advancements in behavioral analytics, and accompanying processing and networking demands will follow. Migration to a multiscreen world requires thought leadership and forward-thinking partnerships to help clients keep pace with the rapid march of technology. This paper explores the challenges that solution providers will face in assisting curators of content to address their subscriber’s needs and changing market demands.

    VI. Building a Case for 4K, Ultra High Definition Video

    Ultra High Definition technology (UHD), or 4K, is the latest focus in the ecosystem of video consumption. For most consumers this advanced technology is considered out of their reach, if at all necessary. In actual fact, 4K is right around the corner and will be on consumer wish lists by the end of this decade. From movies filmed in 4K, to archive titles scanned in UHD, there is a tremendous library of content waiting to be released. Furthermore, today’s infrastructure is evolving and converging to meet the demands of 4K, including Internet bandwidth speeds, processing power, connectivity standards, and screen resolutions. This paper explores the next generation in video consumption and how 4K will stimulate the entertainment industry.

    VII. Are You Ready For Social TV?

    Social TV brings viewers to content via effective brand management and social networking. Users recommend content as they consume it, consumers actively follow what others are watching, and trends drive viewers to subject matters of related interests. The integration of Facebook, Twitter, Tumblr and other social networks has become a natural part of program creation and the engagement of the viewing community. Social networks create an environment where broadcasters have unlimited power to work with niche groups without geographic limits. The only limitations are those dictated by content owners and their associated content rights, as well as those entrenched in corporate culture who are preventing broadcasters from evolving into a New Media world.

    VIII. Turning Piratez into Consumers

    IX. Turning Piratez into Consumers, I

    IX. Turning Piratez into Consumers, II

    X. Turning Piratez into Consumers, III

    XI. Turning Piratez into Consumers, IV

    XII. Turning Piratez into Consumers, V

Content Protection is a risk-to-cost balance. At the moment, the cost of piracy is low and the risk is low. There are no silver bullets to solving piracy, but steps can be taken to reduce levels to something more acceptable. It is untrue that everyone who pirates would be unwilling to buy the product legally. It is equally evident that every pirated copy does not represent a lost sale. If the risk is too high and the cost is set correctly, then fewer people will steal content. This paper explores how piracy has evolved over the past decades, and investigates issues surrounding copyright infringement in the entertainment industry.

About the Author

Home - Signature, Gabriel Dusil ('12, shadow, teal)Gabriel Dusil was recently the Chief Marketing & Corporate Strategy Officer at Visual Unity, with a mandate to advance the company’s portfolio into next generation solutions and expand the company’s global presence. Before joining Visual Unity, Gabriel was the VP of Sales & Marketing at Cognitive Security, and Director of Alliances at SecureWorks, responsible for partners in Europe, Middle East, and Africa (EMEA). Previously, Gabriel worked at VeriSign & Motorola in a combination of senior marketing & sales roles. Gabriel obtained a degree in Engineering Physics from McMaster University, in Canada and has advanced knowledge in Online Video Solutions, Cloud Computing, Security as a Service (SaaS), Identity & Access Management (IAM), and Managed Security Services (MSS).

All Rights Reserved

©2013, All information in this document is the sole ownership of the author. This document and any of its parts should not be copied, stored in the document system or transferred in any way including, but not limited to electronic, mechanical, photographs, or any other record, or otherwise published or provided to the third party without previous express written consent of the author. Certain terms used in this document could be registered trademarks or business trademarks, which are in sole ownership of its owners.

References


[2] Color television, Wikipedia, http://en.wikipedia.org/wiki/Color_TV

[5] Ultra high definition television, Wikipedia, http://en.wikipedia.org/wiki/Ultra_High_Definition_Television

[8] Compression artifact, Wikipedia, http://en.wikipedia.org/wiki/Compression_artifact

[9] Screen tearing, Wikipedia, http://en.wikipedia.org/wiki/Video_tearing

[11] Originally used in Gordian Knot, http://sourceforge.net/projects/gordianknot/, an open source project for encoding videos into DivX and XviD formats. This software is no long being developed.

[12] “The Secret to Encoding High Quality Web Video: Tutorial”, by Jan Ozer, ReelSEO.com, http://www.reelseo.com/secret-encoding-web-video/

[13] With multi-pass encoding, the encoder becomes aware that some static parts of the video can be encoded with lower bitrates compared to complex scenes requiring higher bitrates. This knowledge encodes the video more efficiently, but requires higher processing resources and time to complete the task.

[14] High Efficiency Video Coding, Wikipedia, http://en.wikipedia.org/wiki/H.265

[15] Data compression – Video, Wikipedia, http://en.wikipedia.org/wiki/Video_encoding#Video

[17] This table shows the typical frame size for MPEG2, H.264 and H.265. For consistency and for the sake of comparison, a frame aspect ratio of 16:9 is shown. The Cinemascope[17] frame size of 2.39:1 or 2.35:1 would further alter the figures. The table also does not take into account the audio channel, which roughly amounts to a 10% increase in bitrate and file size (when a similar quality codex are used in each instance). Also not under consideration is a pixel bit depth higher than 8, such as in professional video recording, and common frame rates of 25, 29.97, 30 or 50fps are not considered.

[19] In the context of this article, ½HD is referred to as 1280×720 resolution.

[24] Studies have shown a 39-44% improvement in efficiency over H.264. Joint Collaborative Team on Video Coding (JCT-VC), “Comparison of Compression Performance of HEVC Working Draft 4 with AVC High Profile”

[25] Content Delivery Network, Wikipedia, http://en.wikipedia.org/wiki/Content_Delivery_Network

[27]If H.265 lives up to its hype, then it is destined to be the de facto encoding standard for digital video.”

[28] Multi-core processor, Wikipedia, http://en.wikipedia.org/wiki/Multi-core_processor

[31] “Top 5 smartphones with 1080p displays”, by Jacqueline Seng, cnet, http://asia.cnet.com/top-5-smartphones-with-1080p-displays-62220194.htm

[33] Consumer electronics, Wikipedia, http://en.wikipedia.org/wiki/Consumer_electronics

[34] Home theater PC, Wikipedia, http://en.wikipedia.org/wiki/HTPC

[37] Over the Top content, Wikipedia, http://en.wikipedia.org/wiki/Over-the-top_content

[39] Adaptive bitrate streaming, Wikipedia, http://en.wikipedia.org/wiki/Adaptive_bitrate_streaming

[40] HTTP Live Streaming, Wikipedia, http://en.wikipedia.org/wiki/HTTP_Live_Streaming

[41] Dynamic Adaptive Streaming over HTTP, Wikipedia, http://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP

[44] Best-effort delivery, Wikipedia, http://en.wikipedia.org/wiki/Best-effort

[45] “HEVC Update, Beyond the Main Profile”, Matthew Goldman, Ericsson Television, 26th February, 2013