HAP works by applying a lightweight secondary compressor to standard compressed texture formats. Playback involves retreiving frames from the container (demuxing) and decompressing them to their compressed texture data (decoding), which is then presented directly to OpenGL or Direct3D.
Below you can find information to help add support for HAP into your own software. In many cases you can simply use an existing framework or library to enable both playback and encoding. For more information you can also visit the HAP project page on GitHub.
The following addons enable HAP support in popular creative coding environments:
If you are already using AVFoundation, FFmpeg, LibAV, DirectShow or QuickTime for movie playback or encoding it is relatively easy to add native support for the HAP video codecs to your own applications. Additional information can be found in the sections below.
The route you choose to achieve playback will depend on the platforms you target, and your existing codebase:
The following fragment shader performs YCoCg conversion and can be used for drawing HAP Q frames:
The following fragment shader performs YCoCg conversion and combines the colour and alpha planes for HAP Q Alpha frames:
For many applications, enabling playback is sufficient, and users can be directed to one of the available HAP codecs to encode their media.
If you want to perform encoding within your application, you must first of all encode your RGB(A) frames to the appropriate compressed texture format, and then encode that data to HAP frames.
A number of libraries are available to perform texture compression.
The following resources may help:
If you have questions or problems, visit the GitHub issues page. The HAP codecs email list is used to make occasional announcements.