DirectShow filter built with Visual Studio 2019 Preview to run on Windows Server 2012

I pushed a few commits to my fork of DirectShow Win7 Samples (BaseClasses library specifically).

One of small problems I happened to deal with is that a filter built with current/recent toolset produces code incompatible with legacy operating systems, which are still widely present in the wild. This could be solved by using outdated versions of Visual Studio, outdated Windows SDK etc. This is however not really necessary because even Visual Studio 2019 Preview builds DirectShow code perfectly (including using v142 toolset), and you are generally not limited in using 30 year old codebase alone. I had a filter using Windows Implementation Libraries (WIL) helpers, C++17 code and C++/WinRT for COM object implementation, however a few rough places of BaseClasses resulted in filter binary incompatible with Windows Server 2012 runtime.

A bit of massaging of BaseClasses fixed the problem. I also enabled SDL checks (and this made me fix something COutputQueue implementation – not really a bug, but it could be more accurate, and got rid of strsafe.h in favor of safe CRT string functions.

While doing that, I noticed that ancient lstrcmpW API function is documented incorrectly on MSDN.

Internal E_UNEXPECTED in dxgi.dll

Someone asked a question on StackOverflow recently about suspicious debug output messages associated with DXGI/Direct3D initialization: DirectX12: dxgi dll catastrophic failure when creating IDXGIFactory.

onecore\windows\directx\database\helperlibrary\lib\perappusersettingsqueryimpl.cpp(121)\dxgi.dll!00007FFBA0D6D7F8: (caller: 00007FFBA0D4D167) ReturnHr(1) tid(64d8) 8000FFFF Catastrophic failure onecore\windows\directx\database\helperlibrary\lib\perappusersettingsqueryimpl.cpp(98)\dxgi.dll!00007FFBA0D6D4D0: (caller: 00007FFBA0D3E221) ReturnHr(2) tid(64d8) 8000FFFF Catastrophic failure onecore\windows\directx\database\helperlibrary\lib\directxdatabasehelper.cpp(999)\dxgi.dll!00007FFBA0D6D4FC: (caller: 00007FFBA0D3E221) ReturnHr(3) tid(64d8) 8000FFFF Catastrophic failure

This problem is not fatal or severe but it is a long standing one, and Microsoft folks should look into it because — as StackOverflow question suggests — it confuses people.

It is also a widespread one, and — for instance — it can be easily repro’d by one of the apps I posted earlier:

If you start the application in self-debugging mode with -Debug command line parameter, the debug output is redirected to console and those messages are immediately visible:

In the referenced StackOverflow answer I also advertise Microsoft Windows Implementation Libraries (WIL) which I like and use myself where appropriate, and I think is a good piece of software, and an underrated one. No wonder it is used in DXGI implementation internally.

DirectShow VCam source code

Introducing another popular DirectShow project: Vivek’s source filter which emulates a video capture device. For a long time the code was hosted on P “The March Hare” W’s website, which was eventually taken down.

I placed the recovered project, binaries, README and upgrade to Visual Studio 2019 on GitHub: roman380/ Original Vivek’s popular sample project implementing virtual DirectShow video source filter (

The project itself is described in the repository, so I will not duplicate the text here.

See also: How to build and run Vivek’s Virtual Camera on Windows 10? on StackOverflow.

Windows SDK DirectShow Samples adapted for Visual Studio 2019

Over 20+ years there have been a steady flow of questions “how to build these projects”. Back in time the problem was more about having exactly matching settings in the application/library projects and mandatory dependent static library. At some point Microsoft abandoned the samples, then removed from the SDK completely. Luckily, some point the samples were returned back to public as “Win7Samples” under “Windows Classic Samples” published on GitHub.

DirectShow samples there, however, exist in the state where they were dropped years ago. Still functioning and in good standing, but not prepared for building out of the box. So the flow of the “how to build” questions is still here.

I made a fork of the repository (branch “directshow” on fork of the Microsoft’s repository; “Samples/Win7Samples/multimedia/directshow” from the root of the repository) and upgraded a few projects, those most popular ones (including AmCap, PushSource, EzRGB24, beginner’s DShowPlayer application):

Windows-classic-samples/Samples/Win7Samples/multimedia/directshow at directshow · roman380/Windows-classic-samples (

The code requires Microsoft Visual Studio 2019 (Community version is okay) and current Windows 10 SDK.

To start, clone the fork and locate README at the directshow folder, open the solution and build the code, Debug or Release configuration, Win32 or x64 platform.

Support for FLAC in ISO BMFF with MSE in StreamingServer application

Chrome platform supports FLAC encoding in ISO BMFF (fragmented MP4) media since version 62 (October 2017), however the support for FLAC (and Opus) overall did not become standard and comprehensive since then.

I hooked up Microsoft FLAC Audio Encoder MFT into media streaming application to produce media and check browser compatibility.

  • /audio.mp4?flac – produces FLAC in ISO BMFF media on the fly, resulting in streamable media used here in Google’s demo
  • /audio.mp4?flac&duration=50 – allows to override duration and generate longer content; there is no chunked HTTP transfer [yet] so the content needs to be fully generated before it is sent, beware if doing real long files
  • /audio.mp4 – without “flac” specifier with or without duration results in single AAC track media
  • /audio.mp4.html?flac – produces a wrapper HTML page offering playback of the media (see below)

HTML is repeating Google’s demo and is feeding the FLAC MP4 audio into HTML5 media element:

std::string Text;
Text.append(Format("<p>Audio MIME type: audio/mp4; codecs=\"%hs\"</p>", Codecs.c_str()));
Text.append("<audio controls />");
Text.append(Format("const uri = 'audio.mp4%ls';", Request.GetQueryString().c_str()));
Text.append(Format("const mimeType = 'audio/mp4; codecs=\"%hs\"';", Codecs.c_str()));
auto constexpr g_Script = R"script(
    const audio = document.querySelector('audio');
    if (MediaSource.isTypeSupported(mimeType)) {
        const mediaSource = new MediaSource();
        audio.src = URL.createObjectURL(mediaSource);
        mediaSource.addEventListener('sourceopen', function () {
            const sourceBuffer = mediaSource.addSourceBuffer(mimeType);
            console.log('Fetching audio file...');
                .then(response => response.arrayBuffer())
                .then(data => {
                    sourceBuffer.addEventListener('updateend', function () {
                        if (!sourceBuffer.updating && mediaSource.readyState === 'open') {
                            console.log('Audio is ready to play!');
    } else {
        console.log('MIME type ' + mimeType + ' is not supported on this platform with MSE.');
Response->Initialize(HTTP_STATUS_OK, "OK");
Response->AddHeader(HttpHeaderContentType, "text/html; charset=utf-8");

The generated FLAC MP4 asset is playable not just on Chrome, it can be played on:

  • new Edge (obviously, it’s Chromium based)
  • VLC player on Windows
  • MPC-HC player on Windows (libav backed)
  • Safari on macOS (but not on iOS, neither from HTML wrapper because of MSE absence nor directly)
  • Some Android phone (“Samsung Internet” browser? WTF; both directly and via MSE interface from HTML wrapper)
  • UWP MediaPlayerElement control

The ISO BMFF content is styled for low latency progressive streaming, it’s just concatenated into complete file. For this reason the FLAC content can also be put into adaptive bitrate streaming media asset like HLS, and the application does it as well, but it deserves a separate blog post and support for FLAC in HLS is not as good.

Download links


  • 64-bit: StreamingServer.exe (in .ZIP archive)
  • License: This software is free to use; builds have time based expiration

Reference HTTP Live Streaming (HLS) server application

StreamingServer is the application I am using as internal testbed for various media processing and encoding primitives. As an application (or service) it is capable to stream HLS assets preparing them on the fly without need to keep and host real media files. The functionality includes:

  1. Encodes and multiplexes ISO BMFF Byte Stream Format media segments with AAC audio and H.264/AVC, H.265/HEVC video, exposing them as HLS assets (see also RFC 8216 “HTTP Live Streaming”)
  2. Supports video only, audio only, video and audio assets
  3. Supports parts of ISO/IEC 23001-7 “Common encryption in ISO base media file format files” specification and implements ‘cenc’ and ‘cbcs’ encryption schemes with AES-CTR-128 and AES-CBC-128 encryption modes of operation respectively
  4. Implemetns encryption layouts as supported by Microsoft PlayReady DRM implementations, and specifically Microsoft PlayReady sample
  5. Supports live HLS assets, including live finite and live infinite assets
  6. Encoding services are provided by underlying Media Foundation encoders; due to state of Media Foundation and, specifically, awful quality of vendor specific third party vendor integrations the application (a) might have issues with specific video cards, (b) implements built-in encoding based on NVIDIA Video Codec SDK for NVIDIA GPUs, (c) offers software only mode for GPU agnostic operation

The application assumes just one client and its streaming services are, generally speaking, limited by trivial HTTP serving loop. Still multiple clients should be able to request data in parallel too.

It is possible to dump produced responses as files for retroactive review. Unless responses are written to files, they are streamed in HTTP chunked mode at lowest latency.

Quick start

Start the application with privilege elevation to enable its initialization with HTTP Server API services. Unless overridden with command line parameters, the application uses first available DXGI device for hardware assisted video encoding, and exposes its HTTP functionality via http://localhost/hls base. Open http://localhost/hls/about to get up to date syntax for command line and URI; also to check the status of the application.

Problem resolution

The application is best suited for use with NVIDIA GPUs doing hardware H.264 video encoding. In the case of video encoding issues, it makes sense to start the application with “-Software” switch to put it into software only mode: video frames will be generated by Direct2D into WIC bitmaps instead of DXGI and Direct3D 11 textures, video encoders will use system memory backed Media Foundation media buffers and samples.

Download links


  • 64-bit: StreamingServer.exe (in .ZIP archive)
  • License: This software is free to use; builds have time based expiration

Microsoft HEVCVideoExtension software H.265/HEVC encoder

The engineering quality of most recent Microsoft’s work around Media Foundation is terrible. It surely passes some internal tests to make sure that software items meet requirements of the use cases required for internal products, but published work gives impression that there is noone left to care about API offerings to wide audience.

One new example of this is how H.265/HEVC video encoder implemented by respective Windows Store extension in mfh265enc.dll works.

I have been putting the component into existing code base in order to extend it with reference software video encoding, now in H.265/HEVC format. Hence, the stock software encoder regardless of its performance and qualtiy metrics.

Encoder started giving nonsensical exceptions and errors, in particular rejecting obviously valid input. Sorting out a few things, I started seeing the MFT producing E_FAIL on the very first video frame it receives.

The suspected problem was (and there were not so many other things left) that output media type was set two times. Both calls were valid, with good arguments and before any payload processing. Second call supplied the same media type, all the same attributes EXACTLY. Both media type setting call were successful. The whole media type setting story did not produce any errors at the stage of handling streaming start messages.

Still the second call apparently ruined internal state because – and there can be no other explanation – of shitty quality of the MFT itself.

The code fragment that discards the second media type setting call at wrapping level gets the MFT back to processing. What can I say…